Post Job Free

Resume

Sign in

Data Analyst Etl

Location:
Sugar Land, TX
Posted:
September 17, 2020

Contact this candidate

Resume:

Chandrasekaran Vadivel

(US Citizen)

Mobile: 832-***-****

Email: adf6sg@r.postjobfree.com

Title: Data Analyst

SUMMARY

More than 10 + years of experience in Management, Technology, Software Development Life Cycle (SDLC), Database Programing and Data Analyst roles for banking, financial, manufacturing, production, public sector and information technology industries for their IT projects.

Technology:

Enterprise Data Warehouse

Information Strategy and Architecture

Data Modeling

Business Intelligence

ETL & BI Tool consolidation

Data Integration

More than 8 years of programming experience as Sr. Oracle PL/SQL Developer in Analysis, Design, and Implementation of Business applications using the Oracle Relation database management system.

Experienced in applications development in all phases of the software development life cycle (SDLC) including design, development, testing, implementation and maintenance with timely delivery.

Proficient in object-oriented programming, in-depth experience in database design and development and using stored procedures, triggers and functions, materialized views in oracle 11g/10g.

Strong Expertise with PL/SQL, Dynamic SQL, Collections, and Bulk operations, Analytical functions.

Good understanding of Autosys jobs using JIL to automate/monitor scripts which runs periodically.

Extensively worked on Performance Tuning, Partitioning, Parallelism, Cursors and Ref Cursors.

Expertise in configuring & managing Oracle Table/Index Partitions in large Production databases.

Experienced in working with TOAD, PL/SQL Developer tools, Perforce, VSS etc.

Proficient in Performance Tuning, Query Optimization and Data Migration using SQL* Loader.

Strong experience in RDBMS Concepts, Dimensional Data Modeling, Logical and Physical Data Modeling.

Extensively worked on ETL of data from flat file, Oracle sources to oracle databases.

Sound understanding and experience in all stages of a data warehousing project lifecycle (data requirements, data profiling and analysis, source to target mapping, reference data mapping, ETL Framework design, ETL design, ETL development, testing, system integration, operational design, job scheduling and documentation).

Developed presentations and presented data visualization to the management.

Worked as a Team Lead & Developer for large and small groups.

Strong analytical skills and creative in solving complex problems by thinking out-side-of the box.

Good domain knowledge in Investment Banking and hands on experience in Trade and Reference data including Securities, Client, Marketer, Trader, Products.

Highly motivated, ability to work independently, organized and result oriented with excellent interpersonal skills.

TECHNICAL SKILLS

Business Intelligence Tools

Business Objects XI 3.1 Enterprise Suite – Central Management Console, Central Configuration Manager

ETL Tools

SAP BI 7.0, SAP BW 3.5, Datawarehouse, Microsoft BI (SSRS, SSIS)

Web Design Tools

HTML, XHTML, CCS, XML, XSLT, XPATH, XQUERY, XLINK

Databases

Oracle 11g/10g/9i/8i/7.3, SQL server 2012/2008/2005, Netezza, Informix, MS Access

Operating Systems

Windows 2000/XP, Windows Vista, UNIX, Solaris, LINUX

Front-end Tools

TOAD 8.5, BO, VBA, Business Objects, Bex, Developer 2000 (Forms 4.5, Reports 2.5), Discoverer 2000/3.x, SQL* Loader, MS Office (Access, Excel, Word, PowerPoint), Excel Macros

Programming Languages

C, C++, JAVA, Visual Basic, VBA, Python, SQL, PL/SQL, UNIX scripting, Java Script, VB Script, python

EDUCATION

M.S Computer Science and Applications

M.S Mathematics and Computer Applications

Yellow Belt Six Sigma Certification

PROFESSIONAL EXPERIENCE

Client: XPO Logistics Corp, NC Feb 2020, Jun 2020

Title: Data Analyst

EBI Scorecard: This project is an existing project. The business team wanted the data with the change from the source level and bring the change from source to target systems. I was involved in analysis, design, development, testing and implementation for this change. Previously the data from the source is brought to target based on the location. The change was to bring the data not only based on the location, but also based on the cost center. That is, the data should be the combination of location and cost for 12 KPI Codes. For each KPI code the data was different. I modified the entire ETL process to bring the data with the combination of location and cost center. This reduces the duplicates, data entry time into application. The reporting team used the talent reporting tool to process the data from the target system, and made EBI Matrix in the reporting systems.

Role

Involved in requirement gathering, analysis, design and develop to achieve the objective of the data

Design and developed ETL process for this change requested by the business team.

Developed new programs using SQL/PLSQL, Stored Procedure (SP) to process the data for the business needs.

Written multiple SQL and PL/SQL programs, Stored Procedure, Cursors, Triggers and Packages to extract the data.

Processed volume of data using SQL/PLSQL Code as per business rule and checked data quality in the target system.

Heavily involved in coding, testing, debugging, and tuning the code.

Export and manipulate the data for business needs and for report analysis

Developed database triggers for audit and validation purpose.

Analyzed the origination, movement, changes over the period of time through the process - data lineage.

Have good experience with Visio/Erwin tool for data modeling.

Worked with Business team, ETL team, Testing Team, DBA team to streamline/fix the data.

Experience with creating, documenting and maintaining the process.

Client: Chevron Corp, Houston, TX Aug 19 Feb 2020

Role: Data Analyst

Equipment Usage Project: This project is basically to analysis the data of used, non used and currently being used plant equipments data. The purpose is to identify the age of the equipment from different plants from oversees. I was fully involved in database side, written lot of queries, functions, procedures, cursors and triggers to do ETL of the data. Once the ETL is complete, data is validated for use. When the data is ready for use, the reporting team pulls the data from the destination for Analysis through Reporting.

Role

Involved in requirement gathering to analyze the data to achieve the goal

Design the process/document and converted the design document into technical document.

Developed new programs using SQL/PLSQL, Stored Procedure (SP) to process the data.

Written multiple SQL and PL/SQL programs, Stored Procedure, Cursors, Triggers and Packages.

Processed volume of data using SQL/PLSQL Code as per business rule and checked data quality in the system.

Heavily involved in coding, testing, debugging, and tuning.

Developed database triggers for audit and validation purpose.

Have good experience with Visio/Erwin data modeling tool

Identified defects and fixed the defects by coordinating with ETL team.

Client: Wells Fargo Bank, Chandler, AZ Jul 18 – Jul 19

Title: Data Analyst

VRDW Project: Vulnerability Remediation Data warehouse is an application that provides a consistent, secure and comprehensive platform by which all current and historical enterprise computer systems vulnerabilities data can be tracked, analyzed and reported. Vulnerability data both historical and current data is collected from different sources (IRDB, QUALYS, TMRO) and fed into a CAR Database through ETL process. The data from CAR DB is extracted and transferred to data warehouse called VRDW. From VRDW data is pushed to local Data Marts – REG_DM and QVR_DM. The data from these Data Marts is taken for reporting using Tableau. There are multiple reports designed, developed and generated (weekly, Monthly, Quarterly and Yearly) for Analysis. In this project I have involved in all phases Requirement gathering, Design, Development and Deployment and maintenance.

Involved in requirement analysis, design, and development of the entire projects.

Created conceptual, logical and physical data models for this project.

Involved in helping development teams in normalizing and de-normalizing data from physical data models.

Developed new programs using SQL/PLSQL, Stored Procedure (SP) to process the data as per business needs.

Written multiple SQL and PL/SQL programs, Stored Procedure, Cursors, Triggers and Packages to implement new business rules as per the specification defined for the project.

Processed volume of data using SQL/PLSQL Code as per business rule and checked data quality in the system.

Heavily involved in coding, testing, debugging, and tuning.

Developed database triggers for audit and validation purpose.

Have good experience with Visio/Erwin data modeling tool

Identified defects and fixed the defects by coordinating with ETL team.

Interacted with DBA's in case of data load delay/issues

Interacted with Business Owners and IT partners during project development and releases.

Experience in Microsoft BI (SSIS, SSRS, SSAS) and used Data integration ETL tool SSIS to integrate data.

Involved in Metadata Analysis

Prepared documentation for the entire project Sunrise and Sunset process.

The entire project was developed in agile environment.

Environment: Oracle 11g, SQL Server 2014, Netezza, SQL, PL/SQL, ETL, Datawarehouse, Microsoft BI, SSIS, VB, VBScript, Java Script, MS Excel, MS PowerPoint, MS Access, Visio, Erwin, Windows 10

Client: Bank of America, Dallas, TX Sep 16 – July 18

Role: Data Analyst/Database Programmer

Database migration: Involved in database migration (11g to 12c). I took ownership of replicating this data from Foredm schema, and later migrating the schemas objects – Foredm. Migrating the entire schema from 11g to 12c. In this migration, I have involved in Analysis, Development, functional testing and Release. The database size was 400TB

CFPB300 (Forecast of forecloser loans): Designed and developed a database application. Data is flowing from source to landing, landing to staging, and staging to Data Marts. Developed the code to meet the business requirements. New reports have been defined as per business needs. The type of reports is daily, monthly, quarterly and yearly as per the business users’ request. The reports show the forecast of foreclosure loans in the next 30, 60, 90 and 180 days from the current. Also, the report shows the % increase of the foreclosure loans for the past three years.

Involved in GMT (Global Markets and Trading) – QZ Trades Mortgages Project. There are three types of data – Market data, Reference data (static/dynamic to help classify instrument level data) and Trade data, currently in Nettiza (RDBMS). The objective of the project is to move the data from Nettiza into new system called Sandra database (object-oriented database) running on QZ platform. My role is to write SQL/PLSQL programs to check the data in the source system and quality of the data in the target system. Tuning of SQL Queries in Netezza. Also developed reports from the source data to make sure the target system data is transformed as per business rule.

Involved in EIT (Enterprise Independence Testing) Project. EIT is basically testing the data quality. Once the database is built in source system, the quality of data is analyzed, tested as per the standards before the data is moved to the target system. After moving the data into target system, the quality of the data is analyzed, tested in the target system as well.

GDPP-GFiR Project is an aggregation (point of instrument level) data for multiple finance groups. Heavily involved in Data Analysis, Database testing to maintain the quality of data. Helping to create Data Flow Diagram (DFD) for current state and future state, prepare PDE (Physical Data Elements) for all the fields required for reporting, creating new SLD document for the existing system that applies to the future system. This document is highly useful for the ETL team to bring the data from source to target as per as per business needs. Involved in checking the data quality, data validation, identifying defects and fixing the defects. Wrote multiple test cases and to check the data as per the business requirement for reporting. Same data analysis is applied to Weekly, Monthly, quarterly and yearly Data load.

Involved in requirement analysis, design, and development of the entire projects including data modeling.

Developed new programs using SQL/PLSQL, Functions, Stored Procedure (SP), Cursors and Packages to process the data as per business needs.

Heavily involved in data Analysis, Gap Analysis and provided recommendations for better performance.

Optimization and tuning of SQL queries for better performance of the query results.

Created PDE list, SLD and followed the BRD to implement the new business needs for reporting.

Involved in Requirement analysis, data analysis and database Design and report analysis.

Processed volume of data using SQL/PLSQL Code as per business requirements

Checked data quality in the target system against source systems.

Heavily involved in coding, testing, debugging, documenting and maintaining programs.

Developed database triggers for audit and validation purpose.

Used AutoSys to automate the data load.

Identified defects and fixed the defects by coordinating with ETL team.

Troubleshooting major issues on the database programming / SQL programs

Interacted with DBA's in case of data load delay/issues

Experience in Microsoft BI (SSIS, SSRS, SSAS) and used Data integration ETL tool SSIS to integrate data.

Environment: Oracle 11g, SQL Server 2012, Netezza, Python, SQL, PL/SQL, ETL, Datawarehouse, Microsoft BI, SSIS, VB, VBScript, Java Script, ASP, MS Excel, MS PowerPoint, MS Access, Visio, Erwin

Client: City of Houston, Houston, TX Aug 15 - May 16

Title: Sr. Information / Data Analyst/Database Programmer

ARDatamart, Executive Insight are two different streams of Projects. ARDatamart is the accounts receivable for COH, Executive Insight is cost saving or account saving for COH. Parking, FireAlarm, EMS, Advolaram are different projects come under ARDatamart. Fleet, HR (Missed Discounts, Emp Productive and Non-Productive hours), Court, Library, Recreation come under Executive insight. I have involved in both ARDatamart and Executive insigt projects.

I have successfully completed Fleet Management Project. My involvement in all projects are requirement gathering, Data base design, Data Modeling, Data Extraction, Data Transformation, Data loading and Data Analysis. The project deals with the production of several business-critical reports as insights to Executives. A detailed functional and technical documentation is finally prepared and submitted to technical people as well. Involved in requirement analysis, design, and development of the entire projects including data modeling.

Developed new programs using SQL/PLSQL, Functions, Stored Procedure (SP), Cursors and Packages to process the data as per business needs.

Involved in Functional/Business Requirements gathering and converted into technical design. Developed lot of manual reports (weekly/monthly/periodical/ad hoc) and sent to the Business Team.

Involved in Data Profiling, Data cleansing, Data Cleaning, Data standardization and Data management.

Provided user support and system support and handled all kinds of ad hoc request within the time frame

Optimization and Automation of various Manual Reports – weekly, monthly, quarterly and yearly

Heavily involved in data Analysis, Gap Analysis and provided recommendations for better performance.

Experience in Microsoft BI (SSIS, SSRS, SSAS) and used Data integration ETL tool SSIS to integrate data

Optimization and tuning of SQL queries for better performance of the query results.

Environment: Oracle 11g, Oracle 12c, SQL Server 2012, Datawarehouse, SQL, PL/SQL, Functions, Procedure, Packages, AutoSys, MS Excel, MS PowerPoint, MS Access, Visio.

Client: JP Morgan Chase, Dallas, TX Oct 11 - Jul 15

Title: Data Analyst/Database Programmer

Owned Reporting (MIS) is an important portfolio in Chase Home Finance. The Owned Reporting mainly comprises of Reporting based on the Loans that are primarily Owned and Serviced by Chase, WaMu and EMC. There are different categories of Reporting which include Reporting based on Mortgage Service Portfolio (MSP), Home Equities (HE), VLS (Vendor Loan Servicing) platforms. The data from most of these systems currently reside in the INFOPROD Source Oracle DB. Based on the Business requirements some of these loans are transformed into the BUSPROD target DB. Most of the reporting is done based on the BUSPROD database, however Reporting is possible based on both INFOPROD and BUSPROD systems. Obama Administration and the Federal Housing Administration policy of making people “stay in their homes” have created a lot of Loss Mitigation opportunities -especially loan modification, extension and reporting based on these Loss Mitigations. The project involves design, development and automation of several Waterfall, Pipeline and Performance Reports for each of these initiatives (Account Manager Modification, 2MP, HAMP, CHAMP, FHA /VA etc.)

Technical Information Analyst Lead for the VLS Home Equity Modification Team.

Involved in requirement analysis and design and development of new Reports and reporting architecture.

Involved in development and testing Oracle Stored Procedures, Functions and Packages for several reports.

The creation of new reports namely 2MP Template Pipeline and modification of 2MP Fortracs Pipeline.

Implementation of UNO changes into existing Reports using PL/SQL code.

Involved in development of various strategic initiatives for 2MP namely 2MP Performance Tracker, 2MP LPS Ever Matched Report, and 2MP Waterfall Reports.

Extensively involved in developing s PL/SQL packages – Functions, Procedure, Cursors, Triggers and packages.

Involved in tuning queries in Oracle SQL and PL/SQL, advance Bulk techniques.

Experience in SQL/PLSQL, Function, procedure, Microsoft BI (SSIS, SSRS, SSAS) and used Data integration tools to integrate data

Heavily involved in Database Design, Data Structure using ER model, MDM model, Database programming using SQL language.

Functional/Business Requirements gathering of majority of the manual reports that are sent to the Business Team on Daily Basis.

Optimization and Automation of various Manual Reports

Troubleshooting minor business questions on the HE Ownership Tracker and the PR Ownership Trackers

Troubleshooting minor user/business questions on the 120Plus Loss Saves Tracker Report.

Functional/Business Requirements gathering of the 150 Plus Delinquency Report (During Initial Stages)

Preparation of the Modification Related Documentation for the Chase Owned Reporting Team (Write up based on the Master Table)

Collection and Consolidation of several Business Data Points

Interacted with DBAs in case of data load delays or issues.

Environment: Oracle11G/10g/9i, SQL Server 2008, SQL, PL/SQL, Business Objects XiR3, Live Office, Webi, Qaaws, Universes, Crystal Reports, Cognos Reporting, Tableau, Excel Automation, TOAD, MS Office, Windows

Client: Lufkin Industries, Lufkin, TX Jan 11 – Sep 11

Title: Data Analyst

Founded in 1902, Lufkin Industries Inc. is a vertically integrated company that designs, engineers, manufactures, sells, installs and services high quality and high value-added oil field equipment and power transmission products across the globe. The main business divisions are Oil Field, Power Transmission, Automation and Foundry.

Worked with business users for requirements gathering, business analysis and making technical specification for the reports. Involved in analyzing, understanding, finalizing and documenting the data from the report.

Created several reports and modified the existing reports as per the reporting requirements.

Interacted with MIS team and LOB to ensure that reports are accurate and deployed with timely manner.

Worked on Oracle Database and written SQL, PL / SQL, Functions, Procedure, Triggers and Cursors.

Created Process chain to automate the data loading process into different data target

Involved in translating business process into business and technical requirements.

Extensive experience in ETL process from source systems to target systems, SAP.

Designed extraction logic & applied business transformations to load data from multiple Data sources.

Experienced in Data warehouse, data migration, data transformation and data transfer process.

Created process chain for effective scheduling of various batch processes.

Developed ad-hoc queries, flexible queries using filters, navigational attributes in BEX analyzer to facilitate data analysis in a drill down to give detailed levels of information.

Created analytical reporting using advanced excels features / macros like pivotal tables and charts.

Involved in Multi-Dimensional Data Analysis using slice and dice, drill down techniques to identify the pattern.

Assisted in testing, created test scenarios, and troubleshooting ETL data issues

Created Universe, Webi, Crystal Reports, QaaWS and Xcelsius Dashboards using BO Enterprise.

Improved performance of query by using query Read-mode, aggregates, compression, indices and partitions.

Involved in user training, knowledge transfer and documentation

Worked with End-Users, Data Architects, ETL team to evaluate/validate source to target mappings

Written SQL coding in Start routine, End routine and Expert routine to modify the data. SQL / Coding in transfer rule to clean up the data and in Update rule to implement the business rule.

Created and developed some custom programs for conversion of data for applications

Interacted with Business Analysts, DBAs, ETL team and Production Support team to solve data issues.

Created Multi-providers to analyze the data between different modules and DSO for storing document level data and restrict the duplicate data. Checked the data quality in different targets by comparing the data with tables.

Responsible for Delta Update Process for loading the data from Oracle - RDBMS in specific intervals

Improved query performance by using appropriate query Read mode, aggregates, compression, indexes and partitions, run BI statistics and DB Statistics

Dealt with the info packages for transaction data and master data to bring the data into PSA and used DTP process to load the data from PSA into Data Targets

Created process chain for effective scheduling of various batch processes to automate and streamline the data loading process.

Identified efficient data processing and loading process by writing SQL coding in transformations.

Developed flexible queries using reusable components - to facilitate Data Analysis (Slice and Dice the data) in a drill down to give detailed levels of information.

Created aggregates, Compressed Info Cubes, updating DB statistics regularly and rebuilding missing indexes for increasing the Query Response

Created various Crystal Reports for weekly, monthly as per the client’s requirements.

Developed ad-hoc reports using pre-built templates with report layouts, structures for internal users

Involved in creating multi-dimensional reporting using slice and dice, drill down, up and down, across by different hierarchies as per the scope of data analysis.

Involved in user training, knowledge transfer and documentation.

Environment: Oracle 10g/11g, SQL / PL SQL, SQL Server 2005, SAP BI / BW 3.5, 7.0, R3, BEX Query Designer, WAD, and Report Designer, Business Objects XiR3, Live Office, Webi, Qaaws, Univereses, Crystal Reports, TOAD, MS Office

Company: eTransX, Inc, Houston, TX May 07 – Jan 11

Title: Data Analyst

eTransX has implemented Integration tool (database integration tool) – ETL Tool to its clients to extract, transform and load data from various RDBMS/flat-file systems to a data warehouse system. I was involved as a Data Analyst. My key responsibilities included the following:

Responsibilities:

Involved in Requirement gathering for Project Preparation, Business Blueprint phase, realization, Final preparation, Go-live and Support (SDLC)

Translated Business requirements into ETL design specifications.

Assisted Business Analyst and ETL team with subject expertise in source-to-target data flows.

Maintained data integrity during extraction, manipulation, processing, analysis and storage.

Building data flow/routes for data transactions, data conversions and data transformations

Developed Data Conversion Matrix to carry out the required data transformation

Created and developed some custom program for conversion of data for applications

Performed database transaction and security audits

Resolved database access issue and performance issues

Experience with Skyward information system

Developed complex reports using various functions, advanced excel features.

Written SQL, PL/SQL code to process the data and to transform the data between tables

Developed Ad-hoc reports and customize the existing reports as per the user requirements.

Prepared documentation and ad-hoc reports using VBA, Query writing, SQL and Macros.

Written scripts (Java Script, VB Scripts) in places where custom changes are required

Building transformations and mappings between data sources and data targets

Worked extensively on extracting data from source systems to target systems

Involved in Unit Testing and Integration Testing and Validation of Data and UAT

Provided support to business users on how to use the software and interpret the contents of the data

Environment: Oracle 8g, SQL Server, VB, VBScript, ASP, SQL, PL/SQL



Contact this candidate