Post Job Free

Resume

Sign in

Data Analyst

Location:
Cary, NC
Posted:
February 25, 2021

Contact this candidate

Resume:

Have about **+ years of consulting experience as Data Analyst/ETL Developer driving work streams focusing on, but not limited to, data standards, data integration, data architecture, data quality management, metadata, and data semantics, data loading, data analytics for BI & data governance. Skills include:

Data Governance / Data Quality

ETL / Data Integration

Data Modelling

Data Analysis

Data Profiling

Data Modelling

Reporting & Analytics

Test Data Management

Hadoop, HDFS, Hive, Gremlin,

HBase

MS Power BI, Tableau

IBM Data Stage

PL/SQL, XML, Unix, Shell Scripts

Oracle, SQL Server, PostgreSQL, DB2, Performance Tuning (DB Hints)

Agile Methodology

Cross Functional Team Coordination

SQL Developer, AQT Tool, Toad, PgAdmin4

JIRA, HP-ALM Tool, QTest

Information Analyzer, Quality Stage

MS Office (Access, Excel, Word, PowerPoint,

Visio, Project Planning)

Information Analyzer, Quality Stage,

Business Glossary

Experience Summary

Well versed with end to end Data Management Cycle from Data Governance, Data Quality, Master Data Management, Data Integration and Reporting layers.

Significant business knowledge in Healthcare, Tele Communications, Retail & Insurance industry.

Experience in developing Strategies for Extraction, Transformation and Loading (ETL) mechanism using IBM information server (Data Stage).

Strong Knowledge in ETL methodology for supporting data extraction, loading and Transformation processing.

Good Experience in Information Management Tools like Information Analyzer, Quality Stage & Business Glossary.

Good Experience on generating business reports using Data Visualization tools such as Tableau & Power BI.

Good knowledge on Cloud Databases such as Big Query.

Good knowledge on Big data eco systems with tools and techniques.

Experience in migrating the data from legacy systems into data lake platforms.

Strong experience on Data Mining using SQL queries and also doing a performance tuning on long running queries.

Strong “hands-on” knowledge with application development, DW platforms and databases (Oracle, MS-SQL, DB2): data warehousing, application software SDLC and business reporting.

Extensive Experience in writing SQL Scripts and Unix Scripts.

Proficient in collation/analysis of business requirements, conducting feasibility study, estimation, development, implementation, testing of deliverable.

Leading offshore team of around 5 – 7 members by Assigning work, Monitoring day to day activities by conducting daily status meetings.

Professional Experience

MetLife Inc. - Cary, NC. Apr 2018 – Till Date

Data Analyst/Data Lead - As a member of Data Management Team, responsible for data integration and big data transformation assignments.

Responsible for ingesting data into Hadoop environments from admin systems and core finance systems.

Validating the data from source systems to landing zone, Hadoop cluster and oracle data marts before they consumed by analytics layer.

Working with AD, QA in validating transformation rules and migrating the data.

Work with data quality teams before pushing the data into staging area and analytics platform.

Collaborate with reporting team in making sure the required data available in the oracle data marts for Qlikview and Tableau reports.

Work with Business Teams in analyzing various Databases such as Account, Party, Plans & Claims data for developing models.

Extraction of data from Data Ware housing tables and Data marts.

Worked on data migration using DataStage from admin systems to the data ware house which will be used by MetLife applications.

Working on a Data Experimentation to ease the Data Mining Efforts for testers by linking legacy SOR Databases (Customer & Plan data) with Employee Extracts and then using Power BI to mine the Employees.

Managing the team of people comprised of on site and offshore for this effort and part of the program command center to work with other teams as part of critical data support activities.

Driving the tasks from the project plan and updating the stakeholders on the progress and best practices for the data migration.

Macy’s Inc. - John’s Creek, GA. May 2017 – Apr 2018

Senior ETL Developer – Member of Data Services team responsible for transferring data from EOS to Marketing platform databases and from then Campaigns generate Real Time Offers via Direct Mail, SMS or Email.

Involved in Business Requirements gathering Discussion and Analysis of them.

Worked on getting data loaded into MODS Schema from where SAS campaigns will trigger to generate the Real Time Offers.

Impact Analysis of new data elements, Data type changes and along with work flow of current ODS and Flattened Target structures and the new elements introduced in 1B & 3C.

Updated the Source to Target Mapping Documents as per the phase 1A to 1B changes.

Performance Tuning of DataStage Jobs is done to reduce the loading time of Current Phase 1A and the Historical and Conversion Data for Phase 1B.

Developed Complex DataStage Jobs to load High volume and flattened tables in quick time.

Code Review and Unit testing results reviewing and analysis of DS Jobs performances.

Defined the DataStage Sequence and Job level dependencies to schedule them in Control M at Daily, Weekly and Monthly Frequencies keeping load balance in check.

Running the Prod DS Jobs for Conversion and Historical data from 1A to 1B structure.

Analyzing the Rejected data by comparing the data in source system tables.

Created Method to Reprocess the Rejected data on a weekend to meet the daily SLA.

QA team Supporting during the testing phase of conversion and Historical Loading.

Coordinating with Offshore team on Daily Basis for the Development, Defect Analysis and Fixes tracking, during the Development and QA phase of the project.

Historical Data loading & analysis of Target table’s w.r.t Rejects occurred during loading.

Loaded data counts are captured for Audit Process from RTO Staging to ODS and then to SASFS Data mart and are validated the data flow all the way to the Target tables.

Troubleshooting the data Loading errors& Duplicate Data received are coordinated with Source system IDM, CIM & CC2 for fixing issues on their side.

Created and Maintain the Source code DS packages, scripting, DDL versioning in SVN.

Active participation in Post Implementation process and tracking them to closure.

PSS handover process and Application training along with Batch Runs modes and different frequencies Batch Runs used.

Verizon Inc. - Colorado Springs, CO. Jul 2016 – May 2017

Senior DataStage Developer – Worked with Business team on migrating data from the newly acquired company (MCI Circuits) to Verizon Systems. Worked on implementing the ETL process with IBM information server for this migration activity.

Worked closely with the business teams in gathering the business requirements and turning them into the functional requirements.

Extensively involved in the data migration team to build the Re-usable DataStage job templates, common parameter sets, common DataStage job containers, SQL extract procedures and common re-usable shell scripts.

Worked with business analyst to identify, develop business requirements, transform it into technical requirements and responsible for deliverables.

Designed and developed DataStage jobs for Loading Staging Data from different sources like Oracle, DB2 DB into Data Warehouse applying business rules which consists data loads, data cleansing, and data massaging.

Provide the staging solutions for Data validation and Cleansing with PL/SQL and DataStage ETL jobs.

Implemented Aggregate, Filter, Join, Lookup, Rank and Update Strategy transformations.

Extensively worked with DataStage 9.1 - Designer and Director to load data from source extract files to warehouse.

Created DataStage jobs to load data from sequential files, CSV, Flat files and DB2.

Cleansed, standardize data using Quality stage.

Developed various jobs using Aggregator, Sequential file stages.

Used Fast load, Multi load scripts to load data into DB2 from flat files.

Worked towards optimal performance when using Stages like LOOKUP, JOIN, and MERGE.

Used the DataStage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring the resulting executable versions

Created Parameters and Parameter sets where necessary.

Tuned DataStage jobs for better performance by creating DataStage Hash files for staging the data and lookups.

Scheduled the server jobs using DataStage Director, which are controlled by DataStage engine and also for monitoring and performance statistics of each stage.

Developed Shell Scripts to automate file manipulation and data loading procedures.

Humana Inc. - Louisville, KY. Mar 2015 – Jul 2016

Senior Process Analyst – Worked with the business SME’s from Humana Financial Recovery teams for loading, mining & analyzing Members, Providers, Plans & Claims data claims. In turn SME’s uses this data to Analyze & identify any FRAUD payments made to providers on Claims & Pharmacy. And if it happens then they initiate a recovery on those overpaid claims from providers.

Worked closely with the business teams in gathering the business requirements from Query Doc Tool (Humana Tool) and turning them into the functional requirements.

Designed SQL queries based on those requirements to Mine & Analyze data.

Tuned DB queries and improved the performance by using Oracle DB Hints.

Designed, developed and tested the DataStage jobs using Designer and Director based on business requirements and business rules to load data from source to target tables.

Designed Source-to-Target mapping document to link between different source systems and data warehouse for loading data into Oracle tables.

Extensively used Built-in stages, for building server and parallel jobs to extract, transform and load the data.

Working with the Client on the Adhoc Reports and Monthly generated reports.

Involved in configuring & installing the IBM InfoSphere Enterprise Edition software in workstation.

SCIO Health Analytics. - Chennai, India. Sep 2011 – Feb 2015

Subject Matter Expert.

Requirements Analysis and then gathering those requirements from Query Doc Tool (Humana Tool) and turning them into the functional requirements.

Designed SQL queries based on those requirements to Mine & Analyze data.

Worked on the large set of databases to Mine data for Payment Analysis. So, used to implement query performance tuning techniques by forcing DB hints in query such as Use_Hash, Use_NL, Parallel Hints etc.

Used appropriate Joins, Indexes, Correlated query techniques based on the volume of data need to traverse to mine data from tables.

Designed, developed and tested the DataStage jobs using Designer and Director based on business requirements and business rules to load data from source to target tables.

Interviewed & Trained new recruits on Domain & Functional knowledge as part of their On-boarding process.

Techno Soft Solutions. - Visakhapatnam, India. Feb 2010 – Sep 2011

Trainee/PL-SQL Developer.

Developed ER Diagrams, Data flow diagrams based on the requirement.

Developed Model Diagrams and Input/Output structures for the data entry screens.

Developed Control files, SQL*Loader scripts, migration and backup of Oracle data.

Implemented DDL & DML statements to design SQL queries.

Extensively used PL/SQL to implement cursors, triggers and packages and to develop stored procedures, functions.

Implemented Record Insert, Record update and also used Insert All to load the data into multiple tables.

Education:

Bachelor in Computer Science & Engineering(2005 – 2009), Jawaharlal Nehru University – AP, India.



Contact this candidate