Post Job Free

Resume

Sign in

Data Manager

Location:
Glendora, CA
Posted:
February 16, 2020

Contact this candidate

Resume:

Neha Talwar

adbs70@r.postjobfree.com

adbs70@r.postjobfree.com

415-***-****

Green Card Holder

ETL DEVELOPER

PROFESSIONAL SUMMARY

● Over 7+ years of IT experience in analysis, design, development and implementation of software applications in data warehousing.

● Experience in analysis, design, development and implementation of software applications in data warehousing.

● Deploy, manage, and maintain applications and systems deployed to AWS

● Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.

● Experience in Designing, developing data conversion/data integration modules to support large scale system migrations and implementations.

● Expertise in Metadata manager by importing metadata from different sources such as Relational Databases and XML Sources into Frame Work Manager.

● Expertise in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Center with Oracle, SQL Server and Heterogeneous Sources

● Developed Data Warehouse architecture, ETL framework and BI Integration using Pentaho Reports and Pentaho Dashboards.

● Experience in building the Data warehouse using Ralph-Kimball methodology.

● Extensive experience in developing mappings for Extraction, Transformation, Loading (ETL) data from various sources into Data Warehouse/Data Marts.

● Experience in creating Reusable Transformations (Joiner, Lookup, Sorter, Aggregator, Expression, Update Strategy, Router, Filter, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

● Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.

● Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.

● Experience in performance tuning of SQL queries and ETL components.

● Extensively used SQL and PL/SQL in creation of Triggers, Functions, Indexes, Views, Cursors and Stored Procedures.

● Proficient in writing documents, preparing presentations and presenting them.

● Experienced in UNIX Shell scripting as part of file manipulation, Scheduling and text processing.

● Well organized, goal oriented, with excellent trouble shooting and problem solving skills.

● Strong ability to quickly adapt to new applications, platforms and languages. 1

PROFESSIONAL EXPERIENCE:

Role: ETL Developer/Designer Duration : Sept 2017

T-Mobile - Bay area – Present

Responsibilities:

● Designed and coded application components in an Agile environment utilizing a test driven development approach and extensively worked on understanding the business requirements, Data modeling and ETL structure.

● Designed ETL loading process and data flow.

● Involved in the Meta Data Management of the Schema Designs

● Developed ETL scripts to extract and cleanse data from SQL databases.

● Generated the required Meta data at the source side, in the Informatica Mappings

● Trained technical and non-technical users on the use of Informatica

● Have prepared the impact analysis document and high-level design for the requirements.

● Have involved on performance tuning by collecting statistics and observing explain plan.

● Carried out performance tuning both on Informatica side and database side.

● Have worked on slowly changing Dimensions and Operational Schemas.

● Well versed creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server.

● Extensive experience in installing and configuring Pentaho BI Server 5.0 for ETL and reporting purposes.

● Experience in different fields which includes Financial Services, Media and Investment Banking.

● Used Pentaho Report designer to create various reports having drill down functionality by creatinGroups in the reports and drill through functionality by creating sub-reports within the main reportsStrong knowledge of Entity-Relationship, Facts and Dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).

● Configured AWS S3 buckets to host Static Web content.

● Experienced in S3 Versioning and lifecycle policies and creating functions in AWS Lambda with python.

● Redshift for data warehousing.

● Experienced in Performance Tuning and Query Optimization in AWS Redshift.

● Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.

● Used components of different elements in Pentaho like Database Lookup & join, generate rows, Calculator, Row normalize &de-normalizers. Add constant and add sequence. Used Pentaho Enterprise Console (PEC) to monitor the ETL Jobs/Transformation on Production Database.

● Developed Data Warehouse architecture, ETL framework and BI Integration using Pentaho Reports and Pentaho Dashboards.

Well versed in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and Pentaho Data Integration Designer and scheduling them on Pentaho BI Server. Expertise in using Pentaho Report designer to create various reports having drill down functionality by 2

creating Groups in the reports and drill through functionality by creating sub-reports within the main reports.

Involved in Analysis, Requirements Gathering and documentation of Functional & Technical specifications.

● Have worked on data analysis to find the data duplication and existed data pattern

● Developed High Level Technical Design specifications and Low Level specifications based on business requirements

● Worked on preparing unit test plans and functional validation.

● Designed, developed and tested data conversion/data integration modules to support large scale system migrations and implementations.

● worked closely with ETL Analysts and project subject matter experts to create technical designs, developed ETL modules using ETL software (Informatica) and stored procedures, thoroughly test the software, integrated the code in to workflows or scheduling solutions, and execute the code modules as necessary for final conversion

● Analyzed the source data coming from different sources (Oracle, XML, Flat files, Excels) and worked on developing ETL mappings.

● Developing mappings to load data from multiple data sources to target.

● Developed complex mappings using Lookups both connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router, SQL transformations to transform the data as per the requirements.

● Extensively used SQL-Overrides and filter conditions in source qualifier thereby improved the performance of the mapping involved in performance tuning on Informatica in all levels.

● Configured ODBC connectivity to various source/target databases.

● Have defined production release requirements and sustainment architecture.

● Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the Workflow Manager.

● Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and resolved them.

● Have involved in administration tasks including upgrading Informatica, importing/exporting mappings

● Optimized the performance of the mappings by various tests on sources, targets and transformations.

● Identified the Bottlenecks, removed them and implemented performance tuning logic on targets, sources, mapping, sessions to provide maximum efficiency and performance.

● Used Pushdown Optimization exclusively to improve the performance of the ETL processes.

● Tuned performance of Informatica session for large data files by using partitioning, increasing block size, data cache size, sequence buffer length and target based commit interval.

● Worked closely with the Managed Services team to provide high level design document, monitor progress, provide guidance, review and sign off on all documentation and testing. Environment: Pentaho 7.6, 8.1,Informatica Powercenter 9.6, AWS, Oracle 10g, 11g, MySQL, UNIX . Role: Informatica ETL Developer Duration : Oct 2014 – Aug 2017 BlazeClan Technologies – Pune, India

Responsibilities:

● Worked with filter transformations and flat files to identify source and target bottlenecks 3

● Worked with various transformations including router transformation, update strategy, expression transformation, lookup transformation, sequence generator, aggregator transformation and sorter transformation.

● Used Oracle to write SQL queries that create/alter/delete tables and to extract the necessary data

● Used UNIX to navigate around the system and check for specific files, the files’ content, change permissions and see who the current users are.

● Generated the required Meta data at the source side, in the Informatica Mappings.

● Trained technical and non-technical users on the use of Informatica

● Have prepared the impact analysis document and high-level design for the requirements.

● Have involved on performance tuning by collecting statistics and observing explain plan.

● Carried out performance tuning both on Informatica side and database side.

● Have worked on slowly changing Dimensions and Operational Schemas.

● Experience in creating Transformations and Mappings using Informatica Powercentre Designer

● Processing tasks using Workflow Manager to move data from multiple sources into targets and vice versa.

● Created and managed schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements.

● Created Data Maps / Extraction groups in Power Exchange Navigator for legacy IMS Parent sources.

● Staged Data from legacy Proclaim IMS system into Oracle 11g Master Tables

● Performed CDC capture registrations

● Assisted in building the ETL source to Target specification documents by understanding the business requirements

● Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL, Normalizer and update strategy to meet business logic in the mappings

● Reusable transformations and Mapplets are built wherever redundancy is needed

● Performance tuning is performed at the Mapping level as well as the Database level to increase the data throughput

● Designed the Process Control Table that would maintain the status of all the CDC jobs and thereby drive the load of Derived Master Tables.

● Investigating and fixing the bugs occurred in the production environment and providing the on-call support.

● Performed Unit testing and maintained test logs and test cases for all the mappings.

● Maintained warehouse metadata, naming standards and warehouse standards for future application development.

● Parsing high-level design specification to simple ETL coding along with mapping standards. Environment:Informatica powercenter 9.6x, Oracle 11g, MYSQL, Winscp, putty, Toad. Role: Data Analyst Duration : June 2012– Sept 2014 Quantiphi Analytics, Mumbai India

Responsibilities:

● Executed data integration using ETL processes of the source systems. 4

● Analyzed data using data visualization tools and reported key features using statistic tools and supervised machine learning techniques to achieve project objectives.Utilized python/pandas functionality to gather, compile and analyze data from flat files and created graphs/charts with matplotlib. Provided analysis on any claims data discrepancies in reports or dashboards.

● Integral part of team throughout data gathering, data analysis framework involving data cleaning, processing, analysis, and visualization with python/pandas/numpy.

● Defined configuration specifications and business analysis requirements.

● Performed quality assurance and defined reporting and alerting requirements.

● Assisted in designing, documenting and maintaining system processes.

● Reported on common sources of technical issues or questions and make recommendations to product team.

● Communicate key insights and findings to product team.

● Performed data mining tasks related to system break/fix issues, and provided work- arounds and problem solving.Work independently or collaboratively throughout the complete analytics project lifecycle including data extraction/preparation, design and implementation of scalable machine learning analysis and solutions, and documentation of results.

● Developed and deployed Machine learning.

● Worked with sales and Marketing team for Partner and collaborate with a cross-functional team to frame and answer important data questions

● Tracked key project milestones and adjusted project plans and resources to meet the needs of customers.

● Provided support for projects in project planning, quality plan, risk management, requirements management, change management, defect management and release management.

● Built and maintained queries for analysis/extraction for different databases.

● Developed Excel Services Reports for the Network Team.

● Created technical documentations for each Mapping for future developments.

● Maintained data warehouse tables through the loading of data and monitored system

● configurations to ensure data integrity.Supervised and Unsupervised Machine learning algorithms like Regression, Decision Trees, Random

EDUCATIONAL DETAILS:

● Bsc (I.T) from I.P university, Delhi.

● Certificate in Data Analytics from UC Berkeley Ext, California . 5



Contact this candidate