Santhosh
Mobile# 614-***-****
ETL Developer Email id: *********@*****.***
Professional Summary
* ***** ** ********** ** the field of information technology data warehousing mainly emphasizing in extraction, transformation, and loading (ETL) mechanism using Informatica maintenance & development projects along with AWS Services.
Strong ETL experience using Informatica Power Center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Mappings, Mapplets, Transformations, and Worklets.
Good knowledge of Normalization and Data Modelling.
Strong research knowledge in Aws services like Amazon S3, Python, Lambda, and DynamoDB.
Extensive experience in Extraction, Transformation and Loading (ETL) of data from multiple sources into Data Warehouse and Data Mart.
Designed and developed mappings, from various transformations like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more.
Well experienced in performance tuning of Informatica and Databases.
Good knowledge on different types of partitions like Pass-Through and others.
Excellent knowledge and experience in creating a source to target mapping, edit rules and validation, transformations, and business rules.
Extensively worked on Informatica Designer Components - Source Analyzer, Target Designer, Transformation Developer, Mapplet Designer, and Mapping Designer.
Strong experience on Workflow Manager Tools - Task Developer, Workflow Designer & Worklet Designer.
Implemented SCD Type1, 2 mappings.
Developed several reusable transformations.
Worked on building mappings dimensions and facts.
Very strong experience in writing SQL queries, tuning of SQL queries and designing and writing on-demand UNIX shell scripts.
Familiar with bug reporting and tracking using bug tracking tools like Mantis, JIRA, and Remedy.
Worked extensively with Dimensional modeling, Data migration, Data cleansing, Data profiling, and ETL Processes features for data warehouses.
Extensive experience with SQL, PL/SQL and database concepts.
Experience in all stages of SDLC (Agile, Waterfall), writing Technical Design document, Development, Testing and Implementation of Enterprise level Data mart and Data warehouses.
Strong in data analysis & understanding of dimensional modeling, star and snowflake schemas.
Good Inter personnel skills and ability to work as part of a team. Exceptional ability to learn, master new technologies and to deliver outputs in short deadlines.
Delivery Assurance - Quality Focused & Process Oriented:
Ability to work in high-pressure environments delivering to and managing stakeholder expectations.
Application of structured methods to: Project Scoping and Planning, risks, issues, schedules and deliverables.
Strong analytical and Problem-solving skills.
Good Inter personnel skills and ability to work as part of a team. Exceptional ability to learn and master new technologies and to deliver outputs in short deadlines.
Employment Contour:
Organization
Designation
Duration
Rythmos India Private Limited, Hyderabad.
Informatica Developer
June 2013 – June 2017
CGI Group Inc
Senior Software Engineer
July 2017 to May 2018
Eastern Illinois University (Independent Study)
AWS Data Engineer
January 2018 to Till Date
Technical Skills:
ETL Tools
Informatica 9.X, 10.X
Operating Systems
Windows and UNIX
DBMS/Databases
Oracle 11g & 12c, PL/SQL, DynamoDB
Programming Languages
Core Java
Methodologies
Agile, Water Fall
Modular Tools
Siebel
Environment
AWS, S3, Lambda, Python
Educational Qualification:
Master’s in technology at Eastern Illinois University, Illinois, USA and will be completed by May 5th.
WORK EXPERIENCE:
Client: Eastern Illinois University (Independent Study Project) Start Date: Jan 19
Role: AWS Data Engineer End Date: N/A
Responsibilities:
•Extraction, Transformation and Loading (ETL) of data by using AWS Tools.
•Loading CSV files from FTP server to Amazon S3 using python scripts.
•Loading data to the DynamoDB tables with the files from Amazon S3 by writing the triggers with Amazon Lambda.
•Analyzing the data, Data cleansing, Data profiling from the different sources and loading into the database.
Environment: AWS, S3, DynamoDB, Lambda, Python, EC2.
Client: Cigna Health Insurance Start Date: July 17
Role: ETL Developer End Date: May 18
Responsibilities:
•Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
•Worked on Informatica Designer Tool’s components – Source Analyzer, Transformation Developer, and Mapping Designer.
•Developing Informatica mappings & tuning them when necessary.
•Developed Informatica mappings and Mapplets by using various transformations like Lookup, Joiner, Expression, Aggregator, Update Strategy etc.
•Worked with Mapping Parameters and session parameters to pass the values dynamically.
•Extensively used ETL tool informatica to load data into Datawarehouse.
•Implemented Type 1 Slowly Changing Dimensions.
•Created Pass-Through partitions to improve the performance to load the data.
•Extracted data from heterogeneous sources like oracle, fixed width and Delimited Flat files Transformed the data according the business requirement and then Loaded into multiple databases
•ETL Monitoring in CA Workstation and Development of ESP Tasks in ESP job scheduling tool.
•Map the requirements with the BI solution using Informatica ETL tool
•Performance tuning of ETL Mappings to reduce loading times
•Supporting day to day technical issues of the Business users
•Involved in Code Migration Dev to Test and Test to Production.
Environment: Informatica, SQL, Unix, Remedy, ESP job scheduling tool, CA work station, Rally scrum tool.
Client: Starbucks Start Date: June 15
Role: ETL Developer End Date: June 17
Responsibilities:
•Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
•Worked on Informatica Designer Tool’s components – Source Analyzer, Transformation Developer, and Mapping Designer.
•Developing Informatica mappings & tuning them when necessary.
•Developed Informatica mappings and Mapplets by using various transformations like Lookup, Joiner, Expression, Aggregator, Update Strategy etc.
•Worked with Mapping Parameters and session parameters to pass the values dynamically.
•Extensively used ETL tool informatica to load data into Siebel database tables.
•Implemented Type 1 Slowly Changing Dimensions.
•Extracted data from heterogeneous sources like oracle, fixed width and Delimited Flat files Transformed the data according the business requirement and then Loaded into multiple databases
•ETL Monitoring in DAC and Development of DAC Tasks.
•Map the requirements with the BI solution using Informatica ETL tool
•Performance tuning of ETL Mappings to reduce loading times
•Supporting day to day technical issues of the Business users
•Involved in Code Migration Dev to Test and Test to Production
Environment: Informatica, SQL, Unix, Remedy, Jira, DAC.
Client: South West Airlines Start Date: June 13
Role: ETL Developer End Date: May 15
Responsibilities:
•Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
•Worked on Informatica Designer Tool’s components – Source Analyzer, Transformation Developer, and Mapping Designer.
•Developed Informatica mappings and Mapplets by using various transformations like Lookup, Joiner, Expression, Aggregator, Update Strategy etc.
•Worked with Mapping Parameters and session parameters to pass the values dynamically.
•Extensively used ETL tool Informatica to load data into Siebel database tables.
•Implemented Type 1 & Type 2 Slowly Changing Dimensions.
•Extracted data from heterogeneous sources like oracle, fixed width and Delimited Flat files Transformed the data according the business requirement and then Loaded into multiple databases.
•Writing test cases and preparation of test data and executing unit testing.
Environment: Informatica, SQL, Unix, Remedy, Jira, DAC.