ETL Informatica Developer email@example.com
Around 5 years of IT experience in Design, Development and implementation of Data warehouses as an ETL Informatica Power Center Developer and Teradata Developer.
Informatica Power Center Certified Developer V9.5.
Experience in data warehousing applications, Responsible for Extraction, Cleansing, Transformation and Loading of data from various sources into Data Warehouse.
Planned, coordinated analysis, design and extraction of encounter data from multiple source systems into the data warehouse relational database (Teradata) while ensuring data integrity
Design and implement data mart used to support multiple business functionality.
Experience as on ETL developer using Informatica Master Data Management, Informatica Power Center (Source Analyzer, Warehouse designer, Mapping designer, Mapplet designer, Transformation developer) Repository manager, Workflow manger & workflow monitor.
Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
Very strong knowledge in Relational Databases (RDBMS), Data modeling and in building Data Warehouse, Data Marts using Star Schema and Snow Flake Schema. Knowledge in Data Modeling (Logical, Physical, Star Schemas) with Erwin.
Strong exposure on Data warehousing concepts like star schema, Snow flake schema.
Worked on Informatica version 7.x,8.x,9.5 and 9.6 and 10.1.
Created detailed functional and technical design documents.
Integrated a wide variety of source file layouts into the data warehouse.
Experience in RDBMS such as ORACLE 12c/11g/10g/9i/8i, MS SQL Server 2008/2005/2000, Teradata 12.0, DB2, SQL, T-SQL and PL/SQL.
Good understanding and working experience in Logical and Physical data models that capture current/future state data elements and data flows using Erwin.
Extracted data from various sources like Oracle, DB2, Flat file, SQL Server, XML files, Teradata and loaded into Oracle database.
Worked on PL/SQL Procedures and UNIX Scripts for Automation of UNIX jobs and running files in batch mode.
In-depth understanding of Star Schema, Snow Flake Schema, Normalization, 1st NF, 2nd NF, 3rd NF, Fact tables, Dimension tables.
Assisted in data analysis, star schema data modeling and design specific to data warehousing and business intelligence environment. Concurrently, responsible for the support and administration of large (2TB) enterprise level data warehouses.
Experience in creating Transformations and mappings using Informatica designer.
Experience in implementing complex Mappings/Mapplets, Reusable transformations and Partitioning Sessions and Experience in writing daily batch jobs using UNIX shell scripts.
Having Exposure in Unix and Worked on CONTROL M, TWS to schedule the jobs.
Implemented Performance Tuning Techniques and Error Handling at various levels such as sources, Target, mappings.
Implemented type1/type2/type3/type4/type5/type6, Incremental and CDC logic according to the Business requirements.
Having extensive experience on load Utilities like FASTLOAD, TPUMP, MLOAD, FAST Export, TPT (Teradata Parallel Transporter) and PDO (Push down Optimization) and used BTEQ scripts.
Created UNIX shell scripts to execute SAS programs through Autosys scheduler.
Experience in Query Optimization and Performance Tuning of Stored Procedures, Functions.
Analyzing highly complex business requirements, designing and writing technical specifications to design or develop ETL processes.
Informatica object Migration using Repository Manager, Informatica Administration activity (by using Admin console) and Code review of mappings of various application across plant. Ability to work effectively and efficiently in a team and individually.
Experience in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems and Having exposure in OLTP and OLAP Transactions
Installation of Informatica power center and setup services and client tools
Used Tortoise SVN version control tool,JIRA for version controlling and movement of code to upper environments like Testing, UAT, Pre-production and Production.
Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
UNIX, Linux, Windows XP/2000/98/95/Win 7
C, C++, HTML, PL/SQL, ASP.NET, Java, UNIX Shell Scripting, VB
VMware, VSphere, ESX/ESXi, Vcenter server.
Oracle 11g/10g/9i, Teradata V2R4, DB2, Netezza, TOAD, MS SQL Server
2008, 2012, SAP BW, SAP Logon
Informatica Power Center10.1.1/9.6.1/9.1/8.6,Informatica Data Quality (IDQ), Informatica BDM .
Oracle 10g, 11g/12c RAC, DB2, MS Access, PL/SQL, MySQL, PostGre SQL
SAS, Congo’s 8.0,Tableau
IP Switch, Win SCP
Servlets, JDBC, JSP, HTML, CSS, XML.
Master of Science in Computer Science.
Morgan Stanley New York Nov ’18 to Till date
Role: ETL Developer
Description: As an ETL developer, I have been a part of Field Strategic service Project(FSS). This Project is a visualization dashboard project which display the information of field users (Financial Advisors) and their active client’s information in different metric like Premium Cash Management(PCM), Morgan Stanley online/Mobile(MSO), E-Delivery, ESign, EAuthorization (EAuth), Asset Aggregation at different organizational levels to showcase the reports of client’s eligibility and enrollment information. This Visualization reports been used by Field Managers and Service Professionals to identify their client’s status and encourage them to get eligible and enrolled into different Metrics.
I have been a part of business meeting and requirement gatherings with BSA, PM and translated the requirement into Technical Document.
Involved in the full development lifecycle from requirements gathering through development and support using Teradata SQL Assistant, Informatica Power Center, Repository Manager, Designer, UNIX. Extensively worked with large Databases in Production environments.
Involved in Data analysis by writing complex SQL queries before starting with development to understand different legacy data systems.
Responsible for creating tables, cross database views with Teradata, views, stored procedures.
Written very large stored procedures in Teradata SQL assistant to perform the ETL process with Teradata as Upstream and downstream.
Responsible in creating Teradata tables by identifying required columns, primary index, secondary index, join index as per requirement to improve the dashboard performance.
Responsible for creating informatica workflow which calls the Teradata stored procedure.
Worked closely with downstream Tableau developers in implementing the business requirement after appropriate changes made in Teradata tables and in identifying optimized way to represent data in tableau with most of the calculations made on Teradata side with less calculations to perform on tableau server which increase the tableau performance.
Involved in extensive 3D (Internal Platform for all the tableau dashboard) Tableau Dashboard testing at each and level against the Teradata tables.
Involved in loading table from flat files using export and import of SQL Assistant.
Involved in Performing the business unit testing of the stored procedures by migrating to QA using Morgan Stanley internal Shell TDTS application.
Extensively worked in testing the business test cases after development.
Responsible for participating in performance testing dashboard meeting to understand the possible changes in PI, SI, JI in a table to improve the performance by working closely with DBA Team.
Responsible in bug fixes in UAT testing.
Involved in production support in case of batch job failures, supporting ad-hoc request by the business users.
Implementing production release of database objects, ETL changes, Tableau migration by working closely with OPS team.
Been a part of production release change management process and business checkouts of database objects and dashboard applications after production deployment.
Implemented History run in the productions tables after every new release or after upstream data correction using JCL scripts by closely working with prod support team.
Environment: Teradata SQL Assistant, Informatica Power Center 9.6, Oracle 11g, Tivoli, Tableau, UNIX, Flat Files, XML Files.
Discount Tire Scottsdale, AZ Mar’18 to Nov ‘18
Role: Informatica Developer
Description: As the single developer in Scheduling project which I worked on during the period in DTC, involved in requirement gathering with BSA and technical lead, designed the development and developed the Informatica code and unit tested it. As part of development i have used different transformations in mappings, created reusable sessions and worklets. AS per the requirement standards with vendor (Reflexis) I have created flat files which will be sent to Reflexis using SFTP details.
Extracted data from flat files and oracle database, applied business logic to load them in the central oracle database.
Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.6.0/9.6.1.
Created reusable transformations and mapplets and used them in mappings.
Used Informatica Power Center for extraction, loading and transformation (ETL) of data in the data warehouse.
Developed strategies for Incremental data extractions as well data migration to load into the Target.
Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.
To change data types in Informatica using XML data types such as Name Spaces, Binary types, Logic types, XML types, Date Time types, Number types.
Used Informatica Power Center Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
Involved in creation of Folders, Users, Repositories, Deployment Group using Repository Manager.
Worked on different data sources such as SAP HANA, Oracle, SQL Server, Flat files etc.
Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union,
Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
Scheduling of workflows / Monitoring.
Unit Testing the developed code
Drive workflows based on real-time system events across multiple servers or applications using SKYBOT.
Automatically send job logs via email, save important files to the server, or send output files as an email attachment, and Securely transfer files to any remote system using SKYBOT.
Created E-mail notifications tasks using post-session scripts.
Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
Expertise in writing and modified UNIX Shell script, Perl scripts. Good Knowledge in writing programming C, Java. Created deployment groups, migrated the code into different environments. Worked on reviewing the requirements for project and designs for performance risks.
Environment: Informatica Power Center 9.6, Oracle 11g, SQL Server 2008/2005, Erwin, SAP HANA, Skybot, Data Quality V10.1, Tableau, UNIX, Flat Files, XML Files, PL/SQL.
Dollar General Nashville, TN June’17 to Feb ‘18
Role: Teradata Developer
Worked closely with Business Analyst and interacted with Report Users to understand and document requirements, translated them to Technical Specifications.
Involved in the full development lifecycle from requirements gathering through development and support using Informatica Power Center, Repository Manager, Designer, Server Manager, Workflow Manager, and Workflow Monitor. Extensively worked with large Databases in Production environments.
Involved in the design of Data-warehouse using Star-Schema methodology and converted data from various sources to oracle tables.
Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
Worked on data migration of Informatica Mappings, Sessions, and Workflows to Testing, Pre-live and Production Environments.
Used Informatica power center ETL tool to extract the data from different source systems to target systems.
Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
Utilized Informatica Data Quality (IDQ V10.1) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
Expertise in Exporting IDQ mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.
Designed & implemented various data marts with integration into an enterprise data warehouse from heterogeneous data sources.
Extracted flat file source data using Informatica Designer tool to be staging database and then implemented business logic or rules to load data into target tables.
Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
Developed mappings using different transformations like source qualifier, Expression, filter, Aggregate, Update Strategy, Router, Sequence Generator, and Joiner.
Created UNIX scripts to execute in command task configured email task to send error reports to user.
Involved in writing shell scripts to schedule, automate the ETL jobs.
Developed back end interfaces using PL/SQL Packages, Cursors, Stored Procedures, Functions, Triggers, Collections API's.
Involved in writing PL/SQL stored procedures, functions for extracting as well as writing data.
Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center and BDE.
Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
Have extensively worked in developing for supporting Data Extraction, transformations and loading using Informatica Power Center.
Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
Involved in optimizing database performance by analyzing database objects, creating indexes, creating materialized views etc. Performed SQL tuning using Explain Plan, Tkprof, Hints and indexes.
Perform stored packages, procedures, functions and trigger development in support of an enterprise level, multi user, data aggregation, analysis, and reporting system.
Created Tasks, Workflows, Worklets and Sessions using Workflow Manager Tool and Monitored Workflows using Informatica Workflow Monitor.
Involved in error handling using session logs and workflow logs.
Used Unix Commands and shell scripts to schedule the jobs.
Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
Developed back end interfaces using PL/SQL Packages, Cursors, Stored Procedures, Functions, Triggers, Collections API's.
Troubleshoot problems associated with the designed workflows and provide technical guidance on complex workflows.
Monitor production issues, inquiries and provide efficient resolution and answers to these requests
Performed incremental aggregation to load incremental data into Aggregate tables.
Preparation of test data for regression testing and validating the target data with the source data.
Involved in creating dashboards and reports in Tableau.
Developed Tableau data visualization using Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
Worked on unit testing and prepared unit test documents and uploaded in JIRA to review.
Environment: Informatica Power Center10.1Oracle 11g, SQL Server 2008/2005, Erwin, Data Quality V10.1, Tableau, Teradata, UNIX, Flat Files, XML Files, PL/SQL.
ICREA Solutions HYDERABAD, INDIA Aug ’14-Dec ‘15
Role: Informatica Developer/Oracle Developer
Involved to understand the Business requirements with Business Analysts and stakeholders.
Involved to design the blue print and approach Documents for high level development and design of the ETL applications.
Developed complex ETL mappings to load dimension and fact tables.
Worked with type 1 and type 2 dimensional mappings to load data from source to target.
Worked on agile methodology and JIRA.
Designed blueprint and approach documents for each story and take approvals from the business analysts to design the ETL mappings.
Developed mappings, sessions, workflows and run the jobs.
To develop the mappings used all transformations like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner and Sequence Generator in the mapping.
Used reusable transformation from shared folder to implement the same logic in different mappings.
Involved with testing team to resolve the issues while testing ETL applications.
Resolved the testing issues and conducted meeting with testing team to understand the business requirements and how to test the applications based on business and technical prospective.
Involved in moving the ETL jobs from development to test environment.
Involved in extensive performance tuning by determining bottlenecks using Debugger at various points like targets, sources, mappings, sessions level
Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
Experience in PL/SQL programming that included writing Stored Procedures, Packages, and Functions.
Wrote SQL, PL/SQL codes, stored procedures and packages.
Optimize SQL queries for better performance.
Developed Packages, Procedures and function to maintain the Business logic using PL/SQL.
Created PL/SQL complex procedures.
Creating Database objects including Tables, Indexes, Views, Sequences, Synonyms and granting Roles, and Privileges to the system users
Complex SQL queries were used for data retrieval.
Environment: Informatica 8.6/ Power Center 9.0/8.6, Windows NT, Oracle 9i/10g,,DB2, Erwin 4.0, SQL, PL/SQL, T SQL,TOAD, Business Objects,CARS,XML.