Vanaja
Senior ETL Informatica Developer
Email: *******************@*****.***
Ph: +1-617-***-****
Professional Summary:
• ETL developer with 10+ Years of strong experience in Data Warehousing Applications, Extraction, Transformation & Load (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center 10.x/9.x/8.x/7.x/6.x (Workflow Manager, Workflow Monitor, Warehouse Designer, Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager and Informatica server),
• Strong experience in writing UNIX shell scripts for data file validation and working with Informatica components in Unix environment.
• Extensive experience on Informatica 10.2/9.6.1/9.1.1/8.x, Informatica Power exchange 9.x and IICS.
• Strong experience with various AWS services including S3 and RDS
• Strong experience in writing SQL using Oracle, SQL Server, and DB2.
• Experience in PL/SQL development and SQL Query Tuning by analyzing Explain Plan for better performance.
• Experience in the Implementation of Informatica Data Quality (IDQ) profiling and scorecards.
• Familiarity with Informatica Administration like User Management, Folders / Project Management
• Good working knowledge in Data Modeling - dimensional data modeling, star schema/snowflake schema, Fact & and dimension tables, and physical & logical data modeling.
• Designed and developed ETL processes using Talend Studio to extract data from various sources, transform it according to business rules, and load it into target databases.
• Extensive experience in creating various Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, TCL, Update Strategy, Sequence Generator, Normalizer and Rank), Mappings, Mapplets, Sessions and Tasks (Command, Email, Decision, Event Wait, Event Raise, Timer, Assignment).
• Created customized BDM mapping/ workflows for incremental using informatica developer and deploy them as part of an application on data integration services.
• Strong experience with Database objects, writing stored procedures, functions, triggers, DDL, DML SQL query performance tuning using Oracle 11g/10g/9i/8i, PL/SQL, SQL*Plus, SQL*Loader
• Experience in tuning Informatica Mappings and Sessions for optimum performance.
• Good experience working with Agile process.
• Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
• Designing, developing, testing, and deploying ETL processes using Oracle Data Integrator (ODI) to move data between systems and databases.
• Expert in implementing business rules by creating Re-Usable Transformations like Mapplets and Mappings
• Proficient in using Informatica Repository Manager to create & schedule sessions and batches.
• Good experience in Scheduling ETL Jobs tool like Autosys, Cron tab
• Production support, handling new enhancements and customer support..
• Exceptionally well organized, strong work ethics and willingness to work hard to achieve employer objectives. Experience in handling defect tracking tools Service Now, JIRA
Technical Skills:
Package & ETL Tools
Informatica Power Center 10.x/9.x/8.x/7.x, IDQ, TOAD, OLAP, OLTP, JMeter, CI/CD pipelines
Data Bases
• SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v10 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), Talend,SQL Browser (Oracle Sybase), Visio, ERWIN, Relational Databases, DB Objects, Data integrity, Delta Tables, Data initiatives
Languages
SQL, PL/SQL, TSQL, Teradata, UNIX Shell Scripting, Writing Skills, Python, MySQL, Jenkins
Data Modeling
Star-Schema Modeling, Snowflakes Modeling
Méthodologies
Agile, Waterfall, Change Management, Release Management
Operating Systems
Unix, Win NT, MS-DOS, Solaris, Selenium
Client: Fannie Mae, Washington, DC Jan 2023 – Till Date
Role: ETL Developer
Responsibilities:
• Interacted with Business partners for requirement analysis and implemented the same into a functional ETL design.
• Creation of ETL Mapping Document.
• Involved in Project analysis Phase and Business discussion and coordination with different teams.
• Analyzed the business requirements and designed conceptual and logical data models Creation of Data models.
• Provided the Technical implementation estimates based on the functional requirements.
• Designing data flows, and process flows.
• Loaded data to and from Flat files and databases like DB2, and SQL Server.
• Developed IICS Informatica Cloud mappings to extract the data from SFDC.
• Creating IICS Mappings Tasks/Data Replication Tasks/Data Synchronization tasks
• Worked with Connected, Unconnected, Static, Dynamic and Persistent lookups.
• Designed the ETL processes using Informatica to load data from Oracle, SQL Server, Flat Files (Fixed Width), XML, and Excel files to the staging database and from staging to the target Oracle database.
• Developed the ETL processes to handle complex business logic for high-volume data loads using various.
• Developed the ETL processes to handle History and Incremental logic.
• Identifying performance bottlenecks in Informatica mappings, sessions, and systems and eliminating them for better performance.
• Extensively used Transformations like Router, Aggregator, Normalize, Joiner, Expression, Lookup, Update strategy and Sequence generator, Stored Procedure.
• Defined and used Mapping Parameters and Parameter files to provide values dynamically at session run time.
• Managing metadata within ODI repositories, documenting ETL processes, mappings, and transformations for future reference and maintenance.
• Managing version control of ETL code and configurations, coordinating deployment of ETL jobs across different environments (development, testing, production).
• Proficient in utilizing IDMC features to streamline data operations, improve decision-making processes, and drive business value through effective data management strategies.
• Experienced in architecting and implementing IDMC solutions to optimize data workflows, enhance data quality, and ensure compliance with regulatory requirements
• Promote code base to QA, UAT, and one-time production deployment.
• Created Unix shell scripts to invoke the ETLs.
• Created AutoSys job to schedule the jobs.
• Designed and implemented data lakes utilizing AWS Lake Formation, ensuring efficient storage and retrieval of large volumes of data.
• Orchestrated data workflows and scheduled tasks using AWS Step Functions and AWS Managed Workflows for Apache Airflow (MWAA), optimizing data processing pipelines.
• Proficient in configuring connectors, APIs, and event-driven architectures within CAI frameworks to enable real-time data synchronization and communication between applications
• Managed AWS data environment, including EMR, Glue, S3, DocumentDB, Redshift, RDS, and Athena, ensuring smooth operation and integration of various data services.
• Implemented and optimized data warehouses/RDBMS like Redshift, as well as NoSQL data stores such as DocumentDB, DynamoDB, and OpenSearch, to support diverse data requirements and use cases.
• Strong expertise in Collibra for managing data governance and ensuring compliance with legal and regulatory standards.
• Skilled in establishing and managing stakeholder engagement and communication for data governance and quality initiatives.
• Extensive experience in metadata collection efforts, documenting system processes, and establishing standards and practices for efficient data management and governance.
Environment: Informatica Big Data Management, Informatica PowerCenter 10.2, IICS, Informatica Power Exchange 10.2, SSIS, Windows Secure Agent, Redshift, Teradata v1310, Azure Synapse (Azure SQL DW), Azure Data Lake Store, SQL Database, Power BI reporting
Client: Wellsfargo, Santaclara,CA July 2021 - Dec 2022
Role: Informatica Developer
Responsibilities:
• Developed a standard ETL framework to enable the reusability of similar logic across the board Involved in System Documentation of Dataflow and methodology.
• Extensively developed Low-level Designs (Mapping Documents) by understanding different source systems
• Designed complex mappings, sessions, and workflows in Informatica PowerCenter to interact with MDM and EDW
• Design and develop mappings to implement full/incremental loads from the source system.
• Integrating ODI with other Oracle technologies such as Oracle Database, Oracle Exadata, Oracle Business Intelligence (OBIEE), etc.
• Design and develop mappings to implement type1/type2 loads.
• Experienced with Talend for ETL process design and implementation
• Skilled in SQL, data modeling, and database technologies (e.g., SQL Server, Oracle)
• Responsible for ETL requirement gathering and development with end-to-end support.
• Responsible for coordinating the DB changes required for ETL code development.
• Responsible for ETL code migration, DB code changes, and scripting changes to higher environment
• Experienced in utilizing CDI platforms to seamlessly integrate data from disparate sources, including on-premises systems and cloud environments, enabling real-time data exchange and synchronization.
• Expertise in Cloud Application Integration (CAI), with a track record of designing and implementing robust integration solutions to seamlessly connect cloud-based applications and services.
• Responsible for supporting the code in the production and QA environment.
• Developed complex IDQ rules that can be used in Batch Mode
• Developed Address validator transformation through IDQ to be interacted with in Informatica PowerCenter mapping.
• Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter
• Worked closely with the MDM team to identify the data requirements for their landing tables and designed the IDQ process accordingly.
• Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer, and sequence generator
• Created reusable mapplets, and reusable transformations and performed Unit tests over Informatica code.
• Responsible for providing daily status report for all Informatica applications to customer, Monitoring and tracking the critical daily applications & and code migration during deployments.
• Responsible for reloads of Informatica applications data in production and closing user tickets and incidents.
• Identify performance issues and bottlenecks.
Environment: Informatica Big Data Management, Informatica PowerCenter 10.2, Informatica Power Exchange 10.2, SSIS,Talend, Windows Secure Agent, Teradata v1310, Azure Synapse (Azure SqlDW), Azure Data Lake Store, SQL Database, Power BI reporting
Client: Centene, St. Louis MO Feb 2019 to June 2021
Role: ETL Informatica Developer
Responsibilities:
• Gather business requirements, study the application, and collect the required information from developers, and businesses.
• Participate in design and analysis sessions with business analysts, source-system technical teams, and end users.
• Responsible for developing and maintaining ETL jobs, including ETL implementation and enhancements, testing and quality assurance, troubleshooting issues, and ETL/Query performance tuning.
• Designing, developing, maintaining, and supporting Data Warehouse or OLTP processes via Extract, Transform, and Load (ETL) software using Informatica.
• Manage and expand the current ETL framework for enhanced functionality and expanded sourcing.
• Implemented complex data transformations and mappings within Talend jobs to support business requirements and data quality standards.
• Collaborated with data architects to design data models and optimize ETL workflows for performance and scalability.
• Translate business requirements into ETL and report specifications. Performed error handling using session logs.
• Worked on Power Center Tools like a designer, workflow manager, workflow monitor, and repository manager.
• Involved in creating database objects like tables, views, procedures, triggers, and functions using T-SQL to provide definition, and structure and to maintain data efficiently.
• Done Code reviews of ETL and SQL processes. Worked on upgrading Informatica from version 9.x to 10.x.
• Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence
• Identifying performance issues in the code and tuning it to complete the jobs within the given SLA time.
• Performance tuning at Informatica and database levels depending on the bottlenecks identified with the jobs.
• Developed mappings using Informatica Power Center Transformations - Lookup (Unconnected, Connected), Filter, Expression, Router, Joiner, Update Strategy, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
• Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections, and relational connections.
• Worked with Informatica Data Quality 9.x (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
• Performed Relational Data Modeling, ER Diagrams (forward & and reverse engineering), dimensional modeling, OLAP multidimensional Cube design & and analysis, defining slowly changing dimensions and surrogate key management.
• Worked with the testing team to resolve bugs related to day one ETL mappings before production.
• Maintained ETL release document for every release and migrated the code into higher environments through deployment groups.
• Created change requests before code deployment into production to get required approvals from the business.
• Created weekly project status reports, tracking the progress of tasks according to schedule and reporting any risks and contingency plans to management and business users.
• Experienced working with team leads and interfaced with business analysts, third-party vendors, and end users.
Environment: Informatica Power Center 10.x/9.x, IDQ, OLAP,Talend, Oracle, MS SQL Server, PL/SQL, SSIS, SSRS, SSAS, SQL*Plus, SQL*Loader, Windows, UNIX.
Client: Albertsons, Pleasanton, CA May 2017 - Jan 2019
Role: ETL Developer
Responsibilities:
• Extensively involved in all stages of the testing life cycle, Test Driven Development methodologies, and Software Development Life cycle (SDLC) using agile methodologies.
• Understand the Requirements and Functionality of the application from specs.
• Involved in the development of Informatica mappings and tuned existing mappings for better Performance.
• Develop required UNIX shell scripts to automate ETL processes.
• Developed, enhanced, and tested the Pro* C/batch code and a shell script for the subsystems like Buy-In, TPL (Third Party Liability), Managed Care, and Claims.
• Translating the business requirements into coding (C, SQL) and meeting the project deadlines on time
• Responsible for testing the mappings and ensuring that the mappings do the transformation as proposed.
• Developed Mapplets and Worklets for reusability.
• Installing and configuring Linux network operating systems
• Ability to use the WinSCP tool to transfer data from Windows to Linux Systems.
• Troubleshooting and correcting corrupted Linux operating systems
• Created mappings using different transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, data masking, and Sequence Generator, etc.
• Involved in Unit Testing, creating test cases to check whether the data loads into the target.
• Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and define ETL standards.
• Scheduling daily, weekly, and monthly loads for the managers according to the business specs.
• Develop workflows, and sessions using command tasks in workflow manager.
• Responsible for developing various data extracts and loading routines using Informatica, Oracle stored procedure (pl/sql), packages, triggers, and UNIX.
• Created detailed process flow designs and drafted high-level Standard Operation Procedures.
• Performed tuning for PL/SQL procedures, Views, and Informatica objects.
• Monitored data warehouse month-end loads to ensure successful completion.
• Used Informatica Power Center Workflow Manager to create sessions, and batches to run with the logic embedded in the mappings.
• Tuned mappings and SQL queries for better performance and efficiency.
• Created and ran the Workflows using Workflow Manager in Informatica Maintained stored definitions, transformation rules, and target definitions using Informatica repository manager.
• Gap analysis by map
• Ping the functional requirements to the Business Requirements and worked on the report design to create User Interface mock-ups.
Environment: Informatica Power Center 9.5.1, Oracle 11g, TSQL, PL/SQL, TOAD, SQL*Loader, UNIX, Windows 2007, MS Access, MS Excel, MS Word.
Client: Tri Bro Soft Tech, India June 2013 – Dec 2016
Role: ETL Informatica Developer
Responsibilities:
• Involved in all phases of SDLC from requirement, design, development, testing and support for production environment.
• Extensively used Informatica Client tools like Informatica Repository Manager, Informatica Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
• Created Sources, Targets in shared folder and developed re-usable transformations, mapplets and user defined function (UDF) to re-use these objects in mappings to save the development time.
• Created mappings which involved Slowly Changing Dimensions Type 1 and Type 2 to implement business logic and capturing the deleted records in the source systems.
• Involved in analyzing the site usage files data coming from external vendors using sql after loading them in to work database.
• Used debugger extensively to identify the bottlenecks in the mappings.
• Modified PL/SQL stored procedures for Informatica mappings.
• Created Sessions and Workflows to load data from the SQL server, flat file and Oracle sources that exist on servers located at various locations all over the country.
• Configured the session properties i.e. high value for commit intervals to increase the performance.
• Involved in unit testing, Integration testing and User acceptance testing of the mappings.
• Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
• Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
• Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.
• Performance tuning on sources, targets mappings and SQL (Optimization) tuning.