Post Job Free
Sign in

Sql Server Etl Informatica

Location:
Dallas, TX
Salary:
90000
Posted:
July 30, 2025

Contact this candidate

Resume:

PRATIMA PANDEY

ETL Informatica Developer

Email: **************@*****.***

Cell: +1-984-***-****

LinkedIn: www.linkedin.com/in/pratima-pandey-76489a2a4

An ETL Informatica Cloud developer with 7+ years of IT experience. My main responsibility is to design, develop, and maintain data integration solutions using the Informatica cloud IICS and Informatica PowerCenter tool. I work with various sources of data to extract, transform, and load the data into target database. My job also involves ensuring the accuracy and integrity of the data throughout the ETL/integration process. I am well-versed in data warehousing concepts, Star Schema, Dimensional data modelling, SCD, and have worked with various databases, such as Oracle, SQL Server, Snowflake and MySQL. My expertise also includes designing ETL workflows, creating mappings and transformations, and troubleshooting issues related to data quality and data integration. My skills include data profiling, data cleansing, data mapping, and data validation. I am also proficient in writing an optimizing SQL query and UNIX command. I have worked on several complex ETL projects in HealthCare Insurance and Banking domain and have successfully delivered solutions that meet the business requirements. I am passionate about using technology to solve real-world problems and am always eager to learn new things.

•Excellent experience in designing and development of ETL Methodology using Informatica cloud IICS/IDMC and Informatica PowerCenter 10.5, 9.6.1/9.1.1

•Having experience in working with ORACLE 11g/10g, MySQL, SQL Server 2017/2014 databases.

•Expert in Change Data Capture (CDC) Methodology using Informatica PowerCenter 10.x/9. x

•Worked extensively in Requirements gathering, Development, testing and Production deployment.

•Prepared various types of documents like Data Mapping, Test cases, Data dictionary etc.

•I am expert in writing SQL query using multiple databases like ORACLE, SQL Server, MySQL etc.

•Extensive knowledge of Informatica tuning and SQL tuning.

•Experience in integration of various data sources into staging area.

•Experience on Informatica cloud Services (ICS) by using bundles, Integration Templates, Mappings, Task Flows, Saved Queries, Input / Output Macro fields, Mapping Configuration Tasks and Cascading Filters.

•Performed Unit testing, System testing, Integration testing and users for UAT and prepared test report in different phases of project.

•Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.

•Designed and developed complex mappings, mapplets, tasks and workflows and tuned it.

•Experience in Debugging and Performance tuning of targets, sources, mappings and sessions.

•Good understanding of Data modeling concepts like Star-Schema and Snowflake Schema Modeling.

•Experience in optimizing the Mappings and implementing the complex business rules by creating reusable transformations like mapplets.

•Extensively used Slowly Changing Dimension (SCD) technique in business application.

•Used UNIX command for multiple file operation for data viewing, data parsing and filtering.

•Experience in migrating code through CI/CD pipeline using GIT and Bitbucket.

•Experience in Performance tuning of ETL process using pushdown optimization.

•Understanding of AWS cloud and some of its Product like S3 Bucket, EC2 etc.

EDUCATION

Master in Science from Sam Higginbottom University of Agriculture Science & Technology, India

•Good experience in working with onshore, offshore team and End to End teams to ensure commitments are met.

ETL TECHNOLOGY Informatica PowerCenter 10.5, 9.x and Informatica Cloud /IDMC/IICS,ADF,

DATA WAREHOUSE

Star Schema, Snowflake schema

DATA MODELLING

Erwin 9.5/7.1

DATABASES

Oracle 12c, MS SQL Server 2017/2014, MySQL, Snowflake, Salesforce

APPLICATIONS

MS Office, Toad 9.2/8.6

Others

TOAD, SQL, PL/SQL, WinSCP, Unix, Shell Scripting, Putty, Rally, REST API,

Postman, Jankin, Bitbucket, Git.

CLIENT: LABCORP, BURLINGTON, NC DURATION: JUN 2021 TO CURRENT

PROJECT: LDDW I.E. LABCORP DIAGNOSTICS DATA WAREHOUSE ROLE: INFORMATICA DEVELOPER

Description: Laboratory Corporation of America Holdings operates as a life sciences company that provides vital information to help doctors, hospitals, pharmaceutical companies, researchers, and patients make clear and confident decisions. The company offers various tests, such as blood chemistry analyses, urinalyses, blood cell counts, thyroid tests, PAP tests, hemoglobin A1C and vitamin D, prostate-specific antigens, hepatitis C tests, microbiology cultures and procedures, and alcohol and other substance-abuse tests. The objective of these data marts is to enable improved decision-making, strategic plans, support and solutions that favorably impact costs, quality of care, outcomes, and customer satisfaction through an information-driven environment that leverages their integrated data assets for competitive advantage.

Responsibilities:

Collaborated with business users to capture and document requirements for Business Analysis.

Documented Data Mappings and Transformations according to business needs.

Conducted source data analysis and profiling to understand business processes.

Created and managed source-to-target mapping documents for ETL development.

Developed mappings using Designer, extracting, and transforming data from various sources.

Created Informatica Mappings and Reusable Transformations for efficient data loading into a star schema.

Utilized Aggregator, SQL overrides in Lookups, source filters in Source Qualifiers, and Routers for managing data flows in Informatica Mappings.

Created Sessions for data extraction, transformation, and loading into data warehouse tables.

Used SQL loader to load data into Target table.

Developed reusable transformations and mapplets for integration into other mappings.

Designed Workflows incorporating various tasks such as command, email, session, and decision.

Implemented transformation logic, created mappings, and loaded data into target systems.

Performed data validation using SQL following successful ETL loads.

Migrated Mappings, Sessions, and Workflows across Development, Test, and UAT environments.

Authored numerous SQL queries for data analysis and validation against business rules.

Identified and resolved performance bottlenecks through tuning.

Developed Unit Test plans and conducted unit, system, regression, and UAT testing.

Managed ETL and database code migrations across environments using deployment groups.

Implemented business rules through mappings to populate target tables.

Created and managed parameter files and initiated sessions using PMCMD commands to dynamically adjust session parameters and variables.

Conducted end-to-end integration and performance testing, validating results with SQL queries.

Optimized slow-running SQL queries from Production by rewriting and improving them.

Enhanced mapping performance by addressing Source/Target bottlenecks.

Developed batch scripts for automated database deployment.

Conducted extensive Unit Testing of Informatica code with SQL Queries and Debugger.

Managed server operations using PMCMD commands in UNIX and automated tasks with Shell scripts.

Invoking Scripts in IICS/IDMC to download the API response by unzipping and processing the data files such as JSON, CSV, XML, structured, unstructured files, using IICS connectors and transformations.

Enhanced performance through tuning at the mapping, session, source, and target levels.

Coordinated with the scheduling team to execute Informatica jobs for historical data loading in production.

Prepared Technical Design documents and Test Cases.

Executed scripts in IICS/IDMC to download, unzip, and process data files (JSON, CSV, XML) using IICS connectors and transformations.

Analyzed and redesigned existing mappings from PowerCenter 10.x to IICS using Cloud Data Integration (CDI).

Managed source-to-target mapping documents for the ETL development team.

Worked with Informatica Intelligent Cloud Services (IICS), CDI, and CAI for data extraction from cloud environments.

Created summarized, control, and staging tables to boost system performance and facilitate immediate database recovery.

Monitored data quality issues and metrics, generating reports and dashboards.

Migrated code using CICD into Bitbucket.

Collaborated with Business Analysts to gather and understand requirements and business processes.

Generated Technical Design Documents based on Business Requirement Documents (BRDs), analyzing system impacts.

Conducted comprehensive data analysis and profiling, creating a data dictionary with numerous SQL queries.

Prepared source-to-target mapping documentation and engaged with stakeholders to understand data and transformation logic.

Designed and maintained mappings, sessions, and workflows for data loading.

Engineered mappings for Slowly Changing Dimensions (Type 1, Type 2, etc.), using transformations like Lookup, Update Strategy, and Filters.

Employed various transformations, including Router, Aggregator, Lookup, Source Qualifier, Joiner, Expression, and Sequence Generator, to extract data according to business logic.

Implemented SQL overrides in Source Qualifiers to filter data as per business requirements.

Analyzed mapping logic to assess and improve code reusability.

Created Mapping Parameters, Session Parameters, Mapping Variables, and Session Variables.

Conducted performance tuning by identifying and addressing bottlenecks at various levels to enhance session performance.

Participated in ETL code migration from Development (DEV) to Quality Assurance (QA) and User Acceptance Testing (UAT) using CICD pipelines integrated with GitHub.

Maintained daily communication with the offshore team to ensure project commitments were met.

Environment: Informatica Cloud IICS, Informatica PowerCenter 10.5, SQL loader, SQL, Oracle 19c, SQL Server 2017, TOAD, AUTOSYS, PL/SQL, JIRA, REST-API, Postman

CLIENT: BB&T NOW TRUIST RALEIGH, NC DURATION: SEPT 2019 TO JUN 2021

PROJECT: CONSOLIDATED DATA WAREHOUSE ROLE:

INFORMATICA DEVELOPER

Description: Truist Financial Corporation, a holding company, provides banking and trust services in the Southeastern and Mid-Atlantic United States. Relevant data from BB&T are consolidated into CDW from various source system using ETL Informatica tools. Reports and Interactive dashboards were developed that allowed the company better executive reporting.

Responsibilities:

•Worked with business users to get the requirement and to involve in complete SDLC of this project

•Analyzed source data and gathered requirements from the business users and source data team

•Created Technical Design Document from business requirements document (BRD).

•Analyzed business and system requirements to identify system impacts.

•Studying the existing system and conducting reviews to provide a unified view of the program.

•Created Detail Technical Design docs which have ETL technical specifications for the given functionality.

•Prepared source to target mapping and conducted meetings with the business to understand data / transformation logic.

•Used SQL loader to load data into Target table that save loading time for huge data set.

•Analyzed the existing mapping logic to determine the reusability of the code.

•Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

•Done the extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better performance.

•Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.

•Performing ETL and database code migrations across environments using deployment groups.

•Populating the business rules using mappings into the target tables.

•Involved in end-to-end system testing, performance and regression testing and data validation.

•Managed performance and tuning of SQL queries and fixed the slow running queries in production.

•Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.

•Involved in creating Informatica mappings, Mapplets, worklets and workflows to populate the data from different sources to warehouse.

•Created batch scripts for automated database build deployment.

•Used prewritten scripts to run informatica workflow.

Environment: Informatica PowerCenter 9.6.1,, SQL Server 2017, ORACLE 12c, SQL, UNIX, Toad, SQL Developer, SQL loader, Cognos 9, Control M, JIRA

CLIENT: CAREFIRST BLUECROSS BLUESHIELD, CHARLOTTE, NC DURATION: JAN 2018 TO SEPT 2019

PROJECT: CLAIM DATA MART

ROLE: INFORMATICA DEVELOPER

Description: The organization provides managed care services targeted at government-sponsored health care programs, focusing on Medicaid and Medicare. This project delivered an innovative tableau Performance Dashboard which provides a quick snapshot of all relevant data with a user-centric view to offer a detailed analysis of Govt. Business: Medicare and Medicaid business, business users can use this dashboard to make informed decisions efficiently. The project is aimed at developing multiple dashboards for internal users and stakeholders for Government segment business; as a part of integration data mart for Medicare and Medicaid data sources like Facets, EZCAP, LEON and QNXT for short-term and long-term decision-making, strategic plans, support and solutions giving stakeholders ability to access, analyze and help stakeholders to automate extract process for down-line application.

Responsibilities:

•Analysis, requirements gathering, technical specification, development, deploying and testing.

•Extensive use of Informatica Tools like Designer, Workflow Manger and Workflow Monitor.

•Used transformations like Aggregators, Sorter, Dynamic lookups, Connected & Unconnected lookups, Filters, Expression, Router, Joiner, Source Qualifier, Update Strategy, sequence Generator.

•Used Workflow Manager/Monitor for creating and monitoring workflows and worklets.

•Used mapping parameters, mapping variables and parameter files

•Used SQL to analyze source data, performed data analysis and validate the data.

•Created Informatica mappings to load the data from staging to dimensions and fact tables.

•Conducting unit testing and Prepared Unit Test Specification Requirements.

•Configured the mappings to handle the updates to preserve the existing records using update strategy transformation.

•Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator, Update Strategy, Rank, Expression and lookups (connected and unconnected) while transforming the Data according to the business logic.

•Worked on tuning SQL statements

•Prepared Test Cases and performed system and integration testing

•Created and Monitor the Sessions.

•Involved in troubleshooting the loading failure cases, including database problems.

•Created sessions, reusable worklets and workflows in Workflow Manager

•Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.

•Responsible to facilitate load testing and benchmarking the developed mapping/session with the set performance standards.

•Involved in testing the database using complex SQL scripts and handled the performance issues effectively.

•Scheduled Informatica Jobs through Autosys scheduling tool.

Environment: Informatica PowerCenter 9.1, Oracle 12c, SQL, SQL Server 2014, Autosys, JIRA.



Contact this candidate