Post Job Free
Sign in

Business Systems Sql Server

Location:
Ashburn, VA
Posted:
September 08, 2025

Contact this candidate

Resume:

DEEPALI THAKRE

ETL Informatica Developer

202-***-**** • **********@*****.***

SUMMARY:

Over 7 years of experience in ETL Informatica development and the full Software Development Life Cycle (SDLC), including business requirement analysis, application design, development, implementation, and testing of data warehousing and database business systems. Proven expertise in delivering solutions for insurance and banking domains, ensuring data accuracy, integrity, and performance optimization.

Excellent experience in designing and development of ETL Methodology using Informatica PowerCenter 10.5, 10.4.

Experience in Performance tuning at source side and target side.

Perform the performance tuning on SQL query.

Having experience in working with Postgres, SQL Server 2012/2014, and ORACLE databases.

Worked extensively in design and analysis of Requirements, Development, Testing and Production deployment.

Having experience in design and managing complex ETL workflows.

Expert in writing SQL queries using multiple databases like ORACLE, SQL Server, Postgres etc.

Extensive knowledge of Informatica tuning and SQL tuning.

Experience in integration of various data sources into staging areas.

Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.

Perform data profiling, cleansing, and validation of source data.

Designed and developed complex mappings, tasks and workflows and tuned it.

Experience in handling huge volume data and tuning it.

Perform root cause analysis/troubleshooting on any code issues.

Complete ETL development within timeframes.

Perform QA functions for the ETL application, including ensuring that unit and integration test plans are developed and executed.

Provide code migration instructions in standard documentation.

Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.

Streamline ETL procedures to improve efficiency and cut down on processing time.

Reconcile the data according to requirement with different functionality.

Experience in optimizing the Mappings and implementing complex business rules by creating re-usable transformations.

Experience in Performance tuning of ETL process and reduced the execution time for huge volumes of data for a company merger project. Heavily created user defined functions, reusable transformations, look-ups.

Working experience in Agile and Waterfall methodologies.

Excellent communication skills, ability to communicate effectively with executive and management team, having strong analytical, problem-solving skills.

Designed and implemented end-to-end data integration workflows using Informatica Cloud Data Integration, improving ETL process

Developed reusable mappings, tasks, and parameterized workflows to streamline data and transformation.

Migrated legacy ETL pipelines to IICS, reducing operational costs and improving system reliability.

Led a team of 4 developers to design and implement ETL workflows for a data warehouse supporting enterprise reporting.

Conducted code reviews and implemented best practices to ensure data quality and maintainability.

Created reusable components and templates to standardize ETL development across projects.

As team lead partnering with QA teams to troubleshoot data inconsistencies and resolve issues promptly.

Created data ingestion pipelines from on-premises and cloud sources (SQL Server, Oracle, APIs) into Azure Data Lake and Blob Storage.

Designed and deployed ETL solutions to integrate data from multiple sources into Azure Data Lake Storage (ADLS) and Azure SQL Data Warehouse.

Experience in Azure Functions and Logic Apps to automate ETL processes and handle event-driven workflows.

Implemented and optimized data integration processes in Azure Synapse Analytics, improving data processing efficiency.

TECHNICAL SKILLS:

ETL TECHNOLOGY: Informatica PowerCenter 10.5, 10.4, IICS, AZURE

DATA WAREHOUSE: Star Schema, Snowflake schema

DATABASES: Postgres, Oracle, MS SQL Server, MySQL

Others: SQL, WinSCP, Azure, ADF, ServiceNow, UNIX shell

PROFESSIONAL EXPERIENCE:

Infinite Computer Solutions, Rockville, MD(Remote) Nov 2022 - Present

Role: Informatica Developer

Client – Conduent (State govt. project)-MMIS

Modernizing legacy systems with scalable, cloud-based solutions like Conduent Medicaid Suite (CMdS) can streamline operations, reduce costs, and enhance service delivery. These solutions also ensure compliance with CMS standards and adapt to evolving regulatory requirements.

CMdS Financial: provides accounts payable, accounts receivable, and financial management solutions and eliminates the need to piece together information from multiple systems. CMdS Pharmacy: offers real-time comprehensive NCPDP-compliant claims adjudication functionality that automatically evaluates eligibility, drug coverage, benefit limitations and pharmacy network enrollment prior to dispensing’s Drug Rebate: Maximize efficiency with configurable, integrated flexibility to support all major rebate types.

Responsibilities:

Worked with business analyst for requirement gathering, business analysis, testing, business process descriptions, scenarios and workflow analysis.

Created Technical Design Document from business requirements document (BRD).

Analyzed business and system requirements to identify system impacts.

Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality.

Prepared source to target mapping and conducted meetings with the business to understand data / transformation logic.

Analyzed the existing mapping logic to determine the reusability of the code.

Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.

Done the extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better performance.

Performing ETL and database code migrations across environments using deployment groups.

Populating the business rules using mappings into the target tables.

Involved in end-to-end system testing, performance and regression testing and data validation.

Managed performance and tuning of SQL queries and fixed the slow running queries in production.

Created batch scripts for automated database build deployment.

Expert in all phases of Software development life cycle (SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation.

Designed, Developed and Implemented ETL processes using IICS Data integration.

Created IICS connections using various cloud connectors in IICS administrator.

Designed and built IICS mappings to extract data from Oracle, SQL Server and loaded it into table.

Designed the ETL flows for generating flat file extracts as per the business needs.

Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service.

Implemented Type1, Type2 in IICS.

Worked on converting Informatica PowerCenter ETL code to IICS using PC to Cloud Conversion service.

Performed loads into Snowflake instance using Snowflake connector in IICS.

Extensively used Informatica Power Center, IICS to create mappings, sessions and workflows for populating the data into dimension, fact, and lookup tables from different source systems (SQL server, Oracle, Flat files, tables).

Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the data.

Created complex data transformations using IICS mappings, including aggregations, lookups, and joins, to standardize data across multiple sources.

Designed error-handling workflows and alerts to monitor and resolve data pipeline issues.

Maintained compliance by implementing data masking and encrypting for sensitive information.

Performed performance tuning of databases running on oracle Exadata to improve query response times.

Oracle databases hosted on Exadata and performed day-to-day operations.

Led a team of 4 developers to design and implement ETL workflows for a data warehouse supporting enterprise reporting.

Conducted code reviews and implemented best practices to ensure data quality and maintainability.

Created reusable components and templates to standardize ETL development across projects.

Tuning the mappings based on criteria, creating partitions in case of performance issues.

Performed data validation after the successful End to End tests and appropriate error handling in ETL processes.

Resolving the tickets based on the priority levels raised by QA team.

Expertise in data profiling, standardization, and validation using Informatica Data Quality (IDQ).

Hands-on experience in creating base objects, foreign keys, lookup tables, and staging tables.

Implemented MDM solutions to streamline customer and product data across multiple systems, enhancing data accuracy.

Automated user account creation, permission modifications, and access management with Shell scripts.

Environment: Informatica PowerCenter 10.5, 10.4, IICS, Snowflake, SQL, Oracle, MySQL, Autosys.

Client: CareFirst BlueCross BlueShield, Baltimore, MD Jul 2020 to Oct 2022 Role: Informatica Developer

Responsibilities:

Maintained warehouse metadata, naming standards and warehouse standards for future application development.

Developed complex Power Center mappings by using different transformations and components respectively to meet the data integration and data quality requirements.

Created and monitored workflows/sessions using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.

Performed Unit testing, Integration Testing, and User Acceptance testing to pro-actively identify data discrepancies and inaccuracies.

Involved in performance tuning at source, target, mapping and session level.

Prepared design documents, ETL Specifications and migration documents.

Maintained daily Tech Tracker for the updates from the team regarding their objects, issues and progress.

Created the design and technical specifications for the ETL process of the project.

Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.

Responsible for mapping and transforming existing feeds into new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator & Update strategy.

Worked with various complex mapping, designed slowly changing dimensions Type1 and Type2.

Performance tuning of the process at the mapping level, session level, source level, and target level.

Environment: Informatica PowerCenter, SQL, Oracle, MySQL, Autosys

Client: BOK Financial Corporation, Richmond, VA Apr 2018 to Jul 2020 Role: Informatica Developer

Responsibilities:

Actively involved in interacting with business users to record user requirements and Business Analysis.

Used Informatica PowerCenter for (ETL) extraction, transformation and loading data from heterogeneous source systems like Flat file, SQL Server 2012 and Oracle 11g into target database which is oracle11g.

Documented Data Mappings/ Transformations as per the business requirement.

Prepared Technical Design documents and Test cases.

Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

Created Sessions and extracted data from various sources, transformed data according to the requirement and loaded it into data warehouse tables.

Developed several reusable transformations and mapplets that were used in other mappings.

Developed transformation logic as per the requirement, created mappings and loaded data into respective targets.

Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.

Migrated Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.

Improved performance testing in Mapping and the session level.

EDUCATION:

Bachelor’s in computer technology from Nagpur University, Maharashtra, India



Contact this candidate