Post Job Free
Sign in

Informatica Powercenter,IICS,Sql Server,Snowflake,SQL,UNIX Scripting

Location:
Jacksonville, IL
Salary:
Company Standards
Posted:
August 12, 2025

Contact this candidate

Resume:

* * * * *

Swapna Medukonduru

ETL IICS/Informatica Cloud Developer

Email:********************@*****.*** Cell +1-217-***-**** http://www.linkedin.com/in/swapna-medukonduru-7a9441303 IT professional with 7+ years’ experience in ETL development spans across the entire development life-cycle. From requirements gathering and design to implementation, testing, and production deployment, I have a comprehensive understanding of the process and able to work efficiently and effectively within each phase. In addition, Have experience in working with multiple RDBMS, including Oracle, Snowflake, and SQL Server, as well as experience in building integrations with a range of Enterprise Datawarehouse applications in HealthCare Insurance and Financial domains.

• Experience in performing POC on IICS as a part of Informatica PowerCenter to Informatica cloud conversion project; created IICS mappings using Source, Target, Lookup, Expression, Filter transformations.

• Good knowledge in Azure Data Factory, Azure SQL DB also worked on POC works using these skills.

• Exposure in writing SQL using analytical functions like Ranking, case statement etc.… using ORACLE 12c/11g, Snowflake and SQL Server 2017.

• Excellent experience in designing and development of ETL using IICS/IDMC, Informatica PowerCenter 10.4, 9.6.

• Advanced knowledge in managing Snowflake databases, writing SQL queries, and optimizing data loading strategies

• Proficient in integrating Snowflake, AWS S3 within ETL workflows, leveraging cloud-based solutions for efficient data storage and transformation

• Interacted with end-users identified & helped on updating BRD and transform it into technical requirements.

• Experience in Debugging and Performance tuning of targets, sources, mappings and sessions.

• Worked in Change Data Capture Methodology using Informatica PowerCenter 10.x/9. x

• Prepared various types of documents like requirements gathering, ETL specification, Data Mapping, Test cases, Data dictionary etc.

• Expert in writing SQL query using multiple databases like ORACLE, Snowflake, SQL Server etc.

• Good in OLTP/OLAP System Study, Analysis, E-R diagram, understating Dimensional Models like Star schema and Snowflake schema used in dimensional modeling.

• Experience in creating generic mappings using Expression Macros (Vertical) in IICS.

• Performed Unit, System & Integration testing, and helped users for UAT in different phases of project.

• Experience in Debugging and Performance tuning of targets, sources, mappings and sessions.

• Experience with Data Extraction, Transformation, and Loading from Multiple Sources.

• Experience in creating different types of Taskflows like Parallel tasks with Decision, Sequential Tasks with Decision, and Linear Taskflow.

• Good in Data modeling concepts like Star-Schema Modeling, Snowflake Schema Modeling.

• Experience in optimizing the Mappings and implementing the complex business rules by creating re- usable transformations and Mapplets, look-ups.

• Extensively used Slowly Changing Dimension (SCD) technique in business application.

• Experience in Performance tuning of ETL process using pushdown optimization and other techniques.

• Strong in data analysis, problem-solving, & team collaboration; have a proven ability to work effectively in a fast-paced environment, manage multiple priorities. 2 P a g e

Education and certifications:

M. Sc (Computer Science) Kakatiya University Campus, Warangal, India, 2007 Technical Skills:

ETL TECHNOLOGY Informatica Intelligent Cloud Services (IICS/IDMC – CDI/CAI), Informatica PowerCenter 10.4/9.x

DATABASES/Cloud Oracle 18/12c, MS SQL Server 2017/2014, AWS, GCP, Microsoft Azure, Salesforce, Snowflake

Web Services HTML, XML, CSS, JavaScript, SOAP, REST, REST API OTHERS TOAD, SQL, JIRA, Jenkins, Airflow, MS Office, GIT. Client: Heartland Bank and Trust Company, Bloomington, IL May 2023 to Current Role: Application Programmer / ETL Informatica Developer Heartland’s strength and capabilities, the services they offer and their team ready to serve communities. Heartland Bank and Trust Company is an Equal Opportunity Employer, including disability/vets. this project integrates from multiple application into data into enterprise data warehouse; to build an Analytics solution for better reporting.

• Actively involved in interacting with business users to record user requirements and Business Analysis.

• Created Data Mappings/Transformations as per requirement. Created & maintained source-target mapping documents for ETL team.

• Havely using IICS to design, develop, and deploy data integration workflows, ensuring seamless data flow between various systems, and leveraged REST APIs for integration with external applications.

• Good in creating and managing data pipelines/workflows using IICS, leveraging its robust features for data transformation, aggregation, and filtering, in cloud environments.

• Designed/implemented data integration solutions using IDMC/IICS, integrating data from disparate sources, including databases, files, and APIs, and developed custom REST APIs for data ingestion and exposure.

• Developed & later deployed workflows in IICS, utilizing its advanced features, such as error handling, and data quality checks, including scheduling, monitoring, and troubleshooting in lower environment.

• Collaborated with cross-functional teams to design and implement data governance policies using Informatica Data Management Cloud (IDMC), and developed custom REST APIs for data policy enforcement.

• Implemented data quality and data validation rules using IDMC, ensuring data accuracy and integrity across various systems.

• Utilized IDMC/IICS to integrate data from cloud-based applications, with on-premises systems, and developed custom REST APIs for to handle real-time data synchronization in lower environment.

• Created/managed data lakes using IDMC/IICS, storing and processing large volumes of structured & unstructured data, understood access control & data governance.

• Migrated codes using CI/CD pipelines into GitHub repo in lower environments.

• Worked on Cloud base EDW solutions using IDMC/IICS, for populating data in data marts & other BI application, and developed custom REST APIs for data exposure and consumption in lower environment.

• Designed/implemented real-time data integration solutions using IDMC/IICS in lower environment, which finally enabling business users to make data-driven decisions.

• Worked with cross-functional teams, such as database administrators, Management and Unix admin, to address dependencies, resolve issues, and ensure smooth execution of project.

• Actively participated in team discussions and knowledge-sharing sessions to exchange ideas, leverage collective expertise, and foster a collaborative work environment. Environment: IICS/IDMC, SQL, Oracle 18c, Snowflake DB, TOAD, Airflow, CI/CD pipeline, Jira, 3 P a g e

Client: Aetna, a CVS Health Company, Hartford, CT Mar 2021 to May 2023 Role: ETL Informatica Developer

Description: Aetna is one of the nation's leading diversified health care benefits companies, serving an estimated 46.7 million people with information and resources to help them make better decisions about their healthcare. The objective of this project is to enrich data from new sources into existing IDW i.e., integrated data warehouse to enable improved decision-making, strategic plans, support and solutions that favorably impact costs, quality of care, outcomes, and customer satisfaction through an information-driven environment that leverages their integrated data assets for competitive advantage.

• Actively involved in interacting with business users to record user requirements and Business Analysis.

• Documented Data Mappings/ Transformations as per the business requirement.

• Prepared Technical Design documents and Test cases.

• Creating and maintaining source-target mapping documents for ETL development team.

• Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.

• Developed Informatica Mappings & Reusable Transformations to facilitate Loading of Data of a star schema.

• Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

• Created Sessions and extracted data from various sources, transformed data according to the requirement and loading into data warehouse tables.

• Developed several reusable transformations and mapplets that were used in other mappings.

• Developed transformation logic as per the requirement, created mappings & loaded into respective targets.

• Used SQL tools like TOAD to run SQL queries to view and validate the data loaded into the warehouse.

• Migrated Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

• Wrote hundreds of SQL query for data analysis to validate the input data against business rules.

• Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

• Extensively worked on Unit testing for the Informatica code using SQL Queries and Debugger.

• Used PMCMD command to start, stop and ping server from UNIX.

• Improved performance testing in Mapping and the session level.

• Migrated codes using CI/CD pipeline in lower environments.

• Coordinated with scheduling team to run Informatica jobs for loading historical data in production.

• Working on POC on IICS to convert Informatica PowerCenter to Informatica cloud Environment: Informatica PowerCenter 10.4, SQL, Oracle 12c, Snowflake, TOAD, Control M, JIRA Client: The Hartford, Hartford, CT Jan 2020 to Mar 2021 Role: ETL Informatica Developer

Description: The Hartford is an insurance company that provides a range of insurance and financial products and services to individuals and businesses. The offerings include property and casualty insurance, group benefits, and mutual funds. The Data Warehouse project involves designing and building a large-scale database that consolidates data from various internal and external sources to support the Hartford's business intelligence and reporting needs, enabling it to make data-driven decisions.

• Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.

• Actively involved in interacting with business users to record user requirements and Business Analysis.

• Outlined the complete process flow and documented the data conversion, integration and load mechanisms to verify specifications for this project.

• Parsing high-level design spec to simple ETL coding and mapping standards.

• Maintained warehouse metadata standards for current ETL development. 4 P a g e

• Created the design and technical specifications for the ETL process of the project.

• Used Informatica as an ETL tool to create source/target definitions, mappings and sessions to extract, transform and load data into staging tables from various sources.

• Responsible for mapping and transforming existing feeds into the new data structures and standards utilizing Router, Lookups Using Connected, Unconnected, Expression, Aggregator, Update strategy & stored procedure transformation.

• Worked with various complex mapping, designed slowly changing dimension Type1 and Type2.

• Performance tuning of the process at the mapping level, session level, source level, and the target level.

• Created Workflows containing command, email, session, decision and a wide variety of tasks.

• Tuning the mappings based on criteria, creating partitions in case of performance issues.

• Performed data validation after successful End to End tests and appropriate error handling in ETL processes.

• Resolving the tickets based on the priority levels raised by QA team.

• Developed Parameter files for passing values to the mappings as per requirement.

• Scheduled batch and sessions within Informatica using Informatica scheduler and also customized pre written shell scripts for job scheduling.

Environment: PowerCenter 9.6.1, Oracle 12c, Control M, Putty, WinSCP, Notepad++, JIRA Client: Memorial Health Hospitals and Health Care, Springfield, IL May 2018 to Dec 2019 Role: ETL Informatica Developer

Description: Memorial Health is one of the leading healthcare organizations in Illinois. Founded in 1897 to meet the expanding needs of their communities, they have assembled an ever-growing team and vast resources. This project involves building a data warehouse by consolidating data from a variety of Sources into a central data warehouse to solve multiple use cases to support downstream application.

• Worked with business team to gather requirements for projects & created strategies to handle requirements.

• Documented Data Mappings/ Transformations as per the business requirement.

• Worked on project documentation which included the Technical and ETL Specification documents.

• Created mappings using extracted data from various sources, transformed data according to requirement.

• Involved in extracting the data from the Flat Files and Relational databases into staging area.

• Mappings, Sessions, Workflows from Development to Test and then to UAT environment.

• Developed Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.

• Developed the Informatica Mappings by usage of Aggregator, SQL overrides usage in Lookups, source filter usage in Source qualifiers, and data flow management into multiple targets using Router.

• Created Sessions & extracted data from various sources then transformed as per requirement & loaded.

• Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Router and Aggregator to create robust mappings in the Informatica Power Center Designer.

• Extensively worked on Unit testing for the Informatica code using SQL Queries.

• Implemented various Performance Tuning techniques.

• Wrote SQL queries to view and validate the data loaded into the warehouse.

• Developed logic as per the requirement, created mappings and loaded data into respective targets.

• Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.

• Used PMCMD command to run workflows from command line interface.

• Improved performance testing in Mapping and the session level. Environment: Informatica PowerCenter 9.1, SQL Server 2014, Oracle 11g, TOAD, Putty, Control M, UNIX



Contact this candidate