Priyanka
********.********@*****.*** +1-737-***-**** Austin, TX
Professional Summary
●Having 10+ years of experience in designing and developing end to end ETL solutions and analytics using Informatica PowerCenter, IICS, IDQ, IDMC and Snowflake.
●Expertise building reusable, scalable, and performance optimized data pipelines to extract data from diverse source systems and load it into staging, EDW like Snowflake and data marts to support analytics and downstream reporting needs.
●Experienced in extracting, transforming, and loading data from a wide range of sources including Oracle, SQL Server, flat files, XML, and JSON into Data Warehouses and Data Marts using Informatica.
●Experienced in Data Warehousing, Data Migration, Data Profiling, Data Cleansing.
●Experienced in creating Informatica Cloud(IICS) Mappings, Mapplets, Taskflows, and data integration tasks including Data Synchronization, Data Transfer, and Data Replication to support end to end ETL workflows
●Expertise in creating mappings using Transformations like Joiner, Lookup, Filter, Router, Sequence Generator, Hierarchy Parser, Hierarchy Builder, Rank, Aggregator, Update Strategy, Transaction Control, Expression.
●Performed data profiling to assess data quality, identify anomalies, and ensure datasets were accurate and analysis ready.
●Experienced in building reusable Informatica Cloud Data Quality(CDQ) assets such as Rule Specifications, Verifiers, Deduplication rules, Cleansing and Parsing logic, Dictionaries, Labelers, and Exception handling components to improve data accuracy and consistency.
●Experienced in loading fact and dimension tables using Star schema and Snowflake schema designs, implementing SCD strategies and optimized ETL workflows in Informatica
●Optimized ETL pipelines by applying partitioning, pushdown optimization, lookup caching, and pipeline parallelism, significantly improving throughput and reducing end to end load times. .
●Experienced working with PII/PHI data and following Health Insurance Portability and Accountability Act (HIPAA) compliance standards in Agile environments.
Tools & Technologies
●ETL: Informatica PowerCenter (9.x, 10.x), IICS, IDQ, IDMC
●Databases: Oracle, SQL Server, MySQL, Netezza, Informix;
●Cloud: AWS (S3, Glue, Glue Crawler, Athena, RDS, DMS), OpenSearch
●DevOps & Scripting: Docker, UNIX/Linux, UNIX Shell Scripting (SFTP/FTP automation), GitHub, Bitbucket, Jira, PuTTY, WinSCP
●Enterprise Apps: Guidewire PolicyCenter, AccountCenter
●Programming: SQL (advanced), Java (Basic)
●Security/Compliance: Best practices for information systems handling sensitive data (PII/PHI, HIPAA)
●Other Tools: Oracle SQL Developer, MySQL Workbench, Aginity, SoapUI, GitHub Copilot, LLMs, Ollama (LLaMA 3)
Professional Experience
Senior ETL Developer
Zensar Technologies Jun 2021 – Dec 2025
●Worked on end to end ETL development for Brotherhood Mutual Insurance, including requirement analysis, technical design, development, testing, and deployment.
●Analyzed application architecture and data flows for large-scale insurance data migration initiatives.
●Reviewed requirements from Confluence documentation to ensure accurate ETL implementation.
●Created Cloud Data Profiling tasks to identify data anomalies and shared findings with business teams for timely data corrections.
●Developed and implemented Cloud Data Quality (CDQ) assets in Informatica Intelligent Cloud Services(IICS) to cleanse, validate, and standardize incoming data, ensuring high-quality data for downstream ETL processes.
●Created Cleanse asset to handle null values, remove spaces, replace values and standardize formats, improving overall data consistency and reliability.
●Built and configured deduplication assets to identify and eliminate duplicate records, enhancing data accuracy across systems.
●Created and maintained dictionary assets to assign reference values to standardized incoming data, ensuring data completeness and uniformity.
●Collaborated with ETL and business teams to integrate CDQ processes seamlessly into data pipelines, improving data quality before loading into target systems.
●Developed IICS Data Integration Mappings, Mapplets, Mapping tasks, and Taskflows to load data from SQL Server into staging and target tables and then to Enterprise Data Warehouse.
●Created mappings, mapplets to implement complex business logic using transformations such as Expression, Lookup, Joiner, Filter, Router, Aggregator, Rank, and Sequence Generator, Hierarchy Builder, Hierarchy Parser.
●Implemented parameterization and macros to build reusable and dynamic mappings.
●Migrated insurance policies from SQL Server entities to Guidewire PolicyCenter UI using batch migration tools; validated migrated policy data across PolicyCenter, AccountCenter, and source systems to ensure data accuracy and completeness.
●Deployed ETL and migration code through Bitbucket and supported environment refresh activities using SoapUI; executed Guidewire PolicyCenter deployments and supported database schema changes, configuration updates, and environment refresh activities.
●Validated Guidewire Gosu code to ensure migrated policies met business rule conditions.
Data Specialist - ETL Informatica
IBM India Pvt. Ltd. Jun 2014 – Jun 2021
●Led ETL development and support for Blue Cross Blue Shield of Massachusetts and Royal Bank of Scotland (FATCA Reporting), including design and implementation of enterprise ETL solutions for healthcare and regulatory reporting.
●Worked closely with business users to gather requirements, create technical specifications, and design ETL workflows; created timelines and estimates and worked with business for approval.
●Worked with Business in creating the High Level design(HLD) and Low Level design(LLD) documents.
●Coordinated with the Admin team to set up folders, create user logins, and assign appropriate access privileges.
●Created complex mappings, reusable Mapplets using transformations like Expression, Aggregator, Lookup, Joiner, Filter, Router, Rank, Update Strategy, Transaction Control, Sequence Generator.
●Created performance optimized mappings and workflows using Pushdown optimization, Partitioning and Error Handling.
●Created Sessions, Workflows for orchestration and scheduled the workflows daily, weekly, monthly as per business requirements.
●Created the migration documents to get the code migrated to higher environments like Test, UAT, Stage and Production.
●Prepared unit test cases, data validation documents, and migration documents; debugged and resolved ETL issues during development, testing, and production support phases.
●Developed UNIX shell scripts for automated file transfers using SFTP/FTP.
●Executed Informatica PowerCenter migration projects (9.0.1 10.x) and Oracle database migration (11g to 12c), coordinating with Informatica Admin teams to migrate and validate ETL objects.
●Implemented performance tuning and contributed to IT-wide process improvements (code reviews, unit test documentation, knowledge sharing).
●Collaborated with Mainframe, Java, Informatica Admin, and Oracle DBA teams to deliver integrated solutions.
●Served as designated owner for critical ETL deliverables, ensuring accountability, quality, and timely delivery.
●Mentored new team members on Informatica tools, workflows, and enterprise data models.
Education
Bachelor of Engineering in Information Technology JNTU University – 2012