Sireesha Padala
Informatica IICS Expert
PROFESSIONAL SUMMARY
•12 years of hands-on experience in ETL Projects in development, maintenance, and enhancements of Datawarehouse using IICS/PowerCenter.
•6+ years of leadership experience, includes the management and coordination of an Informatica team.
•Exposure in working closely with Business Analysts and Product Owners at Client Location in Finland Helsinki.
•Created High Level/Low Level/Design Level Documents for development.
•Having 8+ experience working with complex SQL Queries for generating Reports.
•Proficiency in Software Development Life Cycle (SDLC) requirement analysis & definition, designing, coding, testing, implementation
•Worked on Control-M tool to create and schedule batches. Also created calendars and daily/Monthly/Quarterly schedules for critical batches.
•Successfully creating the Linked Services on the source and as well for the destination servers
•Moving data from on-premises Oracle to Snowflake DB with the help of pipeline and data flows.
•Creating Mapping Tasks, Task Flows in IICS interface.
•Implemented Complex Logic in Mapping task and reduced the execution Time of Jobs.
•created synchronization Task and Replication Task for data copy within Lower environments for Testing the code in Lower environments.
•Created and automated workflows with the help of triggers.
•Worked on developing complex BTEQ scripts in Teradata.
•Worked on developing MLOAD, TPUMP, FLOAD, TPT Scripts.
•Worked on Complicated Mappings Like CDC (Change data capture), XML Parser, Huge data handling techniques implemented.
•Excellent team player & an effective communicator with strong analytical, leadership, interpersonal skills as well as ability to work effectively in fast paced environment.
•Production Support experience in identifying the root cause of the failure and provide the temporary fix for the Schedule to finish on time and deliver the reports to business.
•Perform data quality analysis, report data quality issues, and propose solutions for data quality management.
•Implemented Push down optimization techniques (PDO) as part of performance tuning of mappings execution.
•Actively worked on weekend maintenance, Monthly maintenance activities.
•Created Mapping documents, Solution Documents also Service requests, Incidents, Problem Tickets to get the approvals for solution implementation.
•Proactive source files follow-ups. Month-end Loads Tracking and handling the Edward Loads.
•Developed Python Scripts for the function of file watcher. Tested them end to end in the unit testing, test environment.
Technical Skills:
PRIMARY SKILL – ETL Tools like Informatica PowerCenter/ IDMC, Abinitio, TERADATA, DB2, SQL
SECONDARY SKILL –Unix Scripting, Scala, Snowflake, Python
Cloud Services - AWS, Azure, GCP
Visualization Tool – PowerBI, Tableau, BO
Scheduling Tools - WLM, Control-M, Autosys, CA Work Load Automation
Ticket tracking tool - Jira, Service Now
Delivery Methodologies – Agile Methodologies (Scrum/SAFE, Kanban)
Awards and Recognition:
•Client Appreciations for smooth Migration of project into Cloud – Sep 2023
•Certificate of Completion for Data Analytics issued by LinkedIn.
•Achieved a Badge on completion of Generative AI Essentials Course.
•Achieved a Snowflake Badge for completing "Hands on Essentials - Data warehouse" issued by Snowflake.
•Awarded as the “Significant delivery Award” for delivering the project with Zero postproduction defects in Accenture.
•Client appreciation emails for reconciliation and pulling the reports as per business requirement within a short span of time for residential & SME LOB’s.
•Topper of the Batch in engineering with 8.22 CGPA.
PROFESSIONAL WORK EXPERIENCE
Role: Senior ETL Developer
Client: GAIG, OH Aug 2024 – Current
Responsibilities:
•Analysis of requirements
•Creating the Design documents (HLD, LLD)
•Successfully creating the Linked Services on the source and as well for the destination servers
•Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
•Trouble shoot any kind of data issues or validation issues.
•Creating Mapping Tasks, Task Flows in IICS interface.
•Implemented Complex Logic in Mapping task and reduced the execution Time of Jobs.
•created synchronization Task and Replication Task for data copy within Lower environments for Testing the code in Lower environments.
•Develop using python scripting and SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns.
•Experience using Snowflake cloud data warehouse for data integration from numerous source systems, including importing nested JSON structured data into a Snowflake table.
•Testing (Unit Testing, Integration Testing, Regression Testing, System testing, User Acceptance Testing (UAT))
•Provide support or fix the defects raised during the Integration and UAT
•Developed Tasks using API's as Source and implemented transformations to load the target tables.
•Reconciliation between the existing system and the newly developed code.
•Unix scripting for handling the data validation of source Files.
Environment: Informatica IDMC, Snowflake, UC4
Role: Sr. Consultant
Client: EuroClear, Finland Oct 2021 – July 2024
Responsibilities:
•Collaborate with data analysts, engineers, and stakeholders to understand data requirements and design efficient data models.
•Creation, Testing, Maintenance of new Connectors and Drivers.
•supported Monthly/Bi-weekly Releases of new features of IDMC. Make sure that our existing code is not affected.
•Implemented framework to identify any server issues or IICS downtime to intimate the effected email List.
•Provided operations and support activities related to connectors, drivers, secure agent issues etc.
•User Access management like creating access level permissions, Groups, SAML.
•Worked on GITHUB related activities like Access management, User creation, Repository Management, onboarding etc.
•Data analysis for the new systems.
•Loading data into BigQueryfrom Google Cloud Storage using tools like the BigQuery Data Transfer Service.
•Successfully creating the Linked Services on the source and as well for the destination servers
•Moving data from on-premises Oracle to Snowflake DB with the help of pipeline and data flows
•Created and automated workflows with the help of triggers
•Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
•Trouble shoot any kind of data issues or validation issues.
•Develop using python scripting and SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns.
•Migrated the current project consisting of 450+ Mappings into IDMC.
•Developed, Migrated, tested end to end Mapping Tasks, Task Flows and successfully migrated to Production.
•Unix scripting for handling the data validation of source Files.
•Testing (Unit Testing, Integration Testing, Regression Testing, System testing, User Acceptance Testing (UAT)).
•Strong Data warehousing ETL experience using Oracle PL/SQL, Informatica Power Center tools (Designer, Workflow Manager, Workflow Monitor, Repository Manager)
•Worked on the Control-M tool to create and schedule batches. Also created calendars and daily/Quarterly/Monthly schedules for critical batches.
•Strong Knowledge in implementing SCD Type1/Type2/Type3/Type6.
•Designing and developing the various transformation logics by using Source Qualifier, Expression, Filter, Joiner, Lookup, Router, Union, Aggregator and Update Strategy etc.
•Good knowledge in implementation of mapplets, partitions, parameters and variables.
•Unix Scripting for Source File Validation.
•Unix scripts for handling data manipulation and to parse logs, extract and format data, and perform bulk operations on files.
•Working experience in operating systems like windows, ubuntu, Unix.
•Actively worked on weekend maintenance, Monthly maintenance activities.
•Performed the on-call support role, when there is a failure in production Jobs, we get the call and need to fix the issue within 4 hours of failure.
•Created Mapping documents, Solution Documents also Service requests, Incidents, Problem Tickets to get the approvals for solution implementation.
•PowerBI visualization tool is used for data interpretation.
•Worked on Complex SQL Queries to ingest in powerBI Tool in generating Reports.
Environment Used:
ETL Tools: Informatica PowerCenter, Azure Data Factory
Database: Snowflake, Oracle
Scheduling Tools: Control-M
Scripting Language: Python Scripting
Reporting Tool: PowerBI
Role: Technical Lead
Client: Chubb Insurance Aug 2018 - Oct 2021
Responsibilities:
•Analysis of requirements
•Creating the Design documents (HLD, LLD)
•Successfully creating the Linked Services on the source and as well for the destination servers
•Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard.
•Trouble shoot any kind of data issues or validation issues.
•Creating Mapping Tasks, Task Flows in IICS interface.
•Implemented Complex Logic in Mapping task and reduced the execution Time of Jobs.
•created synchronization Task and Replication Task for data copy within Lower environments for Testing the code in Lower environments.
•Develop using python scripting and SQL for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data uncover insight into the customer usage patterns.
•Developing the Mappings, mapplets, reusable transformations, sessions and Workflows.
•Experience using Snowflake cloud data warehouse and AWS S3 bucket for data integration from numerous source systems, including importing nested JSON structured data into a Snowflake table.
•Testing (Unit Testing, Integration Testing, Regression Testing, System testing, User Acceptance Testing (UAT))
•Provide support or fix the defects raised during the Integration and UAT
•Developed Tasks using API's as Source and implemented transformations to load the target tables.
•Reconciliation between the existing system and the newly developed code.
•Unix scripting for handling the data validation of source Files.
•QlikView visualization tool is used for data interpretation.
Environment: Informatica PowerCenter/IDMC, SQL, Control-M, Python Scripting
ROLE: Senior ETL Developer
Client: Anthem Insurance Feb 2016 – Aug 2018
Responsibilities:
• Coordinated with Onsite Team and provided seamless support, analyzing issues and providing solutions for the same and doing the change requests.
•Worked on developing complex BTEQ scripts in Teradata.
•Worked on developing MLOAD, TPUMP, FLOAD Scripts.
•Monitoring of 10k+ jobs in CTM Scheduling Tool.
•Actively worked on weekend maintenance, Monthly maintenance activities.
•Proactive source files follow-ups. Month-end Loads Tracking and handling the Edward Loads.
•Developed Unix Scripts for the function of file watcher. Tested them end to end in the unit testing, test environment.
Environment: Informatica Power Center 9.6, Teradata, WLM, SNOW, Control-M.
Role: Associate Engineer
Client: Liberty Mutual Jun 2013 – Jan 2016
Responsibilities:
•Developing the Mappings, mapplets, reusable transformations, sessions, and Workflows
•Testing (Unit Testing, Integration Testing, Regression Testing, System testing, User Acceptance Testing (UAT))
•Provide support or fix the defects raised during the Integration and UAT
•Creating the Status Reports
Technologies used: Informatica PowerCenter, Teradata, DB2, Unix Scripting, CA Work Load Manager
ACADEMICS
Bachelor's degree in engineering from Andhra University, 2013.