VIJAYA BHASKAR REDDY
ETL Developer (Informatica/IICS) ***********@*****.*** +1-770-***-**** Houston, Texas Summary
Twelve plus years of IT experience in the design and development of data Integrations using various ETL tools (Informatica PowerCenter 10.5.2, IICS / IDMC, IDQ) and reporting tools like Microsoft Power BI, Business Objects to support data warehouse and reporting requirements.
Five plus years of experience in building ETL mappings, Mapping Tasks, Taskflows, Processes, Intelligent Structure Models, Service/App/Add-On Connectors, etc. using Informatica Intelligent Cloud Data and Application Integration tools (IICS/IDMC).
Eight plus years of experience in Informatica PowerCenter/Data Quality in building ETL mappings, sessions, worklets, workflows, reusable objects, and error-handling methodologies.
Very good experience in data extraction and integrating data from various relational sources like Oracle, Snowflake, Teradata, SQL Server, Kafka, AWS S3 and Azure Blobs using Informatica Intelligent Cloud Services.
Good experience in developing the IICS ETL mappings/processes with AWS S3/Azure Blobs/REST APIs as the source and format the unstructured data extracted from source application according to the business requirement and load into the Snowflake and other databases.
Good experience in Unix Shell & Python Scripting.
Good experience in debugging the ETL and SQL code, optimization, and performance improvement techniques.
Good experience in building Snowflake objects like External Stages, Pipes, Streams, Tasks, Storage Integration, Dynamic tables, and Stored Procedures to extract data from Azure blob storage.
Involved in Informatica PowerCenter migration to IICS/IDMC environment and Teradata to Snowflake migration.
Involvement in all phases of SDLC from requirements gathering, analysis, efforts estimation, development, testing, deployment, and go-live support. Skills
Informatica Intelligent Cloud Data and Application Integration Services (IICS/IDMC)
Informatica PowerCenter 10.5.2/Administration/Data Quality/Master Data Management
Snowflake, AWS S3, RedShift, Azure Blobs, Teradata, Oracle, SQL Server, DB2, SAP R3 etc.
Business Objects & Microsoft PowerBI
Unix Shell Scripting & Python Scripting
Azure DevOps, Azure Data Factory, AWS & GCP
Certifications & Training
Informatica Certified Professional (IICS/IDMC).
Snowflake SnowPro Certified Professional.
Azure Data Factory and AWS training completed successfully. Education
Bachelor of Engineering in Computer Science from the University of Madras.
Diploma in Electrical and Electronics Engineering from Sri Venkateswara University. Project Name : Data Modernization and Architecture Client : ConocoPhillips, USA
Company : Rapid IT Inc, GA
Role : ETL Developer/Data Architect
Duration : June 2023 to March 2024
Tools & Environment: IICS/IDMC, Informatica, Snowflake, Teradata 17.0, Oracle, SQL Server, Kafka, AWS S3, Azure Blobs, SharePoint, Unix, Python Scripting, Tidal, DevOps and Microsoft PowerBI Project Description
ConocoPhillips Company is an international energy corporation and the second-largest crude oil refiner in the United States. Business segments consist of natural gas distribution operations, retail operations, wholesale services, midstream operations, and cargo shipping. IICS/IDMC application helps to extract and integrate the upstream, midstream and downstream data and load into Enterprise Data warehouse. It provides centralized access for data analysis and visualization, optimizing operations, risk assessment, Equipment and Wells production monitoring.
Roles and Responsibilities
Working with Business Users/Analysts for requirements gathering and business analysis.
Created the technical design and ETL mapping documents based on the business requirement.
Create data models and designs to meet specific business needs that require sourcing data from multiple source systems.
Developed the mappings, mapping tasks, Taskflow’s, processes, and components to integrate the data from various sources systems like Snowflake, Teradata, Oracle, SQL Server, Azure blobs, AWS S3, K a f k a, REST APIs and various files using IICS and load data into Snowflake, SQL Server database and file systems.
Extensively used various types of transformations - Expression, Joiner, Lookup, Filter, Aggregator, Hierarchy Parser/Builder, Web Service, Intelligent Structure Model to integrate the data from various source systems.
Worked with Service Oriented and Event Driven Architectures like Kafka topics, messaging queues, SOAP and RESTful services
Analyzed source data to identify data quality issues and develop strategies to address them, resulting in a 25% reduction in data errors and inconsistencies.
Implemented error handling methodologies to capture the invalid/bad records coming from a source Systems.
Created the Snowflake external tables, Snowpipes, streams, tasks, dynamic tables, stored procedures, clones and shares etc. to extract real-time data from Azure blobs and loaded into the Snowflake database.
Created UNIX shell scripts/Python scripts to automate the tasks as part of the data integration.
Involved in performance tuning of existing ETL integrations/SQL queries to minimize the performance bottlenecks, resource utilization and run time of the jobs through Push Down Optimization and other techniques.
Hands-on experience in Azure DevOps CI/CD process to migrate the code from Dev to Test/Prod.
Involved in Development, Unit Testing, SIT, UAT, Post-Go-Live support. Project Name : Enterprise Data warehouse (EDW)
Client : ConocoPhillips, USA
Company : Accenture Services India Ltd
Role : Informatica/IICS Technical Lead
Lead Duration : April 2017 to March ‘2023
Tools & Environment: IICS/IDMC, Informatica PowerCenter 10.5.3, Snowflake, Teradata 17.0, Oracle, SQL Server, AWS S3, Azure Blobs, Kafka, Linux, Python Scripting, Control-M, DevOps and Microsoft PowerBI. Project Description
ConocoPhillips Company is an international energy corporation and second largest crude oil refiner in United States. Business segments consist of natural gas distribution operations, retail operations, wholesale services, midstream operations, and cargo shipping. Informatica application helps the Business to connect the upstream and downstream data to optimize supply and trading and manage risk and compliance in a highly regulated industry. It provides centralized access to enterprise data. In response the company has turned its attention to better understanding customers and consumption trends to expand market share and create a global picture of the company’s energy business to help create business value and enhance compliance etc.
Role and Responsibilities
Played the role of an IICS/Informatica Application Lead and was responsible for managing tasks and deadlines for the ETL team at Offshore.
Involved in Business requirements gathering, creating technical design and ETL mapping documents.
Developed the IICS mappings, mapping tasks, components, REST V2/Service/App connectors, taskflows, processes to extract and integrate the data from different source systems.
Developed the Informatica PowerCenter mappings, reusable components, mapplets, sessions,, worklets and workflows to extract and integrate the data from different source systems.
Worked on various transformations like source qualifier, loopkup, expression, aggregator, update strategy, filter, joiner, normalizer and Web Service transformation etc.
Implemented the slowly dimension methodologies for different types of data loads.
Implemented error handling methodologies to capture the invalid/bad records coming source systems.
Design and develop Teradata data model and table structures based on the source systems.
Creation of Teradata BTEQ, FastExport export, MultiLoad, FastLoad scripts for extracting data from various source systems.
Developed Teradata custom SQL queries/views for ETL and reporting tools.
Design, develop and support application solutions leveraging Teradata and Teradata Tools and Utilities
Design, develop and support application solutions leveraging Teradata and Teradata Tools and Utilities
Created the Unix Shell scripts and parameter files to automate the ETL process.
Involved in optimizing ETL processes and database queries to improve performance and reduce load times.
Involved in requirements gathering, development, unit testing, UAT, Post Go-Live support etc. Project Name : Pfizer BTAMS (Business Technology Managed Services) Client : Pfizer, USA
Company : Accenture Services India Ltd
Role : Senior ETL Developer
Duration : Oct ’2015 to March ‘2017
Tools & Environment: Informatica PowerCenter 9.5.1, Oracle, Salesforce, SQL Server, UNIX Shell Scripting, Autosys and Cognos.
Project Description
Pfizer Inc. is an American global pharmaceutical corporation headquartered in New York City. Pfizer. It develops and produces medicines and vaccines for a wide range of medical disciplines. Medicaid, ECP, ESI
& GP are some of the applications that BTAMS is supporting for Pfizer. Medicaid provides health coverage to millions of Americans. This application supports Government Pricing to quickly and efficiently monitor and comply with all government-mandated pricing/reporting requirements etc. Role and Responsibilities
Worked as a Team lead and handled multiple applications having a team size of five resources.
Responsible for delivering Design/Technical specification documents, data flow diagrams, design specs, Unit test documents and other relevant documentation for the Deployment.
Perform lead development/deployments/support/ activities such as source-to-target mapping validations, identify, document, and execute unit test cases/scripts & and document test and review results. Review t h e deployment checklist before deploying into Pre-Prod and Production environments.
Developed mappings, tasks, worklets & workflows.
Involved in identifying the bottlenecks in transformations, mappings & sessions and tuned to improve the performance.
Developed error handling methodologies to identify the data issues and prepared test cases & Documents.
Providing daily and weekly status reports to management/onshore. Project Name : KPN Operations
Client : KPN, Netherlands
Company : Accenture Services India Ltd
Role : Senior ETL Developer
Duration : Nov ‘2014 to Aug ‘2015
Tools & Environment: Informatica PowerCenter 9, Oracle, SQL Server, Salesforce, UNIX Shell Scripting, Autosys and Business Objects.
Project Description
KPN is a Dutch landline and mobile telecommunications company. It has 6.3 million fixed-line telephone customers and more than 33 million mobile subscribers in the Netherlands, Germany, Belgium, France, and Spain under different brand names. KPN provides Internet services, business network services and data transport throughout Western Europe. This project delivers application development and maintenance for all of KPN’s applications like Internet, Television, Fixed & Mobile services. Role and Responsibilities
Experience in Installing, Configuring and Administering Informatica MDM Hub/Process Server
/Active VOS on 10.1.
Experienced in creating, configuring, and registering (Administering) system & Operational ORSs for MDM Hub & Cleanse Server(s) /Process Server(s).
Involved in gathering the Business requirements and providing estimates.
Ability to convert business requirements and provide estimates.
Document and review requirements, technical specifications, and customer expectations.
Developed the stage, load, match, and merge processes.
Developed trust validation rules, data cleaning configurations and Used Exists.
Good knowledge in IDD Functionality and Services Integration Framework etc. Project Name : Vodafone Data Warehouse Implementation Client : Vodafone, India
Company : HP Global Soft. PVT. Ltd
Role : Senior ETL Developer
Duration : Sep ‘2012 to Sep ‘2014
Tools & Environment: Informatica PowerCenter, Oracle, UNIX, Autosys and Business Objects. Project Description
Vodafone India is one of the largest telecommunication companies in India. The mobile business is a highly competitive environment. Succeeding in such a market requires the development of strategies that differentiate the services, the products and market approach. HP introduced Data Warehouse System as a solution for the challenges raised by Vodafone. The data warehouse is a centralized, secure solution that provides a high level of data accuracy, understanding the market strategy, helps in decision making, improve customer solutions and increase revenue and performance etc. Role and Responsibilities
Develop technical and functional specifications for data acquisition, transformation, and load processes from different source systems.
Developed mappings, tasks, worklets & workflows.
Identified the data issues in the staging database/flat files and after it was cleaned up it is sent to the targets.
Involved in identifying the bottle necks in transformations, mappings & sessions and tuned to improve the performance.
Developed error handling methodologies to identify the data issues and prepared test cases. Project Name : HDFC Bank Enterprise Data Warehouse Client : HDFC Bank, India
Company : HP Global Soft. PVT. Ltd
Role : Senior ETL Developer
Duration : April ‘2011 to Aug ‘2012
Tools & Environment: Informatica PowerCenter, Oracle, UNIX, Control M and Business Objects. Project Description
HDFC Bank offers a wide range of commercial and transactional banking services and treasury products to wholesale and retail customers. HDFC Bank faced severe performance issued for providing tactical information to the customers. It is also anticipating huge data volume growth and existing DWH is not flexible and scalable to support HDFC Bank’s anticipated needs and receives adhoc requests for data which is not part of their present EDW solution. To resolve all these issues HP provided Data warehousing solutions to build a scalable and high-performance Enterprise Data Warehouse for HDFC Bank which will replicate the as-is functionality of the bank’s existing data warehouse. Role and Responsibilities
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet etc.
Designer and Mapping Designer as per the business requirement.
Created complex mappings that involved implementation of Business Logic to load data in to staging area.
Used Informatica reusability at various levels of development.
Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter, and Union etc.
Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
Implemented slowly changing dimension methodology for accessing the full history of accounts.
Optimizing performance tuning at source, target, mapping, and session level etc. Project Name : Amgen HealthCare
Client : Amgen, USA
Company : HP Global Soft. PVT. Ltd
Role : Informatica Administrator
Duration : July ‘2009 to March ‘2011
Tools & Environment: Informatica PowerCenter, Oracle, UNIX Project Description
Amgen is a leading human therapeutics company in the biotechnology industry. For more than 25 years, the company has tapped the power of scientific discovery and innovation to advance the practice of medicine. Amgen pioneered the development of novel products based on advances in recombinant DNA and molecular biology and launched the biotechnology industry’s first blockbuster medicines. Today, as a Fortune 500 company serving millions of patients, Amgen continues to be an entrepreneurial, science-driven enterprise dedicated to helping people fight serious illness. Role and Responsibilities
Responsible for Installation & configuration of Informatica PowerCenter 8.6 (Client, Repository Service & Integration Service) on UNIX machine.
Configured Master Node and Backup nodes under High Availability environment.
Responsible for Installation & configuration of Informatica PowerCenter 8.6 (Client, Repository Service & Integration Service) on UNIX machine.
Created and maintained Users, Groups, Roles and Privileges.
Repository Service Backup, Recovery and Migration between Dev., UAT and Prod environments.
Migration of changes, enhancements, upgrades, etc. through various environments.
Used the pmrep and pmcmd to automate the tasks like code back and triggering the jobs etc.
Migration of Mappings, Sessions, and Workflows from Dev. to UAT and PROD environment. Project Name : Royal Bank of Scotland Loan IQ System Client : Rank Bank of Scotland, U.K
Company : Satyam Computer Services Ltd.
Role : Software Developer
Duration : April ‘2008 to June ‘2009
Tools & Environment: Informatica PowerCenter, SQL Server, UNIX. Project Description
Loan IQ is a comprehensive tool that covers the entire processing life cycle of a loan, from origination and deal tracking to administration and record maintenance. It reduces the need for manual workarounds surrounding loans processing and provides a strong control over the transaction flow. It will enable the business to take on transactions with more participants, and to increase the volume / complexity of deals.
Role and Responsibilities
Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
Involved in the project from the initial stages starting from designing to data validation, to development, to testing.
Data validation and cleansing before loading the data. Ensure the ETL code delivered is running, and conforms to specifications and design guidelines.
Preparation of test cases for unit testing.
Ensure the ETL code delivered is running and conforms to specifications and design guidelines. Project Name : Honda Motors Business Intelligence Dashboard Client : Honda Motors, Japan
Company : Satyam Computer Services Ltd.
Role : Software Developer
Duration : May ‘2007 to March ‘2008
Tools & Environment: Informatica PowerCenter, Oracle, UNIX, Control M and Business Objects. Project Description
The main objective of the Project is to create Dashboards for Honda Management Team- Japan through which the senior management would be able to analyze the Sales of the Cars, Parts and service efficiency of different dealers around China, Vietnam, Korea and Japan. Role and Responsibilities
Responsible for Installation & configuration of Informatica PowerCenter 8.6 (Client, Repository Service & Integration Service) on a UNIX machine.
Involved in gathering, analyzing, and documenting business requirements, functional requirements and data specifications for Business Objects Universes and Reports.
Designed, developed, and managed Universes in Business Objects Designer for generating the payroll reports for sales, and marketing reports for marketing group users.
Created Universes by defining connections and retrieving data from the database.
Generated various reports on Daily, Weekly, and Monthly and Yearly basis like Gross Margin, Account Query, Aging, Balance Sheet, Income Statement, Bad Debt and Payment Reports etc.