Post Job Free

Resume

Sign in

Data Engineer Engineering

Location:
Hyderabad, Telangana, India
Posted:
January 18, 2024

Contact this candidate

Resume:

Ramana Boddu Email-Id- :ad2wg3@r.postjobfree.com

Staff Engineer – Data Engineering. Contact- : +91-998*******

Experience Summary:

Having 8.8 years of Experience in data warehouse/Cloud, working as Data Engineer (Informatica power center/ GCP cloud), presently works as Staff Engineer-Data Engineering.

Expertise in Informatica Power Center, SQL (Oracle, DB2) and Shell scripting.

Good experience on Unix shell scripting (created multiple scripts as part of the automation and reduced the testing time for ETL testers) and scheduling tools like Control-M.

Having experience on Hive/Impala.

Extensive Knowledge in banking, insurance and payment domains.

Experience in creating Transformations and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.

Other activities include design & creation of workflows, Sessions & other Reusable components, monitoring daily, weekly, monthly, quarterly production jobs and resolving failures and resolving data issues, data validations, generating reports for clients.

Maintenance and enhancement of mappings as per the requirement changes and guided the team members to implement the changes.

Knowledge of various quality practices with in the Project viz. Technical Documentations, Unit Test Documentation and Code Review tools and database comparator tools for testing.

Performance improvement advocate with expertise in using partitioning, concurrent workflows, persistent caches, converting UNIX scripts to Informatica mappings etc.

Implemented Slowly Changing dimension (SCD)-type2 methodology for accessing the full history of accounts and transaction information.

Implemented SCD typ1.

Good experience on Agile Methodology Facilitating the daily Scrum and Sprint initiatives. Communicating between team members about evolving requirements and planning. Coaching team members on delivering results.

Good Knowledge of Software Development Life Cycle (SDLC).

Having knowledge on AWS/GCP Cloud tools, and CI/CD pipeline.

Education Qualification:

Graduation in MCA from JNTU, Hyderabad.

Professional Experience:

Working as Data Engineer for Altimetrik India PVT LTD from July 2022 to Till Date.

Worked as Team lead for Accenture solutions PVT LTD from June 2015 to June 2022.

Technical Certification:

Completed AWS Solution architect associate certification

Technical Expertise:

ETL Tools : Informatica Power Center

Ticketing Tools : HPSM, Service now.

Databases : Oracle 10g/9i, DB2, Hive/Impala.

Scheduling Tools : Control-M.

Scripts : UNIX Shell, Python

Cloud Tools : Knowledge on Amazon Web Services & GCP.

Project #1

Project Name : Rate-Pay PUI Billing System Integration

Client : PAYPAL

Duration : June 2022 to Dec 2023

Role : Data Engineer

Project Description:

PayPal Holdings, Inc. is an American multinational financial technology company operating an online payments system in the majority of countries that support online money transfers, It serves as an electronic alternative to traditional paper methods such as checks and money orders. The company operates as a payment processor for online vendors, auction sites and many other commercial users, for which it charges a fee.

Goal is to generating an invoice of VAT amount to be recovered from each merchants through auto debit process and collecting all the breakdown of UN-debit details

from rate-pay then generating the reports to be place in HAWK UI by using which merchant can download the reports for internal review.

Roles & Responsibilities:

Prepared design document and mapping documents based on BRS.

Creation of complex Stored procedure and Informatica mappings (Using Transformations like Lookup, Router, Union, transformation, etc.).

Extracting the Unpaid transaction details from APM source and load into KENAN system through ETL process.

Setting up an account for each PUI Merchant in billing system.

Generating an invoices with VAT amount to claw back those amount from merchant and sending invoices through e-mail.

Generates the breakdown of unpaid transactions details as reports and push the reports to HAWK UI.

Generates journal files for revenue recognition in SAP ECC.

Preparation of test plan and test case for PL/SQL Procedures & Informatica mappings.

Preparation of high level documents for the mappings like ETL Document, Data Dictionaries. Took training session for team on Informatica.

Responsible for all deliverables, reviews and functional understanding.

Achieved the goals defined for each sprint. Attended daily scrum meetings reporting the planned tasks for each day. Assure to Product Owner and Scrum Master that the allocated work is being performed as planned

Project #2

Project Name : Happy Returns Kenan Integration – (OFI to Kenan billing system)

Client : PAYPAL

Duration : July 2022 to May 2023

Role : Data Engineer

Project Description:

Happy returns offers return services for their clients in which the end customer can process returns by printing a label at home and then mailing back to the retailer through Happy Returns shipping accounts. By mail Returns billing process begins with identifying labels that are billable back to Happy Returns clients.

Their customers are shopping items back to them using Happy Returns shipping accounts so that data comes through the various portals of the carriers that PAYPAL uses like FedEx, USPS, DHL the goal of the project is the integration of Happy Returns billing processes into PayPal’s global billing and payment infrastructure. Current processing involves extracting the data from different systems like Sales force, pricing Engine.

Roles & Responsibilities:

Creation of complex Stored procedure and Informatica mappings (Using Transformations like Lookup, Router, Union, transformation, etc.).

Supervision on the testing of ETL code to give the quality work to clients.

Preparation of test plan and test case for PL/SQL Procedures & Informatica mappings.

Preparation of high level documents for the mappings like ETL Document, Data Dictionaries. Took training session for team on Informatica.

Responsible for all deliverables, reviews and functional understanding.

Achieved the goals defined for each sprint. Attended daily scrum meetings reporting the planned tasks for each day. Assure to Product Owner and Scrum Master that the allocated work is being performed as planned

Project #3

Project Name : Data and Information Strategies – (Shoppers)

Client : Sate Farm

Duration : Jan 2020 to June 2022

Role : Software Engineering Lead (Team Lead)

Project Description:

Shoppers are nothing but creating shopping experience of a customer within state farm. This experience could be from agent channel or customer channel. Shoppers system basically integrates the all these different experiences and create one single experience. This data is then used extensively by planning & analysis, marketing and strategic resources. So shoppers basically integrate quotes and application data from all the different channels that Client has for customer. This process is currently for Auto and Fire LOB.

Current processing involves extracting the data from different systems like Enterprise APP, Necho application etc, and use this for creating shoppers.

Roles & Responsibilities:

Creation of complex Informatica mappings (Using Transformations like Lookup, Router, Union, transformation, etc).

Requirement analysis

Handling team of more than 5 members.

Supervision on the testing of ETL code to give the quality work to clients.

Preparation of test plan and test case for Informatica mappings.

Preparation of high level documents for the mappings like ETL Document, Data Dictionaries.

Responsible for all deliverables, reviews and functional understanding.

Did an automation of weekly data load and validation plots which saved a huge amount of time of the team.

Achieved the goals defined for each sprint. Attended daily scrum meetings reporting the planned tasks for each day. Assure to Product Owner and Scrum Master that the allocated work is being performed as planned

Ensure a clear understanding of epics and personas. Provided inputs on the creation of User Stories to the Product Owner. Understand the User Stories

Project #4

Project Name : MASS (Marketing Agency Sales Summery).

Client : State Farm.

Duration : June 2018 to Dec 2019

Role : Software Engineering Senior Analyst (SSE)

Project Description:

State Farm is one of the largest insurance companies in US. This firm specialized in auto insurance for farmers, and later expanded its services into other types of insurance, such as homeowners and life insurance, and to banking, financial services and mutual funds.

Roles & Responsibilities:

To go through the documentation provided by System Operation team for system analysis.

To identify the tables and columns for the required to generate desired output, as per the source system along with the team.

Design and develop Informatica ETL Mappings/Transformations for extracting data, transformation, data cleansing, integration and loading into the target/staging area

Creating batch files to transfer data files from Informatica windows server to Active data warehouse UNIX server through WIN-SCP utility.

Creating load scripts to load data from data files on UNIX server onto tables in the database.

Creation, configuring, scheduling and running of Sessions, Work lets and Workflows.

Reviewing the work done.

Project #5

Project Name : Enterprise Data Warehouse (EDW).

Client : Philips

Duration : Sept 2016 to May 2018

Role : Software Engineering Analyst (SEA)

Project Description:

PHILIPS is a multinational conglomerate corporation headquartered in Amsterdam, one of the largest electronics companies in the world, currently focused in the area of healthcare and lighting. It was founded in 1891. It was once one of the largest electronic conglomerates in the world and currently employees around 74,000 people across 100 countries.

Roles & Responsibilities:

To go through the documentation provided by System Operation team for system analysis.

To identify the tables and columns for the required to generate desired output, as per the source system along with the team.

Design and develop Informatica ETL Mappings/Transformations for extracting data, transformation, data cleansing, integration and loading into the target/staging area

Creating batch files to transfer data files from Informatica windows server to Active data warehouse UNIX server through WIN-SCP utility.

Creating load scripts to load data from data files on UNIX server onto tables in the database.

Creation, configuring, scheduling and running of Sessions, Work lets and Workflows.

Creating procedures, functions, packages and triggers based on requirement and updating the existing procedures, triggers and packages.

Design and development of mapping and Database management.

Preparing the unit test cases and performs Unit testing and peer reviews at the end of each phase.

Reviewing the work done.

Project #6

Project Name : LIFE ASIA.

Client : ICICI PRUDENTIAL.

Duration : June 2015 to Aug 2016

Role : Software Engineering Analyst (SEA)

Project Description:

ICICI is one of the largest insurance companies of India. ICICI aims to retain and develop our market position through our behavior, by striking a good balance between price and risk, and by offering dynamic, compassionate and innovative service to our customers, in claims situations. Our insurances include workers' compensation, motor, building, contents, transport, house, personal accident and health care. It uses the product named LIFE ASIA for its IT implementation.

Roles & Responsibilities:

Analyzed functional requirements, mapping documents, assisted in problem solving and troubleshooting.

Responsible for designing the documents based on the requirement specifications.

Extensively worked on Data Extraction, Transformation, and Loading with RDBMS, Flat files and XML.

Developed mappings for the different stages of ETL by utilizing various transformations like expression, filter, joiner, lookup, router, rank and so on.

Debugged the sessions by utilizing the logs of the sessions.

Performed extensive testing and wrote queries in SQL to ensure the loading of the data.

Developed test cases and plans to complete the unit testing. Support System testing.

Performed unit testing at various levels of the ETL.



Contact this candidate