Post Job Free

Resume

Sign in

Data Engineer Etl Developer

Location:
San Ramon, CA
Salary:
140000
Posted:
February 07, 2024

Contact this candidate

Resume:

SUMMARY

Seasoned Senior Data Engineer/ETL developer/Consultant with 11+ years of overall experience with designing and building data pipelines for Data Warehousing and DataMart projects in diverse industries under Analytics and Information Management portfolio.

SKILLS & CAPABLITIES

Participate in full development life cycle of data pipeline and data architecture projects end-to-end, from design, implementation and testing, to documentation, delivery, support, and maintenance

Extensive technical experience in Data Integration using Informatica services

Expert in writing SQL queries, tuning and debugging, performance tuning of ETL jobs by identifying the bottlenecks and providing the solutions.

Ability to learn and adapt to new concepts quickly with demonstrated skill in the use of data analysis, data integration, data modeling, analytics, data visualization and storytelling

Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.

EDUCATION

PG Diploma in Advance Computing from C-DAC, ACTS, Pune, India.

Bachelor of Computer Application, Sambalpur University, India.

Certificate and Training:

No

Name of Training/Certificate

Diploma or Certificate

Year of Completion

1

Informatica 8.6

Informatica certified developer

2011

2

AWS CERTIFED SOLUTION ARCHITECT

AWS CERTIFED SOLUTION ARCHITECT

2020

3

CCA Spark and Hadoop Developer

CCA Spark and Hadoop Developer

2021

Technical Skills:

ETL Tool

Informatica Power Center 10.2, Informatica Developer,

Programming Languages

Scala, python, Unix Shell Scripting, SQL, PL/SQL

Big Data eco system

HDFS, Spark, Scala

Databases

SNOWFLAKE, Oracle 10g/9i/8i, MS SQL Server 12, MS Access 7.0/2000, MySQL

Environment

UNIX, MS Windows 10/7/XP, LINUX

Job Scheduling

UC4, Informatica Scheduler, Control M, Tivoli

Tools and Utilities

SQL * loader, Quest Toad, Quality Center, Putty, Jira, Autosys, Tidal

PROFESSIONAL EXPERIENCE:

Saama Technologies Inc., Campbell, CA Mar2020 – JUNE2023

Role: ETL Developer/ Data Engineer

As a Data Engineer for client AZ DES, Pphoenix AZ – Utilized Harvester tool for integration of data from various sources & populating them to relevant reports for the user to be further utilized for their analysis.

As a Senior ETL developer for the client Genentech, Inc., South SFO, CA –

As part of the EPICX project, the aim was to migrate all the payer sources from the Informatica MDM ecosystem to the Reltio MDM ecosystem. As part of the migration exercise the data ingestion pipeline was set up to onboard data on Reltio platform and consume the mastered data back for downstream usage.

Design and develop a new integration framework to address the current system issue, removed multiple layers, improve the data processing time by 4X, transparent the data process and track the master data changes using Informatica Power Center, Oracle SQL/PLSQL. Created a metadata driven centralized repository to capture all the business rules and CMS validation rules instead of hardcoding all the validation/rules in Informatica mapping. It reduced the overall time to add or modify any business rules significantly. Implemented a source data validation process based on PL/SQL Procedure to validate all the incoming source data against the validation rule define in metadata table and logged the bad records as error if any data issues found. Automated the process to send error report to concern stakeholder to correct data.

Collabera Inc., Basking Ridge, NJ, CA Dec 2017 – Dec 2018

Role: ETL Developer

As a Senior ETL developer for the client Kaiser Permanente, Pleasanton, CA – Worked using Informatica Power Center 10.2 to ingestion data through pipeline and integrate it from various sources. Utilized Informatica mapping, SQL, Jira etc. to provide the deliverables in timely manner.

As a Senior ETL developer for the client US Bank, San Francisco, CA: The Wealth Management aims at populating and maintaining Client Team for Wealth Customers. The scope includes extracting Account, Customer & Product related data from Hogan through one more layer of ELZ system and populate the Client team for the identified wealth customer with suitable customer product. Worked extensively on Informatica PowerCenter to Implement a source data validation process and impalement the business logic to exact data, by user stories through different sprints.

Cognizant Technology Solutions: San Ramon CA Mar-2015 – May -2017

Role: ETL Developer

As an ETL developer for the client Kaiser Technology, Pleasanton, CA: Utilized Informatica Power Center, for data integration and implement ETL process through mappings, SQL, Jira.

The Claims Data Warehouse (CDW) program provides a central, consistent and industry standard view of finalized claims data for all the regions to perform reporting, analytical and operation functions. The scope includes integrating all internal and external claims data into CDW and provide end users ability to generate reports from one central repository. This project also accounts for several iterations as part of enhancement, time to time. It also keeps including new regions for the Claims data processing.

Trident Consulting, Dublin, CA OCT2013 – OCT2014

Role: ETL Developer

As an ETL developer for the client JDA Software, Inc., Santa Clara, CA: Used Informatica Power Center 9.5 to implement business logic to extract data using mappings, SQLs, using tuning to provide integrated data to the user for proper analysis!

PA powerful cloud delivery platform enables JDA customers to achieve and maintain long-term value from their software investments. From rapid deployment to ongoing optimization services. It integrates the inbound data from all three types of source system & providing the out bound data to the MDM hub in incremental manner for relational as well as SFDC data.

Cognizant Technologies, Pune India Feb 2010 November 2011

Role: ETL Developer

As an ETL developer for the client Merck & Co, Pune, India, worked on Informatica Power Center 8.6, using Mappings, Mapplets and Transformations to load data from relational and flat file sources into the data warehouse. Used SQL queries and database programming using PL/SQL. Used various transformations like Source Qualifier, Lookup, Update Strategy, Router, Filter, Sequence Generator, and Joiner on the extracted source data according to the business rules and technical specifications. Loaded the data from files into oracle tables using Informatica with any data cleansing required. Worked on SQL tools like TOAD to run SQL queries and validate the data. Worked on database connections, SQL Joins, views in Database level.

Infosys, Pune, India Jan 2009 to Feb 2010

Informatica Developer

As an ETL developer for the client Procter & Gamble

It involves Informatica for the various processes like data cleansing, data extraction into the staging database, applying Business rules in the transformations. It also involves error logging & job automation.

Fujitsu, Pune, India Sep 2007 to Nov 2008

ETL Informatica Developer

As an Informatica developer trainee, learnt & implemented Informatica Power Center 8.6 in the existing project. Worked with QA team to ensure data quality & worked with support engineers for seamless product delivery & initial support.

The project involved implementing an enterprise wide data warehouse using the existing operational data for better end user reporting, analysis and decision support and to improve the client services by preventing errors, providing real-time data.



Contact this candidate