Post Job Free

Resume

Sign in

ETL Informatica /Big data Developer

Location:
Herndon, VA
Posted:
November 12, 2020

Contact this candidate

Resume:

Pragathi Macha

***** ****** ******, *******, ********, USA C: 703-***-**** adhsh4@r.postjobfree.com

EXPERIENCE SUMMARY

• Total IT industry experience of 10 years, which includes technology consulting & professional services

• Domains experience across Mortgage Insurance, Oil and Gas, Marketing & Manufacturing

• Hands-on development experience in Data Warehousing implementation projects using Informatica Power Center 9.x, 10.x, Oracle, PLSQL, Netezza, Redshift, Postsqlgre, Spark SQL, Unix Shell scripting, etc.

• Strong expertise in ETL-PowerCenter along with reporting tools like OBIEE

• Knowledge on SDLC Cycle from requirement gathering to designing, developing the mappings, testing, de-bugging/ validations, scheduling the jobs, conducting UAT, Performance tuning with both Informatica and SQL, Documentation and Support

• Experience in code migration, testing defect resolution, and initial data loads and ongoing data loads

• Responsible for whole Project technical implementations and planning

• Experience in both Agile and SDLC driven approaches of the project

• Ability to present the Architecture and Strategy to Management and Stake holders and Customers

• Basic Knowledge on Amazon Web services

• Developed PySpark API scripts to bring in required data from multiple sources and upload it to AWS data stores

• Created Python scripts to upload data to AWS S3 buckets to perform data analysis on EMR clusters

• Written and scheduled PySpark scripts to filter and transform data from unstructured to structured form

• Experience with different AWS services like Lambda, EMR, SNS, SQS, Redshift, AWS Glue etc.

• Created or Updated documentation in support of development efforts Documents may include detailed specifications, implementation guides, architecture diagrams or design documents

• Self-motivated team player with an eye for continuous improvement coupled with strong inclination towards adapting to new technologies

• Technical lead/mentor experience – training/mentoring new associates, prioritization/assignment/tracking tasks based on project targets, inputs for work breakdown structure towards new SOW’s

• Self-Starter and excel in high intensity/challenging work environment •Excellent problem solving & communication/interpersonal skills

• Certified AWS Developer Associate

TECHNICAL SKILLS

Operating Systems

UNIX, Windows, Linux

Development Tools

Informatica PowerCenter, Informatica IDQ Developer, Informatica Data Replicator

Reporting Tools

OBIEE 10g, BI Publisher

Languages/Technology

SQL, PL/SQL, UNIX Shell Scripting, Python, Spark SQL,

Databases

Oracle, Netezza, Redshift, PostgreSQL, MySQL

Project tracking tools

Jira, Rally

Other Tools

Toad, Oracle SQL Developer, WinSCP, Putty, Autosys,

GitHub, Bitbucket

AWS Services

EC2, ECS, ELB, EBS, S3, VPC, IAM, SQS, RDS, Lambda, Cloud Watch, Storage Gateway, Cloud formation, Elastic Beanstalk and Autoscaling, EMR, AWS Glue, SNS

PROFESSIONAL EXPERIENCE

Company:

Client:

Industry:

Description of the project:

Hexaware Technologies Inc

Fannie Mae

Mortgage Insurance

Implementing the Enterprise Data Integration and Information Governance programs along

with enhancement activities for Credit Enhancement and Target State Module (CETS) by

loading data from various sources into Target Base State Layer. In addition, hydrating EDW

business capabilities tables which are specifically designed for vending purposes in the EDW

staging schema layer. The vends are sent to CETS (Credit Enhancements Target State) via ESB.

Role:

Duration:

Software:

Senior ETL Developer

Nov 2015 Till Present

Informatica PowerCenter 9.6.1, Informatica 10.1.1,

Informatica IDR, Oracle11g, WinSCP, SQL Developer, Unix, Autosys,

Shell scripting, Netezza, NZ scripting, GIT

Responsibilities:

•Working with Business customers and Business Analysts to analyze business processes, procedures and user requirements in order to establish system requirements and help create DW technology solutions to meet clients’ business needs

•Designing systems and processes to capture, integrate and distribute information in an enterprise

•Developing solutions and specifications so that agreed requirements can be converted into functional systems

•Identifying and communicate potential impacts of a change/issue to other areas

•Lead system/application design including production of system and program documentation as well as ongoing maintenance

•Developing logical and physical data flow models for ETL applications

•Developing ETL Mappings, Workflows and Sessions which reads data from various sources using various transformations to load data into OLAP database as per the business requirements

•Developing complex SQL queries to load data into Database

•Hands-on experiences with databases e.g., SQL, MYSQL, Oracle, Netezza, Redshift, etc

•Using shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically

•Creation of mappings for capturing the audit and reject statistics for the support team

•Delivery of the assigned work within specified timelines with high quality

•Planning and conducting Informatica ETL unit & development testing activities ensuring that quality meets agreed specifications/requirements

•Optimizing Query Performance, Session Performance Delivering objects for client that is correct and on time

•Using Autosys tool to create, schedule and monitor the jobs and send the message if any process failures

•Achieving key Systems Development Life Cycle (SDLC) stages in terms of quantity, timing and quality of all aspects of work allocated

•Knowledge of key features of Cloud Service Providers

•AWS Cloud Developer's Associate Certified

•Ability to use continuous integration and distribution pipelines to deploy

•Developed PySpark API scripts to bring in required data from multiple sources and upload it to AWS data stores

•Created Python scripts to upload data to AWS S3 buckets to perform data analysis on EMR clusters

•Written and scheduled PySpark scripts to filter and transform data from unstructured to structured form

•Experience with different AWS services like Lambda, EMR, SNS, SQS, Redshift, AWS Glue etc

•Created or Updated documentation in support of development efforts Documents may include detailed specifications, implementation guides, architecture diagrams or design documents

•Contributing to the review and redesign of processes or procedures, on an ongoing basis, to deliver improved system productivity or efficiency

•Resolving production incidents (through diagnosis, testing & applying fix) for assigned application

•Preparing the test cases for Major releases

•Responsible for communication with Client Manager and Server Hosting/Operational Support Vendors in case of Production load failures

•Understanding of application lifecycle management

•Worked in Agile Methodology

•Worked with product owners and other development team members to determine new features and user stories needed in new/revised applications or large/complex development projects

•Participated in all team ceremonies including planning, grooming, product demonstration and team retrospectives

•Good Skills for developing, deploying & debugging cloud applications

•Skills in API usage, command line interface and SDKs for writing applications

•Participate in code reviews with peers and managers to ensure that each increment adheres to original vision as described in the user story and all standard resource libraries and architecture patterns as appropriate

•Respond to trouble/support calls for applications in production in order to make quick repair to keep application in production

•Mentored less experienced technical staff; used high end development tools to assist or facilitate development process

•Set up and configured a continuous integration environment

•Advanced proficiency in unit testing

Company:

Client:

Projects:

Industry:

Description of the project:

Role:

Duration:

Software:

Capgemini Consulting

Schlumberger

Reporter L3 and Operational Team Performance (OTP)

Oil and Gas

This project was to Support around 20 applications and communicate to client manager and all

the third parties involved in case of production issues and also to work on the major and

minor enhancements and support tickets.

Senior ETL Developer

Jun 2014 to Nov 2015

Informatica PowerCenter 9.6.1, Oracle11g, WinSCP, SQL Developer, Autosys, Great BI, Unix,

OBIEE 10g

Responsibilities:

•Estimation and analysis of the work process and designing Business requirement, technical design documents as per client specification

•Designing the Logical & Physical model along with mappings sheets for the ETL layer

•Designing and Implementing the ETL Layers for master’s entities and its information schema

•Developing ETL Mappings, Workflows and Sessions which reads data from various sources using various transformations to load data into OLAP database as per the business requirements

•Developing complex SQL queries

•Use of Shell scripting to create bashes

•Creation of unix scripts to load source file which includes reading/copying data from sftp server, creating file lists, archiving the source files and merging files for publishing for downstream applications

•Being Responsible for executing test cases

•Testing - unit testing & integration testing & End-to-End testing

•Resolution of Informatica loading errors in production

•Documentation of Operational Manuals, Technical Design Specification, ETL Mapping Sheets, Analysis Sheet and Code Review Documents

•Involvement in Code Migration of modules into SIT, Pre-Prod environments along with phase releases into Production for Go-Live

•Working on the resolution of incidents as a part of maintenance activity

•Resolving production incidents (through diagnosis, testing & applying fix) for assigned application

•Preparing test cases for Major releases

•Responsible for communication with Client Manager and Server Hosting/Operational Support Vendors in case of Production load failure

Technical Skills Cloud Experience:

•Ability to understand Migration requirements and bridge the gaps

•Basic Hands-on experience with EC2, ECS, ELB, EBS, S3, VPC, IAM, SQS, RDS, Lambda, Cloud Watch, Storage Gateway, Cloud formation, Elastic Beanstalk and Autoscaling

•Demonstrable experience with developer tools like Code Deploy, CodeBuild, Code Pipeline, design the overall Virtual Private Cloud VPC environment

Company:

Client:

Industry:

Role:

Duration:

Software:

Deloitte Consulting

USCC

Telecommunication

ETL Tester

Dec 2013 – Jun 2014

HP Quality Center, Informatica PowerCenter, Oracle10g, Toad

Responsibilities:

•Estimation and analysis of the work process and designing Business requirement, design documents as per client specification

•Analyzing the Technical design document and creating the SQLs Designing of KPIs based on the metrics used in Cognos reports

•Thorough testing of KPIs and raising defects

•Testing of codes developed in Informatica Experience in performing different kinds of testing like Sanity Testing, Functional Testing, Integration Testing, Re-Testing and Regression Testing

•Involving in executing the test cases and finding defects and verifying the fixed defects Assisted the team in understanding Informatica development code

•Involving in documentation of Informatica objects /Scheduled information of the jobs to help in quality testing

Company:

Program Name:

Client:

Industry:

Project Description:

Role:

Duration:

Software:

KPIT Global Solutions Ltd.

Tej - IBM Cognos based Project

Ultratech Cement Ltd, Aditya Birla Group

Manufacturing, Marketing

Data warehousing implementation for BI Cognos reporting purposes

ETL Consultant/Developer

Aug 2011 to Nov2013

Informatica PowerCenter 9.1, Oracle 10g, Toad

Responsibilities:

•Involving in analysis of different lines of business

•Analysis of the client requirement and understanding of their business and their data workflow

•Development of complex SQL queries and designed Informatica Mappings to load the data into warehouse

•Development of the Staging tables using Source as Flat Files, RDBS Sources, SAP R3 source and loaded from source to staging and implemented incremental load logic

•Implementation of Historical data maintenance logic (SCD 1 and 2 logic)

•Utilization of several Transformations like Lookup (Connected & Unconnected), Union, Joiner, Bapi, Normalizer, Router Expression, Aggregator, Sequence Generator, Update Strategy, Stored Procedure etc. for implementing the Transformation Logic in the Mappings

•Creation sessions and scheduled Mappings using Power Center Workflow Monitor for the Daily Load

•Designing the Error Table for easier validations

•Creating new database objects like Procedures, Partitions, Indexes and Views in ODS and DWH

•Developing scripts for various ETL & integration needs

•Performance tuning on partitioned tables for the best and optimum query performance

•Creating Batch Files, Parameter file and Parameter variables for implementing different logics Assisted BI Developers to fetch Reports using Cognos

•Providing modifications to existing functions initiated by Problem Report/Change Request (PRCR) and maintaining product compatibility

•Identifying the functional and non-functional requirement

•Creating unit test cases and performed the same on or before release

•Serving as a single point of contact/responsibility for building and executing the technical implementation or changes in the Production Server

•Creating Design documents, ETL Specifications

•Assisting production support to the users

•Trained the new joinees on Informatica Tool

Company:

Project Description:

Role:

Duration:

Software:

Systime-KPIT Global Solutions Ltd.

Internal Project

OBIEE Consultant/Developer

Nov 2010 to Jul 2011

OBIEE 10g, Oracle 10g, BI Publisher

Responsibilities:

•Participation in business & requirement gathering workshops

•Business requirements mapping to design & configuration of Business Intelligence Server Repository

•Extensively used OBIEE Administration Tool for customization and modification of the physical, business model, and presentation layer of the repository

•Creating dimension and fact tables, logical columns, hierarchies, level based measures in Business model & mapping layer

•Using OBIEE Answers to create reports as per the client requirements

•Catering to reporting requirements for developing interactive dashboards, reports with different Views (Drill-Down, guided navigation, Pivot Table, Chart, Column Selector, dashboard and page prompts) using Oracle Presentation Services

•Translated Business requirements into technical requirements

EDUCATION & QUALIFICATION

2006 - 2010 Bachelor of Biomedical Engineering (Computer Engineering) from Mumbai University

REWARDS & RECOGNITION

•Several recognition emails from Clients and Stakeholders from 2017-2019

•Awarded by Capgemini for the outstanding contribution in 2015

•Rewarded by KPIT for being part of the project and awarded WoW award for its consistent delivery and smooth, successful implementation in 2013

•Appreciation from Senior Managers and Directors for consistent performance, great commitment towards work and being a team asset

PROFESSIONAL TRAINING

•Attended Overview on BigData & Hadoop internal trainings

•Attended AWS trainings

•Attended training on Informatica IDQ basics, Informatica MDM

•Attended training on Undraleau

•Trained on OBIEE 10g



Contact this candidate