Post Job Free

Resume

Sign in

ETL Developer

Location:
Middlesex, NJ
Salary:
90 on C2C
Posted:
April 06, 2020

Contact this candidate

Resume:

Summary

**+ Years of experience in IT industry with strong background in SDLC, Data Integration Tools (ETL) and Databases/queries.

Majority of experience in the mentioned technologies are, 7+ years in Informatica, 3+ Years in Talend, 5+ Years in Teradata, 2+ Years in Snowflake DB, 3+ years in AWS/Azure, 7+ Years in Unix/Linux

Primary skills and experience working with Informatica Power Center, DT studio, Teradata, Oracle, Autosys and Informatica Admin Activities.

Secondary Skills with SF, DB2, Netezza, IDR, DVO, TDM, PWX, MDM, Webservices, Talend for Bigdata, NoSQL(MongoDB), Azure/AWS, Tableau

Have experience in complete SDLC (Software development Life cycle) from requirement gathering to production support and in between.

Extensive experience in Production support using Informatica, Teradata, Unix, IXP.

Experience in data warehousing techniques for Data Analysis, Data Profiling, Data Cleansing, Transformation, Consolidation, Integration, Data Import, Data Export using various source (Oracle, MS SQL Server, DB2, Teradata, XML and Flat files) and target.

Experienced in developing Packages, procedures, functions, Partitions and views.

Worked on Informatica Data Quality (IDQ) for profiling, Analysis, data quality, Reference tables, maps and others.

Worked on Dimensional (Dim-fact) tables - Star/Snowflake Schemas.

Have good working experience in NoSQL DB (MongoDB) using different tools

Good working experience with Salesforce, AWS clouds(Lambda, Redshift), Azure Cloud (SQL Server), And good knowledge of Big data/Hadoop technology

Worked on Creating Index, collecting Stats, And Based on Explain plan work on different strategies as required for Teradata DB.

Knowledge of Informatica/Teradata data warehouse architecture, Also system queries to retrieve the details required in Teradata/Informatica (Skew Factor/space utilization/usergroup/record distribution to AMP)

Experience on Performance Tuning, Performance Monitoring and Query Optimization using Teradata Explain and Table partitioning.

Experience working with Teradata utilities Multiload/Fastload/TPUMP/BTEQ and in their troubleshooting.

Used B2B/DT studio extensively for creating the parser, serializer and mappers projects for different unstructured data files.

Extensive Experience in Finance, Retail, Pharmacy and Healthcare domains

Over 4+ Experience working in Agile Methodology.

Good experience in Unix scripting and known to Python

Knowledge of reporting/testing tools BO, MSRS, Cognos, Tableau and Cucumber.

Developed excellent professional skills by working independently, And also as a team member to analyze the Functional/ Business requirements and to prepare Estimation of work including Analysis, code, Test plans, Test scripts, Test cases for manual testing.

5+ of experience as team lead (TL) leading the team of 4 to 10 Associates as part of the on shore-offshore model in various projects

Working Knowledge of Oracle Data Integrator (ODI), Datastage, SQL Server Integration Services (SSIS)

Excellent Troubleshooting, Analytical, Interpersonal skills and open to take up challenges

Certified PowerCenter Data Integration 9.x:Developer Specialist

Education

Bachelor of Engineering with specialization in Electrical & Electronics from RV College of Engineering, Visvesvaraya Technological University, Belgaun, India, 2008.

Technical Skills:

Operating Systems

AIX, Linux, Sun Solaris, Windows 2000/XP

RDBMS

Oracle, Teradata, DB2, Netezza, MongoDB, Snowflake

Languages

SQL, PL/SQL, Unix, Python, Azure (SQL Server), NoSQL

Tools and Utilities

Informatica Tools, AUTOSYS (IXP), Talend, CA Desk, HPOV, CA SCM, SVN, Cognos, BO, Microstrategy, SSIS, ODI, Datastage, Control M, AWS S3, AWS Lambda, Salesforce, Robo 3T, MongoDB Compass, Azure, Tableau

Cloud

Informatica PC, AWS, Salesforce, Azure, Snowflake

Work Experience:

I>Akshaya Inc: Sep 2014 to till date:

Clients/Projects:

1> EDW, Vonage, Homdel, NJ: Oct 2018– Till now

Role: Talend/Snowflake Developer

Vonage is cloud communications provider (VOIP) for a decade for residential user in US, Canada and UK.

Now they are focusing in VOIP for business and have acquired few other start up (talkbox,Nexmo) in the same domain to grow further.

Worked on few projects involving Talend as ETL tool and Teadata/Snowflake as Database, S3 as intermediate storage.

Worked Extensively on tmap,tjavaflex and tfile,tteradata,tjdbc, pre/postjob, context variables, iterate, subjob/component/conditional flow/main/rejects.

Worked with loading the snowflake tables from S3 files using copy command

Worked on jobs to convert from Teradata to snowflake

Worked on semi structure files to query directly from S3 stage or to load to Snowflake

Worked on different AWS technologies (S3, Lambda, SQS, SNS, Python)

Created Lambda jobs (Python) to trigger on s3 events and through cloudevents

Configured Lambda for failure, Time outs, Retries to handle failures and job execution.

Used aws boto to write the python code for Lambda to call Lambda, SQS, S3 Events.

Created the SNS topics and subscribed to SQS and Emails

Worked on creating a process for GDPR to handle the business/government requirement to first find out the inactive account and delete the informational available in the organization either in file or DB.

Environment/Tools/Technologies: Talend(6.3), AWS (S3,Lambda, SNS, SQS, CloudEvents), Snowflake, Teradata, tableau

2> Sapphire, Symantec, Chennai: Jun 2018– Oct 2018

Role: Informatica Developer

Symantec is one of the largest provider of cybersecurity software and services in world headquartered in CA, USA. After they bought one of the US based Identity protection services (LifeLock) company in 2017 they are now trying to integrate the data for both and trying to build a new DWH in Azure.

Gathering details on requirements and working with BSA to get any clarification.

Developed SCD Type 2 and other regular and advance mappings

Worked on Azure (SQL server) DB/SF/MongoDB as Source and Target creating odbc/Application/Mysql ODBC/Azure based connection.

Worked on MongoDB (MDB), the NoSQL DB with JSON like documents and used mysql DB connection through informatica to use them like sql

Converted python/Slq scripts to Informatica jobs

Worked of production support activities

Worked on Informatica, Teradata, Linux, SQL server(Azure)

Developed Tableau workbooks to perform year over year, quarter over quarter, YTD, QTD and MTD type of analysis

Automate monthly excel reports into Tableau Workbooks and Dashboards.

Design Dashboards in Tableau using Filters, Parameters and Action Filters

Work on generating various dashboards in Tableau server using various data source

Environment/Tools/Technologies: Informatica PC (10.1), Azure (SQL Server), Teradata, tableau, Linux

3>Wholesale Data Mart(WDM), WellsFargo, Fremont, CA: May 2017– March 2018

Role: Informatica/Teradata Developer

Wholesale Data mart team is an intermediate team to hold up the data from source and keep it historically for further use for other datamarts or for direct reporting.

Gathering requirements, understand and sharing the details with support team and users.

Did the development using Farmework tables to generate the mapping for Informatica

Worked on Creating Index, collecting Stats and Based on Explain plan work on different strategies as required.

Queried Teradata system DB to retrieve the Skew factor, User group, space utilization and Row distribution.

Worked on Informatica, Teradata, Linux, Autosys, SQL server, Cobol file.

Produce standard monthly reports using Tableau.

Created Thematic Heat maps using MapInfo.

Good understanding of advanced Tableau features including calculated fields, parameters, table. calculations, row-level security, R integration, joins, data blending, and dashboard actions.

Environment/Tools/Technologies: Informatica PC (9.6), SQL Server, Teradata, tableau, Linux

4>RDW Loan Retirement, Fannie Mea, Reston, VA: Jan 2017– April 2017

Role: Informatica/Netezza Developer

Fannie Mae servers the people who house America. They are leading source of Financing for mortgage lenders, providing access to affordable mortgage financing in all markets at all times.

In This project I worked on RDW systems retirement, to load the data directly to the Integration layer (IDS) avoiding the intermediate (RDW). Also worked on loading the Loan Modification tables from IDS to EDW data marts tables.

Gathering requirements, understand and sharing the details with support team and users.

Developed PL/SQL Packages, Procedures and Functions accordance with the business requirements for loading data into database tables.

Worked on Slowly Changing Dimensions Type 2 and Type 1

Used and created different reusable components from Reusable User Defined Variable to Maplets.

Used Pre/Post SQLs to truncate/drop/Create Index.

Worked for Warranty period production support for the code delivered by us.

Loaded ODS tables and EDW tables.

Created Mapplets for Audit tables load and mappings/session/Wfs extensively using exp, Joiner, Lookup, Normalizer, Filter, Router, Transactional Control, Update Strategy, Source Qualifier, and Sequence Generator

Also In session used pre-post sql for Source and Target and Pre-post cmd for session

Involved in Development, unit, performance and integration testing for the jobs

Migrate the developed component to higher environment and coordinate with testers

Worked on Creating the JIL files, Jilling it to create the jobs and run commands to process the jobs as required.

Worked on EDW frame work to load the Data to NZ DB using NZ scripts and Informaica

Worked on Informatica Data Replicator (IDR) tool to replicate the data from IDS to EDW

In IDR have used Config, Tasks (extractor/applier/sendfile) and scheduler

Used SVN version control tool for code check-in to higher environments.

Worked on Agile Methodology (Rally)

Environment/Tools/Technologies: Linux, Autosys, Informatica PC, IDR, Toad, Netezza, Oracle

5> EUO Support, AMEX, Phoenix, AZ: Oct 2015 – Dec 2016

Role: Informatica Admin/IDQ & Talend Developer

They Are world’s largest card issuer, master in the loyality programmes with the strong Global presence across the entire payment chain.

In This project as part of Enterprise Utilities Operation (EUO) team, we use to Administrator different BI related utilities. It includes Informatica products/Microstrategy/Cognos infrastructure and involve in migration, Uplift, support and keeping all of BI Environment up and running 24*7.

Support the Informatica Grid environments and introduced to Informatica TDM and DVO utilities

Supported Power Exchange (PWX) for DB2, cdc and data map files.

Production support for all Informatica Applications and utilities in production environment.

Look for PWX starter tasks, restart tokens, task logs, Task recycle order and monitor for any issues connecting from Informatica server

Worked on CDC task restarts tasks extensively with restart tokens and cold starts

Supported the Informatica Analyst project, B2B (DT studio) and webservices (to get the run stats for session/wfs)

Work on any on the services failures using the admin console and using the infa commands.

Work on different adhoc requests using different infa commands or metadata table queries.

Monitor the Informatica Health and Automate the critical Alerts

Work on regular Admin activities as per request (Creating Connection String, Native Access Groups, Assigning the IS, File changes, Permission/Owner changes, Seq Reset, ..)

Creating the Deployment group in Lower environment and migrating them to Production.

Have worked on IDQ for Reference table, Profiling and Address Doctor.

Trouble shooting the long running jobs, Crashes, Service restarts, Performance improvement suggestion to Application Team, Any configuration Changes in Admin Console as per improving the system

Worked on Netezza and Teradata scripts and procedures using informatica

Team Lead for Informatica in the team of 8, and Trained them for all the Admin activities and made the understand our role and responsibilities to avoid any unseen issues

Involved in migration of the mapps from IDQ to power center

Applied the rules and profiled the source and target table's data using IDQ.

Work on the Cognos/Microstrategy admin activities and monitoring for scheduled Control M jobs

Worked on Talend POC to check on the performance and flexibilities as developer.

Extensively created mappings in Talend using its most commonly used transformations like tMap, tJoin, tflowtoIterate, tAggregate, tSortRow, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tFilter etc.

Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files like tMySqlInput, tFileCopy, tFileInputDelimited, tFileExists.

Environment/Tools/Technologies: AIX, Linux, Mainframe, Informatica Family (PC, Analyst, B2B, TDM, DVO, PWX, Webservices), Microstrategy, Netezza, Teradata, Cognos, Talend for Bigdata, Control M

6> SWP Migration, Wells Fargo, Minneapolis, MN: Jan 2015 – Sep 2015

Role: Informatica/Talend Developer

Created a new ODS layer synchronising the new system with old SWP system, so that the business and stake holder doesn’t get affected after the cut over to new system.

Involve with BA and Architect to decide on the overall new design

Worked on Agile Methodology with 2 Week sprint

Created the HLD and then LLD and get it approved with TL and then the Architect

Preparing documentation for requirements, design, install and Unit testing

Developed the code using creating reusable components

Created Mapplets/worlet also mappings/session/Wfs extensively using exp, Joiner, Lookup, Normalizer, Filter, Router, Transactional Control, Update Strategy, Source Qualifier, Sequence Generator and UDT Transformations.

Also In session used pre-post sql for Source and Target and Pre-post cmd for session .

Prepared the final Handover Document

Worked on IDQ for Profiling the Tables/Columns

Worked on different mappings of Talend and Informatica to load the data

Extensively used Salesforce and AWS cloud solutions as source and Targets

Extensively created mappings in Talend using its most of transformations like tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

Extensive experience in using Talend features such as context variables, triggers, connectors for Database and flat files.

Used B2B/DT studio extensively for creating the parser, serializer and mappers projects for different unstructured data files.

Environment/Tools/Technologies: AQT (DB2), UNIX, Informatica, Talend for Bigdata, AWS, Salesforce

7> CL, Nationwide Insurance, Dublin, OH: Aug 2014 – Jan 2015

Role: Informatica Developer/IDQ Developer

Nationwide is one of the largest insurance and financial services companies in world with dozens of affiliated companies and still growing.

In the Centralized Licensing (CL), we are integrating the complete Licensing information to one system,

Nationwide has different systems, such as few bought over’s and few for different type of business. So here we are trying to integrate the all system so we don’t have the duplicate data and to have only one source of truth.

Responsibilities:

Involve in the discussion with Business to clarify the requirement

Involve in the discussion with testers to get the scenarios checked

Created the HLD and then LLD and get it approved with TL and then the Architect

Preparing documentation for requirements, design, install and Unit testing

Keeping in touch with other development team to make sure we are in right direction and checking the progress time to time

Develop the mapping and workflows based on requirement

Involved in Development, unit, performance and integration testing for the jobs

Creating the Implementation plans and reviewing with team before the implementation date

Used Informatica MDM to select the golden record and load to Address Target

Involved in migration of the mapps from IDQ to power center

Applied the rules and profiled the source and target table's data using IDQ.

Environment/Tools/Technologies: AQT (DB2), UNIX, Informatica

II>TCS: Sep 2008 to Sep 2014:

Clients/Projects:

8> CCRA ODS, Fifth Third Bank, Cincinnati, OH: Jul 2013 – Aug 2014

Role: Data Analyst

Commercial Credit and Risk Analysis (CC&RA) is the project where in we were trying to match up the requirement from the Business to have our Credit and the Risk team able to run ahead of other market players.

Here we have different types of project running together to match the business requirement and have the Business team enabled for more accurate decisions.

Responsibilities:

Meet with Business to get the requirements, improvements and suggestion

Meet with the Source System teams to get the changes and replicate to ETL side as required

Worked as Team Lead (TL) to make sure developer understand the requirement and have everything on time. Also divide the work to the other 3 members as per their competence and complete the development within time.

Data profiling, Data modeling and then creating the Technical Specs

Preparing documentation for requirements, design, install and Unit testing

Keeping in touch with development team to make sure we are in right direction and checking the progress time to time

Involved in Development, unit, performance and integration testing for the jobs

Develop test cases and review the Unit testing.

Creating the Implementation plans and reviewing with team before the implementation date.

Knowledge of US Financial and Credit and Risk team with in it.

Environment/Tools/Technologies: AQT (DB2), UNIX, Datastage

9> EDW Support, Independent Health, Bangalore, India: May 2011 – Jun 2013

Role: Application Support/Informatica-Teradata Developer

Independent Health involved in dealing with different data warehousing users across the country and with the extracts of data from OLTP databases on regular schedules. The job involved in cleaning the source data and transforming the data to the staging area then loading it into the warehouse. Involved in various stages used to transform the data and loading it into the warehouse.

Designed the ETL processes using Informatica to load data from Mainframe DB2, Oracle, SQL Server, Flat Files, XML Files and Excel files to target Teradata warehouse database.

Designed Reusable Transformations, Mapplets, reusable Tasks and designed Worklets as per the dependencies of various sessions and parent-child tables.

Worked on Dimensional (Dim-fact) - Star/Snowflake Schemas

Worked on Teradata Stored Procedures (SP) & utilities BTEQ, MLOAD and TPUMP and tuned SQL.

Queried Teradata system DB to retrieve the Skew factor, User group.

Created Oracle stored procedure, package and triggers. Worked on analytical query to format report. Created materialized view to store summarized data.

Investigate, debug and fix problems with Informatica Mappings and Workflows.

Implemented SCD - Type I & II in different mappings as per the requirements.

Worked extensively on the HIPPA 4010 transactions and HIPPA 5010 transactions as the source data like 834,835, 277,276 and more.

Involved in analyzing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level.

Production support for the migrated code for 2 months

Environment/Tools/Technologies: Informatica Power Center 8.6, Oracle 11g, UNIX, SQL, Autosys, Teradata, Utilities BTEQ, FLOAD, HPOV.

10> EDW Support, Supervalu Inc, Bangalore, India: Sep 2007 – Apr 2011

Role: Application Support/Informatica-Teradata Developer

Under this project we provided maintenance and Support Services for Tier 3 Legacy SUPERVALU EDW applications on subject areas like CDMA, TCRM, & Pharmacy for the data warehouse application. Also supported Minor Enhancement activities or any small development activities which was either tracked by support, User or BSA. The Project also assists SUPERVALU in upgrades and new releases of third party Applications Software Products

Responsibilities:

Gathering requirements, understand and sharing the details with support team and users.

Developed PL/SQL Packages, Procedures and Functions accordance with the business requirements for loading data into database tables.

Worked on Slowly Changing Dimensions Type 2, Type 3 and Type 1.

Handled system utilization issues, performance statistics, capacity planning and recovery of databases.

Worked on Dimensional (Dim-fact) - Star/Snowflake Schemas

Monitoring day-to-day Process for different Data Loads and resolving Issues.

Provide post-production support and Performance tuning in Informatica and DB side.

Provide status reports to the Core Team and Stakeholders including progress, milestones and issues.

Keeping track of the scheduled jobs of the Supervalu DWH through scheduling tools Autosys & ESP Tool and Ticketing tools HPOV & CA

Troubleshooting problems in Informatica, Teradata, Unix and Oracle code.

Worked with Teradata utilities (Fload/Mload/BETQ) to load files into Teradata tables

Queried Teradata system DB to retrieve the Skew factor and User group

Doing data-fixes on request or on necessary basis depending upon the job failures.

Involved in training and mentoring of new joiners in process and technical areas.

Worked in different, most critical & secure subject areas like CDMA, Marketing, and Pharmacy as Team Lead (TL) where we need to work with extra precision & alert. Also scheduled and planned KT on the business, tools, learning for the new team members as well as for existing

Used CA SCM tool to promote the development work into the production

Communicate efficiently with clients in weekly meetings and whenever it was necessary for the schedule updates during outages & environment migrations.

Migrated few ETL jobs from existing Informatica to ODI (Oracle Data Integrator)

Used the MDM concept to find the best record and update it as per Data quality rule

Also partly involved in Informatica Admin Activities

Co-ordinated with different support teams and development teams.

Environment/Tools/Technologies: Oracle 10g, PL/SQL, Teradarta, Sun Solaris 5.10, INFORMATICA,ODI, CA Service Desk, HPOV, SCM, PWX



Contact this candidate