Post Job Free

Resume

Sign in

Data Manager

Location:
Quincy, MA
Posted:
April 03, 2018

Contact this candidate

Resume:

Kumar

Sr. ETL/Informatica Developer

: ac40t4@r.postjobfree.com

: 732-***-****

PROFESSIONAL SUMMARY

Over 8+ years of experience as the Sr. ETL/Informatica developer.

Good experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Insurance and Financial purpose.

Experience in Business Intelligence solutions using Data Warehousing/Data mart design, ETL and reporting tools.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica Power Center.

Experience in Dimension Data Modeling, Erwin Modeling (Erwin), Microsoft Visio, Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snowflake Schema, Data Warehouse, Data Mart, FACT tables, and Dimension tables.

Experience in working with DataStage Manager, Designer, Administrator and Director.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica Power Center.

Hands on Experience working with scheduling tools like Autosys, Control M, Maestro Cleansing tools.

Experience with Informatica Data Quality and Master Data Management products.

Experience in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, MS Access and Teradata.

Expert in Teradata database and Teradata Load and Unload utilities (FASTLOAD, FASTEXPORT, MULTILOAD, TPUMP, BTEQ and TPT).

Experience in Quality Assurance & Testing Phases including System Testing, User Acceptance Testing (UAT), Integration Testing, Smoke Testing and Regression Testing, Post deployment testing, shake out testing.

Extensively worked with all phases of testing deliverables including creation of Test Strategy, Test plans, RTM, Test case specifications, Test execution, Defect tracking, Resolution, Documenting and Reporting.

Designed and developed a process that loaded source to staging and staging to destination using many SSIS transformations at the staging table level and table partitioning to optimize the load process within SSIS.

Designed parallel jobs using various stages like Join, Merge, Look up, Filter, Remove Duplicates, Data Set, Look up File Set, Complex Flat File, Modify, Aggregator, XML.

Good Experience in developing complex mapping using transformations like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and Unconnected Lookup and Update Strategy, XML Source Qualifier.

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views, and Indexes in distributed environment.

Been part of various integration, reporting, migration, enhancement engagements.

Expertise in analysis of problem severity and priority in Defects Reporting System using Test Management tools like Mercury Quality Center, JIRA and Application lifecycle management (ALM) and communicate toward the team to resolved it.

Proficient in developing SSIS Packages to Extract, Transform and Load (ETL) data into the Data Warehouse from Heterogeneous data sources such as Oracle, DB2, Excel and MS Access CSV, Oracle, flat file.

Experience in requirement gathering, planning, analysis, design, implementation, unit testing, code migration, production support with expertise in developing business applications much based on RDBMS.

Experience in leading teams, good understanding of onsite-offshore environments.

Experience in the use of agile methodologies with SCRUM, Sprint.

Strong understanding in UNIX Shell scripts and writing SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.

Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from development to QA and from QA to Production server.

Involved in analyzing the source data coming from different Data sources such as XML, DB2, Oracle, flat files.

Excellent analytical, problem solving skills with strong technical background and interpersonal skills

TECHNICAL SKILLS & SOFT SKILLS

Data Warehousing

Informatica Power Center, IBM Web Sphere Data Stage and Quality Stage.

Reporting Tools

Cognos, Business Objects.

Databases

Oracle 11g/10g, MySQL, MS SQL Server, Sales force, Teradata,DBA.

Defect Tracking Tools

HP Quality Center, JIRA

Languages/Web

SQL, PL/SQL, SQL*Plus, HTML, ASP, Java, Shell Scripting and MS-Excel, XML, XSD, XSLT, HTML, Unix Korn Shell Scripting.

GUI Tools

SQL*Loader, TOAD, Data Loader Tool. Data Warehouse

Modeling Tool

Erwin, Visio

Environment

AIX 4.5, Solaris 2.x, MS Windows, Windows NT.

WORK EXPERIENCE

Client: State Street Bank (Quincy, Massachusetts) May 2016 – Till Date

Role: Sr. ETL/Informatica Developer

Description: State Street Corporation is the American Financial services organization located in Boston, Massachusetts. It is the second oldest financial institutions in United States o America and considered as the largest asset management companies in the world. It offers research, trading and securities lending services for foreign exchange, equities, fixed income and derivatives.

Responsibilities:

Analyzed the business requirements.

Used Microsoft Team Foundation Server for creating the Product Backlog Items.

Involved in coordinating the Offshore Team by assigning them the requirements and reviewing their work.

Assisted in migrating the Autosys jobs to the Production.

Involved in the problem identification and resolution of the failed jobs and assisted in the troubleshooting issues in the Autosys.

Used Autosys for running the Informatica workflows and Unix Shell Scripts.

Created Autosys JIL Files for developing the Jobs and Boxes to run the Informatica mappings and shell scripts.

Used GitHub to create branches and pull requests.

Involved in pushing the Informatica components like sources, targets, mappings, workflows and Autosys components like JIL files into the GitHub.

Created shell scripts to fine tune the ETL flow of the Informatica workflows.

Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Employed Oracle database to create and maintain Data Marts.

Performance tuning was done at the functional level and map level.

Designed and supervised the overall development of the Data Mart.

Worked extensively with Informatica power Center mappings to develop and feed Data Mart.

Used relational SQL wherever possible to minimize the data transfer over the network.

Used various transformations like Filter, Expression, Aggregator, Sequence Generator, Source qualifier, Update Strategy, Joiner, Normalizer, Router, XML generator, XML Source qualifier, Connected Look up, Unconnected lookup, Stored Procedure and Union to develop mappings in the Informatica Designer.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Maintained Data Marts up-to date and in accordance with the Company’s requirements and practices.

Created Mapplets to use them in different Mappings.

Involved in the creation of the dimensional model for the reporting system by identifying facts and dimensions using Erwin r7.1.

Involved in loading of the data from legacy systems and flat files into the Teradata by using complex MLOAD scripts and FASTLOAD scripts.

Involved in writing BTEQ scripts to run complex queries on the Teradata Database.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Developed PL/SQL packages, procedures, triggers, functions, indexes and collections to implement the business logic using SQL Navigator.

Developed UAT test cases for the business users and worked closely with them to execute the test cases.

Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for improving the productivity.

Involved in testing Change Requests (CR’s) that is testing base enhancements to the existing code before implementing in the production.

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.

Involved in designing data warehouses and data marts using Star Schema and Snow Flake Schema.

Created the Shell Scripts and updated the scripts as per the requirement.

Involved in the creation of Job Sequences.

Pre-and post-session assignment variables were used to pass the variable values from one session to other.

Used Bash, Awk, Sed and Perl to automate most of the daily activities like log monitoring, log rotation and purging, proactive system monitoring.

Developed Unix Scripts to process feed from the Upstream Systems that needs to be loaded in the Database.

Involved in the automation of the manual support tasks using Unix Shell Scripts to improve system availability and avoid recurring issues.

Designed Informatica workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Identified problems in existing production data and developed one time scripts to correct them.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica Power Center 9.6.0, Microsoft Team Foundation Server 2017, GitHub, COBOL files, XML Files, Oracle 11g, Erwin r7.1, Teradata R13, Unix Shell Scripting, Putty, CA Workload Automation AE (Autosys Edition) 4.5.1/R11.0, IBM DB2 8.0, TOAD 12.9,WINSCP,PL/SQL, SQL Server 2012, Data Warehouses.

Client: Measured Progress (Dover, New Hampshire) Feb 15 – April 16

Role: Sr. ETL/Informatica Developer

Description: Measured Progress is a not for profit assessment services company which develops customized K-12 Student assessments. The main motto of the company is to improve teaching and learning by providing customizable assessment products and educational services. It also provides various services like item creation, test construction, test administration, data analysis, reporting, professional development programs and assessments.

Responsibilities:

Involved in the Design, Development, Testing, Analysis, Requirements gathering, functional/technical specification, deploying.

Worked with Business Analyst to understand the business and to design the architecture of the data flow.

Designed logical and physical models for star schema based data marts using Erwin.

Tuned the Informatica mappings for optimal load performance.

Used Teradata utilities fast load, multiload, tpump to load the data.

Optimized the performance of the queries running against the data mart by creation of the table partitions, Indexes and Indexed Views.

Involved in writing BTEQ scripts to run complex queries on the Teradata Database.

Used Mercury Quality Center for documenting the Test Cases, updating the test results after the test case execution has been completed and for tracking the defects.

Extensively used JIRA for executing the Test Cases, Defect Tracking and Test Management.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Designing and customizing data models for Data Warehouse supporting data from multiple sources (Oracle, DB2, Excel, Flat files).

Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.

Developed MDM Hub Match and Merge rules, Batch jobs and Batch groups.

Involved in creating Master data to populate the Dimension table attributes using the lookup transformation.

Used DataStage Director and its run time engine to schedule running the solution, testing and debugging its components and monitoring the resulting executable versions on scheduled basis.

Involved in import and export of the DataStage jobs by using DataStage Manager.

Used Erwin for reverse engineering to connect to the existing database and ODS to create graphical representation in the form of entity relationships and elicit more information.

Created Informatica mappings for initial load and daily updates.

Designed and developed Informatica mappings to Extract, Transform and Load data into target tables.

Wrote, tested and implemented Teradata Fast load, Multiload and BTEQ scripts, DML and DDL.

Modified mappings as per the changed business requirements.

Created Dynamic parameter files and changed Session parameters, mapping parameters, and variables at run time.

Used Informatica to load the data to Teradata by making various connections to load and extract the data to and from Teradata efficiently.

Extensively used almost all transformations of Informatica including Lookups, Update Strategy and others.

Developed and delivered dynamic reporting solutions using MS SQL Server 2012 Reporting Services (SSRS).

Extensively worked on Performance Tuning of ETL Procedures and processes.

Extensively used PL/SQL programming in backend and front-end functions, procedures and packages to implement business rules.

Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.

Developed mappings for Type 1, Type 2 & Type 3 Slowly Changing Dimension (SCD) using Informatica Power Center.

Worked with Session Logs, Workflow Logs and Debugger for Error Handling and Troubleshooting in all environments.

Reviewed QA Test Plans and provided technical support during QA and Stage testing (UAT).

Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.

Environment: Informatica Power Center 9.5/9.6, Informatica Master Data Management 9.1, Mercury Quality Center 10, Informatica Hub, IBM Web Sphere DataStage 8.0.1, Parallel Extender, Data Warehouse, Quality Stage 8.0, TOAD, Visio, Oracle 11g, Erwin r7.1, IBM Mainframes DB2 7.0, Mercury Quality Center, SSRS, UNIX, SQL Server, Teradata R12/R13, Teradata SQL Assistant, PL/SQL.

Client: State Farm Insurance (Chicago, Illinois) Dec 13 – Jan 15

Role: Informatica Consultant/ETL Tester

Description: State Farm is a group of insurance and financial services companies in the United States. It ranked 44th among the Fortune 500 companies in the United States.

Responsibilities:

Involved in performing a high-level risk assessment and developing migration strategies to include in project plans.

Assisted with establishing and administering the Informatica environment.

Designed and documented all the ETL Plans.

Developed Health Care ERP solution like FACETS, QNXT and other Claim adjudication systems.

Worked closely with FACETS 4.48, 4.51 and different EDI Transaction file like 837, 834, 835, 270, 271 to understand source structure and source data pattern.

Worked with different data sources such as DB2 tables, Flat files, CSV files and responsible for cleansing data in flat files.

Involved in Data Mapping Specifications to create and execute detailed system test plans.

Implemented Slowly Changing Dimensions (SCDs, Type 1, Type 2 and Type 3).

Involved in developing detailed test plan, test cases and scripts using Quality Center for functional and regression testing.

Experience with Medicare, Medicaid and commercial insurances in HIPAA ANSI X12 formats including 270/271,276/277,835,837,997 and other NSF Formats for interfaces and images to third party vendor applications

Worked with Oracle, SQL Server and flat file sources.

Involved on various HIPAA claims verification and validation process to understand the source data.

Extracted Erwin physical models into repository manager using Informatica.

Involved in writing conversion scripts using PL/SQL, stored procedures, functions, packages to migrate the data from SQL server database to Oracle database.

Studied the existing OLTP systems and created facts, dimensions and star schema representation for the data mart using Erwin.

Used SQL*Loader to first write the target rows to a flat file and then upload the data to the tables in the Oracle database.

Extensive testing ETL experience using Informatica 9.1/8.6.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.

Involved in writing Stored Procedures and calling them in Informatica Workflow Manager to drop the data from stage tables.

Used Quality Center and JIRA to track and report system defects and project enhancements.

Responsible for tuning the ETL mappings to gain high performance.

Responsible for unit testing the developed mappings and involved in UAT sessions.

Involved in migrating the plans from Development to Production and automating the process.

Environment: Informatica 8.6, Oracle 8i/9i, SQL server 2005, JIRA, Mercury Quality Center, PL/SQL, IBM AIX, UNIX, MS Excel, Erwin4.5, Autosys, Teradata.

Client: Infotech Enterprises Ltd (Hyderabad, India) Aug 11 – Nov 13

Role: Informatica Developer

Description: Infotech Enterprises Ltd is a global solutions provider which is focused on engineering, manufacturing, data analytics and networks and operations.

Responsibilities:

Developed ETLs to load data from Microsoft SQL Server 2000 into target Oracle Database.

Understanding the customer's requirements, analysis, design, development and implementation of system gather and define business requirements, improve business processes.

Developed simple & complex mappings using Informatica to load dimension & fact tables as per Star Schema techniques.

Worked with Static, Dynamic and Persistent Cache in the Look Up Transformation for the better throughput of Sessions.

Extensively used workflow variables, mapping parameters and mapping variables.

Created Tables, Keys and Indexes in SQL Server.

Developed DDL’s, DML’s, PL/SQL stored procedures, indexes for ETL operations on Oracle and SQL Server databases.

Worked on different tasks in the Informatica workflow manager like Session, Event raise, Event wait, Decision, E- Mail, Command, Assignment and Timer.

Developed reusable Mapplets and Transformations by using Informatica Power Center.

Created SQL, PL/SQL, scripts for data validation and reconciliation.

Designed and developed a process to load historical data.

Used Procedure Transformations to invoke Oracle PL/SQL Procedures.

Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production.

Identified performance issues in the existing sources, targets and mappings by analyzing the data flow evaluating transformations and tuned accordingly for better performance.

Involved in Informatica code deployments to all the environments.

Environment: Informatica Power Center 8.1, Erwin 4.0, PL/SQL, Windows 2000, UNIX, Oracle 10g, MS SQL Server 2000.

Client: Rofous Software Private Limited (Hyderabad, India) June 09 – July 11

Role: Jr ETL Developer

Description: Rofous Software Private Limited is an IT Consulting and software services company which offers enterprise application solutions such as training and staffing services and application development services such as custom software development, enterprise solutions, supply chain management solutions and customer relationship management solutions.

Responsibilities:

Deployed ETL Code to Production environment based on the instructions provided by development team.

Monitored the scheduled jobs in production in 24X7 shifts using Tivoli tool.

Developed Reusable Mapplets and Transformations using Informatica Power Center Designer.

Gathered the scheduled run statistics and track it for metrics purpose.

Used Informatica Debugger to debug the mappings to gain the trouble shooting information about data and error conditions.

Involved in the performance tuning of the mappings and sessions.

Fixing the failed jobs, by analyzing the issue using the session log.

Involved in writing Teradata Macros and used various Teradata analytical functions.

Rerunning of the jobs for those that got failed in the production environment.

Use to make sure that the production is up and running and used to clean up the UNIX box that is with unnecessary files.

Used to tune the user queries for better performance.

Have written complex SQL’s by using inner, left outer and right outer joins.

Have attended daily status call and provided the status to the onsite team.

Environment: Informatica Power Center 7.1.1, Oracle 9i, Teradata SQL Assistant, Tivoli, Toad, UNIX, PL/SQL, SQL Server, Putty, WinSCP.



Contact this candidate