Post Job Free

Resume

Sign in

Data Manager

Location:
Bernards, NJ, 07920
Salary:
$60
Posted:
March 30, 2017

Contact this candidate

Resume:

VAMSI

Sr.ETL/INFORMATICA DEVELOPER

Phone: +*(***) -(663)-9712

Email: aczkey@r.postjobfree.com

PROFESSIONAL SUMMARY:

Over 8 years of experience as the Sr. ETL/Informatica developer.

Good experience in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Insurance and Financial purpose.

Experience in Business Intelligence solutions using Data Warehousing/Data mart design, ETL and reporting tools.

Expertise in implementing complex business rules by creating robust Mappings, Mapplets, Sessions and Workflows using Informatica Power Center.

Experience in Dimension Data Modeling, Erwin Modeling (Erwin), Microsoft Visio, Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snowflake Schema, Data Warehouse, Data Mart, FACT tables, and Dimension tables.

Experience in working with DataStage Manager, Designer, Administrator and Director.

Extensive experience in Extraction, Transformation, and Loading (ETL) data from various data sources into Data Warehouse and Data Marts using Informatica Power Center.

Hands on Experience working with scheduling tools like Autosys, Control M, Maestro Cleansing tools.

Experience with Informatica Data Quality and Master Data Management products.

Experience in working with relational databases such as Oracle 11g/10g/9i/8x, SQL Server 2008/2005, DB2 8.0/7.0, MS Access and Teradata.

Designed and developed a process that loaded source to staging and staging to destination using many SSIS transformations at the staging table level and table partitioning to optimize the load process within SSIS.

Designed parallel jobs using various stages like Join, Merge, look up, Filter, Remove Duplicates, Data Set, Look up File Set, Complex Flat File, Modify, Aggregator, XML.

Good Experience in developing complex mapping using transformations like Source Qualifier, Router, Filter, Expression, Sorter, Aggregator, Normalizer, Joiner, Sequence Generator, Connected and Unconnected Lookup and Update Strategy, XML Source Qualifier.

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages, Cursors, Triggers, Views, and Indexes in distributed environment.

Been part of various integration, reporting, migration, enhancement engagements.

Proficient in developing SSIS Packages to Extract, Transform and Load (ETL) data into the Data Warehouse from Heterogeneous data sources such as Oracle, DB2, Excel and MS Access CSV, Oracle, flat file.

Experience in requirement gathering, planning, analysis, design, implementation, unit testing, code migration, production support with expertise in developing business applications much based on RDBMS.

Proficient in the integration of CRM data source such as Sales force and good knowledge on insert update.

Experience in leading teams, good understanding of onsite-offshore environments.

Experience in the use of agile methodologies with SCRUM, Sprint.

Strong understanding in UNIX Shell scripts and writing SQL Scripts for development, automation of ETL process, error handling, and auditing purposes.

Worked with cross-functional teams such as QA, DBA and Environment teams to deploy code from development to QA and from QA to Production server.

Involved in analyzing the source data coming from different Data sources such as XML, DB2, Oracle, flat files.

Excellent analytical, problem solving skills with strong technical background and interpersonal skills.

Experience in Performance tuning in Informatica Power Center.

* TECHNICAL SKILLS:

Languages

C, Java, SQL, PL/SQL, VBA.

Databases

Oracle, DB2, Teradata, SQL Server, MS Access, Mainframe.

Package

MSWord, MS Excel, MS Power point, MS Visio, MS Project Management.

Operating System

Windows 2000/2003/NT/XP, UNIX, LINUX, Mainframes.

Web Tools

HTML, CSS AND XML.

ETL Tools

Data Stage8.5, Informatica8x/9x, SSIS, SSRS.

Other tools

SQL Developer, SAS, Tableau and Toad.

Scripting Languages

UNIX Shell Scripting, VBScript and Java Scripting.

Methodologies

Agile Scrum, WATERFALL, UML, Business/Data Modeling, ER modeling.

PROFESSIONAL EXPERIENCE:

United Health Care- Basking Ridge, NJ

Sr.ETL/Informatica Developer Jan2016-present

The main ideas behind this portal are Claims information to be made readily available to the users, Information regarding the health conditions readily available to the users. Online information about the Healthcare plans opted by the users. Online request for ID cards, downloading claim form, customized claim forms for employers are among the many services that are provided by MyUHC, to reduce the number of calls that are made to UHG for Health Plan related queries.

Roles & Responsibilities:

Analyzed the business requirements.

Created shell scripts to fine tune the ETL flow of the Informatica workflows.

Used Informatica file watch events to pole the FTP sites for the external mainframe files.

Production Support has been done to resolve the ongoing issues and troubleshoot the problems.

Employed Oracle database to create and maintain Data Marts.

Performance tuning was done at the functional level and map level.

Designed and supervised the overall development of the Data Mart.

Worked extensively with Informatica power Center mappings to develop and feed Data Mart.

Used relational SQL wherever possible to minimize the data transfer over the network.

Used various transformations like Filter, Expression, Aggregator, Sequence Generator, Source qualifier, Update Strategy, Joiner, Normalizer, Router, XML generator, XML Source qualifier, Connected Look up, Unconnected lookup, Stored Procedure and Union to develop mappings in the Informatica Designer.

Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.

Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.

Maintained Data Marts up-to date and in accordance with the Company’s requirements and practices.

Created Mapplets to use them in different Mappings.

Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.

Developed PL/SQL packages, procedures, triggers, functions, indexes and collections to implement the business logic using SQL Navigator.

Involved in creating, maintaining and tuning views, stored procedures, user defined functions and system functions using T-SQL.

Generated server side PL/SQL scripts for data manipulation and validation and materialized views for remote instances.

Designed, implemented and tuned interfaces and batch jobs using PL/SQL.

Designed and developed approaches to acquire data from new sources like Mainframe (DB2), and AS400 (DB2).

Involved in designing data warehouses and data marts using Star Schema and Snow Flake Schema.

Effectively worked on Onsite and Offshore work model.

Created the Shell Scripts and updated the scripts as per the requirement.

Involved in the creation of Job Sequences.

Pre-and post session assignment variables were used to pass the variable values from one session to other.

Used bash, awk, sed and perl to automate most of the daily activities like log monitoring, log rotation and purging, proactive system monitoring.

Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Also involved in creation and scheduling of T-SQL jobs to run daily.

Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.

Performed unit testing at various levels of the ETL and actively involved in team code reviews.

Identified problems in existing production data and developed one time scripts to correct them.

Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: Informatica Power Center 9.6.0, Control-M, COBOL files, Oracle 11g, TOAD, Unix Shell Scripting, Putty, WINSCP, IBM DB2 8.0, PL/SQL, SQL Server 2012.

BBVA BANK-Birmingham, AL

ETL Developer March 2014-Dec 2015

BBVA Compass Bancshares, Inc. (formerly Compass Bancshares) is a United States-based financial holding company headquartered in Birmingham Alabama. BBVA Compass is one of the U.S.'s 25 largest banks with 688 branch locations, and was previously a member of the S&P 500 Index and the Dow Jones Select Dividend Index. Ranked by its US$ 65 billion total assets, BBVA Compass Bancshares is the 34th largest bank in the United States.

Roles & Responsibilities:

Involved in the Design, Development, Testing, Analysis, Requirements gathering, functional/technical specification, deploying.

Worked with Business Analyst to understand the business and to design the architecture of the data flow.

Developed Logical and Physical data models that capture current and future state data elements and data flows.

Designed logical and physical models for star schema based data marts using Erwin.

Tuned the Informatica mappings for optimal load performance.

Used Teradata utilities fastload, multiload, t pump to load the data.

Optimized the performance of the queries running against the data mart by creation of the table partitions, Indexes and Indexed Views.

Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.

Designing and customizing data models for Data Warehouse supporting data from multiple sources (Oracle, DB2, Excel, Flat files).

Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.

Developed MDM Hub Match and Merge rules, Batch jobs and Batch groups.

Created Queries, Query Groups and packages in the MDM Hub Console.

Involved in creating Master data to populate the Dimension table attributes using the lookup transformation.

Used DataStage Director and its run time engine to schedule running the solution, testing and debugging its components and monitoring the resulting executable versions on scheduled basis.

Involved in import and export of the DataStage jobs by using DataStage Manager.

Used Erwin for reverse engineering to connect to the existing database and ODS to create graphical representation in the form of entity relationships and elicit more information.

Involved in writing BTEQ scripts to transform the data.

Created Informatica mappings for initial load and daily updates.

Designed and developed Informatica mappings to Extract, Transform and Load data into target tables.

Wrote, tested and implemented Teradata Fastload, multiload and Bteq scripts, DML and DDL.

Modified mappings as per the changed business requirements.

Created Dynamic parameter files and changed Session parameters, mapping parameters, and variables at run time.

Used Informatica to load the data to Teradata by making various connections to load and extract the data to and from Teradata efficiently.

Extensively used almost all transformations of Informatica including Lookups, Update Strategy and others.

Developed and delivered dynamic reporting solutions using MS SQL Server 2012 Reporting Services (SSRS).

Using SQL server reporting services (SSRS) delivered enterprise, Web-enabled reporting to create reports that draw content from a variety of data sources.

Extensively worked on Performance Tuning of ETL Procedures and processes.

Extensively used PL/SQL programming in backend and front-end functions, procedures and packages to implement business rules.

Tested the data and data integrity among various sources and targets. Associated with Production support team in various performances related issues.

Developed mappings for Type 1, Type 2 & Type 3 Slowly Changing Dimension (SCD) using Informatica Power Center.

Worked with Session Logs, Workflow Logs and Debugger for Error Handling and Troubleshooting in all environments.

Reviewed QA Test Plans and provided technical support during QA and Stage testing (UAT).

Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.

Environment: Informatica Power Center 9.5/9.6, Informatica Master Data Management 9.1, Informatica Hub, IBM Web Sphere DataStage 8.0.1, Parallel Extender, Quality Stage 8.0, TOAD, Visio, Oracle 8i, Erwin r7.1, IBM Mainframes DB2 7.0, Mercury Quality Center, SSRS, UNIX, SQL Server, Teradata R12/R13, Teradata SQL Assistant, PL/SQL.

STRYKER (HEALTH CARE), Portage, MI

Informatica Consultant AUG 2012-Feb2014

Stryker is one of the world's leading medical technology companies and is dedicated to helping healthcare professionals perform their jobs more efficiently while enhancing patient care. Stryker is a broadly based, global leader in medical technology with a history of success and exceptional growth.

Roles & Responsibilities:

Involved in performing a high-level risk assessments and developing migration strategies to include in project plans.

Assisted with establishing and administering the Informatica environment.

Designed and documented all the ETL Plans.

Developed Health Care ERP solution like FACETS, QNXT and other Claim adjudication systems.

Worked closely with FACETS 4.48, 4.51 and different EDI Transaction file like 837, 834, 835, 270, 271 to understand source structure and source data pattern.

Worked with different data sources such as DB2 tables, Flat files, CSV files and also responsible for cleansing data in flat files.

Implemented Slowly Changing Dimensions (SCDs, Type 1, Type 2 and Type 3).

Experience with Medicare, Medicaid and commercial insurances in HIPAA ANSI X12 formats including 270/271,276/277,835,837,997 and other NSF Formats for interfaces and images to third party vendor applications

Worked with Oracle, SQL Server and flat file sources.

Extracted Erwin physical models into repository manager using Informatica.

Involved in writing conversion scripts using PL/SQL, stored procedures, functions, packages to migrate the data from SQL server database to Oracle database.

Studied the existing OLTP systems and created facts, dimensions and star schema representation for the data mart using Erwin.

Used SQL*Loader to first write the target rows to a flat file and then upload the data to the tables in the Oracle database.

Extensive testing ETL experience using Informatica 9.1/8.6.1 (Power Center/ Power Mart) (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects

Involved in writing Stored Procedures and calling them in Informatica Workflow Manager to drop the data from stage tables.

Responsible for tuning the ETL mappings in order to gain high performance.

Responsible for unit testing the developed mappings and involved in UAT sessions.

Involved in migrating the plans from Development to Production and automating the process.

Environment: Informatica 8.6, Oracle 8i/9i, SQL server 2005, PL/SQL, IBM AIX, UNIX, MS Excel, Erwin4.5, Autosys, Teradata.

IDBI BANK LIMITED-Hyderabad [April 2009-June 2012]

Jr.ETL Developer

Have worked on IDBI Bank multiple systems as Online banking, employee Payroll, HRD, Leave, Pension, Employee Self Service, Increment, Training Information, Security Payroll, Employee Self Service, scrolling news, Departmental sites, Gatepass system used by the bank for its employee record.

Roles & Responsibilities:

•Gather the various reporting requirement from the business analysts.

•Involved in JAD sessions with SME’s, business users to analyze the High-level requirements.

•Reverse Engineering the reports and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.

•Conduct Design discussions and meetings to come out with the appropriate Data Mart at the lowest level of grain for each of the Dimensions involved.

•Created Data flow diagrams for current system.

•Facilitated JAD sessions for data requirements.

•Developed a data topology based on the data storage, various replications, and movements of data.

•Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.

•Participated in several JAD (Joint Application Design/Development) sessions in order to track end to end flow of attributes starting from source screens to all the downstream systems.

•Design the View models according to the requirements of downstream users like web portals and other consumers.

•Checked for all the modeling standards including naming standards, entity relationships on model and for comments and history in the model. Conducted design walkthrough with project team and got it signed off.

•Created Technical Mapping documents for the development team to develop mapping workflows.

•Created and maintained Logical Data Model (LDM) for the project.

•Performed reverse engineering for a wide variety of relational DBMS, including Microsoft Access, Oracle and Teradata, to connect to existing database and create graphical representation (E-R diagram) using Erwin7.2.

•Used BTEQ for SQL queries for loading data from source into Teradata tables.

•Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, business rules, codes etc.

•Developed Logical data model using Erwin and created physical data models using forward engineering.

•Worked on Master data Management (MDM) database for managing the customers&#39 data across the different departments.

•Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.

•Ensured the feasibility of the logical and physical design models.

•Worked with Data architects and IT architects to understand the movement of data and its storage.

•Worked closely with the ETL SISS to explain the complex Data Transformation using Logic.

•Co-ordinated with QA team to test and validate the reporting system and its data.

•Suggested effective implementation of the applications, which are being developed.

Environment:

PL/SQL Developer, Teradata, TOAD Data Analyst - 2.1, Oracle 9i, Quality Center- 9.2, Informatica Powercenter 9.1, Microsoft Access, TD SQL Assistant, Microsoft Visio, MS PowerPoint, MS Access Excel.

EDUCATION:

Bachelor’s Degree in Computer Science Engineering, Jawaharlal Nehru Technological University, Andhra Pradesh.



Contact this candidate