Post Job Free

Resume

Sign in

Data Developer

Location:
Wheeling, IL
Posted:
June 17, 2015

Contact this candidate

Resume:

Srinath Email: acp95r@r.postjobfree.com

Sr .Ab initio Developer Ph: 408-***-****

Professional Summary

Over 8 years experience in all the phases of the Data warehouse life cycle involving Design, Development, Analysis and testing in the various areas of Data warehousing, ETL working with Ab-Initio, Informatica and SSIS

Strong working knowledge of Data Warehousing techniques and Concepts experience including ETL processes, Star Schema/Snowflake Schema and Database performance tuning.

Extensively worked in Sandbox environment and EME environment.

Worked in EME environment for check in and check out of the graphs, versioning of the graph and migrating graph from development environment to testing environment using tags (AIR commands air lock group, air project group, air sandbox group, air tag group, air repository group).

Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large-scale Datawarehouse using Informatica PowerCenter and PowerMart.

Hands on experience in Tuning Mappings, Identifying and resolving Performance bottlenecks in various levels such as sources, targets, mappings, and sessions.

Designed and developed efficient error handling methods and implemented throughout the mappings in various projects.

Involved in development of application replicating legacy Mainframe process using Ab-Initio.

Proficient in Software Life Cycle Development Methodology (SDLC) and Operational Data Store (ODS).

Responsible for QA activities. Worked as a defect tracking lead.

Experience in writing shell scripts in a UNIX environment for cleansing source files and preparing them for the load process

Demonstrated ability to complete multiple assignments simultaneously and maintain high standards of client organizations.

Sound Knowledge of Data Warehouse/Data Mart, Data Modeling, ER Modeling, Dimensional Modeling, Fact and Dimensional Tables Techniques.

Experience in analysis, design, development, implementation and maintenance of various applications in all phases of project life cycle.

Expert in designing Star Schema and well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming and stored procedures.

Practical Experience with UNIX shell wrapper Scripts, KSH and Oracle PL/SQL programming.

TECHNICAL SKILLS:

ETL Tools : Ab Initio, Informatica,SSIS

Scheduler : Autosys, Control-M, Maestro, CA 7,ESP

BI Tools : Business Objects, Essbase 6.1, Cognos,SSRS

RDBMS :Teradata, Oracle,MS Access, MS SQLServer,DB2

Languages : C, SQL, PL/SQL, SAS 9, COBOL, PERL Scripting and Korn Shell Scripting.

Operating Sys : MS-DOS, AIX, HP UNIX, Windows, Solaris 10/9, Ubuntu Hardy.

Education & Certification:

Bachelor of Technology, Jawaharlal Nehru technological university, Hyderabad, India

Oracle Certified Associate

Professional Experience:

Walgreens, Deerfield IL Duration: April 12 – Till Date

CTS

Sr. Ab Initio Developer

EMR KPI

KPI system will be receiving immunization metrics from both the EMR and the IC+ systems, but the KPI requires that the data from the EMR and the IC+ be consolidated when displayed on the Immunization summary screen. The KPI system cannot process multiple files with the same Stat IDs for the same day and store as this will cause the affected records to be overwritten. If the KPI is responsible for combining the KPI files received from the EMR and IC+, then there may need to be changes in the naming conventions for the Stat ID to ensure that the same stat IDs are not being sent from the EMR and IC+ to KPI..

Analyzed business needs and documented functional and technical specifications based upon user requirements with extensive interactions with business users

Extensively worked with the Ab Initio Enterprise Meta Environment (EME) to obtain the initial setup variables and maintaining version control during the development effort.

Extracted data from various sources like databases, delimited flat files and XMLs.

Designed and developed parameterized generic graphs.

Improved performance of Ab Initio graphs by using various Ab Initio performance techniques like using lookups in memory joins and rollups to speed up various Ab Initio graphs.

Created sandbox utility using shell scripts for individual development and testing environment.

Responsible for the detailed design and documentation.

Involved in change management using Remedy

Worked closely with data modelers and helped in the design of schemas and tables including SCDs.

Created SSIS Packages load data in datamart in SQL server.

Developed wrapper scripts to periodically notify users in case of any failures with debugging information.

Developed UNIX Korn Shell wrappers to initialize variables, run graphs and perform error handling.

Developed, tested and reviewed complex Ab Initio graphs, sub-graphs, DML, Pset, XFR, deployed scripts, DBC files for connectivity

STARS

STARS, Strategic Tracking and Analytical Reporting System, is a reporting system that tracks pharmacy events to determine at what point during the workflow process an event resulted. STARS application allows the ability to identify negative performance patterns and reveals opportunity for training.

Responsibilities:

Participated in ETL development process using Ab Initio.

Performed analysis on the extracted data which is transformed and loaded into the Data Warehouse, to achieve quality and consistency.

Developed graphs to take process huge sets of input data.

Used Ab Initio partition components to achieve parallel data transformations and filtering.

Responsible for the performance tuning of the ETL graphs.

Prepared and implemented data verification and testing methods for the data warehouse.

Implementing transformation rules using Ab Initio components.

Created graphs for performing aggregation and joins on the records.

Improvised the existing Ab Initio ETL process by identifying bottlenecks in the graphs and modifying them to make better use of the resources by utilizing the capabilities of the tool.

Worked in close coordination with business analysts and reporting groups in designing aggregate business rules.

Used Ab Initio components like Reformat, Input file, Output file, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table and Gather Logs for developing graphs.

Developed Shell Scripts for Batch Processing, start and end scripts for invoking the Ab Initio graphs.

Developed an ETL process to update the ODS database with the new and changed records as a result of Transactions at the end of each month.

Prepared Unit and Integration testing plans. Involved in Unit and Integration testing using the testing plans. Involved in UAT with user groups.

Environment: Ab Initio, ESP job scheduler, Oracle 11g,Sql Server2012,SSIS, Sun Solaris

Fannie Mae, Washington DC Duration: Nov 11 – Mar 12

Hexaware Technologies

Sr. Ab Initio Developer

SFLA RDW Simplification

Single family loan acquisition Relational Data warehouse is also known as Housing and Community Development RDW; a relational database containing SF loan. Property and CE bond data from across the mortgage loan and bond lifecycle. Data is used for Single Family analytics and reporting. This project is based on reversed engineered requirements and existing code, new solution specifications are defined which includes value additions,improvements,code optimization and recommendations over the existing code.

Responsibilities:

Gathered business requirements, interacted with the business team on SFLA processes and created High Level Designs for the Ab Initio Graphs as per the use cases given by the business team.

Interacted intensively with the system analysts of the SFLA team in order to understand the requirements and to seek clarifications on the use case(s).

Converted High Level Designs into Ab Initio Graphs. Revised Ab Initio graphs as per the policy changes for the restatement.

Created complex graphs by using various Ab Initio design components – joins, filters, reformat, denormalize, normalize rollups etc in a heavily partitioned environment.

Fine-tuned the Ab Initio graphs for the performance. Worked on phasing and partitioning of the graph.

Created Ab Initio graphs as per the FAST (Fannie Mae Ab Initio Support) team standards for the Ab Initio design and coding and got it reviewed from the FAST team.

Unit tested the graphs extensively by extracting data from the tables and comparing the results to the unit test plan. After successful testing the graphs were checked into the specific EME repositories for the string testing, negative testing, User Acceptance Testing (UAT), Integrated processing (IP) and production.

Mentored the new team members on the existing procedures and helped them in understanding the Subject Matter.

Participated in the Autosys Scheduling of all the SFLA processes and provided the graph and file details for each process.

Worked with the system analyst on automated compare of the expected and the actual results generated from the Ab Initio graphs

Worked on streamlining various interfaces for the SFLA those were required for the processing.

Written Complex SQL queries and SQL programs.

Written UNIX shell scripts and dos batch programs for data management purposes.

Environment: Ab Initio 1.15 GDE, Co>operating system 2.13, EME, Autosys job scheduler, Erwin Data Modeling tool, Oracle 11g, Sun Solaris Unix, MS Projects,

The Home Depot, Atlanta, GA Duration: Jul 10 – Nov 11

Sr. Ab Initio Developer

CRTV

For CRTV, Genco (a 3PL) will operate a Reverse Logistics Center (RLC) to process store RTV’s, markdowns, donations and salvage items. Stores will transfer product to the RLC where it will be received, processed and then dispositioned. While the RLC will use internal systems to manage the facility, THD will still own the inventory and will need to have visibility to it for nightly financial reconcilliation.

Responsibilities:

Performed Analysis, designing and preparing the functional, technical design document, and code specifications.

Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their OLTP systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.

Responsible for all pre-ETL tasks upon which the Data Warehouse depends, including managing and collection of various existing data sources

Involved in developing UNIX Korn Shell wrappers to run various Ab Initio Scripts.

Developed Ab Initio XFR’s to derive new fields and solve various business requirements.

Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, Broadcast, merge etc.

Worked on improving the performance of Ab Initio graphs by using various Ab Initio performance technique’s like using looks instead of Join’s etc.

Implemented Lookup’s, lookup local, in-Memory Joins and rollup’s to speed up various Ab Initio Graphs.

Design Documentation for the developed graphs.

Environment: Ab Initio GDE 1.15, Co>Operating System 2.15, DB2, Teradata V2R14, Mastreo, AIX

SunTrust Bank, Atlanta, GA Duration: Oct 09 – Jun 10

Sr. Ab Initio Developer

Sunmart Redesign

SunMart is currently extracting data from four source systems and moving the data through DIME into EDM. The AbInitio tool is being used for extraction, transformation and loading purposes. The process currently has a flow which specifies validation being done each and every time the data is inserted into EDM. For this purpose there are many load ready files being generated for each element and loaded separately into the base tables. This being time consuming is not concurring with the SLA. In order to reduce the time and to incorporate certain mapping changes the process needs to be redesigned and this document covers the logical design of the new process. It includes removal of certain entities and not validating while the data is being loaded into EDM. It also includes of creating a single file for each table and loading these files in one go into the EDM and processing them only once

Responsibilities:

Analyzed Business and Accounting requirements from the Accounting and Business Detail level Process design.

Involved in understanding the Requirements of the end Users/Business Analysts and Developed Strategies for ETL processes.

Responsible for the detailed design and documentation. Provided technical solutions for the Process requests raised by Data team to fix the issues in the existing system.

Designed, developed and Unit tested Ab Initio graphs using GDE for Extraction, Transformation and Loading of data from source to target.

Extracted data from Oracle legacy Data source tables and created various loan lookups, commitment type lookups, and security type lookups.

Created and modified various Loan, Property and Asset graphs based on the business rule enhancements.

Worked on improving performance of Ab Initio graphs by using various Ab Initio performance techniques like using lookups, in memory joins and rollups to speed up various Ab Initio graphs. Designed and developed parameterized generic graphs.

Worked closely with CM team to migrate the ETL code changes from development environment to System, Integration and Acceptance environments.

Used Rational Clear case to migrate non Ab Initio code by creating a baseline and checking in the scripts.

Extensively worked in the UNIX environment using Shell Scripts. Created test cases and performed unit testing for the Ab Initio graphs. Documented Unit testing. Logged and resolved defects in the roll out phase. Responsible for supporting the CM team and troubleshooting any production issues.

Created a Production support document and documented the Test case work book, High level Design and Detail Design documents.

Environment: Ab Initio GDE 1.15, Co>Operating System 2.15, Oracle 10g, DB2-UDB, CA 7, TOAD, Mainframes, UNIX Shell Scripting, Cognos, Sun OS Solaris 9.

KeyBank, OH Duration: Oct 08 – Oct 09

ETL Developer

Building Data Mart for their Accounts division

The bank's goal is to build the capability to better analyze the bank's accounts. One of the bank's major objectives is to market more effectively by offering additional products to households that already have one or more accounts with the bank. Users want the ability to slice and dice individual accounts, as well as the residential household groupings to which they belong. We created BDW and developed a reporting system and scheduling of the Reports generation as per the requirement of the users

Responsibilities:

Developed AbInitio Graphs with complex transformation rules using Graphical Development Environment (GDE) to read the data files and then reformat, filter, rollup the data before loading them into Oracle database.

Wrote UNIX scripts to automate the ETL flow, and to FTP files to other servers as they are generated.

Facilitated test data set up for other areas through Temporary Ab Graphs and Created several BTEQ, Fastload and Multiload scripts to load backfill data to Data Warehouse for Performance Testing.

Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS using BTEQ, Fastload, Multiload and TPump.

Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.

Modified BTEQ scripts to load data from Teradata Staging area to Teradata data mart and also wrote complex queries to load summary tables based on the core tables in DW.

Usage of AbInitio data profiler for data cleansing and analysis. Also using the same to analyze test output.

Created UNIX Shell scripts (wrapper scripts) for scheduling and also to FTP multiple target files as per user required naming conventions.

Reviews and Code Walkthroughs for development done offshore. Developing shell scripts to perform automated unit testing.

Supported with various Business units during and after Production roll out with adhoc queries to identify missing/incorrect data and resolve issues by fixing them .Also wrote complex Queries and BTEQ scripts for users to run their reports.

Environment: AbInitio Co >Op 2.15, GDE 1.15, DB2, TeradataV2R12/6, Oracle 10g,Toad,SQL,PL/SQL,IBM Mainframes, UNIX,DB2, UC4,ClearCase,ClearQuest, Business Objects,Sql Assistant,AIX

Aventis, NJ Duration: Oct 06 – Oct 08

Birlasoft, Hyderabad, India

ETL Developer

Aventis is a Pharmaceutical company, which provides new and improved biotech drugs for various diseases and their symptoms. The objective of the project is to extract data stored in different databases and load into oracle system which is the staging area and the business logic is applied to transform the tables in the required way. The data warehouse is fed by marketing data, sample data, market (competitor) data, prescription data and others.

Responsibilities:

Extensively used ETL to load data from Flat Files, XML, Oracle to oracle 8i

Involved in Designing of Data Modeling for the Data warehouse

Involved in Requirement Gathering and Business Analysis

Developed data Mappings between source systems and warehouse components using Mapping Designer

Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, sequence generator, joiner, XML.

Setup folders, groups, users, and permissions and performed Repository administration using Repository Manager.

Involved in the performance tuning of the Informatica mappings and stored procedures and the sequel queries inside the source qualifier.

Created, launched & scheduled sessions.

Involved in the Performance Tuning of Database and Informatica. Improved performance by identifying and rectifying the performance bottle necks.

Used Server Manager to schedule sessions and batches.

Involved in creating Business Objects Universe and appropriate reports

Wrote PL/SQL Packages and Stored procedures to implement business rules and validations.

Environment: Informatica 7.1.3, ORACLE 10g, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL, TOAD Quest Software



Contact this candidate