Post Job Free

Resume

Sign in

Data Engineer

Location:
Laredo, TX, 78045
Posted:
July 11, 2016

Contact this candidate

Resume:

Professional Summary:

Over Eight (*+) years of total IT experience with extensive Data Warehousing implementations on various verticals like Health Insurance(CIGNA), Brokerage Solutions (SCHWAB, JPMorgan Securities, FanMail), Mutual Funds(DST), Financial (USAA, BOA, HSBC) & Retail (GSK & ALLERGAN).

Extensively worked on IBM Datastage versions 11.3/9.1/8.5/8.1/7.5.2 including One-year experience on IBM Big Insights 2.1. Performed various roles in Software Development Life-cycle as Developer, Administrator, ETL Test lead & Big data Engineer.

Educational Qualification:

Masters in Software Engineering from JNTU Hyderabad.

Technical Skills:

ETL Tools : Datastage, Informatica, MDM, SSIS.

IBM Tools : Datastage Designer, Information services Director, Information Analyzer, Fast tract, Datastage administrator, Datastage Data Quality.

Database : DB2, Oracle, SQL Server 2008/2012, Teradata, Netezza & RDBMS

Scheduling Tools : ESP, Control M, Autosys

Operating System : AIX, LINUX, Windows Xp/NT/7

Programming Languages : C, UNIX shell Scripting, SQL, PL/SQL

Utilities : Remedy, Jira, BMRS, HP Service Manager, Jenkins, Toad, SSH Client, Teradata Sql Assistant, db visualizer, DB2 Control Center.

Version controllers : Tortoise sub version control 1.8.7

Bid Data Utilities : Hive, Pig, Big Data Architecture

Responsibilities:

Extensive experience on IBM Info Sphere Information server and its components like Data Stage Designer, Data Stage Director, DataStage Administrator from versions 7.5.2 to 11.3

Expertise in cleansing the data using Quality stages in Datastage 8.7/9.1

Highly skilled on dealing big datasets in terms of terabytes (Big data Project)

Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, transformer etc.

Expertise in Information Services Director - which is used for exporting the Datastage Jobs

Expertise in Migration projects from 8.x to 9.x or 11.3.

Experienced in integration of various data sources (DB2-UDB, SQL Server, Teradata & Oracle)

Hands on experience on using Hive Query Language (HQL) for extracting data.

Worked on IBM InfoSphere BigInsights 2.1.1

Good knowledge on BIGDATA supporting tools like (Pig, Hive, Ambari & H Base).

Worked on other ETL Tools like Informatica & SSIS.

Skilled in Handling the Tickets that are raised in the SIT & Production environments.

Good at UNIX scripting for the business requirements.

Worked on TERADATA Utilities (Fast Load, Multi Load, T pump)

Highly skilled on workload management tool (Control M)

Worked extensively on Data modeling concepts and their implementation in Dimensional modeling using ETL Processes for Data warehouses.

Experience in troubleshooting of jobs and addressing production issues like data issues, Environment issues, performance tuning and enhancements.

Good experience in Data Warehouse Development, Enhancement, Migration, Maintenance and Production support projects.

USAA –Plano, Texas. Feb 2016 -to-till date

ETL/Consultant.

DEVCON (TRUST SERVICES):

USAA is a diversified financial services group of companies which offers banking, investing & insurance to people and families of US military. Trust services is a part of wealth Management that holds, manages and distributes assets according to the wishes of Trust members.

Currently the transactions summary reports of the Trust members is handled by worlds largest global provider FIS. The purpose of the project is to build the reports internally. As part of the process we have extracted the Customer information files and loaded them into operational database DB2 so as to create BO universe.

Responsibilities:

Extracted the data from Mainframe files.

Applied complex business logics and created operation database in DB2

Scheduled the jobs using work load management tool (Control M).

utilized performance techniques on the jobs for optimal execution of the jobs.

Version management and migration of the jobs is handed by IBM RTC.

Environment: IBM Infosphere Data Stage 9.1, Control M 8.0, Mainframes, DB2 & UNIX.

Allergan –Irvine, California. Sep 2015-to-Feb 2016

ETL/Consultant.

SMDM (Sales and Marketing Data Mart):

Sales and Marketing Data Mart (SMDM) helps management to analyze physician profile, sales allocation, account profile, calls details and RX data. SMDM sources data from CM, ASPIRE, AMS, and SAP/BW applications. CM provides accounts and physician data, BW provides COPA direct sales and SAP accents data, AMS provides allocation data, and ASPIRE provides call data to the SMDM.

Responsibilities:

Implemented new change requests in the existing module SMDM.

Loaded the data into SFDC and SAP

Documented the Changes as per CR Numbers and uploaded into the Repository.

Implemented performance tuning to the entire module and migrated to the higher environments.

Maintained version control for the Modules.

Environment: IBM Infosphere Data Stage 9.1, ESP Tortoise SVC, SQL Server, DB2, Oracle & UNIX.

Cigna Health Care - Bloomfield, Connecticut. Feb 2015- to- Aug 2015

ETL/DataStage Consultant.

Compliance Screening:

Compliance Screening and process improvement project is to analyze and document the current state of all enterprise wide compliance screening process, determine inconsistencies and gaps in the current state and implement a consistent scalable and standardized process for sanctions screening for the Entire Cigna enterprise. More over we belong to center of excellence team and render our services to multiple projects, parallel we will be working on current assignment as well as supporting the other projects as when required.

Responsibilities:

Understand, Analyze and design the code as per lines of Business based on Functional requirements.

Extracting the data from different sources as defined in Business Requirement Document.

Implemented complex business logics on the extracted data as per Mapping Document.

Worked on Data Quality Stages for filtering the customer’s data throughout the CIGNA.

Maintained version control throughout the project using SVC and Jenkins.

Working on highly sensitive data and creating Data Marts as and when required for the team

As part of center of excellence team - providing support for the ongoing projects.

Migration of the Data Stage projects from 7.5.2 to 9.1.

Environment: IBM Infosphere Data Stage 9.1, ESP Tortoise SVC, JENKINS, SQL Server, DB2, Oracle & UNIX.

DST systems - Kansas City, Missouri July 2013- to- Jan 2015

Senior ETL Consultant.

Enterprise Services - B.I. Services:

The BI Services team is a part of the Enterprise Services Application Development division. Our goal of the Enterprise Reporting initiative is to design and implement an enterprise-wide mechanism for generating and distributing meaningful reports, gather multiple sources of data into consistent formats and from common, consistent, stable sources of data with the ultimate goal of providing an easily accessible, secure and common repository for reporting.

Responsibilities:

Extraction and loading of data in Big data environment.

Used Hive for validating data in Big Data Environment.

Extracted data from different data sources like Oracle, SQL Server, Teradata, MySQL, Cobol Files as and when required and made it available for easy reporting purpose.

Successfully migrated/re coded all the SSIS packages into equivalent Data Stage Jobs.

Worked on Data Quality Stage as part of POC for client Mc Donald.

Used Information services Director and the exports few Modules of DS jobs to SOA.

Involved in performance tuning and optimization of DataStage mappings using features like Pipeline and Partition Parallelism and data/index cache to manage very large volume of data

Loading the data into the Staging and moving the data from Staging to ODS.

Involved in managing different versions of source code through Tortoise sub version control.

Involved in creation of MDM and its maintenance.

Processed large volumes of Data using MDM

Administrative activities:

Created Data Stage Development, Test & Prod Environments.

Installed required patches as and when required

Indexed servers in order to access from Datastage

Created separate config files depending on the requirement for environments.

Created users as and when required.

Imported the dsx/packages from development to Test/UAT/PROD Environments.

Environment: IBM Infosphere Data Stage 9.1, Webfocus 8.1, IBM- Big Data, HIVE, SSIS-2005, IBM - BigData, SQL Server 2005, Oracle 10.0, TeraData 13.10, MySql 8.12, DB2 9.5, window’s 7, UNIX

Datastage 11.3 - Hands on experience in IBM Infosphere information Server 11.3 beta version. We have developed few jobs on Datastage 11.3 and did performance testing on both the versions 9.1 & 11.3.

DST systems - Kansas City, Missouri Feb 2012 - to- July 2013

Senior Software Engineer

BARS (Books and Record keeping system):

The Books & Records (BARS) database will provide aggregation of financial account information from a variety of custodians including clearing firms, broker-dealers, and DST. The aggregation will include security position, transaction history and the non-financial investor information. The high quality data is used by a variety of applications in the routine operations of a broker-dealer and financial advisor.

Responsibilities:

Involved in the implementation of this application, which involved the Extraction, transformation, and Loading of data into an DB2 database for various Dealers.

Worked extensively on Data Profiling and Data Encryption.

Repartitioned job flow by determining DataStage PX best available resource consumption.

Implemented multi-node declaration using APT_ConfigFile for performance enhancement.

Used the ETL Data Stage Director to schedule and running the jobs, testing and debugging its components & monitoring performance statistics through Operations Console

Involved in design of dimensional data model – Star schema and Snow Flake Schema

Used Unix shell scripting for file manipulation.

Extensively worked on the version management of the jobs in the project.

Performed unit testing and system integration testing in dev and UAT environments

Environment: IBM Infosphere Data Stage 8.5, Webfocus 8.0, SSIS-2005, IBM - BigData, SQL Server 2005, Oracle 10.0, TeraData 13.10, MySql 8.12, DB2 9.5, window’s 7, UNIX

Bank of America - Charlotte, North Carolina July 2011- to- Feb 2012

ETL Consultant

FSR – Profitability:

FSR Profitability is a module based on Old Clipper environment, it is broadly classified into cards, loans, Investments and International banking transactions, depending upon the category it extracts data from sources (Teradata, DB2,Oracle,Flat files) and apply business logic as defined in the SLD and loads the data into three layers Financial Layer, Result data layer, and volumes, once the data obtained in these 3 layers is as per requirements then the data is pushed into SAP Bank Analyzer for reporting purpose.

Responsibilities:

Designed Data Stage ETL jobs for extracting data, Cleansing of Data.

Applied business logic in layers like Financial Transactions, Result data layer, volumes and finally load into the SAP Bank Analyzer. (Data stage Plug in stage)

Fixed the issues that are raised in the SIT Environment.

Created Error Tables containing data with discrepancies to analyze and re-process the data.

Used Business Application Programming Interface (BAPI) stage for loading into SAP

Worked on troubleshooting, performance tuning and enhancement of DataStage jobs.

Implemented multi-node declaration using configuration files for performance enhancement.

Worked with DataStage Manager to import/export metadata from database, jobs and routines.

Environment: IBM Infosphere Data Stage 8.0, SAP - BAPI, Oracle 9, TeraData 10, UNIX

HSBC – (HSBC Global Software Delivery) Dec 10- to- June 2011

Senior Software Engineer

RMPR - Relationship Manager Portfolio Reporting:

The project RMPR (Relationship Manager Portfolio Reporting) is a sub Module of Customer Product Transaction. The vision is to capture all the information about the customers based on the level they are operating and assign a Relationship Manager who will deal/Responsible for all the transactions of that particular customer.

Responsibilities:

Involved in designing Data stage jobs using Data stage 8.1 supporting all phases of ETL process

Analyzed existing legacy data source and extracted data from it using Datastage.

Created jobs to move data from Source to Staging and staging to Target as per given logic.

Used XML metadata importer to import XML table definitions from the XSD document.

Performed Unit Testing and Impact analysis and other support activities.

Executed and validated Unix scripts in Development and Test Environments.

Environment: IBM Infosphere Data Stage 8.0, Cognos, DB2, Oracle, HP-Quality Center, Autosys, Windows XP, UNIX

GlaxoSmithKline – Mahindra Satyam Mar 10-to - Dec 2010

Sr. Software Engineer

InSight:

InSight delivers Brand & Customer Account Profitability for all European Markets at a highly detailed level. Profit & Loss statements is calculated daily for On Invoice Discounts (OID) and Net Invoice Price (NIP), Below Invoice Discounts (BID) and Cost of Goods Sold (CoGS) is calculated by monthly. InSight collects & aggregates data from various sources, Cognos Finance and JDE General Ledger.

Responsibilities:

Responsible for daily verification that all scripts, downloads, and files were executed as planned.

Provided production support at all times round the clock.

Handled whole project by running the jobs and validating them for successful completion.

Responsible for Triggering the jobs individually if not triggered by pearl script.

Providing status reports by validating the reports that are taken from Cognos portal.

Handled special runs and daily loads and monthly loads for the project

Daily Monitoring the Project and performing the daily tasks that are required for the project.

Modifying and running the cubes Cubes and validating the reports that are generated from them.

Migrated the project from 7.5x2 to 8.0.

Environment: Data Stage 7.5.2, Cognos-Finance 7.5, Oracle, Perl scripts, Windows XP, UNIX

GlaxoSmithKline - Satyam Feb-08- to- Mar 2010

Software Engineer

GRS - Global Reporting System:

As part of merging from Beecham group and Smith Kline Beckman Corporation they wanted to establish a Global reporting system by merging all its modules into a single Reporting system. GRS collects information from different sources and uses that information to build separate data marts. This project mainly focuses on migrating transactional data into a single system

Responsibilities:

Test and execute the Datastage jobs and validate their functionality.

Preparing Unit Test Cases and documenting them.

Testing the jobs in Different environments and capturing the results.

Responsible for running the jobs and their completion of Data Loads

Provided production support at all times round the clock.

Created job for assigning unique identifiers using Database sequences

Created project documentation, which describes the process and the data flow.

Involved in the Production Test run of the jobs.

Environment: Data Stage 7.5.1/7.5.2 Oracle, Windows XP, UNIX



Contact this candidate