Chaithanya
Email: ******@*******************.***
516-***-*****107
Professional Summary:
Highly Motivated & Solution Driven in reaching the goals of target with 10
years of Data Warehousing experience in the areas of ETL Design and
Development. Involved in complete Software Development life-cycle of
various projects, including Requirements gathering, System Designing, Data
modeling, ETL design, Development, Production Enhancements, Datastage
Administration, Support and Maintenance using IBM InfoSphere Information
Server 11.3, 9.1, 8.5, 8.1, 7.5, 7.1 & 6.5
. Ample experience in Retail, Financial, Brokerage and Industry business
processing Domains.
. Extensive experience on IBM Info sphere Information server and its
components like Data Stage Designer, Data Stage Director, DataStage
Administrator & Quality Stage.
. Highly skilled on dealing big datasets in terms of terabytes(Big data
Project)
. Worked extensively on Data modeling concepts and their implementation
in Dimensional modeling using ETL Processes for Data warehouses.
. Good experience in Data Warehouse Development, Enhancement, Migration,
Maintenance and Production support projects.
. Familiar in using highly scalable parallel processing techniques by
using parallel jobs on multiple node configuration files.
. Experience in troubleshooting of jobs and addressing production issues
like data issues, Environment issues, performance tuning and
enhancements.
. Well versed in setting up Data stage Development, Test & Prod
Environments along with installation of Patches and drivers.
. Expert in designing Parallel jobs using various stages like Join,
Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set,
Complex flat file, Modify, Aggregator, transformer etc.
. Expert in designing Server jobs using various types of stages like
Sequential file, ODBC, Hashed file, Aggregator, Transformer, Sort,
Link Petitioner and Link Collector
. Extensively utilized processing stages like Transformer, Aggregator,
Lookup, Join, Sort, Copy, Merge, Funnel, CDC, SCD & Quality stages.
. Expert in working with DataStage Manager, Designer and Director.
. Used Enterprise stages in Databases like Oracle, DB2, Teradata, &
Netezza for optimal performance while loading or updating the data.
. Experienced in integration of various data sources (DB2-UDB, SQL
Server, PL/SQL Oracle and MS-Access) into data staging area.
. Hands on experience on using Hive Query Language (HQL) for extracting
data from BIGDATA
. Having good amount of knowledge on BIGDATA and its architecture.
. Good knowledge on Pig(Programming Tool)
. Worked on other ETL Tools like Informatica & SSIS.
. Good knowledge and hands on experience on reporting tools like Web
Focus 8 & Cognos 8.
. Skilled in Handling the Tickets that are raised in the SIT &
Production environments.
. Good at UNIX scripting for the business requirements.
. Very good experience in SQL Querying using Oracle, DB2 & SQL Server.
. Excellent written and verbal communication skills in corporate
environment and the ability to perform tasks independently as well as
in team environments.
. Experience of working in onsite-offshore structure and effectively
coordinated tasks between onsite and offshore teams as a technical and
onsite lead.
. Expertise in Big data environment, extracting data from different
sources for building Global DW
. Having exposure to Scheduling tools like Control-M, Autosys.
. Experience in working on Waterfall/Agile Scrum project methodologies.
Technical Skills:
ETL Tools : IBM Information Server 11.3, 9.1, 8.5, 8.1 (DataStage,
Quality Stage, Information Analyzer), Ascential
DataStage V7.5 (Designer, Director, Manager,
Parallel Extender), SSIS 2005, Informatica 8.6
Database : Vertica, DB2, Oracle, SQL Server 2008/2012, Teradata, Netezza
Testing Tools : HP Quality Center, QTP, Selenium
Scheduling Tools : Autosys, Maestro
Operating System : AIX, LINUX, Windows Xp/NT/7
Programming Languages : C, UNIX shell Scripting, SQL, PL/SQL
Utilities : Remedy, Jira, Quality Centre, Toad, SSH
Client, Teradata Sql Assistant, db
visualizer, DB2 Control Center.
Job Schedulers : Control-M, Autosys
Version controllers : Tortoise sub version control 1.8.7
Bid Data : Hive, Pig, Big Data Architecture
Educational Qualification:
Masters in Technology (Software Engineering) from Jawaharlal Nehru
Technological University, Hyderabad, India
Professional Trainings:
. Processing BigData with Hadoop.
. WebFocus 8.0
. Autosys
. Data Analytics (Modeling)
. Master Data Management.
Professional Experience
DST systems - Kansas City, Missouri July 2013- to- Till
date
ETL/DataStage Senior Designer/ Developer/ Technical Lead
Enterprise Services - B.I. Services:
The BI Services team is a part of the Enterprise Services Application
Development division. Our goal of the Enterprise Reporting initiative is
to design and implement an enterprise-wide mechanism for generating and
distributing meaningful reports, gather multiple sources of data into
consistent formats and from common, consistent, stable sources of data with
the ultimate goal of providing an easily accessible, secure and common
repository for reporting. The BI Services uses Datastage for ETL purposes,
Web Focus for reporting, and DB2 for storing information. We are
responsible for getting the right information to the right people at the
right time for greater efficiency and better decision-making. We Support
Business Unit Application services, Data management & Web Development
Responsibilities:
. Extraction and loading of data in Big data environment.
. Extracted data from different data sources like Oracle, SQL Server,
Teradata, MySQL, Cobol Files as and when required and made it
available for easy reporting purpose.
. Successfully converted all the SSIS packages into equivalent Data
Stage Jobs and performed performance tuning for them
. Used Hive for validating data in Big Data Environment.
. Developed simple web focus reports as and when required.
. Created complex synonyms for building the reports and validated them
with the data from reports.
. Installed and configured required patches for Data Stage, Web Focus
and Databases.
. Extracted Big Data data through Hive.
. Worked with Metadata Definitions, Import and Export of DataStage jobs
using DataStage Manager.
. Used DataStage Director and its run-time engine to schedule running
the solution, testing and debugging its components, and monitoring the
resulting executable versions
. Involved in performance tuning and optimization of DataStage mappings
using features like Pipeline and Partition Parallelism and data/index
cache to manage very large volume of data
. Created schema files and parameterized the job in multiple instances,
so that single job can work dynamically as per the requirements.
. Performed performance tuning in order to bring the parallelism concept
through out the module.
. Developed Jobs to load the data into the Warehouse environment on Bulk
and incremental loads
. Loading the data into the Staging and moving the data from Staging to
ODS.
. Involved in daily meetings with the client for requirements and
provided Scrum updates.
. Used Hive Query Language (HQL) for extracting data from BIGDATA
. Involved in Business Requirement gathering sessions.
. Involved in critical multiple instance Datastage jobs which will send
the outbound files for different Lobs (line of business) at the same
time and monitored the jobs accordingly
. Involved in managing different versions of source code through
Tortoise sub version control.
Administrative activities:
. Created Data Stage Development, Test & Prod Environments.
. Installed required patches as and when required
. Indexed servers in order to access from Datastage
. Created separate config files depending on the requirement for
environments.
. Created users as and when required.
Environment: Data Stage 9.1, Webfocus 8.1, IBM- Big Data, HIVE, SSIS-2005,
SQL Server 2005, Oracle 10.0, TeraData 13.10, MySql 8.12, DB2 9.5,
window's 7, UNIX
DST systems - Kansas City, Missouri Feb 2012 - to- July 2013
Senior Software Engineer
BARS (Books and Record keeping system):
The Books & Records (BARS) database will provide aggregation of financial
account information from a variety of custodians including clearing firms,
broker-dealers, and DST. The aggregation will include security position,
transaction history and the non-financial investor information. Broker-
Dealers who will use
applications and services from Brokerage Solutions will authorize BARS to
receive the data feeds from selected custodians. BARS will then normalize
the data and match appropriate accounts to individual investors, financial
advisors, and branches within broker-dealer. The high quality data can then
be used by a variety of applications in the routine operations of a broker-
dealer and financial advisor. More over Books & Records is a reference to a
regulatory requirement that a broker-dealer maintains accurate and detailed
records of securities transactions, positions and account information.
Responsibilities:
. Acting Team Lead for the Team.
. Analyzed, designed, developed, implemented and maintained Parallel
jobs using IBM info sphere Data stage.
. Involved in design of dimensional data model - Star schema and Snow
Flake Schema
. Worked extensively on Data Profiling and Data Encryption.
. Implemented multi-node declaration using configuration files
(APT_ConfigFile) for performance enhancement.
. Used the ETL Data Stage Director to schedule and running the jobs,
testing and debugging its components & monitoring performance
statistics.
. Active participation in decision making, QA meetings and regularly
interacted with the Business Analysts and SMEs to gain a better
understanding of the Business Process, Requirements & Design.
. Used TOAD for writing SQL routines and functions
. Used UNIX shell scripting for file manipulation.
. configured ODBC drivers to connect to the Oracle database
. Involved in the implementation of this application, which involved
the Extraction, Transformation, and Loading of data into an DB2
database for various Dealers.
. Responsible for daily verification that all scripts, downloads, and
file copies were executed as planned, troubleshooting any steps that
failed, and providing both immediate and long-term problem
resolution.
. Repartitioned job flow by determining DataStage PX best available
resource consumption.
. Extensively worked on the version management of the jobs in the
project.
. Performed unit testing and system integration testing in dev and UAT
environments
Environment: Data Stage 8.5, Webfocus 8.0, SSIS-2005, IBM - BigData, SQL
Server 2005, Oracle 10.0, TeraData 13.10, MySql 8.12, DB2 9.5, window's 7,
UNIX
Bank of America - Charlotte, North Carolina July 2011- to- Feb 2012
ETL Consultant
FSR - Profitability:
Bank of America is a multinational banking and financial services
corporation, the bank generates 90% of its revenues in its Global consumer
and small business Banking. It has widely spread its operations in
Investment Management and International banking Transactions. FSR
Profitability is a module based on Old Clipper environment, it is broadly
classified into cards, loans, Investments and International banking
transactions, depending upon the category it extracts data from sources
(Teradata, DB2,Oracle,Flat files) and apply business logic as defined in
the SLD and loads the data into three layers Financial Layer, Result data
layer, and volumes, once the data obtained in these 3 layers is as per
requirements then the data is pushed into SAP Bank Analyzer for reporting
purpose.
Responsibilities:
. Acting Team Lead for the Team.
. Designed Data Stage ETL jobs for extracting data, Cleansing of Data,
applied business logic in 3
. layers like Financial Transactions, Result data layer, volumes and
finally load into the SAP Bank Analyzer. (Data stage Plug in stage)
. Fixed the issues that are raised in the SIT Environment.
. Import and export the jobs Using Data stage Manager.
. Created snapshot for the transactional tables in distributed
databases.
. Analyzed existing legacy data source and extracted data from it using
Datastage
. Created Error Tables containing data with discrepancies to analyze and
re-process the data.
. Used information analyzer to perform primary key, foreign key and
cross domain analysis.
. Extensively used Business Application Programming Interface (BAPI)
stage in collaboration with Datastage
. Worked on troubleshooting, performance tuning and enhancement of
DataStage jobs.
. Created source to target mapping and job design documents from
staging area to Data Warehouse.
. Used DataStage Designer to create the table definitions for the CSV
and flat files, import the table definitions into the repository,
import and export the projects, release and package the jobs.
. Wrote several complex SQL queries to extensively test the ETL
process.
. Implemented multi-node declaration using configuration files for
performance enhancement.
. Collaborated with EDW team in, High Level design documents for
extract, transform, validate and load ETL process data dictionaries,
Metadata descriptions, file layouts and flow diagrams.
. Tuned transformations and jobs for Performance Enhancement.
. Executed Pre and Post session commands on Source and Target database
using Shell scripting.
. Utilized Parallelism through different partition methods to optimize
performance in a large database environment.
. Enhanced various complex jobs for performance tuning.
. Worked with DataStage Manager to import/export metadata from database,
jobs and routines between DataStage projects.
Environment: Data Stage 8.0, SAP - BAPI, SQL Server 2005, Oracle 9,
TeraData 10, UNIX
HSBC - Greater London Dec 10- to- June 2011
Senior Software Engineer
RMPR - Relationship Manager Portfolio Reporting:
HSBC is a global financial services company Headquartered in London. The
bank is originated with four Business Groups, Commercial Banking, Global
Banking and Markets, Personal Financial Services and Private Banking
sector. The project RMPR (Relationship Manager Portfolio Reporting) is a
sub Module of Customer Product Transaction. The vision is to capture all
the information about the customers based on the level they are operating
and assign a Relationship Manager who will deal/Responsible for all the
transactions of that particular customer. This will help in providing 100%
customer satisfaction on his Day to Day Transactions. And for the higher
level people they can compare the performance of the Relationship Managers
Based on their Targets and at any instant of time they can view the
Performance of the Manager and the service that has reached for specific
customer.
Responsibilities:
. Involved in Analysis, Requirements gathering, function/technical
specification, development, deploying and testing.
. Involved in designing Data stage jobs using Data stage 8.1 supporting
all phases of ETL process
. Analyzed existing legacy data source and extracted data from it using
Datastage.
. Created jobs to move data from Source to Staging and staging to Target
as per given logic.
. Involved in the design, development and testing of the PL/SQL stored
procedures, packages and triggers for the ETL processes.
. Used XML metadata importer to import XML table definitions from the
XSD document and identifying the repetition elements.
. Worked on Interactive Dashboards for the business users to analyze the
future of the commodities in present market and new trading
opportunities in COGNOS Environment.
. Created job for assigning unique identifiers using surrogate key
generator, sorter, filter, join stages.
. Performed Unit Testing and Impact analysis and other support
activities.
. Import and export the jobs Using Data stage Manager.
. Fixed the issues that are raised in the SIT Environment.
. Used job parameters and Environment Variables.
. Executed and validated UNIX scripts in Development and Test
Environments.
Environment: Data Stage 8.0, Cognos, DB2, Oracle, HP-Quality Center,
Autosys, Windows XP, UNIX
GlaxoSmithKline - Brentford, England April-07 to Dec 2010
Software Engineer
InSight & Global Reporting System:
InSight delivers Brand & Customer Account Profitability for all European
Markets at a highly detailed level. Profit & Loss statements is calculated
daily for On Invoice Discounts (OID) and Net Invoice Price (NIP), Below
Invoice Discounts (BID) and Cost of Goods Sold (CoGS) is calculated by
monthly. InSight collects & aggregates data from various sources, Cognos
Finance and JDE General Ledger are the key sources for Below Invoice
Discounts (BID) and JDE SOP and SUN (CEE ERP) are the keys sources for TLP,
NIP, OID, COGS, and Volume.
Responsibilities:
. Responsible for daily verification that all scripts, downloads, and
file copies were executed as planned, troubleshooting any steps that
failed, and providing both immediate and long-term problem
resolution.
. Provided production support at all times round the clock.
. Handled whole project by running the jobs and validating them for
successful completion of the reports
. Responsible for triggering the jobs individually if not triggered by
pearl script.
. Providing status reports by validating the reports that are taken from
Cognos portal.
. Handled special runs and daily loads and monthly loads for the project
. Daily Monitoring the Project and performing the daily tasks that are
required for the project.
. Providing the solution for change requests and seeing that the change
request is integrated with the
live data
. Implemented the Change Requests in the jobs for the specified
countries.
. Import and export the jobs Using Data stage Manager.
. Primary resource for the GRS, it was a monthly run where we need to
prepare the files load them into DWH and validate them
. Modifying and running the cubes Cubes and validating the reports that
are generated from them.
. Applied instant fixes in the report based on the data validation and
regenerated them.
. Migrated the project from 7.5x2 to 8.0.
Environment: Data Stage 7.5.1, 7.5.2, 8.0, Cognos-Finance 7.5, Oracle, Perl
scripts,
Windows XP, UNIX
AXA - New York, NY June-05 to Apr 2007
Software Engineer
AXA-EQUITABLE LIFE INSURANCE COMPANY:
AXA Equitable Life Insurance Company is a leading financial protection
company and a premier provider of life insurance, annuity, and investment
products and services. The company offers many Products and Services like
for consumers by financial professionals through its retail distribution
channel, Wholesale distribution channel and corporate distribution channel.
The project vision is to capture the transactions of all the customers
based on their Subject areas and load the data in their corresponding
Dimension Tables and Facts, Thus providing a platform for rapid, consistent
delivery of reports for the Insurance Policies and services. The main
function of this module is to convert from one Stream to another Stream so
as to provide a view about the customer Daily Transactions, Account
summary, loans, Fixed Deposits, Insurance policies at any instant of time.
Responsibilities:
. Test and execute the Datastage jobs and validate their functionality.
. Responsible for running the jobs and their completion of Data Loads
. Responsible for daily verification that all scripts, downloads, and
file copies were executed as planned, troubleshooting any steps that
failed, and providing both immediate and long-term problem
resolution.
. Provided production support at all times round the clock.
. Extracted the data from Oracle using oracle stage in Datastage
. Created job for assigning unique identifiers using Database sequences
. Identified source systems, their connectivity, related tables and
fields and ensure data suitably for mapping.
. Involved in Analysis, Requirements gathering, function/technical
specification, development, deploying and testing.
. Created project documentation, which describes the process and the
data flow.
. Involved in the Production Test run of the jobs.
. Performed data cleansing as when required.
Environment: Data Stage 7.5.1, Oracle, Windows XP, UNIX