Post Job Free

Resume

Sign in

IBM Infosphere Datastage and Infirmation Server. LINUX . DB2. ORACLE.

Location:
Birmingham, AL
Salary:
Negotiable.
Posted:
February 16, 2024

Contact this candidate

Resume:

RAMESH RAGHOTHAMA

Ramesh has ** years of IT experience working on complete SDLC for projects using concepts of data warehousing and client/server technologies with experience in both Waterfall and AGILE development methodologies as well as KanBan and Scrum models. He has proven abilities in extract, transform and load (ETL) of data using IBM DataStage Enterprise Edition. Ramesh has extensive exposure in the areas of data warehousing, BI/OLAP/ETL and databases with 22 years of experience in DataStage

(Server and Enterprise Edition) 11x, 9x/8x/7x/6x, Quality Stage 11x,8x/ 7x and has worked with RDBMSs ranging from Oracle (including Exadata 12), Teradata 14, DB2 11. Db2 10.5, Sybase 12.5 and MS SQL Server . He also has ssignificant experience with Datastage Installation and administration. He has been involved in adopting new technologies to solve business problems in Health Insurance, Retail, Life Insurance, Media Distribution and Entertainment, Manufacturing, Technology, Capital Markets and Financial Services, specifically with functional areas in Customer management, Marketing & Distribution Financial Consolidation, Budgeting, P&L, MDM, and What-If analysis. He has also worked on Reporting & BI tools. Ramesh has excellent interpersonal & communication skills.

TECHNICAL SKILLS:

ETL : IBM Data Stage EE & MVS & Server 11.7, 11.3, 9.12, 8.7, 8.1,8.01/7.51A/6.0/4.1 Data Quality : IBM WebSphere Quality Stage 11.3,8.7, 7.51 A, Dataflux (Pre SAS Custom edition) Languages : DB2-SQL, PL/SQL, Unix Shell Scripting (ksh), Universe Basic, T-SQL Reporting Tools : OBIEE, Cognos Impromptu,Tableau(novice level) Db2 Tools : IBM Datastudio, CA Platinum, WinSQL, QMF for workstation Oracle Tools : Oracle Enterprise Manager, SQL Developer, NetBeans SQL Server Tools: SSMS.

Design Tools : MS Visio Pro 2010, 2014, Rational Rose EE Databases : Oracle 11g (Oracle Exadata 11), 10g, 9i/8i, Teradata v14, Sybase 12.0/12.5, DB2

(AS400 & z’OS, UDB), MS SQL Server 7.0, 2000, 2008 Environment : UNIX (AIX 8, 7.1, 6.1, Solaris 2.6, 2.8, DEC UNIX 4.x), Z/Linux(s-390x), Linux - x86/32 & x/86/64, Power. Windows 10/7/NT/XP, Windows Server 2003, 2008, 2012

,Z/OS-TSO

Other Tools : DB2 Datastudio, CA-DB2 Mainframe tools ( Platinum),Teradata Studio & SQL Assistant, SQL Server mgmt. studio, Sybase PowerCenter, Eclipse IDE, DataStage IIS Client side tools.

Software

Dev Processes : Waterfall, AGILE (Scrum & Kanban)

TRAINING:

Participated in Capital Markets training conducted by Dun & Bradstreet.

HIPAA Training, records management and retention training for healthcare

Brainbench Certified in DW Concepts

Passed the following IBM assessment tests administered through Prometric & Pearson VUE online o A2090-421 Assessment: InfoSphere DataStage v8.5 (2013) o A2090-424 ENU Assessment: InfoSphere DataStage v11.3 (2016) CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 1/15 Skill Matrix

Applications &

Database

Technology Years of Experience Notes

IBM Datastage 22 years Parallel and Server

Editions

DB2 12 years 2 Years Db2 i Series,

8+ years Db2

Mainframe,

4+ years Db2 UDB

SQL Server 9 Years

Oracle Database 11 Years Versions 8i, 9i,10g,

11g,12c with Exadata,

Oracle Express MDMS

Sybase 2 years Sybase Adaptive server,

Power center

Universe DB

( Datastage)

10 years Datastage Server Engine

database

Teradata 1 year V14 & v15

Informix 10 months

XML & Web Services 4 years Integration with

datastage

IBM IIS Information

Analyzer

3 years

IBM Quality Stage 4 years 1 Year Quality Stage

server edition, 3 years

Parallel edition

Programming &

Database Languages

SQL 22 years

DS Basic 20 years

C/C++ 6 years

Java 1 year

Operating Systems

Windows NT &

Windows Server based

20 years Server and client side

applications

Unix /Linux OS 17 years DEC Tru 64 Unix,

Sun Solaris,

IBM AIX,

Linux ( intel, power and

Z/Linux)

Z/OS 9 years Only as database host

I Series 2 years Only as database host

CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 2/15 o

PROFESSIONAL EXPERIENCE:

ETL Designer/DataStage specialist, BlueCross&Blueshield of Alabama 01/2017 to present

Member Interaction Centre

This project is a large enterprise wide initiative to implement an analytical infrastructure on clinical member interaction data to maximize utilization of dollar amounts spent on proactively engaging health insurance plan members and judging the effectiveness of such engagements in preventing cost overruns from providing health insurance services. It maps out the data & processes needed to provide information on performance measures that characterize agent productivity vis a vis managing member engagements, effectiveness of outreach efforts, track clinical outcomes and help visualize the most effective care giving and engagement paths that lead to reduced expenses on healthcare as well as improve overall wellbeing of member . This would further assist in compliance and member satisfaction & retention efforts to positively impact BCBS AL.

Responsibilities:

Worked on the following subject areas of health insurance data – Member, Coverage Eligibility, Agent, Engagement, Health Measure, Inquiry ( with inquiry routing) and Claims.

Worked on DB2/Z database with IDAA acceleration layer that acts as a storehouse of member, agent, engagement, health measure and claim information to build the interaction centre datamart for analytical purposes using IBM Datastage 11.7 and IBM Datastage 11.3

Worked on DB2/UDB Database with IIAS engine using column organized fact tables and row organized dimensions

Involved fully in migration of Data from Db2/Z based data mart to Db2/UDB based data mart optimized for analytic applications and testing data quality post migration, analyzing discrepancies and applying necessary fixes /corrections.

Involved in migration of Datastage codebase from Datastage version 11.3 to 11.7 and testing the migrated code

Used CA Detector Toolkit for analyzing DB2 Mainframe query performance.

Used IBM Datastudio Visual Explain & Administrative View tools for analyzing DB2 UDB query performance

Used SQL Server management studio to access and query SQL server data sources

Involved in Design and development of IBM Db2 SQL Queries to access Database tables and Views.

Created unit test queries and test cases, create and test SQL/PL procedures using IBM Datastudio GUI based tools, run and analyze SQL Explain Plan, performance tuning of SQL Queries, using Mainframe TSO console to view and browse Mainframe stored procedures for maintenance and interfacing with IBM Datastage via analysis of Procedure Call Signatures and Return codes.

Converted slower performing jobs that used a combination of Db2 Deletes,Updates and inserts to use faster methods using Truncate and load strategy via merging and exclusion of the data using datastage datasets and drastically improved load times as well as code readability .

Used partition parallelism to speed up jobs downloading data from DB2 – Modulus Partitioning, Max and Min Range( DB2/Z), DB2 Partitioning ( for IIAS tables with distribution key).

Familiar with Project implementation using AGILE l, used Scrum based development delivery methods, used JIRA for Epic, Story & Task management CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 3/15

Processed various database datasets from SQL & Db2 prestaging layers to DB2 staging layer using ETL paradigm that achieves high workload isolation from database by using Infosphere datastage parallel datasets . A further set of etl processes reads from staging, writes to ETL parallel datasets, applies processing logic interms of transformations,reformatting, lookups and aggregations prior to creating finalized datasets for load readiness to Datamart layer on DB2

Co-ordinated with Datastage Admin Team and IBM support team on product issues that required raising PMR’s .with IBM

Extensive usage of available Db2 analytic & scalar functions and SQL for analysing ETL generated result sets for accuracy

Uses complex DB2 stored procedures typically written in SQL/PL or COBOL and invoked them in datastage using the Stored Procedure stage, procedures could have input and output parameters, cursor outputs and I/O parameters, procedures could operate either row by row or multiple rows per each input row modes and these calls were configured in Datastage STP stage.

Used parallel datasets, sequential files and complex flat files to store data in between ETL processes

Used various change data capture mechanisms- using both DataStage (Change capture stages, Lookups, SCD stage as well as database based approaches (timestamp based, batch number based, minus, intersect etc)

Data sourced from Db2 using db2 connector, SQL server using ODBC connector .

Configured datastage JDBC connectors in isjdbc.config file and tested JDBC connectivity features for comparative purposes with native connector & odbc connector( database driver protocol evalution )

Data was loaded to targets using Db2 connector inserts/appends

Provided guidance and consulting to DataStage admin team on certain nuanced features of DataStage and vector job stages as well as parallel transformer loops and transformer cache ( SaveInputRecord . Created Parallel job transforms using beginner, intermediate level c/C++. Created Wrapped stages to invoke pipe safe & single threaded unix commands on AIX ( mailx etc), used range lookups and conditional lookups.

Performed query tuning on Db2 with appropriate use of indexing strategies, statistics update and table re-org where needed .

Used CA Platinum Mainframe based menu driven UI to troubleshoot and analyse SQL performance issues, used collect/update stats, worked with DBA teams to resolve issues on query performance like check pending, outdated stats, table fragmentation ( via reorg) and re-running the jobs with refreshed stats.

Used Explain Plans in Db2& SQL Server

Created ETL jobs to compress parallel datasets using bzip2 & gzip using the Encode Stage

Created ETL jobs to decompress parallel datasets using Decode stage

Created Unix Shell scripts and use Unix commands and Orchestrate shell commands to manipulate/view and test data files and orchestrate parallel datasets on UNIX compatible RedHat Enterprise Linux Servers and AIX servers.

Developed custom routines for IBM Infosphere Datastage using DS Basic and C/C++ parallel routines on Red Hat Enterprise Linux and AIX for test and verification only.

Analyzed and troubleshoot datastage connectivity issues by working with Network & Platform Admin teams.

Built complex SQL queries from data stored in Datamarts to fulfil base reporting requirements for use by views as well as reporting layer developed in Tableau

Acquired Working knowledge of Mainframe TSO utilities including CA Platinum

Provided Production Support and Operational Support for deployed application modules, analyzing end user data queries. Involved in Fixing and Troubleshooting issues in Production and Test

(QA/UAT) Environments.

Involved in technical discussions with Architecture, Technology Support and business teams to ensure performance, stability of applications and developing additional features to scale application’s capabilities.

CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 4/15

Provided advisory on design alternatives, performance tuning and enhancement of IBM Datastage Application tier services in line with evolving Industry standard Best practices.

Developed and Executed of functional unit tests to ensure requirement fulfillment, security, availability and performance of distributed application systems.

Maintained and Developed Korn shell scripts to call ETL jobs on AIX platform

Provided UAT Support for weekly UAT cycle – clarifying end user questions on data

Used Git BitBucket to store datastage export files for version control

Profiled data using Infosphere info analyzer and published reports

Used IMAM tool to store/define table metadata for datastage ( Infspehere metadata asset manager)

Used Infosphere Governance catalog to analyse ETL repository and perform object lineage analysis

(metadata analysis) .

Environment: DataStage 11.7, 11.3, Information Analyzer, IMAM, IGC, DB2 UDB, Db2/zOS v 11.5, SQL Server, AIX,Linux, Windows

ETL Designer/DataStage specialist, Safeway-Albertsons, Phoenix AZ 03/16 to 01/17 Enterprise Business Intelligence (Albertsons Accounting Information Mgmt.) –ETL Track: Safeway service delivery projects:

This project is a large enterprise wide initiative to manage & present Accounting & Sales information to key stakeholders across Albertsons- Safeway supermarkets. Responsibilities:

Worked on MPP VLDB platforms processing billions of rows on a daily basis – Teradata V14 & v15, as well as Oracle Exadata 11, apart from source systems on Db2, zOS and Db2/I (formerly Db2/AS400) – dealing with store transaction data, GL balances etc.

Familiar with Project implementation using AGILE, KanBan model, used Scrum based development delivery methods, used Version One for Epic, Story & Task management

Processed various files into staging layer from where Database centric SQL and procedures transformed data prior to being used for post staging ETL – loading work area and Data marts

Extensive usage of available Oracle and Teradata analytic functions, set based processing

Created and Debugged store procedures, procedure blocks in Oracle where needed to do pre-extract or post load processing (ETL paradigm in data integration)

Used various change data capture mechanisms- using both DataStage (Change capture stages, as well as database based approaches (timestamp based, batch number based, minus, intersect etc)

Sales BI – Developed ETL jobs to load sales data to the Enterprise business intelligence warehouse in Oracle Exadata, with focus on ETL paradigm – using DataStage to perform transfers across heterogeneous platforms and using database procedures and queries for data processing within the same database. Worked with DataStage admin teams to troubleshoot issues arising in DataStage – like lock and resource issues. Trained Offshore resources based out of Manila

(PH) for production support tasks

Provided guidance and consulting to DataStage admin team on certain nuanced features of DataStage. Used Parallel job transforms that linked to Voltage encryption shared library objects. Knowledge of Informix used at store level locations and serving as alternative source system

Working knowledge of Quality Stage jobs used in enterprise MDM project

Finance BI – Developed and tested jobs that would feed tables used in Profit and loss reporting and GL Balance reporting

Performed query tuning on Oracle with appropriate use of indexing strategies, hints and parellelism where needed

CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 5/15

Used Oracle Enterprise Manager web based UI to troubleshoot and analyse SQL performance issues, used gather stats, worked with DBA teams to resolve issues on query performance like disabling bad SQL profiles and re-running the jobs with refreshed stats.

Used Explain Plans in both Teradata and Oracle

Used OBIEE Clients to verify data in dashboards, create saved report customizations, exporting data from OBIEE reports etc. Running OBIEE based tasks – like daily flash sales report publishing etc.

Earned Retail Allowances- This project would facilitate a reporting dashboard which would indicate to business the difference between actual billed allowance amounts from vendors to Safeway and the estimated allowance amounts – so shortfall in allowance cash flow could be tracked.

Developed scripts using Teradata Bteq and used DataStage jobs to move data from Db2/zOS to Teradata. Performed query tuning on Teradata using set and multiset tables, primary index design, used volatile set tables.

Working knowledge of Teradata utilities like Mload, Fastpump etc

Maintained and Developed Korn shell scripts to call ETL jobs on AIX platform

EBI Support and services track – Provided 24/7 on call support as part of Call escalation pathway for issues related to Production ETL job failures, long running jobs, data corruption issues and data certification failures

Requesting Change tickets for Production deployments of ETL and database objects using the Service Now Change ticketing process

Used Serena Change Man DS for DataStage ETL job version control and promotion from DEV to QA and PROD environments

Environment: DataStage 8.7, Teradata V14 & v15, Oracle Exadata 11, Db2/zOS, Informix, AIX ETL Architect/ Designer/ DataStage Admin, AIG Accident, Nashville TN 04/12-02/16 DataStage Administration/IIS Infrastructure Maintenance: AIG L&R BAU environment Responsibilities:

Administration and maintenance of the DataStage 8.7 environment for AIG L&R organization used by Marketing and Field sales/Distribution,

Installed Rollup Patches and Fix Packs on ETL Client and server side,

Co-ordinated with DBA's and Unix system Administrators on infrastructure issues, performed IVP

(Install Verify Protocol) and AVP (Application Verify Protocol) upon changes in infrastructure environment.

Used service Now Change ticket and request management work flows to implement or verify infrastructure changes. Used service Now incident management workflow to report incidents to respective support teams.

Liaised with IBM Product support and raised IBM PMRs as and when required – for example Product compatibility inquiries, patching and fix pack inquiries, general troubleshooting inquiries not resolvable by reference documents/manuals or public search results.

Performed IIS/DataStage Health Check using ISA Lite Tool, recovering from backup using Xmeta roll-back and running sync Project tool,

Worked on IIS/DataStage Session management tools and command line tools, ASB Node and ASB server standard scripts.

Configured auditing in Websphere Admin console,

Created a process for DataStage code change management. Taking ETL codebase backups prior to changes.

Troubleshot environment for performance issues as reported by operations teams, disk space clean-up etc.

Configured the IIS/DataStage Operations Console interface, CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 6/15

Implemented User access, authentication, authorization and auditing policy in conjunction with clients’ security &audit requirements. Implementing security exceptions upon client authorization.

Monitored Disk, Memory and CPU usage using Ops Console and Command Line tools, monitoring processes, clearing job locks.

Worked on DataStage server runtime engine tools and utilities (smat, uvsh, uv -admin), Universe Editor, UV & DS Basic commands, Orchestrate & osh commands like osh, orchadmin, orchdbutil etc.

Worked on starting and restarting DataStage Engine services

Implemented changes to ODBC datasources, configuring ODBC, DB2, Oracle, Sybase datasources on DataStage server.

Requested root access for administering WAS/IIS/ASB Node agent services as needed, start-up and shut-down of complete service stack.

Performed Limited Administration activity on Non-PROD Oracle Databases – Resolving tablespace, constraints and index issues affecting ETL, gathering statistics, analysing tables/indexes, session management, explain plans etc.

Worked on Unix C/C++ compilers like GCC (z/Linux/S-390X, Linux-x86/64, AIX-ppc) & XLC/C++

(AIX PPC),

Used Unix commands &open source utilities like tar, cpio, gzip/bzip2, as,ar, ldd, nm, od, vi, cat, sed, awk, grep, tr, ps, fuser, lsof, sar,ps, ipcs, ipcrm, dd, objdump Multiple Distribution and Marketing department projects: AIG Life and retirement Life Datawarehouse: LDW

This is a project that aims to build a new standardized enterprise data-warehouse with integrated analytic platform using DataStage 9.12 and Cognos Reports, sourcing data from various policy administration systems like Cyberlife/Consolidated Financials, Vantage, ALIP (Accenture Life Ins. Platform) and the in house AGNIS:

Responsibilities:

ETL architect, DataStage Design reviewer,

Assisted in troubleshooting environment, connectivity and programming issues

Participated in design, development and troubleshooting activities.

Provided Core DataStage technical & design support as and when needed by services provider. Environment: DB2 Client tools - Control Center, Data-studio, Db2 Connect, Db2 SQL and Command line interface, IIS DataStage EE 9.12 Parallel Jobs, z/Linux and AIX. C/C++ (for Debugging Parallel Transformer issues and custom Parallel routines)

Marketing Information Management /MIM

This is a legacy Marketing information management platform used in strategic decision making, evaluating effectiveness of sales campaigns and marketing department initiatives. It also has a customer keying module that uses Dataflux for creating customer cluster ids which are used to build unique customer key based on attributes like first Name, Last Name, Address, Data Of Birth Responsibilities:

Technical Specialist & Developer – ETL and Database tools, ETL administration and DBA roles. Environment: Oracle Client tools, SQL Developer, Oracle SQL and Command line interface, IIS DataStage 8.7 Jobs, DataStage CLI, AIX ksh scripts, Server Jobs, Basic Routines and batch programs. Dataflux (Pre SAS customized Edition Only)

Annuities Data Warehouse

CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 7/15 This project aimed to create a single warehouse to report on both Fixed and Variable Annuities for Sun America & Western National (both AIG acquired company) and eliminate overhead of maintaining multiple systems working in silo, by consolidating infrastructure, and eliminating custom ETL that used UNIX scripts and PL/SQL using DataStage 9.12 Enterprise edition. Responsibilities:

ETL Development using DataStage 9.12 Parallel Jobs, PL/SQL Environment: IIS DataStage EE 9.12, PL/SQL, Oracle Client Tools, AIX Distribution/Field Sales-Consolidated Policy Database This project aimed at recreating current external file extracts to DTCC ( multiple brokers/dealers) and first Command agencies containing customer policy information using ETL instead of COBOL programs, by sourcing data elements from the Consolidated Policy database that stored policy information from many policy admin systems and developing a standardized extract creation process which would reduce overall TCO of the systems to maintain these critical, time sensitive feeds for Fixed and Variable Life and Annuity and term insurance products . The interfaces included Positions and Values File

(POV/PVF), Inforce Transactions ( IFT) – which are routed to multiple sales partners like Merrill Lynch, UBS, Edward Jones, JP Morgan-Chase through DTCC Insurance processing service ( IPS) facility.

Responsibilities:

Technical Designer, ETL development

Developed Proof of concept for a workable & scalable ETL solution to generate these highly unconventional nested hierarchical format flat files using DataStage Jobs,

Used various design customizations to arrive at the solution. Environment: Oracle Client tools, SQL Developer, Oracle SQL and Command line interface, IIS DataStage 8.7 Jobs, DataStage CLI, AIX ksh scripts. Server Jobs, Basic Routines and batch programs. Text Editors like SPF-Lite and PS-Pad.

Customer Value Management(CVM)– Data Integration, DataStage Administration This is a Data integration initiative within ETL to provide Mainframe data to a web based sales force dashboard and analytics system. The analytics system is designed to replace proprietary hardware that runs a version of Windows NT suited to field agents working with AGLA. The objective is to reduce and gradually eliminate any dependency on proprietary hardware systems and their associated maintenance and support costs. DataStage is used as the data integration platform providing a two way data transfer mechanism from Mainframe to the Analytics database staging area, as well as sending corrected and modified feeds from Sales force Analytics App back to the mainframe (z/OS), the hub will also feed other systems over time.

Responsibilities:

Converted All Existing Smart Pad inbound file feeds (ETL jobs) from Mainframe to Jobs loading similar structures in CVM

Developed XML data processing jobs in DataStage to send analytics data back to mainframe for reconciliation and closure processes.

Used XML input pack

Developed Job Sequencers and job control (Master jobs) to minimize manual intervention in incremental loads.

Worked on Enhancement and maintenance, UAT support of CVM ETL jobs (Server jobs)

Worked with Multi format files, Cobol Complex flat files, ASCII data files CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 8/15

Gained working knowledge and understanding of DataStage Mainframe edition jobs (S/390) and mainframe job stages.

Gained working knowledge of XML Transformer stage and Oracle XML functions

Worked on preparing installation roadmap for DataStage v 8.7 on AIX alongside Sys Admin and DBA team.

Used Unix filter commands to pre-process files

Created job description/design and unit test documents for newly developed interfaces

Liaised with various data source personnel to test and troubleshoot issues

Worked on Data sources and targets including Db2/zOS, Oracle 11g r2 and MS SQL Server Enterprise2005, Cobol files and flat files.

Worked on Pre-Installation, Installation and configuration of IBM Information Server DataStage 8.7 Enterprise edition with MVS job support and Information Analyzer on AIX.

Configured PAM (Pluggable Authentication Module) with DataStage to use Centrify driven LDAP authentication scheme for DataStage users

Was the primary technical liaison with IBM Information server support for the installation & configuration of IBM information server and migration from 7.5

Setup and migrated DataStage projects from version 7.5 to 8.7. Configured users and user roles.

Tested and verified parallel job restructure stages to perform data transformations on complex data types like vectors (simple, complex, fixed and variable length) and subrecords (level 02 and level03) structures for internal DataStage use (intermediate processing) with Mainframe data formats.

Co-ordinated offshore team of client’s preferred offshore vendor. Assigned and managed offshore tasks

Involved in production support of CVM and pre CVM application with offshore co-ordination.

Demonstrated performance characteristics of parallel jobs to client – ESP jobs using Modify stage, Join, Lookup Merge and Difference/Change Capture, Pivot Enterprise stage new features.

Used Loop Variables and Iterations in Parallel job transformer.

Successfully configured DataStage operations console (new feature in 8.7) and demonstrated its features

Gained working knowledge of Mainframe job stages including IMS stage and Multi format flat files, COBOL/JCL uploads.

Developed jobs to perform mass emails to agent offices for office specific data issues using email contact data stored in tables.

Verified new analytic functions in Oracle 11gr2 (eg. LISTAGG)

Created server job routines for use in server jobs and job control /sequencers

Worked on custom job control for DataStage (DSBasic Batch programs). Environment: IBM Information Server DataStage (Mainframe and Server Editions) 7.51a, Sun OS for Sparc (Solaris), Db2/zOS, XML, Oracle SQL Analytic / OLAP, ODBC plugin, SQL Server enterprise, Oracle 11g. IBM Information Server Version 8.7 Enterprise edition (with mainframe /MVS jobs), AIX 6.1.

Sr. ETL Designer, Johnson Controls Inc., PA 06/10- 03/12 Building Efficiency – Datawarehouse

This is a pure Sales and Supply Chain Data-warehousing system that extracts data from diverse ERP sources located globally, as well as text files and loads to standard Oracle 10g Datawarehouse consisting of Dimension Tables, Fact tables and Lookup/Cross Reference Tables that store Dimension relations/attributes. Additionally, reporting layer is based out of Aggregate facts that store a higher level of detail. Reporting is done via Business Objects

Responsibilities:

CV of Ramesh Raghothama ~ SVAM International Inc ~ 12/2016 9/15

Converted All Supply Chain Datwarehouse Incremental jobs from Server Jobs to Parallel Extender

(IIS-DataStage Version 8.01)

Developed Job Sequencers to minimize manual intervention in incremental loads.

Worked on Enhancement and maintenance of Existing Sales DW ETL jobs (Server jobs)

Worked with heterogeneous ERP systems – Oracle 11i, SAP, iSCALA, MFGPRO, MAPICS/IFM, SYMIX and other proprietary ERP systems (Costpoint, Navision etc.) – All data required for Incremental jobs was in a standardized format and routed through IBM Websphere MQ into Datawarehouse Staging area for Incremental Processing

Interfaced with MQ using Server and Parallel MQ plugin stages and WSMQ Connector stage– primary message format was XML, some systems dispatched data in fixed length format via MQ. XML data was parsed using DS XML pack plugin

Dispatched data to MQ using MQ plugin stage

Worked on manually profiling data from SQL Server data source for reconciliation of Supply Chain DW data. Profiling done using SQL and Microsoft Excel. Based on this rules were deduced to reconcile reference data in SQL server with data in Supply Chain DW

Developed ETL Jobs to Process Reconciliation reference data from SQL server and compare with existing data in Supply Chain DW.

Used MS-OLEDB plugin to extract SQL server data. Working knowledge of AS/400 (iSeries Databases)

Developed and maintained jobs to pull Master data directly from AS/400 ERP sources using ODBC Connector plugin (parallel Jobs) and ODBC plugin (Server jobs). These jobs were used to maintain certain master data attributes that were not being sent via MQ.

Created System Documentation almost from scratch for Supply Chain DW, using available ERP to EAI Data mapping templates and by deciphering job logic and targeted questionnaires addressed to maintenance personnel.

Worked with EAI and



Contact this candidate