Summary:
Over **+ years of experience in end-to-end development of software applications using data warehouse methodology in various phases of IT projects such as system analysis, design, coding and testing.
13+ years of experience in ETL (Extraction Transformation and Loading) tools Ascential and IBM Infosphere DataStage 7.5/9.1/11.3,11.7. DataStage Designer, DataStage Director, DataStage Manager and DataStage Administrator.
Expertise in Developing Parallel jobs, Server jobs and Sequence jobs.
3+ years of experience in Data warehousing experience in building and managing various data warehousing/data marts using Informatica products such as Informatica Power Center 7.1.1/7.1.2/8.1.1.
Strong working experience in the Data Analysis, Design, Development, Implementation and Testing of
Data Warehousing using Data Extraction, Data Transformation and Data Loading (ETL).
Worked on various Operating Systems which include Windows and UNIX, IBM AIX.
Strong Team work and collaboration skills.
Good communication and analytical skills with very good experience in programming and problem solving.
Technical Skills:
ETL Tools
IBM Infosphere DataStage Parallel Extender 7.5/9.1/11.3,11.7 and Informatica Power Center 7.1.1/7.1.2/8.1.1.
BI & Reporting
Business Objects XI/6.5/6.0/5.0, BO Developer Suite 6.0/5.0 (Designer, Web Intelligence, Broadcast agent, Supervisor, Module)
Data Modeling Methodologies
Erwin 4.5, Bill Inmon and Ralph Kimball’s Data Modeling, Star and Snowflake Schema Modeling.
Databases
Oracle 9i/8i/7.3,12c, MS SQL Server 2000/7.0/6.5, IBM DB2 10.5/9.5, DB2/400, Teradata, MS-Access, NETEZZA.
Operating System
WINDOWS, UNIX, IBM AIX.
Languages/Utilities
SQL, PL/SQL, C, C++.
Professional Experience:
Bank of America
Plano, TX June’23 – Till Date
Senior ETL Production Specialist
Description: The Bank of America Corporation (often abbreviated BOFA or BOA) is an American multinational investment bank and financial services. Bank of America is one of the Big Four banking institutions of the United
States. It serves approximately 10.73% of all American bank deposits.
Responsibilities:
•Provided support for CRH and RDS applications, including multiple applications CNE, CMX, CRDS, FRE, FFD, and WCC.
•Managed over 500k DataStage jobs across various applications, ensuring seamless data operations and workflow continuity.
•Worked as L3 on-call production support, troubleshooting defects, documenting issues, and collaborated with development team, database administrators, and other teams to ensure timely resolutions.
•Collaborated with the Platform team to resolve DataStage server-level issues, with documentation maintained in SharePoint for future reference.
•Provided L1 and L2 support, monitoring, identifying failures or performance issues, and ensuring jobs run as scheduled.
•Resolved few issues after initial analysis and escalated critical issues for immediate attention.
•Skilled in Autosys job management, including Force-Starts, On Hold, On-Ice, Kill, log checking, data execution, and managing job flows and also experienced in handling long-running jobs, coordinating with upstream and notifying downstream teams based on identified issues and timelines.
•Proficient in bypassing threshold values using Unix.
•Experienced in executing jobs through DataStage Director.
•Utilized Oasis or Unix for bringing resources up and down during releases and patching.
•Experienced in designing technical specifications for mapping documents, including unit test scenarios.
•Developed parallel jobs using various stages such as Aggregator, Change Data Capture, Transformer, Join, Merge, Lookup, DB2 Connectors, and Sequential Files.
•Implementing Kafka with Java API Connectors in DataStage parallel jobs enables high-performance, real-time data streaming.
•Developed common modules for error handling and audit processes.
•Responsible for code review, setting design standards and coordinating tasks with offshore teams.
•Implemented branching strategies to streamline development and reduce integration issues using GitHub.
•Experienced in Agile methodologies including Scrum and Kanban.
Environment: IBM Infosphere Data Stage 11.7, Oracle 12c, DB2 10.5/9.5, SQL/ PL-SQL, Autosys and Linux.
AAA Auto Club Group June’11 – May 23
Tampa, Florida
Senior ETL Developer
Description: - AAA is a leading provider of roadside assistance, offering services such as lockouts, winching, tire changes, automotive first aid, and towing to its members. The AAA Auto Club Group (formerly AAA Auto Club South) also offers Auto, Home, and Flood Insurance, as well as travel services, including hotel, car, and flight reservations, and cruise bookings. ACG serves members in states like Florida, Georgia, Michigan, and others. The ACG Information Services team develops and maintains software to support member services. The team uses ETL processes to manage and load data for over 9 million members, facilitating billing, card generation, and statement production for various membership plans. ETL is also employed in strategic projects such as deferring unearned revenues, lead generation, and address matching.
Responsibilities:
•Developed and supported the Dues Deferral and Billing processes, along with existing ETL jobs, using DataStage.
•Utilized the DataStage ETL tool to load data from flat files and source tables into staging, dimension, and fact tables.
•Designed, developed, and tested DataStage jobs using Designer and Director based on business requirements to load data from source to target tables.
•Enhanced existing jobs by implementing new functionality.
•Conducted parallel testing of ETL code with legacy systems using live production data before deployment.
•Prepared test cases for system testing and participated in business requirement gathering sessions, creating high-level scope documents.
•Authored functional and ETL design documents based on business requirements and system architecture.
•Designed technical specifications for mapping documents and created unit test scenarios.
•Developed parallel jobs using stages such as Aggregator, Change Data Capture, Transformer, and DB2 Connectors etc.
•Migrated jobs from versions 7.5 to 9.1, and from 9.1 to 11.3.
•Developed common modules for error handling and auditing processes across the project.
•Managed performance tuning, code reviews, design standards, and coordinated work with offshore teams.
•Contributed to the data conversion process for transitioning from legacy systems to new maintenance systems.
•Developed and executed test plans, test scenarios, and test cases, and performed system and user acceptance testing.
•Provided 24/7 on-call production support.
Environment: Ascential and IBM Infosphere Data Stage 11.7,11.3/9.1/7.5, DB2 10.5/9.5, PL/SQL, SQL, Info maker, AIX (X-Manager) Autosys Scheduler.
Advertising.com (AOL) Jan’10 – May’11
Baltimore, MD
ETL Developer
Description: - Search Engine Management (SEM) includes comprehensive features for managing search campaigns across major platforms such as Google, Yahoo, and MSN. It primarily focuses on campaign configuration and control through a graphical user interface, database, enhanced decision engine (Ad Learn for Search), operational reporting, and web services.
Responsibilities:
Discovery and identification of source data.
Led system study and analysis to support project objectives.
Translated business requirements into functional database designs.
Defined the ETL strategy for populating the Data Warehouse.
Extracted data from Oracle 9i and flat files.
Developed data mappings to extract, transform, and load data from various sources into an Oracle data warehouse, using filters and expressions.
Created workflows, tasks, and database connections in Workflow Manager.
Developed mappings to transform source data, utilizing complex transformations such as Aggregator, Expression, Joiner, Update Strategy, and Lookup.
Extensively used ETL tools to load data from multiple source databases into the target database.
Connected to target Oracle and Netezza databases using ODBC.
Developed and optimized complex Informatica mappings for improved performance.
Created sessions and batches for scheduled and on-demand data movement using Server Manager.
Managed session and batch scheduling, and handled session/batch recovery in case of failures.
Performed testing and peer review of mappings.
Environment: Informatica Power Center 8.1.1, Oracle 9i, 10g, Netezza, SQL, Toad and Windows XP.
East collaborative Pvt. Ltd. July’08-April’09
Bangalore, India
ETL Programmer
Description: - Systemic Enterprise Connector (SEC) is an online project life cycle management system designed to meet the complex Enterprise Application Integration needs of engineering and manufacturing enterprises. It supports various operational levels, from tactical to strategic and policy-driven functions across departments. The system's configuration control features facilitate decision-making by individuals or teams throughout the product development lifecycle, allowing them to evaluate, approve, or reject data and transactions.
SEC is a highly scalable, integrated internet/intranet solution with robust systems that manage large-scale data, information, and time-sensitive processes, specifically tailored for product development enterprises.
Responsibilities:
Conducted systems study and analysis to support project requirements.
Interpreted business needs and implemented them into functional database designs.
Gathered user requirements from end users.
Defined ETL strategy for populating the Data Warehouse.
Extracted data from Oracle and flat files for transformation and loading.
Developed data mappings to extract, transform, and load data from various source files into an Oracle data warehouse using filters and expressions.
Developed workflows, tasks, and database connections using Workflow Manager.
Developed and optimized complex mappings using transformations such as Aggregator, Expression, Joiner, Update Strategy, and Lookup.
Extensively utilized ETL processes to load data from multiple source databases into the target Oracle database. Used ODBC to connect to the Oracle database.
Leveraged Data Sets and Data Grids for report generation and rendered Data Grids based on user roles.
Involved in use case analysis and developed use case diagrams using MS Visio.
Environment: Informatica Power center 7.1.2, Oracle 10g, Toad 7.0, Windows 2000/NT, UNIX (HP-UX, Sun Solaris, AIX).
Accenture Services Pvt Ltd. July’07-Jun’08
Mumbai, India
Software Engineer
Description: - SBC Communications, a Fortune 500 company, is a leading telecom provider in the U.S., offering local and long-distance services, wireless, voice, data network solutions, and internet access for both business and residential customers. The data warehouse operates in a Teradata environment. Due to multiple mergers, some dimensions are misaligned, and practices inherited from subsidiaries have turned the data warehouse into a complex system, making information extraction challenging. SBC aims to streamline the process and establish a consistent architecture for loading the warehouse.
Phase I: A Load Ready File (flat file) is created from the source file after multiple transformations and error handling. This file acts as a staging area for data cleansing, and errors are captured in a separate delimited file.
Phase II: The Load Ready File is loaded into Teradata tables using Teradata’s ODBC/Fast Load connection, with load date and time also recorded in the table.
Responsibilities:
Conducted requirements gathering and business analysis.
Prepared documentation for design and ETL development.
Developed mappings and workflows to generate staging files and load data from flat files into Teradata tables.
Created various transformations such as Source Qualifier, Sorter, Joiner, Update Strategy, Lookup, Expressions, and Sequence Generator to load data into target tables.
Built multiple mappings using reusable mapplets.
Utilized ODBC/Fast Load to connect to Teradata targets.
Implemented error handling logic, including validation and management of incorrect input values in mappings.
Developed workflows, tasks, and database connections using Workflow Manager.
Participated in system application testing before project implementations.
Used shell scripts to compare incoming files with existing files, generating logs and loading changed values into the target tables.
Provided production support, including job migration from UAT to production, job monitoring, and bug fixing.
Documented the mappings and participated in testing and peer reviews.
Environment: Informatica Power Center 7.1.1/7.1.2, Teradata SQL Assistant, Toad 7.0 & Windows XP, UNIX (HP-UX, Sun Solaris, AIX).
Educational Qualifications:
Masters of Science in Information Technology, Bharathidasan University, India.