Post Job Free

Resume

Sign in

Data Analyst

Location:
Chennai, Tamil Nadu, India
Posted:
August 14, 2020

Contact this candidate

Resume:

Amirthaprakash Chidambaram

adfbvt@r.postjobfree.com +91-638-***-****

Senior ETL/Data stage Developer

IT professional with 12+ Years of overall experience in Data warehousing Applications specification, design, coding, testing, debugging and creating documents. 8+ years of experience in using ETL methodologies to perform Extraction, Transformation and loading data into data warehouse environment using IBM Infosphere Datastage Parallel extender, MDM, RDBMS Oracle, SQL server, UNIX Script and Data Vault Modeling.

Major Strengths

Hands-on experience in 3 complete SDLC, including requirements gathering, analysis, designing, ETL development/coding in Parallel framework, Testing, Production implementation, Enhancements, Support and Maintenance, having 4 years of experience in Banking domain.

Solid hands-on experiences in using different processing stages like Transformer, aggregator, Lookup, quality Stage, Join, CDC with SAP, XML Parser, and good knowledge in interpreting/manipulating Hierarchical data document.

Written UNIX script to handle Datastage jobs running, file operations, file transfer to FTP/SFTP and data cleansing.

Hands on experience in dimensional modeling concepts using star/snowflake schema, dimension tables, fact tables.

Hands on experience in Data Vault modelling and its components Hubs, Links, Satellites, developed ETL jobs to load data for data vault modeling.

Developed ETL standards, Best Practices, technical document, troubleshooting document.

Excellent SQL and UNIX skills.

Performed Data Quality Validation, data profiling, Load and Performance Testing, End-to-End (E2E) Regression and Integration Testing.

Prepared high level design and Low level design documentation.

Experienced in troubleshooting of data loads, addressing production issues, performance tuning of slow running ETL jobs.

Created Datastage reusable jobs for extracting data from Oracle and SQL server database, transform and finally loading into the data warehouse.

Provided technical guidance and training for Information & Data Architects and developers.

Experienced in Data Warehouse projects handling multiple roles like Team Lead, senior programmer, Data Analyst, Production Support and Maintenance.

Experienced in working directly with end users to collect data requirements and effective communicate & coordinate with business users.

Highly driven, results-oriented, and creative problem solver and a willingness to do what it takes to deliver quickly. Ability to prioritize and set direction in a fast-paced environment.

Technical Proficiency:

ETL Tools

Data stage – 7.x, 8.x, 9.1, and 11.3, Data quality Stage, MDM, and SSIS.

RDMS Databases

Oracle, SQL Server, Sybase.

Database Tools

Toad, SQL Management studio.

Modeling Concepts

Dimension Modeling, Data Vault Modeling, Data Governance and MDM.

Modeling Tools

Power Designer.

Scripting

UNIX Scripting, SQL Script, PL/SQL, Windows Batch Script.

Operating system

UNIX, Windows

Scheduling tool

Tivoli, Crontab.

SDLC

Agile Unified Process (AUP), Kanban

Educational Qualification:

Degree: M.C.A - Master's Degree in Computer Application Year: 2002

University: Madurai Kamaraj University Madurai TamilNadu India

Degree: B.B.A - Bachelor of Business Administration Year: 1999

University: Madurai Kamaraj University Madurai TamilNadu India

PROJECTS:

Veritiv Corp – Fairfield Ohio Mar 2016 to Till Date

Sr. Datastage developer

Veritiv Data Mart Integration

Description

Veritiv Corp acquired Unisource and xpedx. Veritiv created a new data warehouse called VOS.VOS integrate both data mart into unique enterprises data warehouse. VOS provide the business users with required reporting. All the data from the xpedx are converted into Unisource system. Created a data conversion frame work to handle multiple fact tables conversions using unique framework. Conversion framework is cost effective and saved 400 work hours. Fact table are converted through the framework and tested by QA team. After the QA validations converted data moved through partition exchange methods.

Created ETL frame work to execute the data conversion process.

Written SQL scripts for the fact data conversion and developed testing scenarios.

Validated and tested partition exchange method for post conversion data exchange.

Worked with the solution architects to implement data strategies, build data flows and develop conceptual data models.

Developed best practices for standard naming conventions and coding practices to ensure consistency.

Formulated data loading ETL frame work to support change tracking, apply business rules as per business requirement.

Validated the data files from source to make sure correct data has been captured to be loaded to target tables.

Followed Veritiv release process, change request process for smooth code deployment and change tracking.

Worked in agile development approach.

Environment: IBM Info sphere Data stage 11.5, Oracle, Unix Script, SFTP, FTP, and Agile.

Chevron – Concord California Apr 2014 to Mar 2016

Marketing and reporting data warehouse (MARS)

Sr. IBM Infosphere Datastage Technical Analyst /Developer

Description

MARS is designed to provide the marketing business with an easy to access, centralized, reliable home for data that is not available in ERP systems for downstream applications. MARS is data repository which resides in SQL Server database that is populated using IBM’s Data stage Extract Transform Load (ETL) tool. At least 5 years of historical data will be maintained for trend analysis efforts.

Responsibilities:

Identified application components in the landscape, which will serve as the system of record or reference for enterprise master data and document them.

Architect the file processing system, designed, developed and tested Datastage sequence jobs to control job flow using various job activity, email notification, exception handling, used UNIX script to handle the file operations.

Developed data stage jobs to extract data from source systems Oracle, flat files from SAP systems, stores sales data in XML format, external vendor survey flat files data through FTP, applied business rules, data enrichments and load into Target system.

Defined ETL coding standards for data stage job development and interface deployment process guidelines for production release.

Setup disaster recovery data stage environment, tested and documented the DR procedures.

Interacts and works directly with Data Modelers/Architects and Business owners.

Ensuring the availability, integrity, and quality of existing production applications meet customer expectations and are within with the established service level agreements.

Provided production support, troubleshooting and resolving the issues that arise.

Environment : IBM info sphere Data stage 8.1, 8.7, 9.1, Data quality, MDM, SQL Server, Oracle, UNIX Script, SFTP, FTP, Connect Direct, Remedy Incident System, and Batch Script.

Chevron – Concord California Feb 2015 to Dec 2015

MARS Environment migration to new data center.

Sr. IBM Infosphere Datastage Technical Analyst/Developer

Description

ETL layer and database servers and applications will be moved to new cost effective data center located in San Antonio, Texas. Interfaces from all applications connecting to MARS needs to do connectivity testing and performance testing. Business will need to provide User Acceptance Testing resources during migration and signoff. Targeted on February 2016.

Responsibilities:

Planning, coordinating with different application team to do connectivity testing with new environment.

Documented the technical readiness checklist for new environment.

Documenting the set up procedures for new username & Password for databases, file transfer protocol, SFTP, Connect direct.

Requesting, validating new FTP, SFTP and connect direct environment.

Document all batch jobs pertaining to the applications as they'll need to be setup in the NADC.

Prepared cutover plan and review each step with source system technical analyst, application business analyst.

Migrated ETL Jobs, test all inbound and outbound interface, document test results and present to migration specialist.

Environment : IBM Datastage 9.1, SQL server, Oracle, UNIX scripts.

Chevron – Concord California August 2012 to Feb 2013

Downstream operations data quality and MDM.

Sr. IBM Info sphere Data stage Technical Analyst & Solution Architect

Description

The data capture process must be a source system independent standard interface to allow extendibility to any source system with data quality dashboard requirements. Source systems will implement business rules in queries executed by a data movement technology (i.e. DataStage). Data quality audit tool provides insight into data accuracy, completeness, consistency, timeline, uniqueness of master data (customer, plant, currency, ship to, sold to). The system will allow scheduling of data collection on a periodic basis and display a simple standard interface depicting:

Current data quality scores

Data quality trending

Data quality rule exceptions

Responsibilities:

The Data Quality Analyst is responsible for detailed data definitions, data quality rules and metrics for a data quality initiative

Work with appropriate Data Stewards to ensure implementation of data, rules and metrics

Develop code to obtain and transform data, executed data clean-up activities

Assist with monitoring and maintaining systems during the project execution

Collaborates with the Data Architect on logical data models and the understanding of business and analytical processes as they relate to data quality rules, data transformations, and reporting and analytical needs for business intelligence

Environment : IBM Data stage 8.7, MDM, Data quality, SQL server, Oracle, UNIX scripts.

Chevron – Concord California Sep 2011 to Feb 2014

Chevron Lubricant Pricing Management system

IBM Info sphere Data stage architect and Technical analyst

Description

Lubricant pricing analyzer is a reporting and analytics tool that allows the business to understand how the lubricants business is performing. It is used to determine if transactions have yielded a positive Net Operating Income so the business can make better pricing and discounting decisions to maximize profitability. Reviewed Lubricant pricing mart architecture designed by Lubricant information architect, suggested simplified solution thereby reduced cost and risk involved in the project.

Responsibilities:

Designed and Implemented Datastage jobs for extracting, transforming and loading databases from SQL Server and external vendor data via FTP.

Extensively used transformer, aggregator, and lookup and join stages for implementing business logics, data mapping, data validation and date formatting.

Created Unix Scripts to move the file to SFTP location, backup the files after successfully moved, on error send notification to technical team.

Implemented alerting mechanism in the jobs developed which would notify the support team when there is a job failure.

Currently handling enhancements and production support for the Lubricant Price mart.

Environment : IBM Data stage 8.7, SQL server, UNIX scripts, Crontab, Chevron scheduling tool.

Citi Bank –Singapore December 2008 to August 2011

Credit card data warehouse

Sr. IBM Infosphere Datastage Technical Consultant and Onsite coordinator

Description

Next Gen Rainbow (Credit card data warehouse) project delivered data ware house solutions using IBM Datastage parallel extender version 8.1, 8.5, Oracle, Teradata on UNIX platform and Business Objects for analysis and reporting. Project was implemented for 7 countries in Asia pacific region.

Involved in gathering business requirements, created technical documents with mapping various source systems to targets.

Created reusable template Datastage jobs to load dataset to table, compare dataset, data validation when parsing the file.

Developed UNIX shell scripts to stop/start the applications automatically on the system reboot and run jobs, file operations using UNIX commands.

Analyzed the source system and data element mapping and Work with the senior Architect to finalize the design along with ware house architecture

Performed fine tuning of Datastage jobs using partition techniques and environmental variables, completely understand the file arrival pattern from source system and tuned the schedule to reduce the job wait time in the queue.

During deployment phase we faced lot of issues with IBM Datastage applications; I have coordinated with IBM team and resolved the issues. If we ran more than 20 parallel session’s data stage will hang and the same has been rectified with the help of IBM Support.

Designed data models, generate the DDL’s for Physical model from the Logical Model using Erwin.

Co-ordinate with Off-shore team, source team, Business Analyst, and Infrastructure.

Environment : Datastage 8, 8.1, Oracle, zLinux, Tivoli, AML

L&T Mumbai Apr 2007 to Jun 2008

1-view Master Data Management (Customer data management)

Informatica developer & Technical Analyst

Description

MDM for customer data application aims at conditioning vast amount of data to ensure that they meet various data quality dimensions and can be relied upon by users to take business critical decisions. Right quality data provides an opportunity for organizations to stay competitive.

Responsibilities:

Developed transformation for data validation.

Framework for Customer data management.

Defining the strategy for data cleansing, master data formation, error rectification, integration.

Testing with various scenario Full Match, partial match, and partial match auto correction.

Test and document all code changes including unit testing, system testing, performance testing, and capacity testing.

Written white paper on MDM.

Environment : Informatica, Oracle, UNIX, MDM, SQL Server, Java.



Contact this candidate