Dilip Ravichandran
(Teradata Certified Master V*R5)
*****************@*******.*** Cell: 419-***-****
PROFESSIONAL SUMMARY:
• Progressive and innovative software professional with 6 years of experience in requirement
understanding, system analysis, application design, development, testing, configuration
management, client/ user support services and management of applications.
• Experience in designing and developing of Extract Transform and Load (ETL) processes
on Teradata and extensive use of ETL tool Ab-initio, Informatica.
• Strong Teradata along with SQL Assistant and BTEQ experience. Solid skills and expert
knowledge of Teradata version 2R5.x/6.x and utilities including TPUMP, MLOAD,
FASTLOAD and FASTXPORT.
• Good knowledge about the Teradata Architecture and Teradata Concepts.
• Experience working on the performance tuning and optimization of Teradata SQL queries.
• Knowledgeable/experienced in data warehousing Teradata utilities & MS Reporting.
• Proficient in Data Warehousing concepts, design techniques.
• Handling all the domain and technical interaction with the client and the application users.
• Familiarity with Win XP/2000 Advanced Server /98/95/NT, UNIX, LINUX.
• Experience in SQL Programming (Stored Procedures, Functions and Triggers ) for faster
transformations and Development using Teradata and Oracle.
• Experience in programming in C, Java and web technologies (Dreamweaver, HTML, and
XML).
• Excellent ability with competitive spirit in finding various approaches for problem solving and
highly skillful, creative and innovative in developing efficient logic/code.
• Team player with ability to adapt to various environments with excellent communication,
interpersonal skills and a strong commitment towards customer satisfaction.
EDUCATION & CERTIFICATIONS:
• Bachelor in Electrical and Electronics Engineering.
• Teradata Certified Master V2R5
• Teradata Certified Design Architect V2R5
• Teradata Certified Administrator V2R5
• Teradata Certified Application Developer V2R5
• Teradata Advanced Certified Professional V2R5
• Teradata Certified SQL Specialist V2R5
• Teradata Certified Implementation Specialist V2R5
• Teradata Certified Professional V2R5
TECHNICAL SKILLS:
Primary Skills Data Warehousing, Teradata Interface tools, Teradata SQL,
Teradata Utilities
Languages C, Java, SQL, Shell scripting, Perl.
Web Technologies HTML, DHTML, XML, JavaScript, Dreamweaver, PHP.
Databases Teradata V2R5.x/6.x, Oracle 8i/9i, MS Access, SQL Server
7.0/2000/2005,MySQL
Platforms Windows 95/98/NT/2000/XP, Unix, Linux
Communications TCP/IP, Telnet, FTP
Methodologies Data Modeling – Logical/Physical/Dimensional, Star/Snow
flake, ETL, OLAP, ROLAP, Complete Software
Development Cycle, MLDM,RLDM.
Remote Connection Tools Remote Desktop, XManager, HummingBird Exceed
Miscellaneous Tools MS Office, Autosys, Ab-Initio, Teradata SQL Assistant,
Teradata Manager, TDQM, Queryman, TPT, BTEQ, OLE
Load, FastLoad, Multiload, Tpump, FastXport.
PROFESSIONAL EXPERIENCE
INTEL – Hillsboro, OR Aug 09 - Till date
Senior QA Analyst
Finance CFR (Forecasting, Closure and Reporting) is a part of the Finance Information Services
(FIS) in the company. The ETL process involves extracting the data from SQL Server and SAP
sources and loading them into Teradata Data Warehouse. The data generated is used for financial
earnings release/reporting. This project also involves tuning and improving the performance of the
Business critical queries used. Large numbers of queries which form several batch jobs are required
to be tuned. The tuning exercise is essentially to analyze these queries and improve their
performance using Teradata V2R5 features.
• Troubleshooting ETL designs when data discrepancies are reported, enhancing/revising/
optimizing Stored Procedures specific to Finance/Capital Reporting.
• Query optimization (explains plans, DQMF, collect statistics, data distribution across AMPS,
primary and secondary indexes, locking, etc).
• Involved in Data Validation, Data Conversion and Data Reconciliation Testing.
• Analyzed and optimized queries and wrote revised Stored Procedures.
• Database tuning/Performance improvement on SQL’s in Teradata.
• Collaborate with the Business Analysts to capture the Business Needs of the specific
projects.
• Created DQ Sql’s to between data layers to make sure the data integrity is maintained.
• Testing of data in testing environment & move it to Production .
• Create view, tables in development, create test plans & check for robustness of the code.
• Created Complicated Teradata SQL’s and performance optimize the same.
• Good understanding of Slowly changing dimensions and Star Schema
• Usage of Ab-Initio EME for dependency/impact analysis and configuration management .
• Testing the data present in fact tables and dimensions tables.
• Provide day to day support to the CFR production team with the various production issues.
• Source data analysis, analyzing and Designing mappings for data extraction.
• Verified the Transformations that occur between Staging Tables – Preload Tables for
Delta Loads.
• Monitored batch cycles involving BI processing and jobs running in Autosys.
• Provided on-call support for all applications in Intel’s other development
centers, with responsibility for all applications involved in Data Warehouse
Services.
• Involved in migration of revised/enhanced codes and production support.
Environment: Teradata V2R5.x/6.x, SQL Assistant 7.2, SQL Server, Mload, Fast load, Fast
export, Tpump, BTEQ, Unix, Micro strategy 9, Ab Initio GDE 1.x, SAPGUI, Insight, iXp,
Queryman, TPT (Teradata Parallel Transporter), Autosys.
PEPSICO – Dallas, TX Jan 07 - July 09
ETL QA Analyst
PepsiCo is basically a manufacturer and supplier of beverages in many countries. In order to
expand business they bought other companies like Frito-Lay, Quaker and Gatorade etc. This project
is basically related to consolidate the data from Pepsi and their recent mergers into one single Data
warehouse.
Responsibilities:
• Generating Scripts using Teradata utilities (Fast load, Mload, Fast export, Tpump, Bteq).
• Creating wrapper scripts to implement Teradata mloads effectively and to collect Error
counts to report to the Source team.
• Producing scripts to validate the files received from the source systems to make sure that
the files received fully and are good to load into the tables.
• Working with different sources using some ETL Tools like Informatica.
• Creating several mappings to business logic using ETL tools.
• Fine tuning and Optimization of complex queries to enhance performance.
• Analyzing the SQLs for common mistakes and make necessary modifications.
• Modifying the queries to use the Teradata features for performance improvement.
• Creating and enhancing stored procedures and ad hoc queries in SQL and SAS.
• Perform Code reviews, Unit tests, Test plans & Cases.
• Designed/Created several BTEQ scripts with data transformations for loading the base
tables to apply the business rules manipulate and/or cleansing the data according to the
requirements.
• Creating Workflows to run the mappings and to perform data loads.
• Verified Log files, Data sets and Reports by logging into the Mainframe.
• Extensively used various Performance tuning Techniques to improve the System
performance(PPIs, compression techniques)
• Unit/Integration Testing of application.
• Writing verification / validation test procedures using SQL.
• Created and loaded views to extend the usability of data and enhance performance.
• Perform Planning, Coding and Testing and Deployment of the solution.
• Analyzing the data to make sure if the data can be used for reporting purposes.
Environment: Teradata V2R5.x/6.x, Mload, Fast load, Fast export, Tpump, BTEQ, Unix,
Queryman, Teradata Manager, TDQM, Informatica Power Center/Power Mart 6.x,, Mainframe,
MLDM, SQL Assistant 6.2, SAS, Oracle, SQL Server, Windows NT/2000.
GE CAPITAL - CT Aug 05 - Dec 06
ETL QA Analyst
This project involves the Creation of monthly performance reports (MPR) for the Hispanic
Segment by leveraging the code from the USM. Perform Ad-Hoc Queries using SQL and SAS.
Import data into MS-Excel and MS-Access based on the volume of data.
Responsibilities:
• Perform Ad-Hoc queries & extracting data from existing data stores (Data Pulling).
• Creation of monthly performance reports (MPR) for the Hispanic Segment.
• Rational test manager is used to create the stricter of the test cases associate them to the
written test scripts.
• Create aggregations on the resultant Data Set.
• Generate Ad-hoc data queries utilizing SQL and SAS.
• Mining data to derive specific and mission critical information for the organization.
• Interact with teams and leverage their existing SQL scripts to Marketing & analysis
needs.
• Tested the User Interface of the applications and validated its data with the backend
data.
• Provide feasibility analysis from a data perspective in response to business requests.
• Prepared the Test Data required for Testing.
• Copying production data to test environment for testing.
• Export data into MS-Excel or MS-Access.
• Create Pivot tables on the resultant Data Set.
Environment: Oracle 8i, Teradata Database V2R5.x/6.x, Erwin, PL/SQL, SQL, SAS, Unix (Korn
Shell), MVS, MS-Access, MS-Office, Cognos, Windows-XP, TOAD
CINGULAR – TX June 04 - Jul 05
PRDW (Patron Research Data Warehouse)
DW QA Analyst
PRDW (Patron Research Data Warehouse) was developed to analyze the effectiveness of the
marketing campaigns for the incentives and offerings offered by Cingular. The ETL process involved
extracting the data from SQL Server and Oracle sources and loading them into Teradata Data
Warehouse. The data was used to analyze customer opinions, loyalty, customer profiles and
customer satisfaction.
Responsibilities
• Experience in using various Teradata utilities like Fastexport, Fastload,
Multiload, BTEQ, TPUMP etc.
• Familiar with concepts involving different types of Indexes, Joins, Spaces
etc. for optimum performance of Teradata.
• Development of Bteq /Fast Load /Multiload script for loading purpose.
• Designed/Created several BTEQ scripts to apply the business rules
manipulate and/or massage the data according to the requirements.
• Designed/Developed scripts to move data from the staging tables to the
target tables.
• Designed the order of flow for the execution of the jobs and scheduling the
jobs.
• Fine tuning of SQL to optimize the performance, spool space usage and
CPU usage.
• Responsible for migration and production support.
• Created stored procedures for various business functions.
• Involved with UAT testing and assisted the user in testing the applications
according to their Business needs.
• Generated reports for analyzing the various customer patterns
• Extraction, Transformation was performed using SQL and PL/SQL
embedded COBOL programs was used to build the company data
warehouse.
• Write hundreds of DDL scripts to create tables, views and indexes in the
company Data Warehouse.
• Provided on-call trouble-shooting support for all applications run in the west
coast data center with responsibility for all applications involved in Data
Warehouse Services.
Environment: Informatica PowerCenter/PowerMart 6.x, Informatica PowerExchange, IBM-AIX,
Teradata Database V2R5.x/6.x, NCR Servers, UNIX, Perl, Teradata Utilities (BTEQ, Fastload,
Multiload, Tpump, Teradata SQL Assistant.