Dinesh kurumati
**** * ********* ****. # ****, Irving, TX 75063
acghrd@r.postjobfree.com
. Over 8+ years of IT experience in the Analysis, design, development,
testing and Implementation of ETL & Business Intelligence solutions
using Data Warehouse/Data Mart Design, ETL, OLAP and Client/Server
applications.
. Strong knowledge of Business Intelligence, Data Warehousing and
Business Architecture
. Extensive testing ETL experience using Informatica 8.6/8.1/7.1/6.2/5.1
(Power Center/ Power Mart) (Designer, Workflow Manager, Workflow
Monitor and Server Manager)
. Experienced in different Databases like Oracle, DB2, Teradata, Sybase &
SQL Server 2008
. Extensively used ETL methodology for supporting data extraction,
transformations and loading processing, in a corporate-wide-ETL
Solution using Informatica, Data Stage & Ab Initio and SSIS
. Good expertise in using TOAD, SQL*Plus and SQL Advantage.
. Experience in Testing with HP Quality center.
. Experience in Dimensional Data Modeling using Star and Snow Flake
Schema.
. Designed data models using Erwin.
. Experience in automated testing of WEB-based applications using
Selenium IDE/RC.
. Solid Back End Testing experience by writing and executing SQL
Queries.
. Co-ordinate testing efforts between onsite and offshore teams.
. Has expertise in Test Case Design, Test Tool Usage, Test Execution,
and Defect Management.
. Experience in UNIX shell scripting and configuring cron-jobs for
Informatica sessions scheduling
. Extensively used ETL methodology for supporting data extraction,
transformations and loading processing, in a corporate-wide-ETL
Solution using Informatica
. Experience in testing and writing SQL and PL/SQL statements.
. Experience in writing test cases to test the application manually and
automated using Quick Test Pro.
. Experience in Business Intelligence in generating various reports
using Cognos and BO.
. Expertise in Developing PL/SQL Packages, Stored Procedures/Functions,
triggers.
. Expertise in utilizing Oracle utility tool SQL*Loader and expertise in
Toad for developing Oracle applications.
. Extensively worked on Dimensional modeling, Data cleansing and Data
Staging of operational sources fusing ETL processes.
. Extensive Working experience in applying Relational Database concepts,
Entity Relation diagrams and Normalization concepts.
. Experience in Performance Tuning of sources, targets, mappings and
sessions.
. Automated and scheduled the Informatica jobs using UNIX Shell
Scripting.
. Experience in all phases of Software Development life cycle.
. Application Data warehousing experience in Banking, Financial,
Trading, Retail and Consumer.
. Performed Unit, Integration and System Testing.
. Good analytical and problem solving skills.
. Team player with excellent Organization and Interpersonal skills and
ability to work independently and deliver on time.
Education: Kakatiya university, INDIA
ETL Tools Informatica 8.1/7.1/6.2/5.1 (Power Mart/Power
Center) (Designer, Workflow Manager, Workflow
Monitor, Server Manager, Power Connect), Data
Stage 8.1, Ab Initio (GDE 1.15, Co>Op 2.15), SSIS
Data Modeling Erwin 4.0/3.5, Star Schema Modelling, Snow Flake
Modelling
Databases Oracle 10g/9i/8i, MS SQL Server 2008/2005/2000,
DB2, Teradata, Sybase 12.5
OLAP Tools Cognos 8.0/7.0/6.0,
Languages SQL, PL/SQL, Unix Shell Script, PERL, Visual
Basic, XML
Tools Toad, SQL* Loader, Teradata SQL Assistant, WinSQL,
BI Studio
Operating Systems Windows 2003/2000/NT, AIX, Sun Solaris, Linux
Testing Tools QTP 9.0, Quality Center 10.0, Test Director,
Bugzilla
Rational Tools RUP, Rational Rose,,Rational Clear Case, Clear
Quest, Test Manager, Robot, Rational Requisite Pro
Bank of America, Dallas, TX
April 2012 - Till date
ETL Manual Tester/ QA - Home Loans Mortgage
Bank of America is global financial sector with its branches spread all
over the country. It typically deals with Mortgage home loans and changes
to its existing systems for better customer satisfaction and profits. The
project involves an application called DCM where the means of letter being
sent to customers like Monthly statements, yearend statements and welcome
letters are being processed by DCM app. It has a print vendor associated
with it for printing and successful mailing of the document to customer .
Worked on validating the data received by customer, reports and end to end
flow of the letter.
Responsibilities:
. Develop test plans based on test strategy. Created and executed test
cases based on test strategy and test plans based on ETL Mapping
document.
. Involved in understanding the ETL mapping document and Source to
Target mappings.
. Prepared the test plans and test cases for different functionalities
involved in the application.
. Used Quality Centre as the test management tool for the maintenance of
the test cases and to track the defects.
. Extracted Data from SQl server & Iseries using Informatica Power
Center ETL and DTS Packages to the target database including SQL
Server and used the data for Reporting purposes
. Using Quality Centre as the repository for the requirements.
. Performed the Smoke, Functional testing, Ad Hoc testing and Regression
testing of the application.
. Establish and organize Offshore Development Teams.
. Involved in creating the test design techniques and testing
specifications for the ETL process.
. Worked on Informatica Power Center tool - Source Analyzer, Data
warehousing designer, Mapping & Mapplet Designer and Transformation
Developer.
. Designed and tested UNIX shell scripts as part of the ETL process,
automate the process of loading, pulling the data.
. Interacted with functional/end users to gather requirements of core
reporting system to understand exceptional features users expecting
with ETL and Reporting system and also to successfully implement
business logic.
. Written several PL/SQL stored procedures. Tested the output of Stored
Procedures vs. ETL output
. Performed the tests in both the SIT, QA and contingency/backup
environments
. Performed all aspects of verification, validation including
functional, structural, regression, load and system testing
. Defects were tracked, analyzed, and documented using Quality Center
. Worked with Informatica session log files for data reconciliation and
verifying the data conditions.
. Conducted defect review meetings with the development team members
. Involved in peer review of test cases, Test Pool meetings and Impact
analysis
. Involved in testing the applications in Target host in IBM Mainframe
environment
. Involved in prod support if needed
Paychecks, Rochester NY
Jan 2011 - April 2012
Sr. SQL/ETL/BI/UNIX Tester/Analyst CMS (Customer Management System)
The Enterprise Data warehouse implement a centralized database that
collects and organizes the CMS (Customer Management System) data from
operational systems to provide a single source of integrated and historical
data for the purpose of end-user reporting, analysis and decision support
and Improve client service by preventing errors, providing real-time data
and updating records as transactions are completed. The project is involved
extracting data from different sources and loading into DataMarts/DWH and
based on the Business requirements & Mapping Rules. We majorly involved in
loading the data and transforming the data to the staging/bridge/target
area and then in the Data Warehouse. Worked on Enterprise Data Warehouse
for extensive reporting to pull dashboard and drill-down reports for
business decisions on various Customer
Responsibilities:
. Responsible for Business analysis and requirements gathering.
. Tested several UNIX shell scripting for File validation, Data Parsing,
Cleanup and job scheduling.
. Involved in understanding Logical and Physical Data model using Erwin
Tool.
. Involved in writing Test Plan, Developed Test Cases and supported all
environments for testing.
. Reported bugs and tracked defects using Quality Center.
. Extensively used Informatica to load data from Flat Files to Teradata,
Teradata to Flat Files and Teradata to Teradata.
. Worked on a Business Intelligence reporting system that was
primarily functioning on Oracle Applications OLTP environment with
Business Objects for Business Intelligence reporting
. Tested the reports using Business Objects functionalities like
Queries, Slice and Dice, Drill Down, Cross Tab, Master Detail and
Formulae etc.
. Responsible for testing Business Reports developed by Business Objects
XIR2
. Extensively used Informatica to load data from Flat Files to Teradata,
Teradata to Flat Files and Teradata to Teradata.
. Conducted Weekly Onsite-Offshore Status Meetings.
. This work will include loading of historical data onto the Teradata
platform, as well as reference data and metadata.
. Tested new Data migration tool used in the mapping of data to new
system.
. Data migration of raw data to excel documents for clients
. Involved with discussion with Business for supporting functional
specifications.
. Involved in creating the test design techniques and testing
specifications for the ETL process.
. Worked on Informatica Power Center tool - Source Analyzer, Data
warehousing designer, Mapping & Mapplet Designer and Transformation
Developer.
. Designed and tested UNIX shell scripts as part of the ETL process,
automate the process of loading, pulling the data.
. Interacted with functional/end users to gather requirements of core
reporting system to understand exceptional features users expecting
with ETL and Reporting system and also to successfully implement
business logic.
. Written several PL/SQL stored procedures. Tested the output of Stored
Procedures vs. ETL output.
. Tested data warehouse ETL process using SSIS (Integration Service)
. Created ETL test data for all ETL mapping rules to test the
functionality of the SSIS Packages
. Prepared extensive set of validation test cases to verify the data
. Tested different type of reports including detailed, summary reports,
KPI's and on demand ad-hoc reports.
. Involved in full cycle of code management and reported defects and
initiated Change requests using Rational Clear Quest.
. Data Sets provided by the Business Analyst which are the expected
results.
. TOAD is used to perform manual test in regular basis. UNIX and Oracle
are using in this project to write Shell Scripts and SQL queries.
. Performed the tests in both the SIT, QA and contingency/backup
environments
. Performed all aspects of verification, validation including
functional, structural, regression, load and system testing
. Defects were tracked, analyzed, and documented using Quality Center
Environment: Business Objects XIR2/6.5.1/6.0/5.1.x/4.0,SQL, PL/SQL, SQL
Server 2008, SSIS, XML, XSD, XSLT, XML Spy 2010,,KORN SHELL SCRIPTING,
Informatica 8.6, Teradata V2R6 (SQL Assistant, Fast Load, Multi Load,
TPUMP, BTEQ, FEXPORT), HP Quality Center 10, Oracle 10g, Erwin 4.0, UNIX,
TOAD, WinSQL, IBM DB2 9.1
MetLife, NY NYC
Mar 2008 - Dec 2010
ETL/Data Warehouse Tester - EDWH Group
MetLife is global insurance organization serving businesses and individuals
with a wide range of insurance products and insurance-related services. It
has wide range of Insurance products that include Auto Insurance, Personal
Insurance, Casualty Insurance, and Life Insurance. Claims Management
system, Investments (Mutual,Rei,Fixed Income etc) provides the technology
that assists claim professionals in administering claim practices in a
timely and effective manner. This application involves in designing and
development of the Data Warehouse. The Company's data come from different
operational sources like Oracle, SQL Server, and DB2 is loaded into the
claims Data Warehouse built on Oracle using Informatica. Various business
rules and Business Processes were applied to Extract, Transform and Load
the data into the Data Warehouse.
Responsibilities:
. Develop test plans based on test strategy. Created and executed test
cases based on test strategy and test plans based on ETL Mapping
document.
. Involved in understanding the ETL mapping document and Source to
Target mappings.
. Prepared the test plans and test cases for different functionalities
involved in the application.
. Used Quality Centre as the test management tool for the maintenance of
the test cases and to track the defects.
. Extracted Data from Teradata using Informatica Power Center ETL and
DTS Packages to the target database including SQL Server and used the
data for Reporting purposes
. Using Quality Centre as the repository for the requirements.
. Performed the Smoke, Functional testing, Ad Hoc testing and Regression
testing of the application.
. Establish and organize Offshore Development Teams.
. Involved in Extensively during UAT testing of various functionalities
along with the implementation team, resolved the issues and problems.
Used Test director for bug tracking.
. Involved in Testing Cubes for Dataware house
. Written several UNIX scripts for invoking data reconciliation.
. Execute and analyze complex test cases and compare the results with
expected results and product requirements.
. Tested ad hoc and canned reports for Business objects.
. Tested Business Objects reports and Web Intelligence reports.
. Managed user accounts and security using Business Objects Supervisor
. Tested the universes and reports in Business Objects 6.0
. Tested database integrity referential integrity and constrains during
the database migration testing process.
. Responsible for the management, design, and continuous development of
the Data Migration.
. Involved in Teradata SQL Development, Unit Testing and Performance
Tuning
. Used Software for Querying ORACLE. And Used Teradata SQL Assistant
for Querying Teradata
. Responsible for different Data mapping activities from Source systems
to Teradata
. Performed Regression Testing of the affected test cases to confirm
that the defect has been resolved when defect corrections are
delivered in a build.
. Tested different detail, summary reports and on demand reports.
. Written several PL/SQL stored procedures. Tested the output of Stored
Procedures vs. ETL output.
. Involved in backend testing for the front end of the application using
SQL Queries in SQL Server data base.
. This work will include loading of historical data onto the Teradata
platform, as well as reference data and metadata.
. Involved in Designing Logical and Physical Data model using Erwin Tool
Responsible for different Data mapping activities from Source systems
to Teradata.
. Handling the daily status meetings with QA team and business people in
prioritizing the defects as per the business need.
. Extensively used SQL programming in backend and front-end functions,
procedures, packages to implement business rules and security
. Experience in loading from various data sources like Oracle, DB2, SQL
Server, Fixed Width and Delimited Flat Files, MS Access, COBOL files &
XML Files.
. Tested Transformed data from various sources like excel and text files
in to reporting database to design most analytical reporting system.
. Solid testing experience in working with SQL Stored Procedures,
triggers, views and worked with performance tuning of complex SQL
queries.
. Written Test Plans and Test Cases on Mercury's Test Director Tool.
. Defects identified in testing environment where communicated to the
developers using defect tracking tool Mercury Test Director.
. Optimizing/Tuning several complex SQL queries for better performance and
efficiency.
. Created various PL/SQL stored procedures for dropping and recreating
indexes on target tables.
. Worked on issues with migration from development to testing.
. Tested UNIX shell scripts as part of the ETL process, automate the
process of loading, pulling the data.
Environment: Informatica 8.1/7.1/6.2/5.1, Teradata V2R6 (MLOAD, FLOAD, FAST
EXPORT, BTEQ), Teradata SQL Assistant 7.0, SQL Server 2005/2008, Mercury
Test Director 8.0, QTP 8.0, SQL *Loader, Business Objects
XIR2/6.5.1/6.0/5.1.x/4.0,Oracle 9i, Erwin 3.5, Windows 2000, TOAD 7, Shell
Scripting, PERL, IBM AIX 5.2, TOAD, WinSQL, SQLPlus, XSD, XML, VSAM FILES,
MVS, COBOL II, ISPF, JCL
Freddie Mac, Mc Clean, VA
Oct 2007 - Feb
2008
Data Stage ETL Tester - Mortgage Securities Data Mart - Data Loading Team
Net Yield Statistics (NYS): The Net Yield Statistics is one of the CDW
(Corporate data warehouse) work stream provides history of Single Family
and Multi Family statistical, General Ledger and aggregate calculation
results for inclusion on Net Yield Statistics reports. Also stores for each
statistical result that can be calculated or overridden by a user. This NYS
Calculates the Interest rates on yearly fixed-rate mortgages (FRM) and
Average- rate mortgages (ARM). Net Yield Statistics is responsible for
Financial Reporting which is a complex process that includes preparing and
ensuring that Freddie Mac's financial information and reports are accurate,
complete and in compliance with the laws and regulations that govern them.
Information gathered for the financial reporting process comes from all
over the company and contains both financial and operational data. The data
originates in a number of subsystems and sub ledgers and is summarized and
compiled in the General Ledger. The timeliness and accuracy of this
information is critical for Freddie Mac to successfully complete its
financial reporting deliverables.
Responsibilities:
. Reviewing the business requirements for Data Warehouse ETL process and
working with business and requirements team for gaps found during the
review.
. Developed a detailed Test Plan, Test strategy, Test Data Management
Plan, Test Summary Report based on Business requirements
specifications.
. Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data
Reconciliation, Initialization, and Change Data Capture, Delta
Processing.
. Prepared a UAT Plan and set up UAT environment.
. Prepared Execution procedure document format to prepare the Test cases
based on mapping document.
. Defined and ran jobs using the AutoSys Graphical User Interface (GUI)
and Job Information Language (JIL), monitoring and managing AutoSys
jobs and alarms.
. Ran jobs using the AutoSys to load the data from Source to Target.
. Executed DataStage jobs to test the ETL processes.
. Developed Strategies for Data Analysis and Data Validation.
. Used the DataStage Designer to develop various jobs processes for
extracting, cleansing, transforming, integrating, and loading data
into data warehouse database
. Used the DataStage Director to schedule jobs, testing and debugging
its components and monitoring of results
. Utilized Cognos 7.3 Series for building the Reports.
. Involved in testing the Cognos reports by writing complex SQL queries.
. Worked with DataStage Manager for importing metadata from repository,
new job Categories and creating new data elements.
. Validated the Data in the Warehouse using SQL queries
. Tested the Reports generated by the BI tools (Micro Strategy,
Hyperion) and validated the data on the Reports
. Validated test cases for Source to Target mappings (STM).
. Validated data flow from Source through to FACT tables and Dimension
tables using complex queries(left outer Joins, sub queries etc)
. Validated data model for landing zone, staging zone and Warehouse
tables for column length and data type consistencies
. Validated FK failures and data integrity checks for all the tables.
. Validated Business codes, Error process and Purge process.
. Extensively used UNIX shell scripts and compared the files in SAS to
make sure the extracted data is correct.
. Involved in developing scenarios of testing and maintaining testing
standards
. Involved in Business functionality review meetings and Use-Case
Analysis
. Participation in requirement / Use Case analysis, risk analysis and
configuration management.
. Developed the templates for User/Customer Training and documentation.
. Participated in developing and implementing End-End testing manually.
. Used Share point for Document Management and Dimensions for version
control.
. Involved in setting up QA environment for testing the applications.
. Used Quality Center and Test Director for defect tracking and
reporting
. Assisted Team members in knowledge transfer
. Involved in Regression, UAT and Integration testing
. Coordinated with Different project teams to set up common test
environment and common Integration for different applications
. Conducted defect review meetings with the development team members
. Involved in peer review of test cases, Test Pool meetings and Impact
analysis
. Involved in testing the applications in Target host in IBM Mainframe
environment
. Used QMF for writing and executing DB2 queries in the host
Environment: Datastage 7 (Designer, Director, Manager),Windows XP, Cognos
7.3 Series, Oracle10G, TOAD for Oracle, SQL Developer, Embarcadero Rapid
SQL 7.5,IBM DB2 8.1,Sybase,DART (Desktop Asset Request & Tracking),
Rational ClearCase v7,Telelogic DOORS 8.3, UNIX, Reflections, Telnet,
Mercury Quality Center, PVCS (Dimensions), MS Project, Hyperion Performance
Suite 8.5. AutoSys Jobs, SAS, Mainframe, Micro Strategy
Wipro Systems, Chennai, INDIA
Oct 2006 - Aug 2007
Client: Wells Fargo Financial Services, SFO, CA
Programmer/QA Engineer - Credit/Market Risk Compliance Data Warehouse
Responsibilities:
. Involved in Data Analysis, Database Designing.
. Involved in creation of tables, writing stored procedures, Triggers,
PL/SQL packages.
. Developed user interface screens using VB, ADO to connect to backend
and stored procedures to manipulate data.
. Handling user defined exceptions.
. Developed PL/SQL scripts for data loading.
. Analysis of Physical Data Model for ETL mapping.
. Identification of various Data Sources and Development Environment.
. Writing Triggers enforcing Integrity constraints, Stored Procedures
for Complex mappings, and cursors for data extraction.
Environment: Informatica 6.2, BO 6.5, Oracle 9i, Windows XP, Flat files,
VSAM Files, XML Files, XSD, IBM AIX 5.1, PERL, Shell Scripting, Autosys