Harika
Richardson, TX - ****1 571-***-****
e: *************@*****.***
Over 7+ years of IT experience in the Analysis, design, development,
testing and Implementation of ETL & Business Intelligence solutions using
Data Warehouse/Data Mart Design, ETL, OLAP and Client/Server applications.
. Strong knowledge of Business Intelligence, Data Warehousing and
Business Architecture
. Extensive testing ETL experience using Informatica 9.1/8.1/7.1/6.2/5.1
(Power Center/ Power Mart) (Designer, Workflow Manager, Workflow
Monitor and Server Manager)
. Experienced in different Databases like Oracle, DB2, Teradata, Sybase
& SQL Server 2008
. Good expertise in using TOAD, SQL*Plus and SQL Advantage.
. Experience in Dimensional Data Modeling using Star and Snow Flake
Schema.
. Designed data models using Erwin.
. Solid Back End Testing experience by writing and executing SQL
Queries.
. Experience in Data Analysis, Data Validation, Data Cleansing, Data
Verification and identifying data mismatch.
. Has expertise in Test Case Design, Test Tool Usage, Test Execution,
and Defect Management.
. Experience in testing and writing SQL and PL/SQL statements.
. Experience in writing test cases to test the application manually and
automated using Quick Test Pro.
. Expertise in Developing PL/SQL Packages, Stored Procedures/Functions,
triggers.
. Extensively worked on Dimensional modeling, Data cleansing and Data
Staging of operational sources using ETL processes.
. Extensive Working experience in applying Relational Database concepts,
Entity Relation diagrams and Normalization concepts
. Experience in Performance Tuning of sources, targets, mappings and
sessions.
. Automated and scheduled the Informatica jobs using UNIX Shell
Scripting.
. Experience in all phases of Software Development life cycle.
. Application Data warehousing experience in Banking, Financial,
Trading, Retail and Consumer.
. Team player with excellent Organization and Interpersonal skills and
ability to work independently and deliver on time.
Technical Skills:
ETL Tools Informatica 9.1/8.1/7.2/6.1 (Power Mart/Power
Center) (Designer, Workflow Manager, Workflow
Monitor, Server Manager, Power Connect),
Data Modeling Erwin 4.0/3.5, Star Schema Modelling, Snow Flake
Modelling
Databases Oracle 10g/9i/8i, MS SQL Server 2008/2005/2000,
DB2, Teradata, Sybase 12.5
OLAP Tools OBIEE 8.0/7.0/6.0,
Languages SQL, PL/SQL, Unix Shell Script, PERL, Visual
Basic, XML
Tools Toad, SQL* Loader, Teradata SQL Assistant, WinSQL,
BI Studio
Operating Systems Windows 2003/2000/NT, AIX, Sun Solaris, Linux
Testing Tools Test Director, Bugzilla
Rational Tools RUP, Rational Rose,,Rational Clear Case, Clear
Quest, Test Manager, Robot, Rational Requisite Pro
Barclay's NYC NY
Artemis Data warehouse (ADW)
Jan 2012 - Dec 2012
Sr. ETL/BI Tester
Artemis project they divided 14 work streams. My work stream was work
stream-06, Banking, Cashiering, and Credit &Retirement Barclay's Investment
Banking specializes in innovative solutions, drawing on expertise from
across the full spectrum of products: Debt and equity underwriting, sales
and trading, mergers and acquisitions, investment research, correspondent
and prime brokerage services.
CFT Migration Project
Jan 2013 - Till Date
Sr ETL/Unix tester
Barclays publishes and delivers millions of index attributes for over
30,000 benchmark on a daily basis. The process to generate and deliver this
data is maintained by Index production team and is run on Solaris 10
platform. The project is to migrate legacy CFT - Index production
application from Solaris to Linux.
Responsibilities:
. Responsible for Business analysis and requirements gathering.
. Tested several UNIX shell scripting for File validation, Data Parsing,
Cleanup and job scheduling.
. Involved in understanding Logical and Physical Data model using Erwin
Tool.
. Involved in writing Test Plan, Developed Test Cases and supported all
environments for testing.
. Involved with discussion with Business for supporting functional
specifications.
. Involved in creating the test design techniques and testing
specifications for the ETL process.
. Supported the extraction, transformation and load process (ETL) for a
Data Warehouse from their legacy systems using Informatica and provide
technical support and hands-on mentoring in the use of Informatica for
testing.
. Did functional testing and automated test cases using QTP
. Promoted Unix/Informatica application releases from development to QA
and to UAT environments as required.
. Worked with Quality center to document test. results and validate test
scripts.
. Used Quality Center in routing defects, defect resolution and
assigning business cases to business users.
. Reviewed Informatica mappings and test cases before delivering to
Client.
. Performed front-end testing on OBIEE Executive Dashboard portal
. Assisted in System Test and UAT testing scenarios as required
. Designed and tested UNIX shell scripts as part of the ETL process,
automate the process of loading, pulling the data.
. Developed test plans & test Cases for the project.
. Verified the processes on Solaris and set up bench marks & processes
on Linux to ensure the system worked as it was on Solaris.
. Automated and scheduled the Informatica jobs using UNIX Shell
Scripting.
. Performed system testing and regression testing.
. Interacted with the production and business teams to make sure
requirements are as expected.
. Reviewed Informatica mappings and test cases before delivering to
Client.
. Extensively used Informatica power center for extraction,
transformation and loading process.
. Extensively used Informatica tool to extract, transform and load the
data from Oracle to DB2.
. Checked the data flow through the front and backend and used SQL
queries to extract the data from the database.
. Solid Back End Testing experience by writing and executing SQL
Queries.
. Extensively used SQL programming in backend and front-end functions,
procedures, packages to implement business rules and security
. Fine-tuned for performance and incorporate changes to complex PL/SQL
procedure / Packages for updating the existing dimension tables using
PL/SQL Developer on Oracle 8i RDBM.
. Performed backend database testing by writing SQL and PL/SQL scripts
to verify data integrity
. Interacted with functional/end users to gather requirements of core
reporting system to understand exceptional features users expecting
with ETL and Reporting system and also to successfully implement
business logic.
. Written several PL/SQL stored procedures. Tested the output of Stored
Procedures vs. ETL output.
. Created ETL test data for all ETL mapping rules to test the
functionality of the SSIS Packages
. Prepared extensive set of validation test cases to verify the data
. Data Sets provided by the Business Analyst which are the expected
results.
. TOAD is used to perform manual test in regular basis. UNIX and Oracle
are using in this project to write Shell Scripts and SQL queries.
. Performed the tests in both the SIT, QA and contingency/backup
environments
. Performed all aspects of verification, validation including
functional, structural, regression, load and system testing
Environment: Informatica 9.1/8.1/7.2/6.1,SQl, Solaris 10, OBIEE,PL/SQL,
SQL Server 2008, SSIS, XML, XSD, XSLT, XML Spy 2010, HP QTP 10.0,HP Quality
center 10.0,KORN SHELL SCRIPTING, Teradata V2R6 (SQL Assistant, Fast Load,
Multi Load, TPUMP, BTEQ, FEXPORT) Oracle 10g, Erwin 4.0, UNIX, TOAD,
WinSQL, IBM DB2 9.1
MetLife, NY NYC
Mar 2009 - Dec 2011
ETL/Data Warehouse Tester - EDWH Group
MetLife is global insurance organization serving businesses and individuals
with a wide range of insurance products and insurance-related services. It
has wide range of Insurance products that include Auto Insurance, Personal
Insurance, Casualty Insurance, and Life Insurance. Claims Management
system, Investments (Mutual, Rei, Fixed Income etc) provides the technology
that assists claim professionals in administering claim practices in a
timely and effective manner. This application involves in designing and
development of the Data Warehouse. The Company's data come from different
operational sources like Oracle, SQL Server, and DB2 is loaded into the
claims Data Warehouse built on Oracle using Informatica. Various business
rules and Business Processes were applied to Extract, Transform and Load
the data into the Data Warehouse.
Responsibilities:
. Develop test plans based on test strategy. Created and executed test
cases based on test strategy and test plans based on ETL Mapping
document.
. Involved in understanding the ETL mapping document and Source to
Target mappings.
. Prepared the test plans and test cases for different functionalities
involved in the application.
. Used Quality Centre as the test management tool for the maintenance of
the test cases and to track the defects.
. Involved in the error checking and testing of the ETL procedures and
programs Informatica session log.
. Tested the ETL Informatica mappings and other ETL Processes (Data
Warehouse Testing).
. Using Quality Centre as the repository for the requirements.
. Performed the Smoke, Functional testing, Ad Hoc testing and Regression
testing of the application.
. Executed SQL Queries for testing integrity of data in database
(Backend Testing).
. Backend and Database Testing: Oracle, MS-SQL Server, MS Access
databases.
. Involved in Extensively during UAT testing of various functionalities
along with the implementation team, resolved the issues and problems.
Used Test director for bug tracking.
. Involved in Testing Cubes for Dataware house
. Execute and analyze complex test cases and compare the results with
expected results and product requirements.
. Used Quality Center for bug reporting.
. Tracked and reported the bugs with Quality center
. Automated detailed test cases by using Quick Test Pro.
. Used Quick Test Pro to write the automated test scripts by using
various actions and reusable actions.
. Performed Regression Testing of the affected test cases to confirm
that the defect has been resolved when defect corrections are
delivered in a build.
. Tested different detail, summary reports and on demand reports.
. Involved in backend testing for the front end of the application using
SQL Queries in SQL Server data base.
. Responsible to design, develop and test the software (Informatica, PL
SQL, UNIX shell scripts) to maintain the data marts (Load data,
Analyze using OLAP tools).
. Used all the Informatica power center 9.1.0 client components like
Designer, Workflow Manger, Workflow Monitor
. Used InformaticaPowerCenter 8.6.1forextraction, loading and
transformation(ETL) of data in the data warehouse.
. Handling the daily status meetings with QA team and business people in
prioritizing the defects as per the business need.
. Solid testing experience in working with SQL Stored Procedures,
triggers, views and worked with performance tuning of complex SQL
queries.
. Written Test Plans and Test Cases on Mercury's Test Director Tool.
. Defects identified in testing environment where communicated to the
developers using defect tracking tool Mercury Test Director.
. Optimizing/Tuning several complex SQL queries for better performance and
efficiency.
. Created various PL/SQL stored procedures for dropping and recreating
indexes on target tables.
. Worked on issues with migration from development to testing.
. Tested UNIX shell scripts as part of the ETL process, automate the
process of loading, pulling the data.
Environment: INFROMATICA 8.1/7.1/6.2/5.1, HP QTP 10.0,HP Quality center
10.0, Teradata V2R6 (MLOAD, FLOAD, FAST EXPORT, BTEQ), Teradata SQL
Assistant 7.0, SQL Server 2005/2008, Mercury Test Director 8.0, SQL
*Loader, SSIS, Oracle 9i, Erwin 3.5, Windows 2000, TOAD 7, Shell Scripting,
PERL, IBM AIX 5.2, TOAD, WinSQL, SQLPlus, XSD, XML, VSAM FILES, MVS, COBOL
II, ISPF, JCL
Freddie Mac, Mc Clean, VA
Oct 2007 - Feb 2009
ETL Tester - Mortgage Securities Data Mart - Data Loading Team
Net Yield Statistics (NYS): The Net Yield Statistics is one of the CDW
(Corporate data warehouse) work stream provides history of Single Family
and Multi Family statistical, General Ledger and aggregate calculation
results for inclusion on Net Yield Statistics reports. Also stores for each
statistical result that can be calculated or overridden by a user. This NYS
Calculates the Interest rates on yearly fixed-rate mortgages (FRM) and
Average- rate mortgages (ARM). Net Yield Statistics is responsible for
Financial Reporting which is a complex process that includes preparing and
ensuring that Freddie Mac's financial information and reports are accurate,
complete and in compliance with the laws and regulations that govern them.
Information gathered for the financial reporting process comes from all
over the company and contains both financial and operational data. The data
originates in a number of subsystems and sub ledgers and is summarized and
compiled in the General Ledger. The timeliness and accuracy of this
information is critical for Freddie Mac to successfully complete its
financial reporting deliverables.
Responsibilities:
. Reviewing the business requirements for Data Warehouse ETL process and
working with business and requirements team for gaps found during the
review.
. Developed a detailed Test Plan, Test strategy, Test Data Management
Plan, Test Summary Report based on Business requirements
specifications.
. Wrote extensive SQL and PL/SQL scripts to test the ETL flow, Data
Reconciliation, Initialization, and Change Data Capture, Delta
Processing.
. Worked on Informatica Power Center tool - Source Analyzer, warehouse
designer, Mapping Designer, Mapplet Designer and Transformation
Developer
. Extensively used Informatica power center for ETL process.
. Prepared Execution procedure document format to prepare the Test cases
based on mapping document
. Extensively used ETL methodology for testing and supporting data
extraction, transformations and loading processing in a corporate-wide-
ETL Solution using Informatica.
. Worked on Informatica Power Center tool - Source Analyzer, Data
warehousing designer, Mapping & Mapplet Designer and Transformation.
. Promoted Unix/Informatica application releases from development to QA
and to UAT environments as required..
. Validated the Data in the Warehouse using SQL queries
. Validated test cases for Source to Target mappings (STM).
. Validated data flow from Source through to FACT tables and Dimension
tables using complex queries(left outer Joins, sub queries etc)
. Validated data model for landing zone, staging zone and Warehouse
tables for column length and data type consistencies
. Validated FK failures and data integrity checks for all the tables.
. Validated Business codes, Error process and Purge process.
. Extensively used UNIX shell scripts and compared the files in SAS to
make sure the extracted data is correct.
. Involved in developing scenarios of testing and maintaining testing
standards
. Involved in Business functionality review meetings and Use-Case
Analysis
. Developed the templates for User/Customer Training and documentation.
. Participated in developing and implementing End-End testing manually.
. Used Share point for Document Management and Dimensions for version
control.
. Involved in setting up QA environment for testing the applications.
. Assisted Team members in knowledge transfer
. Involved in Regression, UAT and Integration testing
. Coordinated with Different project teams to set up common test
environment and common Integration for different applications
. Conducted defect review meetings with the development team members
. Involved in peer review of test cases, Test Pool meetings and Impact
analysis
Environment: Informatica 7.1.x,Windows XP, Oracle10G, TOAD for Oracle, SQL
Developer, Embarcadero Rapid SQL 7.5,IBM DB2 8.1,Sybase,DART (Desktop Asset
Request & Tracking), Rational ClearCase v7,Telelogic DOORS 8.3, UNIX,
Reflections, Telnet, Mercury, PVCS (Dimensions), MS Project, Hyperion
Performance Suite 8.5. AutoSys Jobs, SAS, Mainframe, Micro Strategy
Wells Fargo Financial Services, SFO, CA
Mar 2007 - Aug 2007
Programmer/QA Engineer - Credit/Market Risk Compliance Data Warehouse
The system was primarily written to provide reports for their portfolio
managers who could use these reports to provide their customers with the
highest total rate of return consistent with their performance goal, cash
flow requirements, need for liquidity, and preservation of capital. The
reports also provided their portfolio managers the unique objectives of
each client with due regard to tax consideration, preferences toward
quality and market risk. The reports also provided their portfolio
managers information to help their customers identify investment
opportunities that provide above average potential for price appreciation
and total return. Reports were created using the Reporting Tool Business
Objects.
Responsibilities:
. Involved in Data Analysis, Database Designing.
. Involved in creation of tables, writing stored procedures, Triggers,
PL/SQL packages.
. Developed user interface screens using VB, ADO to connect to backend
and stored procedures to manipulate data.
. Handling user defined exceptions.
. Developed PL/SQL scripts for data loading.
. Analysis of Physical Data Model for ETL mapping.
. Identification of various Data Sources and Development Environment.
. Writing Triggers enforcing Integrity constraints, Stored Procedures
for Complex mappings, and cursors for data extraction.
. Worked extensively with mappings using expressions, aggregators,
filters, lookup and procedures to develop and feed Data Mart.
. Extensively used ETL to load data from Flat files, Excel, MS-Access
which involved both fixed width as well as Delimited files and also
from the relational database, which was Oracle 9i.
. Developed and tested all the informatica mappings, sessions and
workflows - involving several Tasks.
. Worked on Dimension as well as Fact tables, developed mappings and
loaded data on to the relational database.
. Responsible for creating business solutions for Incremental and Full
loads.
. Worked extensively on different types of transformations like source
qualifier, expression, filter, aggregator, update strategy, Lookup,
sequence generator, joiner, Stored Procedure.
. Generated reports using Desk-I and infoview with analytical ability
and creativity using Multi-data providers, synchronization, BO
formulas, variables, sections, breaks, formatting drilldowns,
Hyperlinks etc
. Designed Universe with data modeling techniques via implementing
Contexts, aggregate awareness, hierarchies, predefined conditions,
linking, Joining tables, indicating cardinalities, creating aliases to
resolve the loops, subdividing into contexts and creating the objects
which are grouped into classes..
Environment: Informatica 6.2, BO 6.5, Oracle 9i, Windows XP, Flat files,
VSAM Files, XML Files, XSD, IBM AIX 5.1, PERL, Shell Scripting, Autosys