Post Job Free

Resume

Sign in

Data Test Cases

Location:
Dallas, TX
Posted:
June 16, 2015

Contact this candidate

Resume:

Sajini A Naik

acp9ak@r.postjobfree.com

614-***-****

Experience Summary

My total experience in Software Quality Assurance (QA) stands about 6+ years in Data Warehouse Testing, Database (ETL & BI), Web, and Client-Server Systems and Applications for Banking, Insurance and Telecom.

Experience in Defining Testing Methodologies; creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC).

Having experience LINUX, Solaris, Windows 2000 and MS Windows XP.

Experience in ETL based testing, Functional Testing, Web based testing, Oracle Database testing mainly Ab Initio and making sure quality Ab Initio code been delivered to user by testing each functionality of code.

Experience in Data warehouse and Data migration Technologies.

Experienced in System, End-to-End, Functional, Regression, Performance and

compatibility testing techniques.

Experience trouble shooting and performance enhancement by having regression testing and providing feedback on where we have consuming the more time.

Experience in working with Agile and waterfall methodology.

Have learned programming skills in Ab Initio, SQL, PL/SQL and UNIX shell scripting.

Good understanding of SDLC and STLC

Good communication & analytical skills, process driven, excellent documentation and reporting skills.

Trained in testing tools such as win runner, Load runner, QTP. Participated in all phases of the System Development Life Cycle Analysis (SDLC).

Solid Back End Testing experience by writing and executing SQL Queries.

Experience in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.

Strong working experience on DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using Ab initio.

Expertise in extended validation of report functionality developed, using Business Objects, by writing complex SQLs at the backend.

Good concept of ETL, Warehouse and BI applications integration.

Good experience in working with onsite-offshore team model projects.

Technology

Below is a list of all important software products, tools and methods that I have worked with.

Qualifications

Degree and Date

Institute

Major and Specialization

Bachelor of Engineering

VCET

Computer Science

Diploma in Computer Science

VCET

Computer Science

Assignments

The details of the various assignments that I have handled are listed here, in chronological order.

Project Name

NMS Pre-Foreclosure Solution

Duration

June 2014 – March 2015

End Client

Citi Mortgage

About the project

The intention of this request is to create a supported workflow tool which will provide users with a frontend interface for monitoring, QC, workflow management, accurate inventory and productivity reporting. These efforts will assist the business to track and manage pre-foreclosure activities and assignments, additional lift through checklists, document tracking and other manual entry forms and calculations.

The National Mortgage Settlement has directed the 5 largest mortgage servicers to comply with a list of revised servicing standards to mitigate alleged deficiencies in the servicing process which allegedly may have led to improper foreclosures and borrower financial harm.

System involves : CitiMortgage, Citibank, and CitiFinancial must comply with a number of updated servicing standards as outlined in the formal NMS settlement document

Goals of the tool(PreFCL) effort include :

User to have automated tool

Currently run by manual process

Simplify the database technology

Simple work flow process

System driven

Role

ETL Tester /QA Analyst

Activities

•Identifying field and data defects with required information in ETL process in various jobs and one to one mapping.

• Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from Mainframe and RDBMS tables to target tables.

• Validated the data in the reports by writing simple to complex SQL queries in the ODS

• Running workflows though control-M-jobs. Attending the client calls and updating the status of the Testing status

• Optimizing/Tuning several complex SQL queries for better performance and efficiency.

• Prepared the test cases after thorough analysis of each requirement.

• Prepared test data by modifying the sample data in the source systems, to cover all the requirements and scenarios.

• Conducted and actively participated in reviews, walkthroughs of test cases.

• Reviewed the ETL mappings (Ab initio) to ensure the transformation rules are applied correctly.

• Executed the workflows in the workflow manager, to start the ETL process.

• Created and executed SQL queries to perform source to target testing on Oracle and SQL database. Performed the following tests:

o DATA COMPLETENESS

o VALIDITY

o UNIQUENESS

o DATA INTEGRITY

o DATA TRANSFORMATION

o DATA QUALITY

o INITIAL AND INCREMENTAL LOAD TESTS

• Executing and monitoring jobs through Control-M.

• Extensively used Control-M jobs to load data from Mainframe to Oracle, Oracle to SQL

• Designed and developed UNIX shell scripts as part of the ETL process to automate the process of loading, pulling the data for testing ETL loads.

• Tested the entire data reconciliation process for multiple source and target systems.

• Involved in creating the test data for generating sample test reports before releasing to production.

•Performed data validation testing writing SQL queries. Tested ETL batch processes using UNIX shell scripting.

• Wrote complex SQL scripts using joins, sub queries and correlated sub queries.

• Provided input into project plans for database projects providing timelines and resources required.

• Involved in the application tuning of database by identifying the suitable Indexes.

• Worked on UNIX Shell wrapper scripts

• Worked on issues with migration from development to testing.

• Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.

Environment

SQL, Quality center 9.2, Oracle10g,Control-M, Abinitio, SQL Server 2000/2005,IBM Mainframe,Unix

Location

Bangalore, India

Project Name

Enterprise System Re-Engineering

Duration

April 2012-Oct 2013

Client

JPMorgan Chase

About the project

Enterprise Systems (ES) is a technology group within Consumer and Business Banking organization. The bulk of what ES does centers around a set of systems called Enterprise Customer Deposits Systems (ECD). ECD systems are mission critical functions that perform the most basic functions for JPMC such as maintaining customer's accounts, balances, transaction history, and information about the customers. There are over 100 million customers and over 40 million bank accounts stored in our systems. Trillions of dollars per day in financial transactions are processed.

As a result, Enterprise Systems has started a platform reengineering initiative

The project is migration of COBOL applications and IMS databases model to a Java applications and DB2 databases model.

Goals of the Reengineering effort include :

Move Enterprise Systems into future leaning technology

Enable distributed processing on less expensive platforms

Simplify the database technology

Reuse processes between our applications

Expose the functionality to our Channel Partners as Web Services.

Role

QA analyst

Activities

Developed Test Plans, test scripts and executed the test scripts.

Thorough out the project in various iterations/Sprints – conducted system testing, integration testing, functional testing and Regression testing.

Developed complex SQL queries for querying data against different data bases for data verification process.

Prepared the Test Plan and Testing Strategies for Data Warehousing Application

Developed ETL test scripts based on technical specifications/Data design documents and Source to Target mappings.

Extensively interacted with developers, business& management teams to understand the business requirements and ETL design document specifications.

Participated in regular project status meetings and QA status meetings.

Extensively used and developed SQL scripts/queries in backend testing of Databases.

Written test cases to test the application in Quality Center

Defects identified in testing environment are communicated to the developers using Quality Center – Defects module.

Prepared daily status reports with details of executed, passed, and failed test cases and defect status.

Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.

Creating test cases for ETL mappings and design documents for production support

Setting up, monitoring and using Job Control System in QA

Worked with ETL groups and Acquisition team and business analyst for understating mappings for dimensions and facts

Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.

Extracted data from various sources like Oracle, flat files

Worked on issues with migration of Data from development to QA-Test Environment

Extensively tested several Business Objects Reports –to validate the reports the data and cosmetics of the report.

Environment

Unix/Java/Oracle 10g/SOAPUI/QC/BO

Location

Columbus, Ohio

Project Name

Linux Migration

Duration

April10-Feb 11

Client

L&G

About the project

The Business Intelligence Competency Centre (BICC/ S&C) is an initiative, which aims to provide leadership to Legal & General and its growing use of Business Intelligence technology (using AB Initio). AB Initio applications, across environments are being migrated from UNIX platform to LINUX platform in order to save costs associated with HP UNIX OS. When we started this project there were 40 + applications running HP UNIX.

The purpose of the project was to achieve cost savings by migrating to Linux servers while adding additional functionality and change mapping process for the source system. More than 40 feeds from different source system landed in the data warehouse.

Role

Sr.ETL Tester

Activities

Involved in gathering requirements and analysis for applications.

Developed Test Plans, Test Cases, Test Scripts

Developed test cases based on test matrix including test data preparation for Data Completeness, Data Transformations, Data quality, Performance and scalability.

Developed SQL queries /scripts to validate the data such as checking duplicates, null values, truncated values and ensuring correct data aggregations.

Performed data quality analysis using advanced SQL skills.

Performed Count Validation, Dimensional Analysis, Statistical Analysis and Data Quality Validation in Data Migration.

Data Validation between old system(unix) to new system(linux)

Tested Staging table using SQL and Unix script

Used Abinitio as an ETL Tool for Developing the Data Warehouse for creating test data.

Created Test Cases in Quality Center and RTM and mapped Test Cases to Requirements in Req Pro.

Developed Tractability Matrix and Test Coverage reports.

Managed and conducted System testing, Integration testing, Functional testing, UAT and Regression testing.

Used the Abinitio Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.

Responsibilities include acceptance testing, functional testing, Integration testing, System testing.

Loaded data to different databases using SQL scripts and maintained a repository for data loading scripts

Used Shell scripts extensively for automation of file manipulation and data loading procedures.

Tracked the defects using Quality Center and generated defect summary report

Prepared status summary reports with details of executed, passed and failed test case

Interacted with developers, Business & Management Teams and End Users

Participated in regular project status meetings related to testing

Environment

Quality Center, Java, J2EE, XML,

Oracle 10G, TOAD, DB2,Unix Shell scripting MS-Project, SQL Navigator, Windows XP, Abinitio/BO

Location

Bangalore

Project Name

Broadband data synchronization

Duration

From Sep 06- Apr 10

Client

BT.com

About the project

As part of the migration from Classic to the SFJ provisioning stack, a process was required to keep service inventory data synchronized across SIDB, SPACE and CDA platforms.

This SFJ synchronization process is to be developed and run on the BIP platform.

This process is designed to run 24x7 and will capture and forward service inventory updates frequently throughout the day. Service Inventory Database (held in an Oracle database). This holds the BIP view of the current inventory for each service instance and version, with details of the latest BIP update timestamp and latest donor system that updated the inventory.

Role

ETL Tester

Activities

Performed the Product testing for checking the functionality of the application.

Handled order management system (OMS) such functions as order entry, sales analysis, inventory planning, and accounting, among others.

Queried databases with SQL to obtain sample data for billing verification.

Prepared Functional Requirements Documents to redesign two IDMS legacy systems into a DATACOM DB environment.

Developed a smooth handoff scheme in Mobile IP to decrease the handoff time between networks together with a LINUX based implementation

Defining defects and carried out test reports using Quality center as Defect tracking system.

Responsible for validations using SQL Queries

Worked with the components of Data Warehousing like ODS, ETL tools.

Development of Actuate Reports/ SQL Server Stored Procedures.

Identify the source tables from which the Data Warehouse extracts data.

Followed the Software Development Life Cycle in all stages of the project.

Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.

Environment

Quality center, VSS 6.0, web logic 8.1,SQL Server 2000, Web Services, HTML, Oracle 8i, Windows NT, Abinitio

Location

Pune, India

Career Profile

Dates

Organization

Designation

June 2014 – March 2015

Citi Mortgage/TCS/Anush Infobase

ETL Tester/QA Analyst

April 2012 – Oct 2013

JpMorganchase/TCS/ittblazers

QA Analyst

From Apr10 – Feb 11

L&G

Sr.ETL Tester

From Mar 08 – Apr 10

MBT

ETL Tester

From Sep 06 – Mar 08

MBT

ETL Tester

Training / Continuing Education Summary

Program or Course

Year

Attended Hadoop certification course

Dec 2014

Attended Taining in Abinitio/Informatica

Apr 2014

Project Lead Training

Aug 2008

Advance concepts in Software Testing

Nov 2008

Win runner,Load runner, QTP

Nov 2006



Contact this candidate