Post Job Free

Resume

Sign in

Big Data Testing Analyst

Location:
McKinney, TX
Salary:
As per company standard
Posted:
October 27, 2023

Contact this candidate

Resume:

PROFESSIONAL SUMMARY:

About ** + years of IT experience, with strong Data and testing Analyst experience in Data warehouse and Business Intelligence Programs and Big Data Projects

Strong SQL writing skills using different database environments like Oracle, Teradata, MS SQL Server, MySQL, NoSQL and Hive.

Has worked in Migration, Conversion, ETL, Big Data, Datawarehouse and Application testing projects

Well Versed with SDLC and Agile methodologies in business domains Banking and Financial, Life science and Insurance & GIS and

Extensive experience in QA process design and implementation, Estimation and Strategic QA planning

Experience on Hadoop, Query IT, Express IT with Ab-Initio

Good Knowledge on Excel Macro, PLSQL, Bteq, UNIX Shell Scripting

Fairly good knowledge of ETL tools like Informatica, Data Stage, Ab-initio and SSIS and BI tools like Cognos, QlikView and OLAP Methodology from the QA perspective

Using Advance Excel for Data Analysis and Data Validations.

Have Knowledge performing Data Analysis from source systems to datamart and provide root cause analysis for defects

Experience with various environments like UNIX, Windows and have theoretical understanding on Hadoop and Big Data

Experience working with Cross functional teams for the execution of organization wide projects and Programs

Quick learner with the ability to grasp new technologies. Energetic and self-motivated team player

Proven ability to work in both independent and team environments

EDUCATION

Title of the Degree with Branch

College/University

Year of Passing

BACHELOR DEGREE IN Electronics & Communication ENGINEERING

Visveswarya Technological University, Belgaum, INDIA

2003

TECHNICAL SKILLS

Area

Skills

RDBMS

Oracle, Teradata, MySQL, SQL Server

Domains

Banking and Financial Services, Life Sciences, Insurance, GIS Domain

Operating Systems

Windows 2000, Windows NT, UNIX

Testing Concepts / ETL Testing

Test Strategy, Test Analysis, Test Design, UAT (User Acceptance Testing)

Tools

Toad, SQL*PLUS, HP ALM, MQC(Mercury Quality Centre), JIRA, SFTP, SCP (Secure Copy), Oracle Data Pump Utility, Control-M, SVN (subversion), SharePoint, JIRA

Programming Area

SQL, PL/SQL, Basic Unix Scripting, BTEQ (Teradata)

ETL/Reporting Tools

Informatica, DataStage, Ab-initio, Cognos, Qlik View

EMPLOYMENT HISTORY:

Name of the Company

Designation

Duration (Years)

TEKSYSTEMS/COMPUNEL

Data Quality Analyst

2 + years

SiriusXM Satellite Radio

Data and Quality Engineer

10 months

Bank of America

AVP Projects

1 Year 9 months

Texas Health Resources

Quality Assurance II

3 months

Cognizant Technology Solutions

Sr. Associate

10 years 3 months

Patni Computers System

Sr. Software Engineer

3 years

Certification:

ORACLE 9i CERTIFIED ASSOCIATE in SQL and PLSQL

PROJECT PROFILE

TheromFisher(TekSystems) Sep 2022 – Oct2022

Data Quality Analyst

Mainly work was on REDSHIFT database, SQL server (using SQL Developer) with Specific Business requirement

Worked as QA Coordiantor between different DEV Team

Raise and track defects in JIRA till closure

SQL Scripting/Query based validations

Defect triage meeting for each sprints

Environment: REDSHIFT, SQL sever, JIRA,Oracle, SQL Developer

USAA - Plano TX(Tek Systems) Mar 2020 – Aug 2022

Data Quality Analyst

Working on Datamigration Project from Netezza to Snowflake(Cloud)

Using Python based automation Validating Data

Perform 6 different set of Test scenarios – including Record Count, Summation of Integer, Data Mismatch, Primary Uniqueness Check, Not null check, and Tokenization Check

Validate and Verify the Data in all the Environments including QA, UAT and PRODUCTION

Also work with Deposit ART and was performing Datawarehousing QA activities including PROD validations.

Also worked with Information Management for Table/Column Level Metadata Verification and Ingestion using IGC.

Raise and track defects in JIRA till closure

SQL Scripting/Query based validations

Defect triage meeting for each sprints

Environment: Netezza, Snowflake, JIRA,Oracle

Sirius XM Satellite Radio- Irving TX May 2019- Feb 2020

Data and Quality Engineer

Working on ETL Informatica, Hadoop, Big Data and Oracle & Teradata and Python Scripting

Datawarehousing Data Analysis for Marketing Campaign

Work Include Weekly Sprints release based Stories

Used Python scripting to extract Data from sql server for few of the validations

Excel Data Analysis and creating mapping and validating the mapping against the PROD Data

Validate and Verify the Data in all the Environments including QA, UAT and PRODUCTION

Raise and track defects in JIRA till closure

Analyzed the defects to find and document the root cause.

Defect triage meeting for each sprints

Environment: Informatica, Teradata, JIRA, Tidal, Unix, Oracle, Excel, sqlserver (legacy server)

Bank of America –Plano TX July’17 – May’19

Quality Data Analyst

Working on ETL Testing and Manual Testing in Informatica and Oracle

Analyze different source System Data /customer data

Using the mapping created by Data Analyst / BA for creating all the Test cases and Test scripts

Validate and Verify the Data in all the Environments including QA, UAT and PRODUCTION

Raise and track defects in ALM / JIRA till closure

Handle the defect triage meetings

Analyzed the defects to find and document the root cause.

Assign, Monitor and Manage the tasks among other analysts at onsite

Environment: Informatica, Oracle, HP Quality Center ALM/JIRA, Autosys, Unix

Texas Health Resources –Arlington TX Mar’17 –Jun’17

Quality Assurance Analyst II

Worked on Healthcare Data Warehouse with Netezza is the Database and different source system (Clarity(Oracle), SQL Server, Flat files)

ETL was built using the IBM Datastage with IBM Data Model taken into consideration.

Analyze different source System Data /customer data and learnt the new Healthcare domain

Using the Fastrack mapping tool as the based for all the mapping created by Data Architect for creating all the Test cases and Test scripts

Validate and Verify the Data in all the Environments including QA & UAT and PRODUCTION

Raise and track defects in ALM till closure

Handle the defect triage meetings

Analyzed the defects to find and document the root cause.

Worked on Regression and Manual Testing

Assign, Monitor and Manage the tasks among other analysts

Environment: Netezza, Oracle 12G, Aquadata(SQL Query Tool), Datastage, HP Quality Center ALM

Cognizant Technology Solutions

(JP Morgan Chase)- Lewisville TX Mar’14 –Mar’17

Sr DW Quality and Data ANALYST

Coordinate with key stake holders, SME, Application owners and other analysts for business requirement gathering and analysis.

Profile and analyze customer data against information quality expectations or specifications.

Unwinding the existing logic in Oracle to create STT (Source to Target) documents using Express IT

Creating mappings that involve complex business rules. Preparing and reviewing detail design and technical specification documents.

Writing complex SQL in Hadoop(Hive)and Query IT to redefine business requirement from different data sources

Validate and Verify the Data in all the Environments including QA & UAT and PRODUCTION

Executed Manual Test cases

Raise and track defects in ALM till closure

Handle the defect triage meetings

Analyzed the defects to find and document the root cause.

Involved in migration of database from Oracle to Teradata.

Involved in technical discussion to resolve the complex data issues and help developers on implementation.

Assign, Monitor and Manage the tasks among other analysts

Environment: Teradata, Oracle 11G, SQL Assistant, Ab-Initio, Toad for Oracle, HP Quality Center ALM, Excel Macro, Hadoop, Query IT, Express IT

Cognizant Technology Solutions (Kemper Insurance, FL) Jul’13 – Feb’14

DW /ETL Test Lead

Involved in creating Test Strategy and Approach for overall Conversion Testing

Understanding the Requirements and analyzing Source Systems

Review of ETL specification documents and data models.

Designed test plans and defined cases for functional, integration system testing

Coordinating with Onsite and Offshore Testing Team on Day to Day basis for Data Validation and Verification

Defect management, chairing the Triage calls and Quality Centre Administration, Managed Transitioning Projects to Offshore, Knowledge transition to offshore including On-site / Offshore liaison, resourcing

Defect identification & analysis and established guidelines for severity and priority for defects

Status reporting to Kemper and Cognizant Program Management

Liaison with design and development teams, To undertake Manual testing against agreed test schedules and to report results

Environment: GuideWire(ClaimCenter), Informatica, Winsql, Toad, Quality Center

Cognizant Technology Solutions (HARTFORD Insurance, CT) Oct’12 – Jul’13

DW Functional Data and Test Analyst

Understanding the Requirements and finding the Gap in the Testing and help QA team to understand Business Scenarios apart from the Data Validation & Verification

Raising issues that can lead to potential delays, frequent customer interaction and client meetings for understanding the existing functionality and processes

Understanding the Business Criticality and update the missing scenarios to help improve overall testing process

Defect Triage with BA’s, Data Architects and Developers in order to identify valid defects

Reviewing the work from QA team on day to day basis

Build relationships to be leveraged for references to other business lines/prospects

Was involved in helping the QA team to work on Data Quality Checks and help team to present the same to the Client Business Team

Environment: Oracle 11G, Informatica, PLSQL Packages, Toad for Oracle, HP Quality Center ALM

Cognizant Technology Solutions (FOREST Laboratories _ India) Sep’11 –Sep’12

Sr DW Data & Test Analyst

Leading and managing Team of QA and Production Support and Manual Testing

Managing overall of Testing of the Application, Analyzing the design documents and understanding the functionality of both source and target systems

Raising issues that can lead to potential delays.

Configuration management and change management of all project artifacts

Frequent customer interaction for solving issues, Defect Management in QC and Weekly & Monthly Test report Generation

Prioritizing activity and maximizing effectiveness of resource allocation

Ensuring adherence to the project lifecycle processes including preparation and / or review of key testing deliverables, including, Test Approaches, Test Strategy, Master Test Plans, Status reports and Test Summary Reports, Streamlining business processes

Environment: Informatica 9.1, Oracle 11G, Toad for Oracle, Quality Center, QlikView

Cognizant Technology Solutions (UBS, Zurich) Jul’09 –Jul’11

Test Analyst

Support legacy Credit Risk & Monitoring Application on day to day Execution and Change request Management

Understanding the Business Requirements, Analyzing the design documents and understanding the functionality of both source and target systems

Modifying the PL/SQL code, developing the code for replicating the database using Oracle Data pump and Unix Shell script

Logging the Defects in QC (Mercury Quality Center)

Involved in writing Automation scripts for avoiding Manual Intervention

Helping the Credit Risk Business Team for testing the code.

Involved in interaction with Business users, SMEs and preparing Documents of Understanding for various applications and getting sign-off for the same

Knowledge Transfer sessions to the new members in the team, Co-coordinating the over all Testing activities including Manual Testing & Unit Testing and System Testing and production implementation

Environment: Oracle, Sysbase, Toad, SSH ( File Transfer), Control M, jisql, Oracle Data Pump

Cognizant Technology Solutions (Lloyds TSB, UK) Aug’07 –Jun’09

ETL Tester

Analyzing the design documents and understanding the functionality of both source and target systems

Preparation of traceability Matrix and Capturing the Complete requirements

Frequent customer interaction for solving issues

On Call Support for the Complete application and Fixing the Issue as and when required

Logging defect and Managing QC (Mercury Quality Center) for complete Testing life Cycle.

Worked on some changes in PL/SQL codes for improving the performance,

Later on took extra Responsibility of Handing the Test Lead Position

Environment: DB2, Teradata, Datastage, Quality Center, Control-M

Cognizant Technology Solutions (Merck & Co Inc.) Dec’06 –Jul’07

Software Developer

Understanding the Change request, getting it clarified by Business Team, Analyzing the design documents

Customer interaction for solving issues in requirements

Develpoing PL/SQL Procedures to achieve the required functionality as per changes

Changes made in PRO*C and Shell Scripts code as per the requirements

Testing the Overall Functionality

Preparing the necessary documents like Unit test Plan, Deployment Checklist, etc.

Environment: Oracle, Toad, PRO*C, Unix Shell Scripting, Subversion

Patni Computers Solutions (GE Capital) Feb’06 – Dec’06

Database-Developer

Maintenance of Database

Changes in PL/SQL Procedures and Function, Packages

Testing the Overall Functionality, creating Unit Test cases

Client Interactions and attending meeting to clarify the requirements

Maintaining the complete Process and Documentation

Environment: Oracle, Toad for Oracle, Batch Processing, SQL Loader

Patni Computers Solutions –(Navigation Technology) Jul’04 – Jan’06

Test Analyst

Validating various navigation rules on the Oracle database (Bison), GIS database testing using the GUI tool Atlas, Batch validation

Comparing the D96 (Mainframe) reports with Bison reports by firing appropriate Pl/SQL scripts and SQL queries on the Bison (Oracle) database

Determining the mismatches in the two reports and reporting the legitimate mismatches

Analyse results and defects reporting according to the Requirements and Communicate and Interact with Client.

Documentation of the test reports mentioning the passing or failing of the test cases with adequate supporting reasons in Quality Center.

Environment: Atlas (GUI Tool), Bison(Oracle) Database, Quality Center, Query Tool (GIS)

Patni Computers Solutions – (Vmware) Jan’04 – Jul’04

Tester

Execution of test cases to test the functionalities of the product

Modifying the host machines by Installing OS such as Windows and Linux as daily requirement for mobility testing

Carried out manual testing in multiple cycles to test the functionality as well as integrated product, Testing

Reporting to the client about the bugs, steps to reproduce the bugs with build number and hosted branch, category and component in bugzilla Tool

Environment: VMware Server, GSX, ACE and Workstation



Contact this candidate