Sign in

Quality Analyst

Atlanta, Georgia, United States
August 29, 2018

Contact this candidate







Professional Profile

Total 9 years of experience in Software Testing of various applications in Client/Server Environments, Web Based Applications, Data Warehousing and Business Intelligence Solutions.

5+ years of Experience in ETL and BI testing.

Experience in Functional Testing in Financial and Retail domain.

Experience in Webservices Testing using SOAP UI.

Successfully executed large and complex projects as a QA Test Lead and proficient in working in an Onsite/Offshore model.

Extensive experience in various phases of STLC such as Requirement Analysis, Estimation, Test planning, Test Strategy, Test Execution, Test Reporting and Test Closure Activities for various projects.

Experience of working in Waterfall, Iterative, V and Agile model.

An ISTQB certified (Foundation) Testing Professional.

Strong analytical skills to translate project requirements and Business Process Flows to appropriate test scenarios and test cases to ensure test coverage and traceability.

Ability to handle multiple projects simultaneously.

Proficient in SQL Query and UNIX commands.

Good Knowledge in Designing as well as in Analysis of Test Cases, Test Plans and Test Procedures.

A good team player with good leadership qualities, excellent communication skills and possess an ability to work in a team-oriented environment.

Handled many releases as a QA SPOC and point of contact for the UAT Team. Interaction with the senior managements and clients via call.

Created knowledge sharing document and also conducted training session for the new team members.

Proven track record of meeting targets in stringent deadlines. Adept in skilling up modern technologies, domain.

Knowledge of Selenium Web Driver.

Basic Hands on training in Big Data Testing (Cloudera,Cassandra,Hadoop, Hive)

Skill Set

Test Methodologies: Agile, Iterative and V model, Waterfall.

Testing Types: ETL Testing, BI Report Testing, Database Testing, Functional Testing, API testing, Integration Testing, Regression Testing, User Acceptance Testing, Usability Testing and UI testing.

Software Testing Tools: HP ALM, HP Quality Centre, JIRA, IBM Rational Clearquest, HP QTP 8.2

Database/ETL Tools: SQL Developer, OWB, Datastage, TOAD

Other Tools: VSS, IBM Rational Clearcase, SVN & MAVEN, GIT-GITHUB, PUTTY, WinSCP, Cardpac, Autoscore, PEGA, COGNOS

Operating Systems: Windows 98, NT, XP, UNIX

Languages: C, SQL, PL/SQL, UNIX, JAVA

RDMS: Oracle, DB2, Microsoft SQL Server

Recent Accomplishments

Project Name: IKEA BI AMS Bundle

Organization: IBM

Client: IKEA

Role: Test Lead

Period: From Feb 2012 till 2015

Technologies: SQL Developer 3.1, OWB Tool, Datastage, HP Quality Centre 10.0, HP ALM 11.0, Toad, COGNOS, Clearcase, Clearquest, Putty

IKEA is a privately held, Dutch company of Swedish origin that designs and sells ready-to-assemble furniture (such as beds, chairs, and desks), appliances, and home accessories. The company is the world's largest furniture retailer. BI AMS project is to maintain and enhance the existing Data warehouses of IKEA along with 36 different reporting services. There are four DB platforms for BI for IKEA i.e. IBIS, IDW, IDSS and RT70 which supports decision taking on tactical and strategically level. The BI systems are sourced from various source systems catering to IKEA business processes of Creating, Supplying, Communicating & Selling and Support functions.

The data is transferred through IDRS agreements directly with source systems or with integration/consolidation points like SCPIX, HELENA, EDDA etc. IDRS is a product developed for handling the data exchange between two or more source systems/database.

IDW is the main source for supply chain related data. Integration between these Data warehouses is done by mainly IDRS, DB Links and FTP. These data is then transformed into fact and dimensional data through Oracle tool OWB and PL/SQL and Datastage. There is no direct end-user access to IDW Platform. Any end-user access need to be managed through the different tools used (Cognos, Web/asp-solutions) to retrieve data from IDW.

I was playing the role of a Test Lead for IKEA BI AMS Bundle and responsible for testing of Change Requests, Problem Reports.

Highly dynamic environment with Agile scrum teams

Participated in various meetings like Sprint Planning and Sprint Demos

Requirement gathering from BA and stakeholders

Reviewing the business rules and design documents

Designing the test approach and end to end testing plan

Assigning the testing assignments to the team members

Involved in Designing Activities like Test Case Template and Test Plan Template

Reviewing and Executing test cases

Lead daily defect call

Triage call with client Business Analysts and SQAs regarding issues faced during execution and resolution of the same

Raising defects and following up the same through its lifecycle

Defect Analysis

Involved in preparing Project Final Summary Report

Providing support during UAT by Business Users

Played the role of QC Admin including user creation, project creation, adding or removing users from projects

Verifying BI Reports in COGNOS

Project Name: IKEA KPI Dashboard Project

Client: IKEA

Organization: IBM

Role: Test Spoc

Period: From Jan 2011 till Feb 2012

Technologies: SQL Developer 3.1, Datastage, HP Quality Centre 10.0,Cognos, Clearcase,Clearquest,Putty,Winscp

The KPI Dashboard will give one store the opportunity to easily identify if it has a potential in Staff Costs and Productivity, how the store is performing against comparable stores, if the Staff Costs and Productivity are performing well at the cost of other important factors, and (most importantly of all) which similar store can provide a long term sustainable solution to the problem.

Under Game Changer #2 initiative there are around 17 KPIs and 9 measures. Requirements on each of these KPIs shall be addressed individually and captured in separate BRS and design shall also follow a similar pattern i.e. the design elements discussed in the General HLD shall apply to all KPI/Measure specific HLDs, moreover the specific HLDs shall handle the design for one or more KPIs/Measures in particular.

Requirement gathering from BA

Reviewing the business rules and design documents

Designing the test approach and end to end testing plan

Providing training to new members

Involved in Scrum meetings

Reviewing test cases

Executing the Test Cases

Meetings with clients and stakeholders for issue resolutions and clarifications

Raising defects and following up the same through its lifecycle.

Defect Analysis

Involved in preparing Project Final Summary Report

Providing support during UAT by Business Users

Project Name: IKEA CLA QA Project

Client: IKEA

Organization: IBM

Role: Test Lead

Period: From June 2010 till Feb2011

Technologies: SQL Developer 3.1, Datastage, HP Quality Centre 10.0

The purpose of CLA project is to have a common landing area where all the data from the legacy source systems will be stored and collected.

Qualified data is extracted from legacy data source systems and transferred directly via IDRS System and FTP into a Source Receiving Area based on full or partial loads. At physical view the Source Receiving Area is defined as a mix of physical data formats - database, text file, xml, etc.

From the Source Receiving Area an ETL process will transfer and load the data into the Initial Staging Area - CLA Source Validation. After the data validations are done, the data will be loaded into the Landing Area (LA) tables. This LA area data will in turn feed the IDSS system (published area).

In CLA R1.0, 5 source tables were used for implementing the re-usable common components. In CLA R2.0, another 12 source tables will be used for this purpose and the no. of target tables would be 5.

The testing activities include a complete ETL testing. Running the Datastage jobs to move the data from source system to target tables, and then validating the data in the target tables by running a SQL query which meets the specified business transformation rules. Apart from that, data type, data accuracy, data consistency, data uniqueness- these are also part of the system testing.

Requirement Analysis

Reviewing the business rules and design documents

Preparation of test planning

Preparation of test scripts

Preparation of Test Data

Reviewing test cases

Executing the Test Cases

Raising defects and following up the same through its lifecycle.

Defect Analysis

Involved in preparing Project Final Summary Report

Project Name: iSMS Application

Client: PMI

Organization: IBM

Role: Senior Test Analyst

Period: From Sep 2009 till June 2010

Technologies: Java, JSP, XML, SQL Server 2005 and 2008, Mercury Quality Centre 10.0,Jira

iSMS is a Custom application for Sales force Automation and merchandising activity. Interfaces with PDA application are used by Sales Force. The HH mobile devices are used by the Sales and Merchandisers to perform Sales and Merchandising activities. Merchandisers usually use iSMS offline in their laptop. iSMS consist of an evolving “Core” software that gets customized for each market requirement. Extent of a Customization need for a market is identified by a fit-gap analysis. A subset of market customization/enhancement is retrofitted in the “core” software, usually after UAT of the market customization. iSMS as an application has been in place for the last 4 years. But from the early Days of its existence no significant testing activities have been planned in the SDLC. Due to the nature of iSMS delivery processes a significant amount of defects have been introduced in the application and stayed as such. The number of such defects kept on increasing. And as the number of affiliates using the application increased more and more defects were thus introduced by the various customization requests made by such affiliates. As the complexity and the usage of the application increased, to enhance/maintain the quality of delivery the need for a testing team was felt.

Working as the Senior Test Analyst.

Requirement gathering from the Market teams.

Preparing Test Cases and circulating to Stakeholders.

Test Case execution and mentoring the members of the Team.

Defect Analysis.

Involved in Change Request Analysis.

Performed Web Services testing using SOAP UI

Involved in preparing Project Final Summary Report

Project Name: E-application and E-servicing

Client: GE Money UK

Organization: Capgemini

Role: Functional Test Analyst

Period: From Feb 2008 till Aug2009

Technologies: Java, JSP,XML, VisionPlus, Cardpac, Autoscore, Pega, Mercury Quality Centre 9.0

GE Money operates in a number of product segments in consumer credit market. It is predominantly a major Business-to-Business provider offering a range of consumer credit products and supporting the retail sales of their clients.

GE Money UK Card Services has web based front-end system to originate accounts; e servicing of the accounts and the backend resides on Mainframes (Vision Plus). E-apps is a digitized account opening system with a browser front-end developed by GE Money UK. Four versions are currently used - Call Centre, Client Eapps, Internet Eapps and XML. The Originations Application is used mainly to provide a decision like Accept, Refer or Decline to a customer applying for a MasterCard or Private Label Credit Card for which GE is a service provider. The decision-making data of an application is fetched from Experian Bureau. E servicing application is to enable credit card customers to avail online services. It offers a variety features such as User Registration, Sign In, View Account Summary, Amend Personal details, Amend Account Details, Make Payment, DD setup, etc. is used to maintain the existing accounts

Working as an UAT Functional Test Analyst.

Review of BRD and FS and provide FS Sign-off.

Preparing Test Plans and circulating to Stakeholders.

Preparing Test Scenarios,Test Scripts.

Updating client onsite PTC on the project planning status via weekly calls and mails.

Test Case execution and mentoring the Test Execution Analysts.

Involved in API Testing,XML Validation.

Execute the test automation scripts.

Defect Analysis.

Daily Defect calls with Clients and Development team.

Involved in Change Request Analysis.

Involved in preparing Project Final Summary Report

Project Name: Apollo Workstation

Client: GE Consumer Finance US

Organization: Birlasoft

Role: Execution Test Analyst and Functional Test Analyst

Period: From Jan 2006 till Jan 2008

Technologies: Java, Oracle 9i, JSP, Mainframe, Toad 8.5.1, Test Director

Workstation is a user-friendly, “self-help” enabled; web based front-end system that was designed to be accessible from both Collection and Customer Service Call Centers and via the Internet. The software is accessed by GECF-A employees on their desktops upon initiation of outbound collection calls and inbound customer service calls. Additionally the application supports 3rd party vendors. The application initiates various transactions that cascade to logical monetary and non-monetary transactions impacting cardholder accounts and receivables enhancing operator usability and productivity.

Preparing Test Cases, Maintaining Test Cases, Test Cycles in Test Director

Test Case execution.

Defect Analysis.

Involved in Use Case Analysis and Change Control Analysis.

Involved in understanding the DDD,Database Structural Documents and Entity – Relationship diagrams

Involved in preparation of Functional and Database Test Cases like Positive and Negative

Involved in Database Testing(Frontend V/s Backend and Data Validation)

Performed System Testing,Integration Testing.

Execute the test automation scripts.

Review the test automation scripts

Educational Background

Masters of Computer Application from Bengal Engineering and Science University, Shibpur 2003-2006, India.

Organizational Career

IBM India Pvt. Ltd : Sep 2009 to Jan2015

Capgemini India Pvt. Ltd : Feb 2008 to Sep 2009

Birlasoft India Pvt. Ltd : Jan 2006 to Feb 2008

Work Authorization Status

Currently on H4EAD.

Contact this candidate