Post Job Free

Resume

Sign in

Project Manager Sales

Location:
Edmonton, AB, Canada
Posted:
October 31, 2013

Contact this candidate

Resume:

Summary

Around * years of experience in the field of Software Quality Assurance and

Testing with a strong exposure to all phases of Software Development Life Cycle

(Initiation, Design, Development, Testing and Implementation) and Software

Testing Life Cycle (Test Planning, Test Execution and Test Analysis)

3 years of experience as QA Lead/Lead Tester in handling multi-million dollar

project.

3 years of extensive experience in performing data warehousing testing using

Informatica.

Gathered, reviewed, understood and performed gap analysis on Business and

Functional Requirements and Use case requirements

Understood the project requirements, created test strategies, test plans, test

scenarios and test conditions/cases

Ability to organize test strategy and test plan review session with the stake

holders and end users

Established communication bridges with development, implementation and end user

teams

Created QA project plan based on master project plan created by the project

manager

Organized knowledge transfer sessions from the business team to better understand

the applications that are involved in the project

Performed Test planning, Test execution and Test analysis using Quality Center

and have an ability to import requirements and test cases from excel or word

Managed a team of 15 QAs on both onsite and offshore capacity

Coordinated and managed all the testing tasks from both onsite and offshore teams

Organized testing status meetings and provided inputs to the project manager and

QA manager

Involved in the projects risk analysis meeting and provided testing related risk

to the project manager

Ensured that all the testing related risks are met by coordinating with

development, DBA, infrastructure and deployment teams

Created WBS - work breakdown structure and RACI - responsible accountable

consulted and informed documents by outlining all the tasks to be completed for

this project and reviewed with the team

Created daily testing checklist to make sure that testing is on track as per the

time lines and deadlines

Coordinated and performed testing on applications that are in scope of the

project using both automated and manual testing on Web based, Client/Server and

Mainframes applications in Windows, UNIX, Mainframes and Web environments

Executed testing as per the plan using different testing techniques like Sanity,

Functional, Black Box, System, Penetration, Interoperability, Integration, End to

End, Adhoc, Smoke, Cross Browser, Cross Platform, Regression, Batch, Database

(Data model, Data Mapping, Data Validation, Data Integrity), User Acceptance,

Load, Stress and Performance Testing on client server, online and mobile

applications

Created automation test strategy and used HP Mercury Quick Test Professional for

creating automation regression suites using VB Scripting

Maintained automation regression testing suites for multiple projects

Experienced in working with Multiple browsers and Cross Platform environments

Extensive Experience in UNIX (AIX/Solaris/ HP-UX 11.x), and Windows Operating

system

Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages

Goal oriented, self-motivated with excellent communication, analytical and

interpersonal skills

Ability to work in a team oriented, collaborative environment, customer oriented,

highly versatile, innovative, with excellent interpersonal, problem solving,

trouble shooting and communication skills

Quick learner and adaptive to new and challenging technological environments

Education

Bachelors in Computer Science & Engineering, JNTU University, India 2005

Experience

Alberta Blue Cross, Edmonton, AB

Jun 2012- Till date

QA Lead/Lead Tester,

Administration Redevelopment System (ASR-Sales) is an enhancement application

designed and developed to replace the existing Application. The application is

designed with robust features to satisfy the organization's goal to improve the

customer service experience and the way the handling of the current loads in

processing claims and administration.

Worked closely with the team management in creating the Master Test Plan, Test

Strategy and Resource Planning, and Risk mitigation.

Identified the complexities involved in the spaghetti architecture and created

the test scenarios

Involved in writing the test cases in the Test Plan of the Application Lifecycle

Management (ALM 11 tool)

Created Test conditions and mapped the requirements traceability from the

Requirements folder to the Test cases in the Test Plan to ensure the coverage is

completed.

Worked across the teams in setting up the test data to facilitate the test

execution

Assigned tasks and worked along with the team members to perform the Sanity

testing, system testing, system integration testing etc.

Created automation test strategy and used Quick Test Professional (QTP) for

creating automation regression suites using VB Scripting

Involved in the test case execution in the Test Lab and logged the defects

accordingly

Worked with development team to ensure the defect is fixed and a follow-up with

the team members to retest and proceed with the action on the defect based on its

nature of pass or failure.

Worked on the XML file cloning to create the sample data to test the application.

Conducted weekly team meetings to identify the road blocks while writing the test

cases, test case executions, defect fixation and reported to the team management

immediately.

Tasks were assigned such that during the progression of the test execution if the

application is down, a parallel task is assigned to ensure the productivity and

time of the resource is utilized to the maximum based on the priorities.

Involved in the defect triage meetings and laid plans in finding the solutions

Since the application comprises of multiple projects, proper handling of the

Service Requests (SRs) is a major task and should work closely with the inter

project team managements.

Coordinated the BPV (Business Process Validation) with business sponsors of other

project teams involved as part of the application.

Performed system integration testing with other projects and smoke testing to

ensure the accuracy of the application.

Worked closely with the UAT Team of business users using the production data and

published the reports on a daily basis.

Environment: HP Application Lifecycle Management 11.0, HP Quick Test

Professional, UNIX (HP-UX), Oracle 11g, Windows 7, Java, Putty, SQL Developer

(Query Tool), MS Office Suite, MS Visio, Apache, Weblogic, Snag It.

Ministry of Education, Calgary, AB

Jun 2010 - Mar 2012

QA Lead,

Approval Framework data warehousing process is a combination of legacy and

current data sources. The approval framework (DF) is a decision-making process

designed to assist school boards in making informed decisions, based on what

their analytical data, about areas that require special interest and

concentration to provide better results and performance from the students.

Performed Business Analysis to plan and create the Master Test Plan.

As the QA Stake holder, conducted meetings with business users to create the UAT

and Use cases.

Worked as a QA in testing the Data Migration project

Understood the complex architecture of the project

Identified different data sources that are in scope the project and understood

its process

Organized and understood the complete end to end business process

Involved in the project as QA Lead in managing and performed testing on the

complete data warehousing process

Involved in the documentation of master test strategy with the inputs from the QA

manager

Published the draft version of the master test strategy to the project members

for review and sign-off

Provided test estimates to the project manager based on testing scope and tasks

Identified with work effort and created Work Breakdown Structure and published to

the team members

Performed test management and Defect management using HP Mercury Quality Center

Created detailed Manual Test Scenarios and Test cases according to the

Business/Functional requirements

Tested various Informatica mappings with Tuning Techniques to improve the

mapping/session performance

Performed Data testing, Transformation testing, Business rules and Data quality

Wrote complex SQL Queries to incorporate ETL logic and to validate Test Cases

Performed functionality testing on both online and client server application

using converted data

Checked the results against to the expected results and checked and validate the

Data validity

Involved in data warehouse testing by checking ETL procedures/mappings

Worked on understanding of ETL plan and ETL mappings for the data flow

information

Also involved in working of Transformation rules in the ETL tool

Worked with Informatica mapping ETL plan for analyzing the business rules are

mapped and designed properly

Involved in testing of output of transformations data in the destination tables

Worked extensively on Source Qualifier, Sorter, Aggregator, Normalizer,

Expression, Filter, Router, Update strategy, Lookup, XML, Stored Procedure,

Sequence Generator, and Union Transformations.

Tested the complex ETL mappings and configured the sessions and workflows to

populate the Decision Frame worktables in the Data Warehouse.

Worked with Star Schema model to design dimensional and fact tables.

Performed Manual testing for Functional, Integration, End to End and UAT testing

for Data Migration project

Reported defects with proper priorities and severities using HP Mercury Quality

Centre

Generated and published customized defect reports to the project team and

organized defect triage meetings

Environment: Informatica Power Center 9.0, Business Objects Crystal Reports, HP

Mercury Quality Center 10.0, HP Mercury Quick Test Professional 10.0, Mercury

LoadRunner 9.0, UNIX (HP-UX), Oracle 11g, Windows XP Professional, Flat files,

SQL Server 2008, SQL Developer (Query Tool), Unix

AMA - Alberta Motor Association Insurance, Edmonton, AB

Jan 2009 - May 2010

Senior QA Analyst, Marketing Analytics Portal System

AMA Marketing Analytics System will deliver a business intelligence tool, which

enables sales and marketing department to capture and leverage information to

support and provide insights into the business. The solution involves the

extraction of customer level data; merging with outside lists and demographics;

and includes data from AXA transactional systems; results of sales efforts from

call center and sales systems, as well as other vendor services.

Involved in the project as ETL Informatica QA Analyst in testing the Marketing

Analytics System

Requested overview of the complete business process from the business users of

AXA insurance

Identified all the project documented and gathered MAS project strategy and

understood the conversion process and conversion time lines

Worked closely with project manager and QA manager to understand the deliverables

Understood the functional and business requirements of the project by organizing

review sessions with the Business analyst

Involved in the creation of master test plan by outlining test assumptions,

risks, timelines, roles and responsibilities, build management, entry/exit

criteria and deliverables

Worked closely with the ETL and Development team to understand the business

process

Created detailed Test Scenarios and Test cases according to the

Business/Functional requirements for the MAS application

The solution provides the ability to integrate external list purchases

maintaining the required data quality against AXA's information for use in the

marketing process

Tested the Sales & Marketing Data mart, which includes the data model, business

rules, Extract/Transformations/Load procedures and schedules

Tested the common definitions using the business rules for all business regions

reporting on member, lead or prospect data

Validated the forecast pipeline activity, which defines when a Lead becomes the

member by generating reports

Performed System integration testing with the Lead Management System (LMS)

historical data that is critical for campaign management (primarily campaign

related measures based on the campaign history needs) will be migrate to the MAP

environment

Verified and tested the Informatica transformations like Expression, connected

and unconnected Lookups, Source Qualifier, Filter, Aggregator, sorter, Sequence

Generator, Normalizer, Router, Joiner, Stored Procedure etc

As part of performance testing validated and verified the Performance tuning, ETL

Procedures and processes

Tested ETL mappings using Informatica Power Center to move information from

multiple sources into staging tables and then to common consolidated data area of

Data Warehouse and then to Flat files as requested by the vendor

Executed PL/SQL Procedures and Functions for Procedure transformations also

created and used different tasks like Decision and email tasks

Performed Analytics reports testing based on the requirements

Identified and reported data mapping, data integrity, data corruption issues to

the ETL and Development teams

Used File Aid, TSO to perform data validation checks

Provided daily and weekly status to Test Manager

Environment: Informatica Power Center 8.6, HP Mercury Quality Center 9.0, HP

Mercury Quick Test Professional 9.0, Mercury LoadRunner 9.0, Cognos, Windows XP,

UNIX, Oracle 11g, SQL Server 2008, MS Excel (Source), Flat files (End Targets),

TOAD (Query Tool), PL/SQL.

GSK Pharmaceutical, Montreal, QC Jul 2007 - Dec 2008

Senior QA Analyst, Online Oracle Siebel Sales Application

Siebel Sales application is used by all the GSK organization throughout the world

to project and forecast sales. Siebel Sales application collects data from

different markets (Asia, Africa, Europe) data sources and loaded into historical

database. Using the historical data analytics processing is performed to generate

multiple reports that will be used by the marketing and sales department to

analyze the data.

Identified all the project documents and gathered them from Microsoft SharePoint

Requested walkthrough sessions from the development team to understand the

overall process of the Oracle Siebel Sales process

Involved in requirement discussion to understand the priorities of the

requirements from the business side

Involved in creating test strategy and test plan based on the requirements and

published to the project team

Drafted detailed testing scenarios to cover all the requirements and published

these scenarios to the DBA for the creation of test data and test beds

Understood and analyzed the market business data from different countries for

creation of test data

Involved in the test data creation for specific type of testing

Created detailed test cases in Excel and imported into Quality Center using

export process

Imported requirements from word documents into Quality Center

Mapped test cases to specific requirements to ensure requirement test coverage

Included test data parameters during test case creation in Quality Center

Executed the test cases for sanity, functionality, integration, security,

backend, shakedown, end to end, regression and performance testing

Published test planning and test execution reports generated from Quality Center

to the management

Identified defects and reported to the development team with defect details and

coordinated with the development team on regular basis to understand the status

of the defects

Performed UAT testing as requested by the business support team

Performed Data conversion, Quality and Verification testing on the data from

different data sources.

Developed UNIX (Korn shell) scripts for ftp'ing files, to execute the SQL

scripts, Stored Procedures etc.

Used SQL Loader to load data from flat files to the Database tables for testing

purpose

Scheduled the UNIX jobs/batches comprising of Informatica workflows, Oracle SQL

scripts, and ftp process using the control-m job scheduler.

Updated all the test documentation with the actual results and stored them in

release binder for auditing purpose

Environment: Informatica Power Center 8.5/8.1, HP Mercury Quality Center 9.0, HP

Mercury Quick Test Professional 9.0, Mercury LoadRunner 9.0, Informatica Power

Exchange 8.x, Crystal Reports, Mainframes, UNIX (SUN-Solaris), Oracle Siebel 6.0,

Oracle 10g, Windows XP Professional, Flat files, SQL Server 2005, SQL Loader,

Serena Version Manager, Toad (Query Tool), Control-m scheduler, Korn shell

Scripting.

Canadian Western Bank, AB

Mar 2006 - Jun 2007

QA Analyst, Banking Portal Application

Banking Portal Application is used by the internal and external employees of the

bank to access the banking products, services, rates and apply for multiple

credit products offered by the bank. The banking portal application is integrated

with Vignette content management, which is used for centralizing the entire

content that gets published on to the application.

Reviewed and understood the Banking portal application requirements

Requested the BA and PM to organize a walkthrough and an overview from the

business to better understand the application and project

Involved in the creation of test plan with the inputs from the stake holders and

manager

Used Quality Center to manage the complete end to end process of testing

Administered the projects from Quality Center from testing perspective

Imported and created requirements from the word documents into the Requirements

module

Documented detailed test cases in the test plan tab and mapped the test cases to

the requirements to ensure the coverage of testing with the requirements and to

ensure the traceability

Created different test sets based on priority and business test and mapped the

specific test cases into the test sets

Executed the test cases from different test sets and reported defects in the

defects tab with proper priorities and severities

Produced daily test execution and defect status reports to the project team

Executed test cases for functionality, system integration and end to end testing

Involved in the creation of UAT test plan, UAT test scenarios and UAT test cases

Performed backend database testing on oracle database using SQL

Executed test cases to check the content and integration with the Vignette

content management server

Participated in the daily defect status and test execution status meetings

Environment: ASP, ASP.NET, XML, Weblogic, HP Mercury Quality Center 8.0, HP

Mercury Quick Test Professional 8.0, Mercury LoadRunner 8.0, NUnit, Vignette,

Oracle 10g, SQL Plus, Toad, XML, UNIX Shell Scripting.

Hartford Life Insurance, CT

Oct 2005 - Feb 2006

QA Analyst, Claims Management System

Claims management system is a web based application used by the customer and

underwriters to report, view and process the claims that are reported by the life

insurance policy customers. Online Claims Management system is integrated with

printfree application that will be used by the underwriter to print and send the

office documents to the customer

Created test cases in excel and mapped the test cases to the requirements using

RTM - requirement traceability matrix

Provided the test cases to the BA for review and sign off

Stored test case in specific folders based on specific types of testing to test

execution purpose

Performed System integration testing between Claims Management System and FASAT.

Performed System and Functionality Testing on FASAT application

Produced daily test execution and defect status reports to the project team

Performed sanity, functionality, system integration, end to end, user acceptance,

security, navigation, security, content, regression, load performance and stress

testing

Participated in the daily defect status and test execution status meetings and

published the reports

Environment: Java, Java, Java Script, J2ee, EJB, HP Mercury Quality Center 8.0,

HP Mercury Quick Test Professional 8.0, Mercury LoadRunner 8.0, WebSphere, FASAT,

MQ Series, Oracle 9i, Oracle Reports, XML, PL/SQL, SQL, Windows NT, UNIX (HP-UX

11.x), Unix Shell Scripting.

Skills

Testing Tools: HP Mercury Quick Test Professional 10.0, HP Mercury Quality Center

10.0, HP Mercury LoadRunner 10.0, FASAT

Data warehousing: Informatica Power Center 6.x/7.x/8.X/9.0, Power Exchange,

Repository Server Administrator Console, Star Schema, Snowflakes, FACT and

Dimensions tables

Databases: Oracle 11g/10g/9i/8i, T-SQL, MS-SQL SERVER 7.0/2000/2005, Teradata,

TOAD 8.0/7.1, Winsql, SQL*Plus, SQL Developer, Squirrel SQL

Languages: C, C++, Java, Java Script, J2ee, EJB, JUnit, NUnit, Weblogic,

WebSphere, MQ Series, ASP, ASP.NET, XML and CSS.

Reporting Tools: Cognos, Crystal, MS Access Reports, Oracle reports 6i

Environment: UNIX (Solaris, AIX, HP-UX), LINUX, Windows 9x/2000/NT/XP/VISTA

Other Tools: MS Visio, Legacy (COBOL 3.x/2.x), Control-m, AutoSys, Redwood



Contact this candidate