SAROJA NAYAK
Cell: 609-***-****
*****-******@*********.***
Objective
To obtain a leading position as QA Lead/QA Analyst in a well reputed organization that will provide
professional growth and the chance to utilize my doming expertise and professional skills.
Professional Summary
10+ years of strong IT experience in Quality Assurance (QA) and software development.
Extensive experience in QA and development in Telecom, Finance and Retail domain
Extensive experience in testing CRM (Salesforce.com & Microsoft dynamic CRM) application.
Involved extensively in Integration Testing, Regression Testing, End2End Testing, Functional
Testing, Database Testing, System Testing, Performance Testing and User Acceptance Testing.
Expertise in project management and team management.
Experience in testing Distributed, Client/Server, SOA and Web applications by using manual and
automation tools such as QTP and Load Runner.
Proficient in writing SQL queries to setup test data and execute the use cases in SQL/Oracle
backend.
Expertise in analyzing functional requirements and preparing Test plans, Test scripts and Test cases for
both Automated and Manual Testing.
Experience in ClearQuest, Test Directory, Quality Center, SharePoint and Informatica.
Expertise in defects logging and generated defect tracking reports using Quality Center and
ClearQuest.
Experience in preparing the UAT plan and test cases and participated in UAT execution.
Strong experience in testing application using scrum methodology in agile environment.
Experience with Software Development Life Cycle (SDLC) methodologies.
Strong Experience in UNIX operating system.
Project management experience includes preparing requirement documents, technical
specifications, test plan document and use cases.
Maintained onsite-offshore co-ordination by being a point of contact for all testing related queries,
issues and guidance.
Education
Master in Computer Application (MCA) - India
Bachelor of Science (BSc.) -India
Technical Skills
Test Tools : Test Director, Quality Center, QTP, Load Runner
Operating Systems : Sun Solaris, Linux/Unix, Windows Vista, Windows NT/XP/2000
Databases : Oracle 7.x/ 8.x/8i/9i/10G, MS SQL Server, MySQL, Mainframe, DB2
Languages : Java, JSP, Servlet, Java Beans, C/C++, HTML, XML, Java Script, VB Script,
Struts
Tools : Toad, Eclipse, Share Point, Informatica, Ant, WinCVS
Environment : Agile, Citrix
Work Summary
Merrill Lynch/Bank Of America, NJ
GWM Online Portal (Mobile Testing)-Oct 2010- Jun 2014.
QA Lead
Project Description
It is an integrated site to support institutional clients and improve their abilities to support their end clients
and create linkages to maximize retail conversion from Retirement relationships. It brings both
domestic/International clients under GWM online platform and integration of domestic/International
clients/accounts with GWMT business processes to extend capabilities, accelerate time to market and
improve efficiencies. It provides a single access solution across all benefit-related products, retail
accounts and bank accounts offered by Merrill Lynch/Bank Of America including Credit Cards, VISA,
ADVISORY, HELLOC, and MORTGAGE, BANK which helps in aggressively growing non-US GPC
business.
Involved in end to end testing of Merrill Lynch Bank Of America online applications including account
opening, login, portfolio, custom view, user profile, client vault, SMC, Move Money, Trading and extend
our trading features for online users.
Responsibilities
Analyzing requirement specifications which are having related stories
Mapping all requirements to Functional Design & Business Scenarios to assure
that the applications & requirements were covered under integration testing.
Setup test data for all the test cases before execution.
Extensively involved in developing test plan, test cases for manual testing.
Creating & Managing test requirements hierarchy in developing Test Scripts and Test
Cases using Quality Center.
Performing User Acceptance, Regression, Integration, Functional, Database,
End to End, Back-end testing on multiple releases and verified actual results
with expected results.
Testing all the modules involved in this project using Mobile
Devices(IPAD,IPOD/IPhone, BlackBerry)
Involved in collecting information for UAT by closely working with the business and user
team.
Logged & Tracked defects using Quality Center.
Isolated and simplified problems discovered during testing so that developers can fix
them easily.
Guided & trained new team members on testing procedure, scripting and test cases.
Environment: .NET, Oracle 10/11Gi, Windows XP, Quality Center(ALM)
Wells Fargo, Columbia, MD
Relationship View (RV)
Jan 2010- Oct 2010
QA Analyst
Project Description
RV is the Microsoft dynamic CRM which deals with the Wells Fargo Customer Relationship Management
which is used to deals with Opportunity, Relationships Contacts, Call Reports and it’s the integration
between different application like Profit Max, FP, MCV, Credit View etc.
All these above modules are tested in Outlook, Share Point and Mobile
Responsibilities
Analyzed requirement specifications which are having related stories
Setup test data for all the test cases before execution.
Extensively involved in developing test plan, test cases for manual testing.
Integration testing with Credit View, Profit Max, A++, MCV for Relationship,
Contact and Opportunities integration.
Created & Managed test requirements hierarchy in developing Test Scripts and Test
Cases using Quality Center.
Performed User Acceptance, Regression, Integration, Functional, Database,
End to End, Back-end testing on multiple releases and verified actual results
with expected results.
Tested Account, Opportunity in RV and verified those accounts in different
applications like Credit View,
Developed SQL queries to collect test data and validate the data in Oracle database.
Performed Database/Stored procedure testing by running SQL queries using Toad.
Involved in collecting information for UAT by closely working with the business and user
team.
Performed the testing on Agile Methodologies.
Logged & Tracked defects using Quality Center.
Environment: .NET, SQL Server, Oracle 9i, Windows XP, Quality Center
Cisco System, Sanjose, CA
Customer Data Integration Investment (CDII)
FDEV and Salesforce.com (SFDC) Account Integration
Jan 2009- Dec 2009
QA Lead
Responsibilities
Extensively involved in developing test plan, test Cases for manual testing.
Integration testing with SFDC for account creation.
Created & Managed test requirements hierarchy in developing Test Scripts and Test
Cases using Quality Center.
Performed User Acceptance, Regression, Integration, Functional, Database,
End to End, Back-end testing on multiple releases and verified actual results
with expected results.
Created Account data with different contact details in one application (FDEV) and
the informatica tool placed the entered accounts in another application
(Salesforce.com). This procedure is followed to validate:
Csc_site_id
CSC_id and Country combination
Duns no and Country combination
Account name and Country combination
Inactive account owner
Data quality check
Ran Informatica workflow to verify the data flow from FDEV application to
Salesforce.com application.
Tested Account, Opportunity and Sales Account View modules in
Salesforce.com (SFDC).
Verified the Territory assignment, Ownership assignment and Group visibility
in SFDC
Developed SQL queries to collect test data and validate the data in Oracle database.
Performed Database/Stored procedure testing by running SQL queries using Toad.
Prepared User Acceptance Test (UAT) plan and test cases.
Used checkpoints and synchronization statements extensively to customize the Quick
Test Pro scripts.
Logged & Tracked defects using Quality Center.
Environment: Java/ J2EE, IBM Web Sphere, Oracle 9i, Windows XP, Quality Center, QTP, Informatica.
Ann Taylor, NY, NY
Zymmetry- Jan 2008- Dec 2008
QA Lead
Responsibilities
Prepared test plan, test Cases for manual and automated testing.
Managed and executed Test Scripts and maintained the defects using Quality Center.
Performed SQL Server Back-End testing by writing SQL queries and stored procedure.
Extensive use of Share point portal for the documents repository.
Developed UAT test plan and test cases.
Tested all the .Net Web services using SOAP client.
Environment: Quality Center, QTP, Scrum Methodology, Oracle, SharePoint, MS .Net 2.0/3.0, SOAP
and Windows XP.
JPMorgan Chase, Iselin, NJ
HP3000 System- Oct 2007- Dec 2007
QA Lead
Responsibilities
Reviewed the Business Requirements document with the business and development team to
understand the architecture and functionality of the application.
Reviewed the test cases with business analyst and dependant partners and set up the test data
before execution.
Prepared Test Plans and Test Cases for manual and automated testing.
Created criteria for data conditioning needed for Automated and manual testing.
Involved in the execution and validation of the following activities.
Transform, map & stores data from various inbound system.
Stores Historical data for inbound systems as well for application that resides on HP3000.
Filters, enriches and transform for generating the outbound feeds/Reports.
Provide interface, processes, reports/Feeds for applications that reside on HP3000 (Inventory Control,
HMDA, Finance etc.)
Expertise in setting up the test environments and preparing test data for automation of regression
test cases.
Preparation of User Acceptance Test (UAT) scripts and executed.
Wrote Performance test specification and executed performance test using Load Runner.
Involved in Performance testing using Load Runner by creating Virtual Users and setting up duration.
Performed Data Driven Testing using SQL queries and stored procedure.
Implemented and automated regression test scripts based on business requirements using QTP.
Performed in both System and User Acceptance Testing.
Tested mainframe web services using SOLA tool to retrieve and update the data in Mainframe
/DB2 database.
Validated test results in SQL, Mainframe and DB2 databases.
Executed the test scripts on different releases and validated the actual results against the expected results.
Debug the Test Scripts, created Batch tests, verifying the Test results and reporting the defects to Quality
Center.
Responsible for Creation and maintenance of Test cases in Quality Center.
Wrote SQL, PL/SQL queries to access the data from the database tables and the results, also been used
for Automation Regression Testing.
Extensively used TOAD and SQL Station - database tools for the Data Validation.
Environment: Java, JSP, Servlet, SQL Server, Mainframe, JDBC, EJB, Quality Center, QTP, Load
Runner, Web Sphere
Cisco System, San Jose, CA- April 2004- Sept 2007
QA Lead
Environment: Java, JSP, Servlet, Struts, SQL Server, XML, PL/SQL, WinCVS, Ant, Weblogic, Quality
Center.
IIT Bombay, Mumbai
Automated Protein Design– July 2002- March 2004
Developer
Environment: Linux 9, C/ C++
PAGE 5