Padma Nandaram
Phone: 732-***-**** Email: *********@*****.***
Objective:
With Eight years of experience in Software Quality Assurance, Test Planning and Execution, I am looking for a new position where my knowledge and skills are relevant. I’m a team player who takes responsibility of my tasks and take pride in the team’s success.
Summary:
Over 8 years of experience in QA Testing and Analysis
Experienced in test execution using Mercury Interactive Automated Tools like Quality Center, QTP, WinRunner, LoadRunner and Test Director
Involved in testing Web based and Client/Server Applications.
In-depth technical knowledge of software tests cycle, system testing techniques and methodologies.
Experience in System testing, Regression testing, Performance testing process of a given software application for different software releases, builds and patches.
Actively involved in all phases of testing:- test planning, documentation, test data set-up, execution and defect analysis and tracking. Worked extensively on creating and executing test conditions, cases and test scripts using manual/automated methods.
Familiar with working with remote teams, including coordinating with offshore teams for 24X7 test executions
Experience in testing Windows and Unix based applications.
Experience in using SSH,SCP, FTP protocols
Experience in Configuring the applications in Webservers and Application servers (WebSphere, WebLogic) in both Unix and Windows environments
Expertise in Test Planning, Test Case preparation and Execution based on Requirements Specification as well as Design Specification.
Experience in Creating Test Procedures, defining Test cases, developing and maintaining Test scripts, analyzing bugs, interaction with team members in fixing the errors.
Thorough knowledge of RDBMS Concepts, including Oracle, SQL Server & DB2, and SQL. Tested reporting applications with manually verifying the results using direct database queries.
Worked individually and in a team environment, possess good communication skills.
Flexible and versatile to adapt to new environments
Experience in testing Web services using SOAP UI
Experience with Agile and scrum development Methodologies
Education:
Post Graduation Diploma in Computer Applications India 2001
MS in Chemistry, Osmania University, India 2000
Technical Skills:
Operating Systems: Windows 98/2000/NT/XP, UNIX, MS DOS,
Languages: SQL, Shell, PL/SQL, C, C++, Java,
GUI: Visual Basic4.0/5.0/6.0, JSP
Enterprise tools: HTML, HTTP/HTTPS, XML, Java Script, IIS 4.0/5.0
RDBMS: Oracle 9i/10g MS Access 97, SQL Server, DB2
Browsers: Netscape Navigator4.7, and MS Internet Explorer 8.0
Testing Tools: Quick Test Professional, WinRunner, LoadRunner, Rational Robot
Experience Summary
Project: Master Data Management
Client: JCREW
Duration: September 2015 – Till date
Role: Senior QA Analyst
Responsibilities:
Developed Test Plan, Test Cases documents by analyzing the Business requirements
Prepared testcases and test data to test IIB project, Performed Functional Testing using SOAP UI
Prepared test data for performance testing in MDM
Validated MDM database using SQL queries
Prepared the data for ETL jobs and validated data in MDM data base.
Working in Agile and scrum development Methodologies.
Used Jira tool to track the defects.
Environment: JIRA, Oracle 11g, SQL Developer, SOAP UI, JASON,
Project: ORT
Client: AT&T
Duration: June 2014 – June 2015
Role: Sales Test Lead
Responsibilities:
Worked as Test lead in ATT Sales System using various tools like ROME, ASAP, GCSM
Developed Test Plan, Test Cases documents analyzing the Business requirements, HLD, TDD and use cases of the application.
Performed Manual Testing, Functional Testing for verifying application’s functionality on different products such as MIS, AVPN, BVOIP
Participated in different types of testing ORT, CRT, and UCT and provided all Phases of software life cycle (STLC) starting with the business requirements gathering, business analysis, design, implementation, testing and support.
Actively participated during the release of the product and provided go-live support during deployment.
Streamlined the process after analyzing the gaps to improve the efficiency of the support procedure for time sensitive projects.
Coordinated with Off-shore Test and Development resources organizing meetings and translating requirements into technical specifications.
Lead teams through life cycle of the projects and also experienced in mentoring, projecting effort estimates and working with management to report back status of the team working on the project. Acted as primary point of contact for all the work taken offshore.
Discussed business scenarios with the PTE’s and BSM’s.
Created the orders in FMO and PMO flow in ROME, ASAP and GCSM for different product like MIS, AVPN and BVOIP.
Environment: HP-UX, Solaris, Oracle 9i, Quality Center
Project: Universal Platform (UP)
Client: IBM/AT&T, NJ
Duration: Sep 2011- Mar 2014
Role: QA Lead
The Universal Platform is a robust set of applications supporting the billing functions for the at&t Business Programs like AT&T Business Network, Managed Network Services, Private Line, Frame Relay, MPLS family of services, Business VOIP, Enhanced VPN, Customer Premise Equipment, Most of World, Small Business Services, Unified Communications, Cloud Computing, At one time, AT&T maintained separate IT platforms for each and every program. Today, the UP is AT&T’s billing system to support all these programs
Responsibilities:
As a Lead of IST team responsible for coordinating between the onsite team and off shore team with the customer requirement.
Collecting the status of each project updating the management with the latest status. Preparing the IST schedule, reviewing the same with IBM Management, client and make sure that It has been base lined.
Preparing the Projects deliverables like Data requirement spread sheet, Test plan and test cases.
Executing the project test cases that are owned by me.
Providing project status to client on daily basis. Conducting status call with US team, Clients and all other stakeholders.
Creating defects and following up with different Billing applications.
Generating the daily defect reports from Quality center and running the status calls with all Billing applications stake holders
Attended agile development methodologies and scrum training sessions provided by client IBM.
Worked and discussed with Development and Architecture teams to understand requirements and communicate software defects.
Environment: Mercury Quality Center, Clear Quest, UNIX, Perl, Shell, DB2, C++, Java/J2ee, JSP, WebSphere, Windows7, DB2, NDM Connect Direct.
Project: Master Data Management
Client: J&J, NJ
Duration: Oct 2010- Sep 2011
Role: Data Analyst/QA Specialist
Johnson & Johnson Health Care Systems Inc uses MDM(Master data Management) System to provide account management and customer support services to key health care customers, including hospital systems and group purchasing organizations, leading health plans, pharmacy benefit managers, and government health care institutions. The company also provides contract management, logistics and supply chain functions using MDM for the major Johnson & Johnson franchises.
Responsibilities:
Responsible for Quality Assurance planning and documentation.
Analyzed Software Design Specification (SDS) and Functional Requirements Specifications (FRS) with business, dev team.
Performed Pre and Post reviews for required System test scripts and executions to meet functional requirements
Prepared Test plan, Test Cases from Functional Requirements.
Used Quality Center for tracking defects and reporting purposes.
Attended Defect Status Meeting and prioritize the defects.
Tracked and reported test cycle status.
Collected and reported testing metrics for all aspects of testing including: Test Script/Plan condition, test case Pass/Fail ratios, Test cases executed per hour, Overall project growth.
Interacted and worked with the business/developers for implementing Change Requests.
Creates, maintains and reconciles customer data including Sales Force Alignment.
Ensured timely processing of various Customer rosters and third party data
Proactively monitored gray area and potential customer data queues and works on matching, merging and unmerging customers
Proactively monitored various end-to-end data error and data quality reports through resolution
Interact with U.S. and international customers according to current healthcare compliance policy and the Credo values
Identifies various system and data quality issues; prepares resolution approach and recommendations; leads change through implementation in collaboration with business and IT partners
Environment: Windows XP, UNIX, Java, HTML, XML, JSP, Servlets, Quality Center 9.2, Oracle11i, MS Excel
Project: Check Image Archive
Client: Viewpointe Archive Services, NJ
Duration: Aug 2009- Sep 2010
Role: QA Analyst
Viewpointe is one of the largest providers of check image exchange and archive services in the United States. The company has developed the most secure, scalable, and high-performance national archive and deliver strategic value to many of the nation's top-tier financial institutions. The primary focus is to encourage electronic form of the check payments process by fostering the use of image exchange and image sharing on a national scale. Viewpointe also provides a number of value added products apart from Archive services like Check fraud-guard, Image Statements and Image Integrity Analysis.
Responsibilities:
Created Test Plans and Test Cases based on the business requirements & technical specifications and Created Test Data (Image Files).
Executed Test Cases and Test Scenarios authored above.
Installed software packages in various QA and implementation environments.
Modify/rewrite installation scripts for each environment and install applications accordingly.
Create Test Data, large performance data, Check Images w/ specific test requirements and in different file formats like CIFF, X937, modca formatted files etc.
Deployed and Configured Applications in WebSphere Application Server.
Execute run-level scripts in PERL & Korn SHELL to install applications, debug and resolve install issues.
Work and discuss with Development and Architecture team to understand requirements and communicate software defects.
Providing daily status updates to the Business Team.
Logged defects in Mercury Quality Center and report status on them.
Environment: Mercury Quality Center, UNIX, Perl, Shell, DB2, C++, Java, JSP, WebSphere, Windows XP, DB2, IBM Ondemand Content Management.
Project: Fixed Income - TRACE (Trade Reporting and Compliance Engine)
Client: UBS, NJ
Duration: Feb 2008 – Jul 2009
Role: QA Analyst
The Securities and Exchange Commission ("SEC") approved proposed rules requiring FINRA members to report over-the-counter ("OTC") secondary market transactions in eligible fixed income securities to FINRA and subject certain transaction reports to dissemination. The Trade Reporting and Compliance Engine ("TRACESM") is the FINRA-developed vehicle that facilitates this mandatory reporting and also provides increased price transparency on an immediate basis to market participants/investors in corporate bonds. FINRA members are obligated to report secondary market trades to TRACE within fifteen minutes of trade execution. Fixed income transactions that must be reported under the TRACE 6700 Series Rules are those OTC secondary market transactions involving a "TRACE-eligible security". Effective March 1, 2010, the Rules require that primary market transactions involving a “TRACE-eligible security” also be reported.
Responsibilities:
Created Test Plans and Test Cases based on the business requirements & technical specifications and Created Test Data
Used Quality Center for tracking defects and reporting purposes.
Attended Defect Status Meeting and prioritize the defects.
Execute run-level scripts in SHELL to install applications, debug and resolve install issues.
Tracked and reported test cycle status.
Used Quality Center for tracking defects and reporting purpose.
Collected and reported testing metrics for all aspects of testing including: Test Script/Plan condition, test case Pass/Fail ratios, Test cases executed per hour, Overall project growth.
Coordinated with automation team to automate the application using QTP for regression testing.
Organized QA Entrance & Exit criteria meetings.
Environment: Environment: Java, HTML, XML, JSP, Servlets, Quality Center 9.2, QTP 9.2, Oracle 10i, DB2, MS Project, And Visio
Project: Capstone
Client: Capital One, Richmond VA,
Duration: Oct 2006 – Nov 2007
Role: QA Specialist
Capstone (Credit Card Application Decisioning Engine) is a rules-engine based decision server.
Capital One receives credit card applications via mail, voice response units and third party vendors, partner systems etc (Lowe’s, TjMax, Amerifee etc) and these applications are loaded in to Capstone using both Online decisioning mode and batch file load processes. Once an application is received Capstone uses the various rules set-up, credit policies, Bureau results and the Fraud metamodel to approve or reject an application. For online modes, the decision is conveyed immediately by sending the approval status as response.
Responsibilities:
Analyzing the user/business requirements and functional specs documents.
Set up Automation test environment with mercury Tools like Test pro.
Installed Quick Test Pro and configured Shared Object repository.
Designed, Developed Test automation Strategy for Quick Test Pro.
Extensively used Quick Test Pro methods to create automated scripts.
Set the QA Environment for Manual Testing as well as for Automation.
Created Project, Requirements, Test Plans and Tests in Test Director.
Write test cases to test the application manually in Test Director and automate using Quick Test Pro.
Used Test Director for defect reporting and tracking.
Generated Virtual load using web (HTTP/HTML) protocol.
Verified all functionality testing, front & back end using Win Runner.
Involved in Performing backend testing using Unix shell scripts
Developed SQL Queries to perform transaction testing
Developed Load Runner Vuser scripts to simulate real time load simulation.
Write detail User Acceptance Test Plan (UAT), Executing UAT with Clients and Customers.
Interacted with Developers discussed technical problems and reported bugs.
Environment : TestDirector 8.0, LoadRunner 8.0, QTP 8.2,C++, Java, JSP, JNLP, Weblogic8.1, JDK, Windows XP,Unix, Oracle 9i, WebClient, EJB