Chirag Patel
**********@*****.***
585-***-**** (Cell)
Address: **** ****** **** **, ****, NC 27519
Summary
Over 6+ years of experience in Information Technology including
approximately 4 years of Amdocs experience with emphasis on Software
Quality Assurance, Worked on testing of stand-alone, Web based and
Client/Server Applications using both Manual and automated tools. Possess
experience in Telecom Industry. Very strong communication and judgment
skills, team player, international work experience, quick adaptability to
the ever-changing technology and confidence are my strongest points.
Specific Expertise
. Excellent working experience in various different industries like
Freight, Health Care Providers, Media, Publishing and Telecom
. Excellent working knowledge of various Amdocs tools/applications,
processes and workflows in the Telecom industry like Amdocs Ensemble
Billing System, Amdocs Enabler Billing System, Clarify CRM, OMS and
eCare
. Experience in testing through a full system development and testing
life cycle.
. Experience in End-to-End System Testing and System Support
. Experience in creating and validating Test Plans, Test cases and Test
scripts formulation.
. Strong analytical, problem solving, and testing skills
. Worked with all stages of testing namely Integration Testing, System
Testing and User Acceptance Testing (UAT).
. Experience in Manual, Regression, Functional and Configuration
Testing.
. Experience in Performance Testing, Stress Testing, Security Testing,
Sanity Testing of Web-based and Client/Server based applications.
. Worked and have understanding of release and quality management
processes
. Experience in writing SQL to confirm the system is writing and
displaying correct data. Oracle SQL tools used: Toad, SQLPlus, MySQL,
etc.
. Performed Backend Testing of the applications using SQL queries to
validate the consistency of data.
. Possess Strong experience in using UNIX commands. Used Exceed and
Putty to access UNIX environments.
. Performed Functional Testing and GUI Testing of applications and
checking them against standards and business requirements.
. Experience in documenting test results for corrective actions,
reporting and audits.
. Experience in testing performance of web applications.
. Work with development team and business users to create and document
business scenarios for testing
. Ability to work with in a team environment and to communicate well
both orally and in writing.
. Extremely diligent, strong team player with an ability to take new
roles.
. Excellent communication skills along with verbal and written skills.
Exceptional ability to learn new concepts and technologies in the
least time.
Technical Skills
Operating Systems: Windows 95/98/2000/2003 Windows XP, UNIX( HP-Unix)
Database: Oracle 9i, 10G
Web Servers: Web Sphere, Web Logic.
Software Packages: MS Office and Oracle Forms
Testing Tools: Win Runner, Load Runner
Bug Reporting Tools: MS Excel, Test Director, AMC, Team Track, Mercury Quality
Center, and Bugzilla
Oracle SQL Tools: TOAD, My SQL, SQL Server and SQL Plus
Amdocs Applications: Clarify CRM (6.0), OMS (6.0), eCare, Customer Service
Management
Middleware Applications: Tuxedo, Atlas, SOAC, SDP, Granite, and SAP
Professional Experience
FedEx Service: Cary, NC Project: QA Lead
Mar 10 - Aug 10
FedEx Services Today's FedEx is led by FedEx Corporation, which provides
strategic direction and consolidated financial reporting for the operating
companies that compete collectively under the FedEx name worldwide: FedEx
Express, FedEx Ground, FedEx Freight, FedEx Office, FedEx Custom Critical,
FedEx Trade Networks, FedEx SupplyChain and FedEx Services. Through FedEx
Services I supported the FRATX project by merging FedEx Ground and FedEx
Home Delivery working towards their re-organizational structure. I am
involved into this project as a QA lead leading various different aspects
of this FRATX project. Simultaneously I am working thru QA responsibilities
to fulfill QA needs to meet the deadline.
Responsibilities:
. Gathering Software Requirement Specifications from Business related to
FRATX project
. Performing Manual, Regression and Functional testing for FRATX project
. Creation and distribution of the Detailed Test Plan Specification
. Creation of Test Scenarios based on Software Requirements using
Microsoft Excel
. Leading the Test Scenario Review with the Application Development Lead
and Business Area Test Lead
. Creation of Test Cases based on Software Requirements using Microsoft
Excel
. Leading the Test Case Review with the Application Development Lead and
Business Area Test Lead
. Lead QA Test Effort with assistance of the Application Development
Lead and Business Area Test Lead
. Posted all test scenarios and test cases on Source Forge for
Developers, UAT and QA teams to review and approve them
. Reviewed all test scenarios and test cases in person with development
and UAT teams
. Executed test cases and updated the results using Microsoft Excel
. Insure completion of all test cases
. Validated all different reports developed in AS400 that were impacted
for FRATX changes
. Log defects in Quality Center during execution of test cases
. Creating a Report using Microsoft Excel on all defects logged during
execution for meeting reviews and fixes
. Retest defects as they are fixed
. Working thru approx 18 Software specifications towards FRATX project
. Working thru by actively participating in mentoring the test cases
with UAT, Development and QA teams assisting them to see no existing
functionality is broken with FRATX changes
. Working thru the test cases as part of my active tests in addition to
mentoring for some of the Software Requirements
. Attending weekly meetings with the management team and share the
Status Report on the weekly basis
. Working closely with Development and UAT teams
Sandata Technologies: Port Washington, NY
Project:Sr.QA Nov09 - Feb10
Sandata Technologies improving the Business of Care by providing leading
edge, market responsive information technology solutions and services. They
provide home health and human service agencies performing vital work. They
are responsible for the care of our nation's seniors, children, sick, abled
and disabled people. The Project at Sandata involved working through
Peformance and Manual testing of Web and Client Server applications. As a
QA, I am testing with the Global QA and Development teams in analyzing
system requirements, data requirements, testing end-to-end test
environments, fulfilling data requests, and testing applications.
Responsibilities:
. Exeperience working with both Web Based and Client server Applications
. Comparing the end results, actual results and performance in both Web
Based and Client server Applications
. Working and testing thru different versions of applications in both
http and https forms
. Analyzed Business Requirement Document, Software Design Document,
Software Requirement Specification and Functional Requirement
Document.
. Worked closely with Project Manager, Development Lead, Developers,
Network Admins to gather requirements in order to formulate the
Performance and Functional Test Plan.
. Executed various loadrunner scripts using Load Runner VU Generator
(VuGen)
. Executed various loadrunner scripts using Load Runner Controller
. Analyzed the test results of all scripts using both Load Runner VU
Generator (VuGen) and Controller
. Created the Peformance Strategy document based on each script using
Controller and VuGen test results
. Performed Auto Correlation by correlating the Controller and VuGen
graphs to validate the Users closely
. Performed validation of automation Scripts, test cases and made
comments for changes made in the script.
. Performed Load & Performance testing on the Linux and Windows
machines.
. Ran thru approx 120 Reports on Conversion Data and Compared the data
b/w the old version (Mainframe DB/Client Server App) and the new
version (Oracle DB/Web Based App)
. Reported bugs based on the comparison b/w the two DBs and Converted
data from old DB to new DB
. Re-tested/Regression tested the reported bugs after the fix implied
into QA
. Tested Used Case Scenarios and Test Cases as per business requirements
with various user loads using Performance Center.
. Modified/Changed Used Case Scenarios and Test Cases based on missing
steps to ensure the testing quality
. Generated LoadRunner reports with Running Vusers, Hits per Second,
Throughput, Average Response time and Total Transactions Per Second
graphs.
. Created a Performance Report for the end to end testing in comparison
with the baseline and final values.
. Uploading the Performance Test Plan, Test Scripts, Scenarios and Final
Reports in the Quality Center for every application.
. Extensively used SQL to query the Database to validate the test data
for each application
. Also worked closely with the developers for the Performance Tuning and
other related bugs of an application by gradual Scaling of the
application.
. Faced lot of challenges and issues and over come successfully using my
knowledge and expertise.
. Weekly meeting with the management team and share the Status Report on
the weekly basis.
DirecTV, El Segundo, CA QA
Lead Feb 08 - Oct 09
DirecTV (trademarked as "DIRECTV") is a direct broadcast satellite (DBS)
service based in El Segundo, California that was founded in 1990. It
transmits digital satellite television and audio to households in the
United States, the Caribbean and Latin America except for Mexico. DirecTV
is owned by DirecTV Group, which is controlled by Liberty Media.
Responsibilities:
* Leading UAT team and making sure all the new production defects have
been tested in UAT before submitting the code into production.
* Understanding the Business Requirement and applications end-to-end
functionalities
* Preparation of Detailed Test Plan and Test cases for functional
testing.
* Prepared and issued weekly status reports and conducted Defect Review
meetings.
* Performed and actively participated in User Acceptance Testing (UAT).
* Ensured Use-Cases were consistent and covered all aspects of the
Requirements document.
* Performed Negative and Positive Testing.
* Designed and developed scenarios based on business requirements.
* Executed test cases and reported defects in Mercury Quality Center.
* Supporting Production applications for emergency defects requests
* Validated the SQL scripts to make sure they have been executed
correctly and meets the scenario description.
* Reviewing and re-testing reported defects in the concerned
applications using Manual as well as Automation tools
* Creating test cases and test steps for the reported defects based on
the business requirements
* Executing test scripts for functional and regression testing related
to the defects fixed using QTP
* Executing the scripts in Fire Fox and IE and tested the browser
compatibility using QTP for QA purpose
* Validated various different logs generated using Java or Perl language
in Unix
* Supporting and resolving various data issues reported by different
* Trouble-shooted various issues that came across during the testing
Execution
* Writing SQL queries to access Oracle database using 'SQLyog'
Enterprise tool
* Experience with UNIX commands to access UNIX servers for reading log
files, executing shell scripts and running various jobs
* Conducted routine meetings with team members and developers in case of
changing requirements for better understanding of requirements.
* Interacting with other teams through walk-through, teleconferences,
meetings, etc. to resolve various issues.
Houghton Mifflin Harcourt, Boston, MA Position:QALead
Apr 07 - Jan 08
Houghton Mifflin is one of the leading educational publishers in the United
States, publishing textbooks, instructional technology, assessments, and
other educational materials for teachers and students of every age. As a QA
Engineer/Data Migration and Validation Analyst, I was involved in multiple
projects, sometimes subsequently. I wrote the QA requirements and then
created the data and process flow diagrams and documents for HMH content
management applications in preparation for testing. As a lead, I
coordinated the data input for the BTI project into Test Director for the
purpose of status and bug tracking. I worked directly with the QA Director
and the lead Automation Tester and communicated directly to division owners
to create and input test scenarios and test cases into Test Director. I was
also largely involved in test execution and bug tracking thereafter.
Responsibilities:
* Involved in gathering business requirement, studying the application and
data and collecting the information from developers and business users.
* Worked and reported directly to the QA Managers who was offsite.
* Primarily involved in data validation efforts using Microsoft Excel.
* Design and execute Test Plans and Test Cases, generate Test Scripts and
Test Created a daily status report to track data validation efforts.
* Responsible for Back End Testing. Wrote extensive SQL queries for Back
End testing
* Reviewed the converted files in XML format to validate the conversion
process.
* Analyzed XML data to make sure that the transformation process occurred
correctly.
* Compared converted data to the original files in XML format to check
discrepancies.
* Involved in Unit testing and Integration testing to check whether the
data is coming perfectly from different source systems.
* Coordinated the testing efforts with the Developers, QA Team, the
Functional Specialists and the Business Analyst to ensure that the
testing requirements were being fulfilled.
* Performed sanity and smoke tests before performing extensive manual
functional testing on each application..
* Assisted in Designing, Communicating, and Enhancing QA testing plan for
the application.
* Project description and defining the Design steps in Test director.
* Executing System Test scripts developed in Test Director manually.
* Performed extensive Manual Functionality Testing on all the applications
assigned.
* Performed Referential Integrity Check on the oracle database.
* Design and execute Test Plans and Test Cases, generate Test Scripts and
Test scenarios for eCommerce and Supplemental sites.
* Manually executed testing steps
* Queried the Oracle database using Oracle Enterprise Manager for Back End
Testing.
* Optimized SQL queries and interacted with development team to resolve
anomalies in the database development.
* Conducted (UAT) User Acceptance Testing and (UT) Usability testing on the
AUT of Projects migrated from Test Director
* Maintained test matrix and bug database and generated weekly reports.
* Actively participated in enhancement meetings focused on strategizing the
testing effort of subsequent projects.
IntraISP, USA, St. Louis, MO Position: Lead QA
Apr 06 - Mar 07
IntraISP (www.intraisp.com) is a leading global software company focused on
Billing, CRM, and OSS for the communications services industry (WISP/WiMAX,
ISP VoIP, Web Hosting, IP Video, Gaming, and other IP-based service
providers). At IntraISP, I was involved with Production Support and testing
production defects for various ClearWire applications such as Full_SignUp,
Sales OE, and Boss.
Responsibilities:
. Leading UAT testing and making sure all the new production defects
have been tested in UAT before submitting the code into production.
. Co-ordinating with the Business and the development team for the
product delivery, also working with the testing team for writing test
cases and test scenarios using Bugzilla
. Supporting Production applications for emergency defects requests
. Managing and Tracking defects through bug tracking tool Bugzilla
. Reviewing and testing reported defects in the concerned applications
using Bugzilla
. Coordinating with concerned developer/development teams for design
reviews per the business requirements
. Creating test plan, test cases and test steps for the reported defects
based on the business requirements using Bugzilla
. Executing test scripts for functional and regression testing related
to the defects fixed using Bugzilla
. Writing SQL statements to access MySQL database using 'SQLyog'
Enterprise tool
. Team coordination and customer interaction for daily reporting
Cingular/AT&T Mobility, Dallas, TX Position: Sr. QA Analyst
Nov 04 - Mar 06
The Project FQC (Functional Quality Control) involved working through Six
application versions upgrades. As a Senior QA/Business Analyst, I am
working with the Global QA and SUPPORT Teams in analyzing system
requirements, data requirements, supporting end-to-end test environments,
fulfilling data requests, and testing applications.
Responsibilities:
* Gathering business requirements from Business team based on new
functionalities every version
* Creating test plan, test cases in Quality center or in excel based on
business requirement
* Executing test cases using test lab in Quality Center
* Reporting any defects using Quality Center, bug reporting tool for any
critical failed test cases
* Escalating for any show stoppers or critical defects
* Communicating or attending meetings with developers, business, leads
and managers for critical issues
* Supporting and resolving various types of jobs issues that include but
not limited to MAF, MPS, Billing, EndOfDay, AR, CSM Telegence Online
and other jobs running into the environment under the teams' control
* Configuring and Supporting the Test environment set-up including
testing servers (UNIX), Application and operational database (Oracle),
and applications running on UNIX server boxes
* Working with issues coming from Reference, Cross Reference,
Operational tables in the Oracle Database and Updating the tables per
the requirements using Oracle SQL
* Working with data validation in the Application Databases using Oracle
SQL
* Collecting global data requirements related to environment and Billing
Account types and run SQL queries against production database to
create BAN copies from the production environment and copy into
Quality Assurance environment/database using BAN Copying tools
(CopyBAN Tool)
* Made recommendations for changes, while working with clients to
design, setup and test applications prior to implementation date.
* Supporting all teams worldwide with the system environment issues with
either resolving them or recognizing the causes and pushing them to
the respective development or system maintenance (Infra) teams
* Working very closely with Production Support team in testing all the
Emergency Change Requests/Tickets opened by the developers to validate
the code change
* Collecting, creating and validating domestic and international voice
and data usage samples using UNIX SHELL scripts and creating templates
to store them based on the usage requirements received from worldwide
testing teams
* Creating template(s) to support automation application (Ran by
WinRunner) for bulk data usage creation and to store the manual data
from CSM online
* Attending various technical trainings provided by Amdocs on the
application to enhance technical skills
* Working closely with Development and Infra teams
* Analyzing logs generated by different applications to research the
possible causes of issues
* Point of Contact for offshore teams such as DVCI Amdocs, India and
Abilene, TX for any data or environment issues
* Point of Contact for worldwide voice and data usage requests from
various Quality Assurance/Testing teams
* Traveling to different Amdocs locations for cross training and
providing application support
* Providing web training to both onshore and offshore teams often to
train them on applications and data usage as part of application
support
* Providing 24X7 On Call support
Sai Infonet solutions Private Limited, Bangalore, India
May 2002 to Oct 2003
I was a part of Amdocs ETS End to End testing support group for the SBC
Light Speed project. Being part of this group I provided full support to
SBC Onshore & Offshore ETS team. This project involved customization and
regression testing on various Amdocs applications such as Clarify CRM, OMS,
and eCare
Responsibilities:
* Transferred Amdocs application knowledge to both onshore as well as
offshore SBC teams.
* Gathering business requirements from Business team based on new
functionalities every version
* Creating test plan, test cases in Quality center or in excel based on
business requirement
* Executing test cases using test lab in Quality Center
* Reporting any defects using Quality Center, bug reporting tool for any
critical failed test cases
* Escalating for any show stoppers or critical defects
* Participated in testing End to End Amdocs Clarify CRM application that
includes creating contact, service (AM BIS) & USPS (LIM BIS) address
validations, credit evaluation, verifying or editing contact roles,
creating accounts, validating tentative BANs, capture account
information, validating authorization, and launching OMS & eCare.
* Also took part in testing End to End Amdocs OMS application that
includes testing the entire life cycle of order processing in the
application. The process flow for OMS: Negotiating >>> Provisioning
>>>Notification.
* Used eCare application, Located & Viewed Bill Summary: Locate the bill
Summary - Account statement and its parameters, for example - previous
balance, adjustments and current charges etc). Created Adjustment:
Create an adjustment to resolve the customer's complaint (Tax, event
or charge level adjustment). Located Unbilled Usage: Locate the
pending event to affect the balance in the next cycle. Located Billed
Events: Locate the billed events (usage charges, credits and
adjustments) affecting the current balance.
Education & Training
. Bachelor of Science in Computer Engineering- VNSGU, INDIA
. MBA (Finance)- ICFAI University, INDIA
. M.S - Technology Management- University of Bridgeport USA