Amarnath Ravi
********.*****@*****.***
Sr Test Consultant with Wipro Technologies
. 9+ Years of diverse experience in Software Testing and Quality
Assurance & Control.
. Extensive experience in designing and execution of Test Cases, Test
Scripts and Test reports for manual and automated tests.
. Expertise in analysis of Requirement Traceability Matrix (RTM),
Problem severity, Defect tracking and Defects reporting system.
. Ability to work independent or as a team member, excellent written and
verbal communication skills.
. Excellent interpersonal skills and ability to work in a cross-
functional environment.
. Excellent problem solving and multitasking. Quick learner and able to
work under pressure to meet deadlines.
. Strong skills in Batch & back-end testing using SQL queries.
. Complete Knowledge of SDLC.
1 EDUCATIONAL QUALIFICATION:
> Master of Computer Applications (MCA)
TECHNICAL SUMMARY:
> Efficient in implementing QA Process, established standards, procedures,
and methodologies.
> Strong Knowledge on the phases of Software Development Life Cycle.
> Strong Knowledge in Perl and Shell Scripting.
> Identify the Test Requirements based on User Requirements and Program
Specifications.
> Experienced in preparing Test Plans, Test Scripts, Test Cases and Test
Data.
> Ability to identify tasks which require manual and automate test cases
> Expertise in System, Smoke, Integration, Regression, & UAT tests.
> Experience in the areas of GUI, Functional, Integration, System, Front-
end, Regression & Web Services testing for Client Server and web
applications.
> Good Knowledge in PL/SQL, QTP,Selenium,Python.
> Expertise in Risk Analysis, preparation of RTM & Prioritize Test case.
> Strong experience in interacting with business analysts and developers to
analyze the user requirements, functional specifications and system
specifications.
> Proactively studied various kinds of testing concepts and templates from
various sources and effectively implemented them across the organization.
> Professional experience in designing test cases and test scripts for
Windows, Java and UNIX/LINUX environments.
> Experienced in distributed testing, multi-user testing and database
testing.
TECHNICAL PROFICIENCY:
Programming Languages C, C++
Web Development Tools VB Script
Testing tools Manual Testing, Quality Center 10.0, QTP 10.0,
Load Runner
Operating Systems Windows Series, HP Unix, Linux
Data Access Languages MYSQL
Database Servers ORACLE
Scripting Languages Perl, Shell, Python
Professional Experience
1. TAS Consolidation and Upgrade Project for BP US FVC
Location Wipro Technologies, Chicago, IL
Project Duration Sep'10 - Current
Role Senior QA Analyst
Environment Linux, .Net client application, MySQL and ALM
11.
The Project is for British Petroleum, an Oil and Gas Major. Currently the
Terminal Automation System used at the terminals is TopTech- TMS 5 and the
support for the same is ending in near future. Continuation with the
current systems will challenge the License to Operate. Therefore, it is
decided to upgrade the Terminal Automation system with TMS 6 version and
replace TMS hosts with TopHat. Both TMS6 and TopHat uses Red Hat Linux as
the Operating System and .NET GUI frontend with My SQL database. TMS 5 and
TMS Hosts are currently using an unsupported Operating System - QNX.
Currently, two TMS Hosts are used, one each for East of Rockies and West of
Rockies. These are the very core changes to the system which necessitates
system, regression, Integration, UAT testing to ensure business continuity
without flaws. Scope of the project includes various kinds of testing of
the software received from the vendor Toptech, validating against the
client business requirements, setup a Lab with development, test and
operating acceptance test environments with Terminal Automation Devices and
deployment of the tested product in all the BP terminals.
Responsibilities:
. As a Consultant, to understand the various business processes implemented
via TMS6 - loading processes, End of Day and End of Month activities,
Testing and Deployment process.
. Involved in Defining, and Analyzing the Testing Requirements based on the
Application Functionality.
. Gathered test data requirements for data conditioning from Business Units
to test total application functionality.
. Prepared Traceability Matrix, Test Results Documents for manual and
automation testing.
. Conducted Test Plan walkthroughs and wrote test scripts for manual
testing.
. Created test scenarios for System testing, Regression testing and
Integration testing.
. Prepared Test Plans, Test Cases for both positive and negative scenarios
and mapped the same to requirements in Quality Center.
. Validate the implementation of business processes via TMS6 and get back
to the customer with defects in the application or suggested process
improvements, based on the testing results.
. Basic knowledge and use of TMS5 application.
. Executed test scripts and analyzed the test results in system and UAT
testing.
. Performed the rigorous regression testing, Integration testing (end to
end) from TMS6 system to SAP. Following-up the flow of data with all the
teams.
. Testing involves the communication with the hardware terminal devices
such as RCU, PLCs and Accuload.
. Defect Management involved reporting and tracking using ALM 11.0 (HP
Quality Center).
. Defect Reviews, Test suite creation for test lab, Project user
maintenance, generation of reports.
. Developed Automation Scripts using Perl and shell scripts to perform the
Data Validation Testing.
. Testing at the Vendor site Toptech for the Critical Deliveries.
. Involved in Database Testing Using SQL to pull data from database and
check whether it matches with GUI.
. Tested in VM Ware Desktop Environment.
. Testing the defect fixes at the vendor by working with development teams.
. Involved coordinating with Offshore People. Assigning tasks, helping them
in understanding the requirements and Providing Everyday Status report to
the Manager.
. Responsible in supporting the Deployment activities, tracking the data
flow from TMS6 application to Tophat application and other external
systems and Bubble support activities.
. Played the role of Deployments lead, overseeing activities at the actual
Terminal site during the deployment of the TMS6 at the sites.
2. NetApp - DPG
Location Wipro Technologies, Banglore, India
Project Duration March' 10 - Aug'10
Role QA Analyst
Environment QTP 10, Linux, Filers, Perl..
Project is for a company in the business of Storage and data management
solutions.
The project involves automating the test cases in Perl for the DPG area of
Netapp storage devices. The Scripting language used for the test case
development is Perl and the framework used for script development and
execution is NATE. Unit and system tested the developed test cases on
Netapp filers. Snap Mirror is a feature of Data ONTAP that enables you to
replicate data Snap Mirror enables to replicate data from specified source
volumes or qtrees to specified destination volumes or qtrees, respectively
Responsibilities:
. Involved in developing detailed test strategy, test plan, test cases
and test scripts for Automation Testing.
. Set up the test environment, defining detailed Test Requirements,
converting them into Test Cases and collected Test Metrics for
analyzing the Testing Effort.
. Develop, maintain and conduct smoke test cases for QA environments.
. Development of library functions for the Automation Test cases.
. Coordinated and worked along with the development and business teams.
Controlled testing projects at every step of the quality cycle from
test planning through execution of defect management.
. Involved as part of automation team for the development of Perl and
Shell Scripts to automate the DPG Product.
. DPG product involves snap mirroring among volumes and aggregates,
taking backup from disk to tape and other operations on filers.
. Debugging the logs when the problem occurs during execution.
. Unit testing and System testing of the scripts.
3. QCHAT AUTOMATION
Location Wipro Technologies, Hyderabad, India
Project Duration Aug'09 - Feb'10
Role QA Engineer
Environment Linux, Perl
QCHAT is the future of Push-to-Talk Communications. QUALCOMM developed
QChat to provide a reliable method of instant connection and two-way
communication between users in different locations. QChat allows two-way,
push-to-talk (PTT) communications for subscribers with an easy-to-use and
instantaneous communications capability. QChat is an application developed
for the BREW (Binary Runtime Environment for Wireless) platform. QChat
handsets and server software allow users to connect instantaneously with
other QChat users anywhere in the world with the push of a button. QChat
will provide a new suite of enterprise communications for organizations
faced with time-critical emergency situations
Responsibilities:
. Involved in developing detailed test strategy, test plan, test cases
and test scripts for Automation Testing.
. Set up the test environment, defining detailed Test Requirements,
converting them into Test Cases and collected Test Metrics for
analyzing the Testing Effort.
. Develop, maintain and conduct smoke/sanity test cases for QA
environments.
. Involved in developing the framework for the Automation.
. Development of Test Driver scripts for the automation test cases.
. Coordinated and worked along with the onsite coordinator
. Automation of scripts in Perl.
. Debugging the logs when the problem occurs during execution.
. Unit testing and System testing of the scripts.
. Performed regression, functional, system, UAT testing on main
application
. Developing and maintaining test scripts, analyzing bugs and
interacting with development team members in fixing the defects.
. Responsible for writing simple to complex SQL queries to verify the
data in database.
. Responsible for analysis, reports and defect tracking.
Environment: Quality Center 10.0,Linux,Perl
4. NetAct Reporter
Location Wipro Technologies, Hyderabad, India
Project Duration Feb' 09 - Jul '09
Role Module Lead
Environment GUI based application, C++, HP UNIX, Perl,
Shell, Load Runner, Oracle 7.3
The project deals with developing and enhancing the platform for processing
and aggregating the Performance Management Data received from the network
elements. It handles IEdge, 3G and other various network elements PM data.
The project has 3 modules ETLO AD, Aggregate, PM Integration tool kit and
Topology history. The ETLOAD module deals with processing the PM data from
XML file and inserting it into the Database. It is a generic module which
can process any network elements data w.r.to the metadata available for
that network element. The Aggregate module deals with summarizing the PM
data available in the PM database and storing the summarized data into the
summary tables. It also deals with forwarding the summarized data to the
other clusters of the NMS. The PM Integration tool kit module deals with
fetching of PM files from NE ( PACO NEs and SNMP related NEs ). It works in
two modes 1) Polling from NE 2) Notifications from NE . The Topology
history module deals with collecting and maintaining the topology's
history.
Responsibilities:
. Solving the bugs raised by customers as problems reports,
. Helpdesk requests and NOKs: ensuring precise and speedy resolution to
global clients of Nokia.
. Software maintenance and Enhancements: Fixing bugs, releasing
Change/technical notes.
. Responsible to lead the module related activities
. Developed Perl scripts to verify the output of the Network Elements.
. Query the data using SQL*Plus
. Responsible for extracting and loading data into database for report
generation and other functionalities.
5. NetAct Core Platform
Location Wipro Technologies, Hyderabad, India
Project Duration June' 05 - Jan '09
Role Project Engineer/Quality Coordinator/Module Lead
Environment GUI based application, C++, HP UNIX, Perl,
Shell, Load Runner, Oracle 7.3
The Project is for the Telecom Major in Europe and America.
The OSS (Operations Support Systems) provides a future proof, scalable
framework for operating the entire managed network, including GSM/EDGE,
GPRS and WCDMA. OSS can adopt flexibly new network technologies with
minimal site intervention. The Core Platform has all the essential network
management services in a generic way. The services provided by the Platform
include Core Applications, User Interface services, System Management
services, Communication Services, Database Management services and Common
Libraries & Services in the form of processes, applications, utilities and
libraries. The scope of this project is to maintain these applications and
develop any enhancements by proper analysis.
Responsibilities:
. Reviewed Design specifications, created test plans, test cases and
executed them.
. Developed Perl scripts to verify the output of the Network Elements.
. Developed and maintained automation shell scripts using HP UNIX tools
such as ksh, awk, sed etc to test whether data integrity and
referential integrity were being met.
. Extensively utilized SQA to automate test procedures & test cases that
perform regression testing on front-end of the system.
. Testing of various objects on the front-end.
. On the server, performed stress testing and regression testing.
. Query the data using SQL*Plus.
. Worked on fixing critical bugs found in customer labs. Analyzed bugs
prepared Implementation analysis reports and then once approval was
got from product management, implemented the fix.
. Involved in the phases of System testing for every release.
6. Interconnect Billing
Location MIC Electronics, Hyderabad, India
Project Duration Apr'04 - May'05
Role Software Engineer
Environment C++, Linux, MySQL
This project is for a Telecom Major in India. Interconnect Billing is the
process of handling calls for other service providers. This allows the
customers of one service provider to communicate with the customers of
another service provider.
The project has two phases. offline CDR collection and Online CDR
collection. In the Offline CDR collection data from the switch will be
collected in the cartridges or drives. The application must access the
drive and retrieve the data and store it in file s. These files will be
sent from the Aggregation center (PoI) to the Data center through the
application. The local path and remote path of those files, the POI Name
and its details will all be entered through the user interface and are
stored in the database. During the Online CDR collection process, the
manager application will communicate with the agent application through the
CMIP protocol (UHC?s Q3 stack) and get the CDR files, these will be sent to
the Data collection center as 2MB files via 64kpbs link as and when
collected .
Responsibilities:
. Handled Collection and distribution modules.
. Requirements gathering for the project.
. Involved in development of Collection and distribution modules in C++
contains CMIP protocol.
. Processing of CDR (Call Detail Record) information in the Raw CDR
files.
. Unit testing of the developed modules.
. Involved in the module of putting the processed information into the
database.
. Responsible for unit Testing and System testing of the product.
7. CDR Normalizer
Location MIC Electronics, Hyderabad, India
Project Duration Mar'03 - Nov'03
Role Software Engineer
Environment C++, Linux, MySQL
The project will normalize the data of the switches such as OCB-283 of
Alcatel and EWSD of Siemens. The normalizer will take the file as input and
depending upon the technology, it will process the raw data of the switch
and produce a rich formatted files which will be used for billing purposes.
The data so formatted will be updated into the database.
Responsibilities:
. Normalization of the Raw CDR files
. Requirements Gathering for the project
. Processing of Raw CDRs files of Alcatel OCB 283 switch
. Processing of Raw CDR files of EWSD Siemens switch
. Responsible for the unit testing and System testing of the project.