Sign in

Senior Automation Engineer

King of Prussia, PA
April 19, 2016

Contact this candidate


Madhuri Buddha



* ***** ** ******* ********** in Information Technology with emphasis on Quality Assurance, Performance Testing and Automation testing of Web and Client/Server based applications.

Experience developing software tests for functional, performance (Load Runner) and regression testing for complex applications

Certified by ISEB (Information Standard Examination Board)

Experience in automating functional test suite using Selenium Webdriver /Java/Cucumber/Maven

Implement Page-Object Model (POM) framework for Selenium projects

Hands on experience in using GIT,Maven, Jenkins for build management and continuous Integration

Hands on experience in implementing automation scripts using TestNG

Hands-on experience in developing Data Driven and Hybrid Framework

Expertise in Gathering Test Requirements, developing Test plans, Use cases, Test data (when needed) and Workflows based on business requirements.

Proficient in SQL queries

Experience in writing SQL, Oracle PL/SQL queries using SQL Developer and Toad

Validate applications developed using Hadoop

Experience in RESTful web services and SOAP

Using JIRA for Agile implementation

Experience in HP testing tools Load runner, Performance Center /ALM, Diagnostics, Wily Introscope, Sitescope, AppDynamics.

Work on various protocols in Load runner like Web (HTTP/HTML), Web (Click & Script), Ajax (Click & Script), Ajax TruClient, Java.

Handle various types of Performance Testing which are Load Testing, Stress Testing, Endurance Testing, Failover Testing and Spike Testing.

Identify Performance bottlenecks in the application and provide recommendations to improve the performance of the application

Proficient in working with Windows, UNIX environments

Familiar with QTP/UFT data driven framework

Work independently, as well as part of team and adhere to strict quality assurance guidelines, familiar with entire lifecycle of software development

Coordinate Onshore and Offshore teams: Resource allocation, Project planning, Forecasting, scheduling and reporting

Ability to quickly assimilate new technologies

Technical skills:

Testing tools

Selenium WebDriver,Load Runner,Vugen,Performance Center,,QTP/UFT,JIRA,SOAPUI, Quality Center

Programming/Scripting Languages

C/C++, Unix Shell scripting, XML, HTML, DHTML, JavaScript, Java, VB Script


MS SQL Server, MS Access, Oracle, DB2, My SQL,Hadoop

Monitoring Tools

Sitescope, Perfmon, Diagnostics, Websphere Console, HTTP Watcher, IBM Tivoli, wily Introscope,AppDynamics

Servers Monitored

Web Sphere server, IIS server, Database server,AIX

Operating Systems


Other tools

WinSCP, Putty, Quest Central, Visual Source Safe, Git,SourceTree,Jenkins

Educational Qualification and Certifications:

Bachelor of Technology, Jawaharlal Nehru Technological University, India.

Certified by ISEB (Information Standard Examination Board)


Comcast Spotlight, PA Nov 2013– Till Date

Sr. QA Automation Engineer

Comcast Corporation is a global media and technology company with two primary businesses, Comcast Cable and NBCUniversal. Comcast Cable is the nation's largest video, high-speed Internet and phone provider to residential customers under the XFINITY brand and also provides these services to businesses. Comcast Spotlight is the advertising sales division of Comcast Cable, helping put the power of cable to use for local, regional and national advertisers. Provided QA services for following applications.

Analytics CentralBilling

VOD-DAI OnlineBillPayment



OperativeOne SalesLook


Develop automation scripts using Selenium-Webdriver/Java/TestNG/Maven

Enhance numerous test scripts to handle changes in the objects, application,testing environment using Selenium.

Implement Page-Object Model ( POM ) pattern for Selenium projects

Develop Cucumber Feature file(Gherkins) based on the scenarios gleaned from business/stakeholders

Implementing Cucumber Feature Step Definitions using Java and Page Object Model.

Perform Input Validations, User Interface Validations, Browser Compatibility testing and Navigation testing.

Execute Selenium automation scripts on different browsers /environments and sharing Surefire/HTML reports to the concerned teams.

Maintain the Selenium Java Automation code and resources in GitHub

Use Github, SourceTree and Jenkins for continuous Integration ( CI )

Hands on experience on Maven build tool

Familiar with Selenium Webdriver advanced APIs

Follow Agile methodology – Sprint and Kanban

Use JIRA for Agile and bug tracking

Use SOAP UI for Web Service testing ( SOAP and REST API )

Work with various protocols like Web (HTTP/HTML), Web (Click and script), Ajax (Click & Script), Ajax TruClient, LR Java protocol for performance testing using HP Performance Center/ALM

Perform various performance tests like Baseline, Load, Stress, Endurance, Volume, Spike, Failover and Fallback testing

Ramp up Virtual users in a load test to achieve a maximal virtual user load of 3000 concurrent users

Verify the performance of application on monitoring tools Introscope/SiteScope and Unix server after each Load Test

Use SQL Developer and Toad for validating backend databases

Check data in HBase and Hive

Use Quality Center / ALM for managing entire QA effort right from test plan to generate test reports after execution ( before started implementing Agile methodology)

Environment: Selenium WebDriver, JIRA, SoapUI,Windows 7, Loadrunner 11.52 & Performance Center 11.5, Oracle,SQL Server, SOAP UI,,Hadoop


Accenture, Charlotte, NC, USA

Technical/Performance Test Lead

Client: Bank of America Mar ’09 – Sep’ 12

Bank of America’s Copernicus Generation Project is focused on retiring existing “Know The Customer” application to provide complete, accurate and secure information about all our customers to all channels and all sales and service processes. This project objective is to lower operational costs of current environment significantly.


Managed four big modules (OFFER/ CNE/EServices/Websphere Customer Center/IDV) under COPERNICUS space and leading a team of ten resources.

Managed Onshore, Offshore, Nearshore teams.

Involved in forecasting and resource planning activities.

Worked closely with Business partners in capturing and understanding requirements.

Created test plan and publishing the plan for Customer sign off.

Understood Architecture to conduct Performance and Negative/Failover tests.

Designing performance test scenarios that represent a real time production scenario.

Created and enhanced load runner scripts

Executed load tests

Created scripts by plugging XML’s

Created Performance center scenario’s and regression suite based on the requirements to ensure the response times are well within SLA

Tuned Scenario’s using performance model

Executed long duration, Bell curve test, stress test, Scalability test results using performance center .

Cleaned up queues using MQ self-service scripts (“pbrun” commands) in AIX boxes and bouncing servers using Websphere console.

Monitored queue depth of the queues using Websphere MQ explorer and MQ self service scripts.

Analyzed results using application logs, CICS reports, Introscope and Diagnostics graphs and coming up with observations and recommendations based on the Performance Test results.

Published final results to the Business partners and conducting walkthrough on the final results document.

Created Introscope dashboards to monitor user defined metrics

Specialized in assessing the application to identify the business critical functions that need to be Performance evaluated.

Involved in meetings with Testing Delivery Managers and Architects in building scale down testing environment for new Applications.

Conducted performance testing on OLB web based applications

Developed SQL queries to pull data based on the requirements

Got trained in automating functional test suite using Selenium Webdriver

Experienced in automating functional test suite using Selenium Webdriver ( using JAVA )

Implemented DataDriven and Hybrid frameworks for “Eservices” applicaion

Executed Selenium automation scripts on different browsers /environments and reporting defects/results to the concerned teams.

Hands on experience on Maven build tool

Coordinated Near shore and offshore teams

Worked with users to ensure quality browsing experience

Environment: Load Runner 9.X, Performance Center, Diagnostics, Java, J2EE, XML, Shell scripts, Introscope, UNIX, Selenium.

Customer Notification Engine

Accenture, Charlotte, NC, USA

Performance Test Engineer

Client: Bank of America Jan ’08 – Mar ’09

CNE (Customer Notification Engine) came into picture as part of moving alerts functionality into the target customer solutions infrastructure. This is going to be the ‘hub’ of online banking eAlerts and ODAO eAlerts. This stores all alert preferences set up by the customer and has multiple processes that support creating alerts in both online and batch mode


Created test plan and test conditions based on the requirements.

Worked with Business partners in understanding High level and low level diagrams to perform Failover and negative testing activities

Created virtual user scripts and sceanrios based on the requirements

Executed Regression tests, Breaking point test, load tests, Volume tests using performance center

Executed scalability tests after introduction of new servers

Defect Logging, tracing and reporting defect metrics using quality center

Monitored Message queue’s using Tivoli

Cleaned up the JVM queues using MQ self services and local queues using MQ explorer

Monitored technical testing dashboards for queue replication activities, DB errors, API error count

Prioritized testing efforts based on release plan, analyze and interpret test results, finalize the test plan and get approvals

Involved in monitoring different metrics in the web servers and application servers using wily Introscope to identify the threads, connections, memory allocations, heap dumps, and heap size

Used manual and automatic correlation and parameterization techniques in generating the test scripts for Load Runner

Environment: Performance Center 9.1, Quality Center 9.2, Tivoli, Wily Introscope 7.1, IBM MQ, Sitescope 9.02

Know the Customer (KTC)

Accenture, Bangalore, India

Performance Test Engineer

Client: Bank of America May ’05 - Jan ’08

Know The Customer (KTC) involves capturing customer data from across the enterprise, consolidating all internally and externally acquired customer-related data in a central database. Various interfaces talk to KTC to retrieve and store the information. Some of the interfaces are web channel, BOSS, COIN (Customer Online Information Network) etc. Web interfaces have to pass through iPlanet web server and others through IBM MQ series.


Created Regression approach Document

Prepared Test Conditions with the help of HLD and LLD

Mined data using Quest Central

Created Performance Model

Created Vuser Scripts and Scenarios

Created MQ Load Runner scripts

Executed Load test that is Baseline Load, Peak Production mix

Executed Failover tests to find out the FCI’s (Failed Customer Interactions) when component is down and monitor the recovery time of the component

Executed Environment shake out, Short Duration, Long Duration tests.

Analyzed Sitescope, Introscope graphs and application logs

Analyzed Performance Center results in order to check the performance of API’s

Pulled the Records and Reports for Defect meeting, Status meeting, business meeting using Quality Center

Environment: Load Runner 7.8, Test Director / Quality Center, Tivoli, Quest central, Sitescope, Java., Web methods, Oracle.

Contact this candidate