Balaji Gunasekaran Selvi
Email: ***********.******@*****.***
Skype Id – gunasekaran.balaji1983
PROFESSIONAL SUMMARY:
Accomplished, highly motivated technology professional with 13 years of hands-on IT and quality assurance experience, specializing in both Functional and Performance testing.
11 Years of experience in Performance testing.
2 years of experience in UI Automation testing.
Proficient in performance Engineering and Performance testing Concepts and Performance Testing Life Cycle and methodologies in the Telecom domain
Hands-on experience in design and execution of automation test scripts using tools like Silk Performer, HP Load runner, Performance Center, Quality Center, Silk Central etc.
Functional testing and UI testing on mobile devices and tablets of different operating systems (iOS, Android, Blackberry, Windows). UI testing on different desktop browsers (IE11+, Microsoft edge, Safari, Firefox, Chrome)
Responsibilities in my role as Test analyst include assignments in Regression Testing, performance testing, PIT (Project Integration Testing), SIT (System Integration testing) and UAT (User Acceptance Testing).
Proficient in performance Engineering and Performance testing Concepts
Hand on experience with multiple protocol like HTTP/HTML, Webservices etc.
Good exposure in the design and execution of automation test scripts using Silk Performer and HP Load runner.
Hand on experience in developing scripts using Vugen and Silk performer workbench.
Good experience in maintaining and execution of scripts in HP Loadrunner, Silk performer and Quality center, jMeter.
Good experience in preparation of performance test plans for both Web application and Web services.
Solid knowledge of Software quality assurance processes and procedures that includes test planning, design, development, execution and evaluation phases, set up and execution of performance testing.
Responsible for writing and running the Manual Test Cases in Microsoft XL and Microsoft Word
Experience in analyzing server side performance using performance counters on both windows and Linux server provided bottlenecks if any.
Effectively track all the bugs and update in the bug-tracking tool called Silk Central, Quality center and CMIS web.
TECHNICAL SKILLS:
Primary Skills
Performance testing & Functional UI Automation testing
Test Management Tools
Silk Performer, HP Loadrunner 12.02,11.5, Silk Central, Quality center 11.0, Performance center 11, jMeter, Selenium
Operating Systems
Windows NT, Windows 2000/XP, Unix, Z OS
Scripts and languages
Visual Basic 6.0,VB.Net, ASP.Net, Java Basics
Databases
SQL Server 2000, MS-access, Oracle AWR Report
Documentation
MS-Word and MS-Excel.
Server Monitoring Tool
HP Sitescope 11.0, Weblogic Console, JVM, Introscope, HTTP-Watch
Protocol
HTTP/HTML, Webservivces, MQ series, CEM, Ajax Truclient
Other Tools
Applitools
Testing Phases
Blackbox/ Whitebox/ Greybox, Functional, Integration, GUI Navigation, Regression, Performance, Stress and Load, User Acceptance
PROFSSIONAL EXPERIENCE:
Salesforce – San Francisco, CA
ROLE: QA Engineer Dec 2015- Till date
PROJECT DETAILS:
Q3 Team - SFX (Lightning Reports and Dashboards & Wave) :
(Q3) engineering team is focused on automated validation of key customer scenarios for Wave and Operational Analytics builds (before the builds are released to production). The customer scenario would ideally be validated with all services up and running, without mocking any responses and on real data (Salesforce data) in the Org.
KEY RESPONSIBILITES:
Functional, UI Automation testing on web application.
I will work on defining key customer scenarios across Wave and Operational Analytics. This list will be used to feed into the automation work that the Q3 team will do and also feed into the exploratory testing that the Q3 team and scrum teams will do periodically. The same list would be used during Blitzes to get more coverage
I will ensure the build is running, create work items for unassigned failures, and track flappers on these autobuild.
Testing on real mobile devices and tablets of different operating systems (iOS, Android, Blackberry, Windows).
I will organize platform Blitzes. We will also organize Q3 VAT sessions to provide a forum to discuss upcoming release features and ensure the new features are well stitched. We will provide guidance to the individual scrum teams on the testing they need to do and evaluate if Q3 needs to enhance our automations to test new features
I will focus on discovering existing E2E automation buckets and analyze them further to evaluate if they can help in validating key customer scenarios in builds
I will create net new automation from scratch if existing automation does not meet the requirements. The automated tests owned by Q3 will be grouped and added to Q3 owned jobs that would run against builds
VERIZON -- Tampa FL
ROLE: SENIOR PERFORMANCE QA SPECIALIST April 2015- Dec 2015
PROJECT DETAILS:
VEC (VERIZON ENTERPRISE CENTER):
Verizon’s online customer self-service tool, allowing customers to manage transactions online such as orders, Invoices and trouble tickets
VIMPACT-REQUESTNET:
Single source for mechanizing and automating fields sales requests for high capacity services processes these requests and generates response back to the originating sales/custom service personnel or originator.
Environment:
Java, JSP, MXML, J2EE, Flex builder, Adobe Flex3.0, Adobe Flash Player 9.0,Win 2000,
Win XP, Win Vista, NT, FireFox 2.0, Oracle, JVM, JSF, WebLogic,
Testing Tools: Loadrunner 12.02, Performance Center 11,Silk performer 15.5, Quality Center 9.0
Monitoring Tools: Sitescope 11.0, Weblogic Console, Vmstat (from production support team), Introscope
Database Monitoring: Oracle AWR report
KEY RESPONSIBILITES:
Interacts with cross functional teams regarding project and testing status
Providing leadership throughout the performance testing process by identifying core test strategies, plans and cases, by analyzing the design-documents to validate business and technical needs
Script developments for the exiting functionality and for add on functionality per release.
Test and measurement activity per release to ensure all performance related parameters are fully satisfied
Business Requirements Analysis, Risk Analysis of Requirements, Tests and mitigating the same.
Create test plan, strategy document and master test scenarios with overall integration test approach.
Studying and analyzing the functionalities of the work requests.
Communication with business analyst, technical lead and application team to get issues resolved.
Hand on experience in developing scripts using Vugen. Execute the test using performance center (Load generator and Controller).
Performance testing web applications, web services APIs using silk performer, Load runner and monitoring server side performance using perfmon in windows and SAR in Linux platforms. Supported end-to-end performance testing. Monitoring JVM machines.
Experience in analyzing server side performance using performance counters on windows provided bottlenecks if any.
Hand on experience processing AWR report and Web logic console report and analysis of test results.
Defect logging and reporting with Silk central and Quality center.
Test Data Analysis and provide guidelines for production like data creation as per UAT needs
Report preparation of testing and knowledge sharing across team.
Preparing Test Summary Report, Result analysis Review and Signoff.
VERIZON DATA SERVICES -- India
ROLE: SENIOR PERFORMANCE QA SPECIALIST FEB 2007-APR 2015
PROJECT DETAILS:
BDMS:
Broadband Data Management System application is intended to support FTTP technologies. BDMS consists of workflow and reporting functionality based on engineering data from multiple applications.
ICONFIG:
High performance and high availability SIP/Http(s)/j2ee server that is used for provision and activate customer premises equipment for fttp sip/fios voice. Iconfig is used to provision and activate ONTS
Environment:
Java, JSP, MXML, J2EE, Flex builder, Adobe Flex3.0, Adobe Flash Player 9.0,Win 2000,
Win XP, Win Vista, NT, FireFox 2.0, Oracle, JVM, JSF, WebLogic,
Testing Tools: Loadrunner 12.02, Performance Center 11,Silk performer 15.5, Quality Center 9.0
Monitoring Tools: Sitescope 11.0, Weblogic Console, Vmstat (from production support team), Introscope
Database Monitoring: Oracle AWR report
KEY RESPONSIBILITES:
Strategic analysis of each application release with performance related parameters
Script developments for the exiting functionality and for add on functionality per release.
Test and measurement activity per release to ensure all performance related parameters are fully satisfied
Business Requirements Analysis, Risk Analysis of Requirements, Tests and mitigating the same.
Create test plan, strategy document and master test scenarios with overall integration test approach.
Studying and analyzing the functionalities of the work requests.
Hand on experience in developing scripts using Vugen. Execute the test using performance center (Load generator and Controller).
Communication with business analyst, technical lead and application team to get issues resolved.
Performance testing web applications, web services APIs using silk performer, Load runner and monitoring server side performance using perfmon in windows and SAR in Linux platforms. Supported end-to-end performance testing.
Experience in analyzing server side performance using performance counters on windows server provided bottlenecks if any.
Communicate with support and development teams to get issues resolved.
Defect logging and reporting with Silk central and Quality center.
Test Data Analysis and provide guidelines for production like data creation as per UAT needs
Report preparation of testing and knowledge sharing across team.
Preparing Test Summary Report, Result analysis Review and Signoff.
MICROBASE TECHNOLOGIES
SOFTWARE ENGINEER July 2004 – JAN 2007
KEY RESPONSIBILITES:
Design, Coding and integration of the various modules in the product.
Developed test plans, test cases based on functional specifications and design documents
Maintained requirements and created Traceability between Requirements and Test Cases.
Involved in analysis and designing database, Screen Designing, coding, testing in different Sub Modules.
Involved in coding for generating demand, Issue to personnel. Issue of salvage items to salvage depot, generating quotation, comparative statement etc. Generation of different reports.
Testing of the individual modules in the system so as to check for unit dependencies in the modules.
Testing the overall system for the integration with different modules in the system with each other.
Perform the tasks of evaluating software testing process and participate in various IT testing activities
Academic Profile :
B.E Computer Science – ST.Peter’s Engineering College, University of Madras 2004