Post Job Free
Sign in

Web Services Test

Location:
Herndon, VA
Posted:
July 31, 2017

Contact this candidate

Resume:

Laxman Kandumalla

ac1lac@r.postjobfree.com

Phone#: 703-***-****

Summary:

Overall 14+ years IT experience and Automation Testing, Performance Testing, Analysis, Development, Production support, Implementation, Quality Assurance, Web and Multi-Tier complex applications on UNIX and Windows Environment and various Domains like Federal, Mortgage, Insurance, Telecommunication and Financial.

Hands on experience with Selenium WebDriver and Performance Testing with LoadRunner and JMeter.

Good experience developing automation tests with Selenium WebDriver, Junit, TestNG, DataDriven, PageObjects, Robot framework. Good experience on programming languages like JAVA, C, XML, SQL.

Familiar with OOP concepts and good understanding of J2EE architecture, SOA architecture and DevOps sdlc.

Good experience on Jmeter, SoapUI, Web Services (SOAP) and RESTful Web Services.

Good experience in identifying Workload and perform Load & Stress and Endurance tests with HP Performance Center, monitoring CA Wily Introscope, SiteScope and identify the bottlenecks of the application.

Good experience on Web (html/http), WebServices, AJAX, SAP-WEB and Oracle_NCA Protocols.

Ability to handle baseline and comparison performance tests from beginning to end (Test plan, vugen scripting, Load test scenario execution, analysis, and reporting of results).

In depth knowledge of architectures and technologies as they relate to performance testing.

Good experience in production support and system support during IST/E2E/UAT and Production Implementation.

Knowledge of all phases of Software Development Life Cycle (SDLC) and Project Life Cycle. Testing Software quality standards, configuration management, change management and Quality procedures.

Expertise mocking XML test data and flat files data with Mapping documents and control files for various systems.

Experience with testing in Agile Development and highly iterative software development environment.

Proficient in UNIX environment, Autosys and knowledge of writing shell scripts.

Experience with defect tracking processes and source code management tools.

Ability to interact with developers and product analysts regarding Testing status.

Ability to come up to speed quickly by learning the system and business rules.

Able to work on complex quality assurance tasks where a variety of factors may impact the results and ability to work independently and take initiative as well as actively participate in team enterprises.

Excellent analytical, root cause analysis and troubleshooting skills and ability to work with others in identifying major issues during the testing phases.

Excellent analytical, programming abilities to create elegant, flexible and maintainable solutions.

Technical skills

Operating Systems:

Windows NT/2000/XP, UNIX, AIX5.2, Sun Solaris, Linux

Testing Tools:

Selenium WebDriver, Load Runner, Performance Center, JMeter, SoapUI, Webload, Wily Introscope, HP SiteScope, ALM, DynaTrace, JProfiler, BlazeMeter.

Languages:

JAVA, C, SQL, PL/SQL, HTML, XML

Scripting Languages:

JavaScript, UNIX Shell Scripting.

Databases:

ORACLE, SQL Server, MySQL, DB2, Teradata, MS Access

Web, Application Servers:

BEA WebLogic, IBM WebSphere, Apache Tomcat, JBoss, IIS

Version Control:

GitHub, SVN, Clear Case, SourceTree, CM Synergy

Management&Tracking Tools:

ALM, Quality Center (QC), JIRA, ClearQuest, VersionOne, Remedy

Technologies:

Java, JSP, Servlets, JDBC, HTML, DHTML, XML, API, JSON, WML, Ajax, Middleware (JMS, MQSeries, RMI, SOA), Web Services (SOAP, REST), Selenium WebDriver, Selenium Grid, TestNG, Cucumber, BDD, Ruby, Jenkins, Splunk, DevOps, Maven, GIT, Github, Bamboo, VisualVM, Eclipse, AWS, UFT, SOA Tools, AutoSys, HP Service Test, Applets, RUP, UML, LDAP, DNS, Data Warehouse, DataStage, Agile Methodology, Mainframe, Cloud AWS, TIBCO.

Protocols:

WEB (HTTP/HTML), Web Services, Oracle NCA, SAP-WEB, AJAX, Citrix, RTE, RDP, Java Vuser, TCP/IP, IIOP, VOIP, SNMP.

Education Details:

Bachelor of Engineering (Electrical & Electronics Engineering), Sri Venkateswara University, TIRUPATI, INDIA.

Professional Experience

Vixxo, Baltimore, MD Feb 2017 – Present

Automation Test Engineer

Description: Vixxo offers full-service asset management and maintenance solutions to multi-site facilities across North America. Combining emerging technologies and our extensive network of Service Providers allows us to deliver operational efficiency, detailed analytics and significant cost savings to our clients.

Responsibilities:

Gathered requirements, project related documents to develop the test plans and scenarios.

Developed Functional Automation scripts with Selenium WebDriver, Eclipse, Java.

Developed PageObjects, TestNG, DataDriven and Robot Framework.

Used web-debugging tools like XPath, Firebug and Firepath to locate the web elements.

Developed selenium java automation scripts using XPath for complex situations.

Integrated Automation scripts (Selenium WebDriver API) in Continuous Integration tool Jenkins for nightly batch run of the Script. Created Project plan on Jenkins configured scheduled using Cron job.

Used Maven to build the Project, GitHub version control for Code repository and VesionOne for Test Defects.

Developed Data provider test data Objects for multiple test runs and captured Screenshots results.

Developed automation cross browser testing and created custom reports.

Performed automation tests on multiple environments with Selenium Grid.

Performed webservices Rest APIs testing with DHC JSON, validation check points using Assertions.

Performed Manual testing for Vixxo SEI Employee Portal, ORMB and MyFSN Apps with VersoinOne Tool.

Performed load and stress tests using WebLoad and presented performance statistics to project team

Analyzed performance test results and worked with technical teams to resolve bottlenecks.

Validated Backend test scenarios using Oracle SQL Developer SQL Queries.

Interacted with project management to discuss the performance test results.

Attended daily meetings for providing status of the Automation Suites and status of existing defects and functionality

Environment: Selenium WebDriver, Selenium Grid, Eclipse, PyCharm IDE, Jenkins, VersionOne, Apache Tomcat, Maven, GIT, Github, Web Services, REST APIs, Agile Environment, DHC, JXL, POI, Cloud AWS, XML, JSON, Python, SoapUI, SQL Developer.

DOD/DTS (Defense Travel System), Fairfax, VA June 2016 – Feb 2017

Sr Performance/Automation Test Engineer

Description: Defense Travel System (DTS) is a fully integrated, end-to-end travel management system that enables DoD travelers to create authorizations (TDY travel orders), prepare reservations, receive approvals, generate travel vouchers, and receive a split reimbursement between their bank accounts and the Government Travel Charge Card.

Responsibilities:

Analyzed requirements and other project related documents to create test plans and scenarios.

Coordinate and collaborate with multiple groups to determine, gather and verify performance test needs.

Developed VUGen Scripts, correlated dynamic values and load projections needed to simulate virtual users.

Developed test execution scenarios for various types of tests smoke, load, stress, endurance, and run tests.

Analyzed Throughput Graph, Hits/Second graph, Average Transactions per second graph using Analyzer.

Performed application Monitoring with HP SiteScope during performance testing and analyzed server side component graphs and Heap usage analysis for memory leaks.

Performed monitoring, diagnostics, analysis, tuning and reporting of results to identify and troubleshoot system performance issues and bottlenecks.

Designed and develop test data strategies to drive workloads for performance test reports.

Performed Automation scripts with Selenium WebDriver using Eclipse, Java Language.

Performed automation testing and reports with TestNG Framework and Data Driven Framework.

Performed Web Services performance test using WSDL, web service calls and soap requests.

Performed REST Web Services functional testing with SoapUI.

Developed Jmeter Scripts with Samplers, validation using Assertions and monitored results with Listeners.

Parameterized input values with CSV Data Set, correlated dynamic values with Regular Expression Extractor.

Logged defects and tracked with JIRA and fallowed with developers until it fixes.

Performed in-depth analysis to isolate points of failure in the application and troubleshoot performance issues.

Interacted with project managers and management in the development, execution and reporting of test results.

Attended status meetings and reported weekly testing status to the Manager.

Environment: LoadRunner12.02, Web (HTTP/HTML), Web Services, HP SiteScope, Splunk, Eclipse, Selenium WebDriver, Selenium Grid, TestNG, JMeter, ALM, JIRA, Bamboo, XML, Apache Tomcat, Oracle, Linux, Agile Env, SoapUI, MS Office.

USPTO (United States Patent and Trademark Office), Alexandria, VA Sept 2015 – May 2016

Automation Test Engineer

Description: The United States Patent and Trademark Office (USPTO) is the federal agency for granting U.S. patents and registering trademarks. The USPTO furthers effective IP protection for U.S. innovators and entrepreneurs worldwide by working with other agencies to secure strong IP provisions in free trade and other international agreements.

Responsibilities:

Gathered requirements, project related documents to develop the test plans and scenarios.

Developed detailed user stories out of high level business requirements.

Developed Functional automation scripts with Selenium WebDriver using Eclipse IDE and Java Language.

Developed selenium java automation scripts using XPath for complex situations.

Developed TestNG Framework and Data Driven Framework to perform the automation scripts.

Performed Continuous Integration (CI) automation testing with Jenkins, Maven, SVN and batch file.

Developed Data provider test data Objects for multiple test runs and captured Screenshots results.

Developed automation cross browser testing and created custom reports.

Performed automation tests on multiple environments with Selenium Grid.

Analyzed PALM project Web Services Security enhancements and developed security testing scenarios like user Authentication, Authorization, sql injections using SoapUI tool and web services WSDL.

Logged defects in to bug tracking ALM and fallowed with developers until it fixes.

Developed Jmeter Scripts with Samplers http requests with JSON, soap requests and performed validation check points using Assertions and monitored results using Listeners.

Parameterized input values with CSV Data Set Config and correlated dynamic system unique values using JMeter Post Processers Regular Expression Extractor.

Performed load and stress tests using Blazemeter and presented performance statistics to project team

Analyzed performance test results and worked with technical teams to resolve bottlenecks.

Monitored the Resources metrics to find the performance bottlenecks of the application.

Interacted with project management to discuss the performance test results.

Attended status meetings and walkthroughs and reported weekly testing status to the Manager

Analyzed Throughput Graph, Hits/Second graph, Average Transactions per second graphs using Blazemeter.

Environment: Selenium WebDriver, Selenium Grid, TestNG, Cucumber, BDD, Ruby, Eclipse IDE, Java, Jenkins, JMeter, Apache Tomcat, Maven, SVN, Web Services, REST API, SoapUI, VMware, AWS, XML, JSON, WSDL, JBoss, LDAP, Linux, Oracle, JIRA.

Social Security Administration, Baltimore, MD Jan 2012 – July 2015

Sr Performance/Automation Test Engineer

Description: Social Security Administration (SSA) offers online information and services to third parties. SSA provides services to government organizations and authorized individuals to conduct business with Social Security and provides guidance to employers on reporting wages to Social Security. Disability Case Process System (DCPS) is one of the Social Security Administration servicing system. DCPS will provide the service to disabled people and this system will analyze the Disability Cases and process the Disability Claims.

Responsibilities:

Analyzed business requirements and other project related documents to create test plans and scenarios for SSA Applications like DMA, CFRMS, BIDS, ASA and DCPS.

Parameterized various input values and correlated dynamic unique, session Ids with VUGEN and simulated number of concurrent users for load, created scenarios and executed load tests using Performance Center.

Performed test execution, monitoring, diagnostics, analysis, tuning and reporting of results to identify and troubleshoot system performance issues and bottlenecks.

Performed Web Services performance test using WSDL, web service calls and soap requests and simulated virtual data with Service Virtualization.

Performed Web Services functional testing with SoapUI, analyzed WSDL, prepared web service XML requests.

Developed JMETER Scripts with Samplers http requests with JSON, soap requests and performed validation check points using Assertions and monitored results using Listeners.

Extensively used Web (html/http), Web Services and Oracle_NCA Protocols for SSA Applications.

Performed XML API functional and performance testing, Monitored Websphere logs, CPU and GC Heap usage.

Performed CA Wily Introscope for Monitoring SSA Production and Integration performance testing Applications and analyzed Servlets, JSP, EJB, DB2 component graphs and Heap usage analysis for memory leaks.

Monitored the Resources metrics to find the performance bottlenecks of the application.

Analyzed Throughput Graph, Hits/Second graph, Average Transactions per second graph using Analysis tool

Created VUGEN scripts for test data purpose and performed root cause analysis for the application issues.

Worked on production to find the current and projected user volume and transaction density.

Analyzed JVM Heap Dump, JVM Tread Dup, GC Logs and Web Access Logs.

Performed functional automation testing using Selenium WebDriver with Java language.

Developed selenium automation scripts using locaters like ID, Name and XPath for complex situations.

Developed TestNG Framework and Data Driven Framework to perform the automation testing and reports.

Created CSV Test data and developed Data Provider Objects for test runs.

Developed automation cross browser testing and created custom reports

Performed Continuous Integration (CI) automation testing with Jenkins, and batch file.

Developed Database JDBC connection to validate the backend database tests.

Opened and tracked defects with ALM and fallowed with developers until it fixes.

Involved in various meetings with project team to gather the performance requirements and SLAs before testing.

Ran the data refresh shell scripts before running the performance load tests.

Worked with development, web admin and DBA to find the issues.

Provided support to the development team in identifying real world use cases and appropriate workflows.

Performed in-depth analysis to isolate points of failure in the application and troubleshoot performance issues.

Used Putty to connect to the Sun Solaris websphere servers for error logs to identify the serverside issues.

Interacted with project managers and management in the development, execution and reporting of test results.

Attended status meetings and walkthroughs and reported weekly testing status to the Manager.

Environment: LoadRunner11.5, Performance Center11.5, ALM11.5, Virtual Table Server(VTS), Web (HTTP/HTML), Web Services, RESTful Web Services, Oracle NCA, SoapUI, SAP-WEB, AJAX, UFT, Sun Solaris, VMware, JMeter, Blazemeter, JavaScript, XML, WSDL, JSON, XML API, JMS, Web Sphere, HP Diagnostics, CA Wily Introscope, SiteScope, RUM, Oracle, DB2, J2EE, Mainframe, MS Office, Putty, Toad, Fiddler, Selenium WebDriver, Selenium Grid, Eclipse, TestNG, Jenkins.

Monster Government Solutions, McLean, VA July 2011 – Dec 2011

Software Test Engineer

Description: Monster Government Solutions began working with leaders in government and education. MGS provides the way people look for government jobs and make a difference in the lives of people across the country, the way employers look for people, and how organizations connect with their target audiences. MGS mission is to help you achieve your workforce mission.

Responsibilities:

Gathered requirements, specifications, developed test plans, test scripts and executed scenarios.

Collected performance metrics on various configurations, performed baseline and comparison tests.

Parameterized test scripts for various parameter values and simulated number of concurrent users for load scenarios and correlated dynamic unique session Ids.

Performed Automated web based functional testing using Selenium WebDriver and cross browser testing.

Developed selenium automation scripts using locaters like ID, Name and XPath for complex situations.

Developed TestNG Framework and Data Driven Framework to perform the automation testing and reports.

Performed Continuous Integration (CI) automation testing with Jenkins, and batch file.

Performed Vugen test scripts with concurrent VUsers with ramp-up, ramp-down and duration options.

Designed and maintained performance and scalability test suites based on product specifications.

Identified appropriate protocols on various applications with Protocol Advisor utility.

Performed Web Services test using WSDL, web service call, soap requests and REST Ful Web Services.

Analyzed throughput, Hits/Second, Transactions per second graphs using LR Analysis tool

Worked on test results and application resource usage graphs to identify performance bottlenecks.

Performed extensive back-end database testing with complex PL/SQL queries in toad to test the different scenarios and made TNS names properties for toad database connection.

Developed test plan, test cases and executed with required test data and followed up status in Quality Center.

Prepared operational documents and uploaded to Share Point for application support team.

Attended status meetings and walkthroughs and reported weekly testing status to the Manager.

Environment: LoadRunner11, Apache JMeter, Selenium, Eclipse, Web (HTTP/HTML), Web Services, SAP-WEB, AJAX, FTP, QTP, JIRA, SourceTree, Sun Solaris, XML, JSON, XML API, JMS, Tomcat, GIT, Oracle10g, Quality Center11, PL/SQL, SQLNavigator, J2EE, MS Office, Shell Scripts, SSH, Putty.

Fannie Mae, Herndon, VA Oct 2010 – June 2011

Software Test Engineer

Description: Fannie Mae provides financial products and services that increase the availability and affordability of housing for low, moderate and middle income. Credit Loss Management (CLM) Systems help manage the risk of credit loss. Servicer Default Management Utility (SDMU) is a new business tool system provides various loan workout options Fannie Mae offers, with an emphasis on preferred order of programs and online resources for communication with borrowers.

Responsibilities:

Participated in application technical design process and Performed User Acceptance testing (UAT) and Vendor integration testing (VIT) environment setup. Worked on properties files, set facls.

Performed SDMU application technical shakeout with Java Client to ensure no environment/application issues after code migration in User acceptance testing and vendor integration testing environment.

Gathered requirements, specifications, developed test plans, test scripts and executed scenarios.

Collected performance metrics on various configurations, performed baseline and comparison tests.

Parameterized test scripts for various parameter values and simulated number of concurrent users for load scenarios and correlated dynamic unique session Ids.

Performed Vugen test scripts with concurrent VUsers with ramp-up, ramp-down and duration options.

Designed and maintained performance and scalability test suites based on product specifications.

Identified appropriate protocols on various applications with Protocol Advisor utility.

Performed Web Services test using WSDL, web service call and soap request.

Analyzed throughput, Hits/Second, Transactions per second graphs using LR Analysis tool

Worked on test results and application resource graphs to identify performance, bottlenecks

Performed TIBCO Servers operations and monitored Tibco Services in UAT and VIT environments.

Performed WebLogic Servers operation and monitored instances and logs.

Performed JMS Queues with Hermes and drained queues as per requirement.

Supported SDMU Auth service, Decision service, Workflow service and Reporting service operations.

Communicated the test results and test status to the manager and project team.

Identified errors, defects and data quality issues and performed root cause analysis.

Worked with File Transfer Portal setup team and performed SDMU FTP operations.

Performed testing using the frond-end CLM GUI and back-end data base queries.

Performed extensive back-end database testing with complex PL/SQL queries in toad to test the different scenarios and made TNS names properties for toad database connection.

Developed test plan, test cases and executed with required test data and followed up status in Quality Center.

Supported SDMU application during the production implementation.

Connected to UNIX Sun Solaris servers and FTP files across different test environments by using SSH, Putty, and Command prompt.

Environment: Quality Center, LoadRunner 9.5, Web (HTTP/HTML), Web Services, SAP-WEB, AJAX, FTP, QTP, AutoSys, Rational ClearQuest, Remedy, DOORs, TIBCO, Sun Solaris, XML API, JMS, WebLogic, Oracle10g, TOAD, Java, J2EE, MS Office, Shell Scripts, SSH, Putty, Hermes Queues

NewYork Life Insurance, Clinton, NJ May 2009 – Oct 2010

Performance Test Engineer

Description: NewYork Life Insurance is one of the leading life insurance companies. NYL provides Life Insurance, Lifetime Income, Investment Annuities, Long-Term Care Insurance, and Mutual Funds.

Responsibilities:

Reviewed requirements and specifications for potential performance risks/issues.

Conducted walkthroughs with Project team/business analyst to understand the application process.

Developed performance Test Plan for load and performance testing.

Developed test scripts using Load Runner Virtual user generator and designed test data for individual test cycles.

Performed various Load and Performance test scenarios using Controller.

Correlated Vugen scripts to handle the dynamic values.

Parameterized automation scripts for various parameter values and simulated number of concurrent users for load scenarios to determine response times and bottlenecks.

Determined appropriate protocols for assigned applications with Protocol Advisor utility.

Executed Vugen test scripts in Controller with concurrent VUsers with ramp-up, ramp-down and duration.

Performed baseline and comparison performance tests from beginning to end (developing test plan, test scripts, scenario execution, analysis, and reporting of results)

Performed Web Services with WSDL, web service call and soap request.

Monitored various system resources on application and database servers.

Analyzed Throughput Graph, Hits/Second graph, Transactions per second graph using LR Analysis tool.

Worked on analysis of test results to identify performance, bottlenecks and scalability issues in application using various system and application resource usage graphs.

Worked with technical teams such as Development, Operations & Architecture to discuss performance/stress test results and troubleshooted the issues represent the Performance Testing Team in meetings with Tech Managers, Business Owners and Stakeholders.

Managed multiple tasks and meet the test scheduled dates.

Created Requirement Traceability Matrix with requirements and test cases to ensure complete test coverage.

Performed extensive back-end database testing with complex PL/SQL queries in toad.

Opened defects in to Quality Center and fallowed with developers until it fixes.

Environment: LoadRunner 9.5, TeamQuest, QualityCenter, JMeter, Sun Solaris, XML, JMS, SOA Tools, WebSphere, MySQL, Oracle10g, J2EE, MS Office, Web (HTTP/HTML), Web Services, SAP-WEB, AJAX, JDBC, Lotus Notes, LapLink Gold, Putty.

VERIZON Ashburn/Arlington, VA Feb 2005 – Apr 2009

Software Test Engineer/Production support

Description: StatusPro is a web-based integrated reporting and tracking system. StatusPro offers end-users a way of tracking orders through the order process. StatusPro receives milestones from service ordering and Provisioning systems and process milestones to Enterprise and Billing systems. StatusPro can schedule reports for future use, and saving frequently used sets of criteria. XRM is a solution designed to automatically generate service orders for the Local Service Requests (LSR) submitted by Competitive Local Exchange Carriers (CLECs). LSR is received by one of Verizon wholesale interface gateway systems (LSI-EDI, WEB GUI, and LSI) through MQ listeners and socket listeners. The LSR is received in internal format (EIF) from LSI and WEB GUI. This application starts a multi-step workflow process like Validation of input LSR, Storage of input LSR for tracking, review, Population of data required for service orders.

Responsibilities:

Analyzed business requirements and software requirement documents.

Developed test plans and detailed functional and usability test cases for each deliverable of the software and project implementation services. Maintained Requirement Traceability Matrix (RTM) between Requirements.

Participated in design reviews and Test case reviews to ensure scenarios captured business functionality.

Opened defects under Defects Tab in Quality Center and followed with developers to fix the bugs.

Performed regression testing after developer fixed the defects and updated the changes.

Mocked XML files test data, Flat files data for service Order, Reporting systems to satisfy test scenarios required.

Configured SoapUI tool and performed Functional Web Service tests and Web Service Mocking.

Performed submitting the Mainframe JCL jobs with TSO and ISPF editor to extract the required test data.

Involved in Build (.ear, .jar, .war) and Deployment process and tested applications through Web Logic. Used CM Synergy for checkout/check in.

Involved and supported production support during the production implementation.

Performed backend testing with SQL Queries, triggers, functions and verified data integrity.

Performed source to target mapping of data while System and Integration testing and performed ILOG Jrules business process testing.

Tested production defects and followed with developers to fix and managed defects in ClearQuest.

Worked on QTP automation scripts and executed regression scripts.

Used Parameterization and Correlation while creating the scripts to avoid hardcode values

Performed Data Analysis testing for various source and target systems.

Identified critical scenarios in application to Create, Enhance & develop scripts required for Performance testing using Virtual User Generator (VUGen)

Performed various Load and Performance test scenarios using Controller.

Used Controller in Load Runner to create manual as well as goal oriented scenarios and running the scenarios.

Worked on analysis of test results to identify performance and scalability issues in application using various system and application resource usage graphs.

Tested production defects and followed up with developers to fix the bugs.

Prepared documentation on all test scenarios with the expected results.

Managed the overall test process according to the Agile Methodology.

Environment shakeout and smoke test before to start functional testing.

Performed sending messages from SPRO to another system EOM through SOA with MQ Series, JMS, BOD XML and Ascential DataStage Jobs.

Mentored offshore team and coordinated with Onsite and Offshore teams to resolve the issues via Microsoft Office Live meeting utility.

Environment: HP QualityCenter9.2, Load Runner 9.1, Jmeter, QTP 9.2, Oracle10g, PL/SQL, SQLDeveloper, Ascential Data Stage, Sun Solaris 8, J2EE, XML, XSchema, JMS, SoapUI, Web Services, VOIP, FTP, MQSeries, IBM WebSphere, Hummingbird, Putty, Agile, MainFrame, MVS, TSO, ISPF, WebLogicServer8.1, Web 2.0, VATT, SOA Testing tool, WLI, Wily Introscope, ILOG, LDAP, DNS, VOIP, IREP, CM Synergy.

Wachovia Bank, Winston Salem, NC Jan 2003 – Dec 2004

QA Engineer

Description: This system is developed by Wachovia Bank and provides on-line information about its clients’ holdings, loans, maturity, realized or unrealized gain and loss etc.

Responsibilities:

Analyzed Business and Functional requirements of the application and worked with Business users to understand project requirements and to figure out the scope of test strategy.

Developed Test Scripts and Maintained Requirement Traceability Matrix (RTM) to track the requirements to the test cases to ensure complete test coverage in the Rational Test Manager.

Prepared personal Loans test data and financial information exchange (FIX) messages with FIX Trading Interactive tool and tested process flow.

Performed System testing, Integration, regression and smoke Testing

Performed Manual testing and Front-end Web based Functionality testing, Navigation testing, links validation, Session management, cookies and Security testing in different web browsers like Internet Explorer (IE), Netscape, and Firefox.

Performed End-to-End testing of the application.

Involved in User Acceptance testing of the application with business experts.

Performed SQL to validate various reports, data loads and extract processes.

Environment: Test Director, RequisitePro, FIX Protocol, DB2, COBAL, JCL, HTML, XML, Java, Servelets, Java Script, Apache Tomcat, and PL/SQL, MS Office.



Contact this candidate