Post Job Free

Resume

Sign in

Test Cases Engineer

Location:
Norwood, MA
Posted:
March 23, 2017

Contact this candidate

Resume:

MAJOR PROJECTS:

Client: Ascensus College Savings November 2016 – Present

Position: Sr. Automation Engineer (Selenium Automation)

Project: State Sponsored Retirement Plan and Record Keeping

State sponsored retirement plan for private-sector workers who do not have access to a retirement plan through their employers.

Participate in daily scrum meetings to report daily progress.

Contributed to introduce agile process, JIRA workflows and dashboards.

Designed and maintained test artifacts like test plans, test cases, bug reports and test completion reports with minimum to zero requirements documents.

Performed validations on the back office and record keepting feeds across multiple environments.

Created and maintained XML Schemas to increase the feasibility of reading test data files.

Responsible to perform testing RestAPIs calls and validate the JSON responses using RESTAssured framework.

Performed DB validation using complex SQL Queries.

Developed and maintained automation framework for feed data generation using Selenium WebDriver, Apache Velocity, JAVA, and TestNG.

Written scripts to generate test data file which reads data from database, UI or generate random data.

Environment: Selenium WebDriver, Java, TestNG, Maven, Apache Velocity, RESTAssured, JIRA

Client: PTC May 2016 – November 2016

Position: Sr. Automation Engineer (Selenium Automation)

Project: ThingWorx loT Platform

The ThingWorx Technology Platform is the only enterprise-ready platform that enables innovators to rapidly develop and deploy smart, connected solutions for the Internet of Things. ThingWorx contains the most complete set of integrated loT-specific development tools and capabilities making solution development simple, time-to-market fast and solutions built on the platform more compelling.

Responsibilities:

Automation

Participate in daily scrum meetings to report daily progress of automation activities.

Perform automation feasibility study & tool selection.

Develop and maintain automation framework using Selenium WebDriver, JAVA, TestNG, and Gradle.

Document the best practices, guidelines, dos and don’ts on wiki pages.

Prioritize test cases, estimate automation efforts and plan sprint wise automation activities.

Write automation test scripts with complex steps like drag and drop

Get code reviewed from manager for automation activities per sprint wise before merging to development branch

Execute automation test suite on weekend and provide automation execution report with trends to other team members and seniors.

Perform test failure analysis with screenshots and debugging test scripts

Demoed Automation framework in Sprint Demo

Manual Testing

Collaborate with product management and team members to understand new feature requirements.

Develop and maintain test cases/test data for new features

Execute test case for new features

Log, track and retest defects in JIRA

Participate in regression testing

Environment: Selenium WebDriver, Java, TestNG, Gradle, Git, Bitbucket, JIRA, AuotIT

Client: IBM May 2011 – March 2016

Position: Sr. Software Engineer (Selenium Automation/REST API Automation/Security)

Project: IBM Emptoris Sourcing Portfolio

IBM Emptoris Sourcing enables companies to optimize value, as well as price, from their supply base by factoring cost, risk and performance into sourcing decisions. It automates various sourcing events from reverse auctions to complex multi-stage negotiations and provides both broad and granular visibility into corporate sourcing data. IBM Emptoris Sourcing can help companies improve supplier collaboration and innovation, reduce supply base risk and lower costs.

Responsibilities:

Automation

Participated in daily scrum meetings to report daily progress of automation activities and sprint planning for test planning activities.

Led REST API automation team with team size of 3 members.

Performed automation feasibility study & tool selection.

Generated HTML Report with trends using TestNG and ExtentReport and provided execution report to other team members and seniors.

Involved in automation framework development using Selenium WebDriver, JAVA, TestNG, RESTAssured.

Mobile browser and mobile application automation testing with Appium.

Used Selenium Grid to run automation script in distributed environment.

Maintained Selenium and REST API automation frameworks. Written the common library methods and maintained it so as other team can proceed with the scripting part.

Conducted training on automation for new team members and provided all type of technical support and guidance to team members.

Documented the best practices, guidelines, dos and don’ts on wiki pages.

Developed Selenium WebDriver automation test scripts in Selenium WebDriver for new modules using reusable components and to ensure a robust code structure.

Used Firebug and Firepath tools locate elements with Xpath and CSS Selector.

Used UI Automator and Xcode to identify elements of mobile application.

Used AutoIT tool to record window based actions and integrated it in Selenium WebDriver Automation framework.

Written manual REST API test cases with JSON and manually tested using tools like Firefox REST Client.

Written automation test scripts to test REST APIs using RestAssured and TestNG framework.

Executed cross-browser and parallel testing using TestNG.

Peer Review of automation code written by team before it gets merged to central repository.

Over-night test scripts execution of both Selenium Automation and REST API Automation using Maven/Ant and Jenkins.

Manual Testing

Collaborated with product management teams in an agile environment to develop a comprehensive set of tests for both web application and REST APIs.

Developed and maintained test documentation including test plans, test cases, and test data.

Developed test cases for functional and regression testing.

Logged, tracked and retested defects in RTC, JIRA

Managed VMs and installations

Led and managed mid-size releases

RCA of client reported issues

Performed Migration Testing from old release to new release.

Security Testing

AppScan: Set up environment for security scanning, scanning of applications with AppScan, generated reports, analyzed reports and verified bugs manually

Solved problems faced during scanning, cleaning database

Manual security testing for XSS and CSRF(Cross-Site Request Forgery) for attacks for core functional areas

Analyzed customer reported security issues

Environment: Selenium WebDriver, Appium, TestNG, REST Assured, AutoIT, UI Automator, Xcode, FirePath, FireBug, Firefox REST Client, Eclipse IDE, AppScan, Webscarab, Burp, IBM Rational Quality Manager(RQM), Rational Team Concert (RTC), JIRA, Java/J2EE, Angular JS, Oracle, DB2

Client: Dell Sep 2010 – May 2011

Position: Data warehouse Development and Testing(ETL Testing)

Project: Float Dashboard – Data warehousing and Reporting

Float is defined as all service parts which have left the fulfillment center but have not yet been received back. Float is measured for 90 days, and after that period of time product that has not yet been received is written off.

A frontend of Float Dashboard is a dashboard where users can view predefined reports, run adhoc reports, browse cubes and view log of scheduled job. In backend there are scheduled jobs running on daily and weekly basis to populate data from various data sources and generating report data.

Responsibilities:

Understood new data feeds and reporting requirement

Created and scheduled ETL packages using SSIS and MS SQL Server Agent

Extensively used SSIS for extraction, transformation and loading process.

Written and executed test cases and executed test cases related to ETL mappings from source to target.

Prepared test data in database tables, Excel Spreadsheet, FTP to test ETL packages.

Written and executed test cases for checking the populated data correctly into respective tables.

Performed SQL validation to verify the data extracts and record counts in the database tables.

Extensively used SQL Queries to verify and validate the Database Updates.

Created reports in SSRS and deployed it to Reporting Server

Created, deployed and processed cubes and dimensions in SSAS

Scheduled jobs to execute ETL packages and processed cubes

Managed adhoc requests for data or reporting from client

Environment: MS SQL 2005, SSIS, SSRS, SSAS, ASP

Client: Entercoms Inc Dec 2007 - Sep 2010

Position: Sr. Test Engineer

Projects: ServicNet, DoD(Dashboard on Demand), AgentNet

Entercoms ServiceNet is an integrated decision support solution that enables you to manage your installed base, field service, aftermarket sales, and service parts planning processes. It’s a platform enables customers to get on-demand access to intelligence in the service chain, while giving users (field dispatch, parts management, and customer service teams and senior management) the visibility and action ability they need to manage operations.

DoD (Dashboard on Demand) is a Data Analysis Tool which helps user to prepare reports on the fly which can be used as dashboards. User can create reports with grid and various charts and have lots of option to get different views of the report including options like sorting, filter, drill down, range etc.

Responsibilities:

Led and managed medium size releases.

Analyzed the requirements and streamlined the testing activities.

Written test cases for checking complex report and their outputs.

Developed and maintained Test Scenarios and detailed test cases based on limited and dynamic requirements.

Created and maintained Traceability Matrix.

Prepared test data based on reports/features to be tested. Prepared data which required real geocoding.

Executed test cases, logged, tracked and retested defects in Redmine.

Provided status report to manager and seniors.

Extensively used SQL Queries to verify and validate the report outputs and search results based on address using geo coding.

Presales activity like updating sales machines with new build, preparing datasets as per clients for Demos.

Worked on process improvement activities.

Automation testing with QTP 9.1

Environment: Testlink, Readmin, QTP 9.1, Microsoft .Net, Flex 3.5, MS SQL Server, MySQL

Client: Metropolitan Life Insurance Company Apr 2007 - Dec 2007

Position: Test Engineer

Project: ITAC(In-Force Tracking and Communication) and MLAC to MIC merger

ITAC (In-Force Tracking and Communication) was a common system where one can see messages and alerts which are sent to them from 11 different admin systems. There were several kinds of messages generated from admin systems. We were required to test the flow and content of the messages and alerts. Messages and alerts were generated from mainframe system for different events in admin systems which were sent to ITAC where user can view their messages with all data associated to that message or alert. User can perform operation like forward, reply etc to these messages.

In MLAC to MIC merger module I was required to test company name change in several applications and reports because of two company’s merger. I was the only resource working on the application where I was required to identify changes on screens, reports and database of several applications.

Responsibilities:

Understood requirements

Functional test case creation, review and execution

Created detailed test data and got it reviewed from peers and client

Complex data Preparation in mainframe to test various messages

Logged, tracked and retested defects in-house tool

Prepared weekly and monthly status reports.

Environment: Quality Center 9.0, DTS, Windows, JSP, Mainframe

Client: Motorola July 2006 - Apr 2007

Position: Test Engineer

Project: Motorola claim processing system upgrades

Under this project we were required to test Claim Processing System for Motorola. Authorized service provider can register to the company and can perform several tasks like repairing a mobile, replacement of parts, mobile services etc. These dealers can claim their charges to the company for the service they have provided to clients. Dealers are required to create claim files and upload them to respective server, these files gets validated and gets processed at several applications.

There were 4 applications: SIFT, Service Link, Mclaims, UPD. We were required to test each system separately as well as integrated. I was also part of ETE testing team where we tested the application with business users.

Responsibilities:

Understood complex requirements with SMEs

Unit, system, system integration, regression test case creation, review and execution

Created vast range of test data to test validations

Coordinated with development team for new builds and installations of different applications on different servers

Monitored FTP and Linux boxes for failures where claims were processed.

Logged, tracked and retested defects

Coordinated with development team for defects, troubleshooting and impact of fixes

End to end testing with business users and SMEs

Faced rigorous client audit

Environment: J2EE, Windows, Linux, VSS 6.0, Oracle (SQL/PL-SQL), Quick base, DTS, TOAD SQL Client

Client: Coca Cola Mar 2006 – June 2006

Position: Test Engineer

Project: Clarity Workflow testing

Clarity PPM process manager enables you to display and automate the workflow of all business processes. We were required to test a work flows developed by Coca Cola. We were required to test flow of the new created projects with different entry and exit points.

Client: Metlife Oct 2005 - Feb 2006

Position: Test Engineer

Project: Metlife CP

The objective of the project was to test several applications integrated via common platform (CP). We were required to test applications like illustration, DES(data entry system), underwriting systems, policy admin eSeg, customer portal, management reports.

I was working on Policy Admin System which supports life, annuities, ULIP and group insurance. I was testing functions like remittance, disbursements, claims, surrender.



Contact this candidate