SUMMARY
•Having **+ years of experience in Software Development, Testing and Test Management.
•Over 10 years of experience in Manual and Automated Testing and as business process analyst of web based applications.
Extensive work experience in Microsoft IoT Edge, IoT Hub, Azure Cloud, Cosmos DB, Azure Blob Storage,RabbitMQ
• and Telemetry Services
•Have good experience in testing Deep Neural Networks(DNN) and Convolution Neural Networks(CNN) Models
•Have good experience in testing applications on Linux environment
•Extensive work experience in business applications like CRM, ERP and complex software integration applications in web based and client/server environment
•Expertise in developing Software Test Plans, Test Case Design, Test Scripts, system/ Functional/ Integration Testing, Regression Testing and Stress Testing.
•Excellent experience in developing and maintaining robust test scripts according to the business specification given.
•Excellent Experience in Defects Tracking in the Application.
•Good Experience working in J2EE environment and Microsoft technologies.
•Have good experience in testing Main Frame Applications
•Involved in the testing of data validations for Mobile Applications
•Excellent leader ship skills
•Excellent communication, technical and interpersonal skills.
EDUCATION
Bachelor Degree in Mechanical Engineering
SOFTWARE SKILLS
Testing Automation Tools: ALM, Selenium, SOAPUI, Postman
Quality Center 9.0, Version One, Lotus Notes
Development Tools: C#, .net, Azure Cloud, Azure IoT Edge, Cosmos DB, Azure Blob Storage, Machine Learning, CNN, DNN, AI Models, Python, PyCharm, Java, node.js, java script, XML, JSON, RabbitMQ
Application Servers: WebSphere
Web Servers: IIS, Apache, WebSphere
Databases: Oracle, IBM Mainframes, DB2, MS SQL Server, Cosmos DB, Microsoft Azure Storage Explorer
Internet Tools: Eclipse IDE, Visual Studio, Git, Jenkins
OS Expertise: Windows, UNIX, Linux
ERP: Baan ERP, CRM
EXPERIENCE
Image Analytics – Azure IoT Edge Flow
BNSF Railway, Fort Worth, TX
Senior QA Analyst
Feb’19 – Till Date
BNSF is implementing Azure IoT based AI Image Analytics to find defects on the wheels of trains. Track side detectors captures images and they are fed into EdgeFlow Interface that has Deep Neural Network Models and AI Models identifies the defects on the wheels and send defective images to the Azure cloud and in turn the defects are sent to EQMS application that handles train car wheel defects at NOC center.
Responsibilities:
Followed Agile Scrum Methodology.
Involved in stand-up, sprint capacity planning & retrospective meetings.
Worked with product owners and Dev team to understand requirements.
Prepared Test Plan and Test Cases in Version One.
Involved in preparation of CSV files for testing the Pipelines
Involved in the testing of Azure IoT EdgeFlow Pipeline (BV Polling, Polling, Machine Learning, Detection Rules Engine, Uploader) to Azure Cloud
Involved in the testing of images sent from IOT Edge to Microsoft Azure Blob Storage
Involved in the testing of meta data sent from IOT Edge to Cosmos DB
Involved in the testing of AI Cogniac Models sending defective images
Involved in the integration testing of IOT Edge and Azure Cloud Pipeline
Involved in the testing of Telemetry services of Azure Pipeline
Involved in the configuration testing of Azure function apps
Involved in the configuration of the IoT Edge Device
Involved in End to End testing of Pipeline
Used Version One to log and track defects.
Participated in resolution of road blocks involving Scrum master, Product owner & Dev team.
Prepared Test Evidence Document for each task in the sprint.
Involved in Integration, System, Regression and End to End testing of the application with other teams.
Environment: Machine Learning, AI Models, DNN, Python, CNN, Microsoft Azure Portal, Microsoft-IOT-Hub, Microsoft Azure Storage Explorer, JSON, Rabbit MQ, Blob Storage, Cosmos DB, Version One, PyCharm IDE, Microsoft Visual Studio, Git,Linux,kubernetes,Docker Containers
BNSF Railway, Fort Worth, TX
Jan’16 – Jan’ 19
Graphical Restriction Input Tool (GRIT) is used by the Field Users, Dispatchers and ACDs to create requests and restrictions for FORM A and FORM B. Field Users use the tool to create requests from the field and Dispatchers will approve the requests. Once approved they become restrictions. These restrictions are sent to TSS and PTC.
Responsibilities:
Followed Agile Scrum Methodology.
Involved in stand-up, sprint capacity planning & retrospective meetings.
Worked with product owners and Dev team to understand requirements.
Prepared Test Plan and Test Cases in Version One.
Involved in preparation of Data staging.
Tested various web services using SOAP UI
Involved in the testing of all the sub division tracks highlighting
Used Version One to log and track defects.
Participated in resolution of road blocks involving Scrum master, Product owner & Dev team.
Prepared Test Evidence Document for each task in the sprint.
Involved in Integration, System, Regression and End to End testing of the application with other teams.
Involved in the testing of downloading data from Mainframe to GRIT
Environment: Mainframe, DB2, Groovy On Grails, react.js, XML, JAVA, Version One, JSON, MQSeries, GIS, Eclipse IDE, Git, Jenkins,MS SQL Server
BNSF Railway, Fort Worth, TX
April’14 – Dec’15
BNSF Railway is the second-largest freight railroad network in North America. Involved in the testing of Detectors using Web Services, ATRAC, Intermodal trains, TSSWaybill, Demurrage, WOPRT, TSS Xpress.
Responsibilities:
Followed Agile Scrum Methodology.
Involved in stand-up, sprint capacity, planning & retrospective meetings.
Worked with product owners and Dev team to understand requirements.
Prepared Test Plan and Test Cases in ALM.
Involved in preparation of Data staging.
Tested various web services using rest client API.
Tested batch jobs using TSOB and ISPF
Involved in the testing of Intermodal trains.
Involved in the testing of Demurrage Notifications
Involved in the testing of TSS Waybill.
Involved in the testing of TSS Xpress.
Tested rules based decision services that are implemented using ILOG rules team server7.1.1.
Used Team forge to log and track defects.
Participated in resolution of road blocks involving Scrum master, Product owner & Dev team.
Attended daily defect triage meetings.
Involved in Integration, System, Regression and End to End testing of the application with other teams.
Involved in UAT along with Business.
Responsible for Validating Backend Data using SQL Queries.
Prepared Test Evidence Document for each task in the sprint.
Involved in post deployment validations.
Environment: Mainframe Testing,DB2, Informix, TOAD, XML, Unix, JAVA, ILOG RULES TEAM SERVER7.1.1,ALM, Team forge, JSON, RESTCLIENT, IBM Intelligent Operations Center(IOC).
Apex Systems Inc. /BankAmerica, Dallas TX
Jun’13 – Mar’14
UAT Lead
The objectives of the MDMI core capabilities release are to test the systems to ensure the new features and enhancements perform as designed per specified requirements and will not impact existing functionality of other areas within these systems with new functionality.
Responsibilities:
•Planning and Designing tests based on user expectations and release expectations
•Designing test cases high level scenarios to go for detailed test cases design
•Preparing test data requirement work sheet based on test cases and test scenarios to get data from CDM and various sources
•Continuous interaction with project stake holders.
•Working with UAT Testers and providing them continuous support.
•System test environment setup and connectivity testing
•Prepare End to End Test plan, Test Cases and execute the test cases in the Quality Center.
•Analyzed business requirements, and suggested change requests if necessary.
•Document of daily defect status with the help of QA metrics
•Creating Road Map for the project
•Creating Traceability Matrix for the modules.
Environment: Mainframe Testing, DB2,XML,Java, MS Access, HP Quality Center, UNIX, Windows XP Professional, MS outlook
Accenture/BankAmerica, Dallas TX
Sep’10 – Mar’13
Sr. QA Analyst
Portfolio Services Group (PSG) offers a variety of portfolio management services such as sub-servicing, interim servicing, backup servicing and special servicing to various financial institutions. The Department manages the process of acquiring and releasing loans related to structured transactions, new loan transfers and other servicing portfolios for an assortment of different business partners. PSG also monitors the transfer of the loan portfolio and communicates with all parties as needed to ensure a smooth timely transition.
Responsibilities:
•Review of Business requirement documents, application software design, implementation, upgrade & technical documentation and developed test cases.
•Carried out System testing for Production fixes and opened up tickets in HP Quality Center for bugs detected and Closed after successful system test.
•Involved in team meetings and providing estimates for the projects.
•Involved in DB2 database testing.
•Involved in successfully releasing the projects on time.
•Prepare End to End Test plan, Test Cases and execute the test cases in the Quality Center.
•Used Selenium IDE for web testing
•Responsible for Setting up Web Services project using WSDL in SOAPUI and provided setup help to other team members
•Analyzed business requirements, and suggested change requests if necessary.
•Prepared test plans based on test requirements which includes all test items and test cases.
•Document of daily defect status with the help of QA metrics.
•Provided management with test metrics, reports and schedules as necessary and participated in the design walkthroughs and meetings.
•Documented test cases based on corresponding Business/Users requirement documents and technical specifications and other operating conditions.
•Used FTP Client for checking the logs on Servers
•Involved in writing complex SQL Queries.
Environment: PSG Application, Main Frames,DB2,MS SQL Server 2005,XML,ASP.NET, MS Access, QTP,HP Quality Center, UNIX, Windows XP Professional, Lotus Notes, VB Script, SOAPUI, Selenium
Verizon, Irving, TX
Oct’07 to Aug’10
Sr. QA Analyst
Verizon has an automated Voice Portal System for the Customers in different parts of the country where users can make choices of the automated voice regarding the mobile phone services offered by Verizon. Voice Portal testing involves the maintaining and testing of the servers.
Responsibilities:
•Responsible for defining Test Strategies, Test Plans, Estimating the test execution
•Wrote Test cases for all the possible scenarios for every requirement
•Preparing End to end Test cases and sign off before the test execution
•Engaged meeting with Business users/clients for the requirements.
•Work with Development Managers and Product Managers to standardize on a Software Development Life Cycle
•Evaluated the Tools to automate the test Process based on the applications and technologies
•Prepare End to End Test plan, Test Cases and Execute the test cases
•Loading the Requirements into QC and mapping the Defects, Test cases for the tractability metrics
•Enhanced the Existing automation script for in the regression testing for every build life cycle.
•Extensively performed System, Integration, and functional, smoke testing Application Functionality and Regression test cases for manual and automated testing.
•Performing Health Checks on All Test systems on Daily basis
•Coordinating with different Teams for integrating testing.
•Participating in Go/No GO calls for every build sign off
•Used FTP Client for checking the logs on Unix Servers
•Involved in writing complex SQL Queries.
•Involved in the testing of data validations for Mobile Applications
•Involved in the testing of Content Management templates of the website
•Involved in recycling of Unix Servers using FTP client
•Communicated with the team and with other teams throughout the project life cycle for change requirements and for to update in the Test plan /test cases
•Analyzing the Test Data for every Scenario and coordinating with different teams for Test Data
•Coordinating with release teams for every build
•Executed SQL queries for data validation and back end testing
•Review UAT test cases and supporting UAT (User Acceptance Testing)
•Performing the Post Production Testing
Environment: J2EE, JSP, JavaScript, VB, QTP 9.5, Clear Quest, Lotus Notes, Web Logic, UNIX, XML, Oracle10g, FTP, Linux,
Cingular, Richardson, TX
Aug’06 to April’07
Positively Retail Application
Sr. QA Lead
Positively is a Retail Web Application (POS) for the Cingular to sell various models of Mobile Phones and Various Monthly Plans to Consumers, Existing Customers, and Businesses. Positively is a Point of Sale application to sell the Mobile phones.
Responsibilities:
•Responsible for defining Test Strategies, Test Plans, Estimating the test execution
•Developed standardized testing methodology and procedures across QA department.
•Engaged meeting with Business users/clients for the requirements.
•Preparing the prerequisites for the test Environment.
•Work with Development Managers and Product Managers to standardize on a Software Development Life Cycle
•creating resource plans, schedules, and weekly tracking of tasks and milestones
•Evaluated the Tools to automate the test Process based on the applications and technologies
•Performing Health Checks on All Test systems on Daily basis
•Involved in the testing of data validations for Mobile Applications
•Involved in the testing of Reports generated by Crystal Reports
•Involved in the testing of data migration from old systems to the new positively application.
•Used data migration tool informatica to migrate the data.
•Enhanced the Existing automation script for in the regression testing for every build life cycle.
•Involved in the trouble shooting of IP Connectivity Variables
•Involved in the testing of firewalls, router config, ipaddress.
•Extensively performed System, Integration, and functional, smoke testing Application Functionality and Regression test cases for manual and automated testing.
•Setting up the Team goals for the whole facial year
•Prepare End to End Test plan, Test Cases and Execute the test cases
•Communicated with the team and with other teams throughout the project life cycle for change requirements and for to update in the Test plan /test cases
•Analyzing the Test Data for every Scenario and coordinating with different teams for Test Data
•Mentoring the Team members to bring them up to the comfort level
•Developing a complete end-to-end test strategy for assigned Applications
•Coordinating with release teams for every build
•Involved in the testing of e-commerce transactions
•Maintaining the bug status for every Build for management review.
•Presenting the demo to the management about the status of project execution, Traceability Metrics
•Executed SQL queries for data validation and back end testing
•Engaging Meetings with Team members to discuss the findings in the executed tests and decide the next steps and Stream line the process QA process for Bug free applications.
Environment: J2EE, JSP, JavaScript, VBScript, Clear Quest, Lotus Notes Web Logic, XML, Oracle10g, SQL Server2005,Crystal Reports, E-commerce
SSA Global, Dallas, TX
Oct’05 to July’06
Team Lead
SSA Interactive Selling (Baan CRM) is a Web based CRM Application that helps Customers to manage Contacts, Opportunities, Creating, quotes and orders, sales proposals.
•Involved in the testing of Web based CRM Application Interactive Selling that was developed using ASP.Net, XML, EDI C#, SQL Server, Oracle Wrote and executed Test Plans and Test Cases for Web Applications based on System Requirements.
•Involved in the testing of EDI.
•Involved in the development and Testing of BOI’s for Integration between CRM & ERP.
•Involved in the testing enterprise level ‘COTS-commercial off the shelf
•Prepared VB Test Shells for testing BOI’s
Prepared test cases, test scripts based on user requirements and functional
•Used automation scripts developed using Win Runner. (TSL) was used to enhance the power of the scripts for handling complicated scenarios.
•Mismatches and bugs were identified and reported back to the development team.
•Performed System Testing and regression testing.
•Interacted with developers to resolve technical issues and provided production support
•Developed a Customized Mercury Test Director for defect tracking of various products in the company
•Used Web Defect Manager (Test Director) for defect tracking system.
•Involved in doing various Certifications for different releases with Complex Matrix.
•Involved for releasing sustaining patches for various Modules.
•Involved in releasing service packs for various releases
Environment: ASP, VBScript, Java Script, Load Runner 7.5/7.5.1/7.6, QTP 8.2, Quality Center9.0,IIS,Apache, XML, SQL Server 2000,Oracle9i, Windows 2000,Web services..
SSA Global, Golden, CO
Nov’00 to Sep’05
QA Analyst
•Rolled out Mercury Test Director8.0 for various products to login defects, Create Test Plans, Test Cases, and Business Requirements across the company.
•Mercury Test Director is a Defect Tracking System for managing defects of various products.
•Involved in the customization of the Test Director according to the requirements of various departments.
PROJECT: SSA I PACK
SSA iPack is an integration package for SSA CRM (Baan CRM) and ERP (Baan4c4 and Baan5.0) that helps Customers real time data flow of Customers, Contacts, Products Quotes and Orders.
•Involved in the testing of Middleware Baan Open World.
•Involved in the testing of EDI for integrations between CRM and ERP
•Involved in the development and Testing of BOI’s for Integration.
•Prepared VB Test Shells for testing BOI’s
•Involved in the testing enterprise level ‘COTS-commercial off the shelf
•Prepared test cases, test scripts based on user requirements and functional
•Specifications
•Used Mercury Test Director 8.0 for writing Test Plans, Test Cases and logging defects
•Involved in the Functional Testing, System Testing, and Integration Testing of various Components.
•Involved in white box Testing.
•Mismatches and bugs were identified and reported back to the development team.
•Performed System Testing and regression testing.
•Involved in doing various Certifications for different releases with Complex Matrix.
•Involved for releasing sustaining patches
•Involved in releasing service packs for various releases
•Interacted with developers to resolve technical issues and provided production support.
•Used Web Defect Manager (Mercury Test Director) for defect tracking system
PROJECT: SSA CRM
SSA CRM formerly Baan CRM is a Client Server Application that consists of Modules Sales Force Automation, Marketing, E-Configuration, Outlook Integration, Lotus Notes Integration, ERP Integrations, and Baan Assistant for data synchronization.
•Involved in the end to end testing of Client Server CRM SSA CRM (Baan CRM) that was developed using C, C++, VB5.0/6.0 SQL Server, Oracle
•Involved in the installation of software for creating Test Environment
•Prepared test cases, test scripts based on user requirements and functional Specifications
•Involved in the testing enterprise level ‘COTS-commercial off the shelf
•Used Mercury Test Director 8.0 for writing Test Plans, Test Cases and logging defects
•Involved in the Functional Testing, System Testing, and Integration Testing of various Components.
•Involved in white box Testing.
•Mismatches and bugs were identified and reported back to the development team.
•Performed System Testing and regression testing.
•Involved in doing various Certifications for different releases
•Involved for releasing sustaining patches
•Involved in releasing service packs for various releases
•Interacted with developers to resolve technical issues and provided production support.
•Used Web Defect Manager (Mercury Test Director) for defect tracking system.
Baan CRM, San Jose, CA
April’00-Nov’00
QA Engineer
Baan CRM is a Client Server Application that consists of Modules Sales Force Automation, Marketing, E-Configuration, Outlook Integration, Lotus Notes Integration, ERP Integrations, and Baan Assistant for data synchronization.
•Involved in the end to end testing of Client Server CRM SSA CRM (Baan CRM) that was developed using C, C++, VB5.0/6.0 SQL Server, Oracle
•Involved in the installation of software for creating Test Environment
•Prepared test cases, test scripts based on user requirements and functional
•Specifications
•Used Mercury Test Director 8.0 for writing Test Plans, Test Cases and logging defects
•Involved in the Functional Testing, System Testing, and Integration Testing of various Components.
•Involved in white box Testing.
•Mismatches and bugs were identified and reported back to the development team.
•Performed System Testing and regression testing.
•Involved in doing various Certifications for different releases
•Involved for releasing sustaining patches
•Involved in releasing service packs for various releases
•Interacted with developers to resolve technical issues and provided production support.
•Used Web Defect Manager (Mercury Test Director) for defect tracking system.
Environment: VB, MS Sql Server, Oracle 8, Windows95/98/NT