Jayshri Vyas
** ****** **. ******* ** *****. PH: 732-***-****. E-mail: ********@***.***
Objective
Seeking opportunities at Sr. level IT Test and consulting.
Summary of Qualifications
• Over 12 years of IT experience in systems testing.
• Hands on experience in starting up a Defense Military environment in an IT lab.
• Full knowledge and experience in all aspects of the IT industry including management, and Systems and Integration testing.
• Tested all security features under, FIPS (Federal Information Processing Standard) at AT&T and Lockheed Martin.
• Experience with automated testing tools, QTP, Selenium-IDE. Familiar with agile and cloud systems.
• Experience with Sybase, SQL database querying, and in UNIX.
• Extensive experience in cutting waste and increasing efficiency.
• Involved extensively in Integration, Regression, Functional, Database, System, Performance and User Acceptance Testing. Familiar in Production Testing. Coordinated multiple projects with multi national teams to ensure on time completion and deployments.
• Expert in Negative, Positive testing and very well familiar with SDLC process.
• Experienced in analyzing project requirements and designing software applications using The Unified Modeling Language (UML). Executed SQL queries for back-end testing.
• Expert in Software Development Life Cycle, Product Development Life Cycle and Test Methodologies.
• Strong in coordinating and analyzing product design with customers, and any business related concepts. Able to extract key information quickly and efficiently.
• Excellent organizational and analytical skills.
Technical Skills.
Operating system: Windows XP/NT/Server 2003, Window Vista, UNIX,
Languages: Unix.
RDBMS: Oracle, Informix, Sybase.
Tools and Utilities: Win-Runner, Clearquest. Quality Center, DDTS (Distributed Defect Track System), Test Director, XML File, Quick Test Pro, Sublime, Visio, FileNetP8, Quality center, Selenium_IDE/RC, SUBLIME, and Share Point.
Johnson & Johnson (Raritan NJ) UAT Technical Writer / UAT Tester Nov 2011 to May 2012
Working on Johnson & Johnson’s work-streams for the design and construction of a Validated ITIL Infrastructure management application with “ServiceNow” on a cloud-based service that automates enterprise IT to manage support workflows on the IRIS program,.ServiceNow Creates single system record for enterprise IT, ensures system builds, automates all business requirements, and creates a single responsibilities include writing UAT test cases and test scripts.
• Created technical documentation for 8 process areas including CAPA, NC Management, Incident Management, Problem Management, Knowledge Management, Change Management, Configuration Management (CMDB), Service Request Management, an Operational Data Store and the main platform shared by all applications.
• Captured User Requirements and defined Functional Specifications (total project scope, 8500 Specifications). Created and documented Style guide and documentation standards to ensure that a team of writers working in individual process teams were able to produce consistent, quality and accurate specifications. Cooperated with Quality & Compliance, Project Leadership, Testing and other leads to ensure that all project goals were met according to the active project plan.
• Maintained UAT Test scripts with requirements traceability in quality center and in share point. Responsible for follow-up on defects with key stakeholders across the application team.
• Provide guidance at the project leadership level as to GxP and quality concerns to help assure appropriate resources were available to produce quality deliverables
• Managed efforts to create various SDLC deliverables including User Requirement Specifications, Functional Requirement Specifications, Detailed Design Specifications, SOPs and Work Instructions, Validation UAT, Validation System Testing, Knowledge Management Articles.
• Performed impact analysis on a continual basis and greatly assisted the projected management team’s ability to forecast and avoid issues and conflicts for the collective benefit of the project.
• Managed and prioritized work efforts across all work-streams to ensure optimal utilization of available resources to ensure all project dates were met.
• Responsible and participated in all deployment GO/NO GO meetings.
• Responsible for end to end testing including UAT, functional and regression testing of all scripts.
• Performed analysis of Requirements Traceability Matrix (RTM) for accuracy against relevant SDLC deliverables.
• .Collaborated with developers and Architects in the creation of Detailed Design documents.
Environment: Window XP, IRIS, Service- NOW, Share point and Quality Center,
Testing
Markel (Middletown NJ) QA Tester Jul 2011 to Sep 2011
Markel Corporation is a insurance holding company, writes specialty insurance products and programs for a variety of niche markets through its insurance subsidiaries.
• Responsibilities included testing an insurance billing application and developed test plans for each requirement.
• Maintained and handled test plans in quality center. Uploaded test cases from excel spreadsheet to quality center.
• Created and updated latest results for new and existing issues (bug-reports) in quality center.
• Responsible for system and regression testing for a web/windows based application.
Environment: Window XP, Quality Center, QTP Test Tool.
AT&T (Middletown NJ) QA Tester Jun 2010 to March 2011
Worked on AT&T Online/Web Telephone Conference Services and their billing application designed to provide a bridge between two independent applications - Online Customer Provisioning System (OCP) and AT&T teleconference billing system (ATBS).
Roles and Responsibilities:
• Responsibilities included developing test plans, test procedures and test cases for validating various IP teleconferencing and web conferencing product for billing on Quality center.
• Responsibilities included accounts payable & receivable. To ensure system generated billing information for each customer based on contract and agreement.
• Testing was done in Sybase and SQL server to retrieve, update, and modify data in database to ensure data integrity.
• Was responsible for system, regression and performance testing for a web/window based application. All backend testing was done in SQL Sybase and in UNIX environment.
• Created and managed MRs in SUNTOS system and followed up with new and old MRs to retest and update status in SUNTOS.
• Managed and coordinated the QA team members – both onshore and offshore on accelerating process of test case execution to reach defined quality goals.
Environment: SQL, Sybase server, UNIX, Window XP, XML, Quality Center, sublime.
AT&T (Middletown NJ) SR. QA Tester Jun 2006 to Jun 2009
Worked on AT&T Enterprise web application. Worked as startup new application Sr. System Tester to ensure system configuration and working as per specified requirement.
Roles and Responsibilities:
• Managed and coordinated multiple applications used in Apache and Tomcat Web Portal including Adopt-Hosting, WHSD-File-Net P8.0, GEOLink, ICDS, EFMS, IDCs, Cloud and Synaptic storage projects in testing environments and in production environments with National and International users groups. Conducted Performance and Regression Testing. Proud to be part of a team that met overall 95% on-time performance record under high pressure to deliver under tight deadlines.
• Executed and tested SQL queries for backend testing to retrieve and verify data.
• Managed and coordinated the QA team members – both onshore and offshore on accelerating process of test case execution to reach defined quality goals.
• Estimating QA members/resources needed based on project requirements. Categorizing the resources into Manual and Automation testing group.
• Created and ran system and regression test cases for automation test tool in Selenium IDE by using features like -click and Waite, verify text is present, Assert command to pass the value, and breaks point etc.., to ensure all IBM functions captured and passes the scripts.
• Created and ran system and regression test cases for automation test tool in QTP by using six to save most important features to ensure all new and old functions captured for AT&T application.
• Traceability Matrices were created which covered requirement to design document, design document to test cases, test cases to defects and vice-versa.
• Performance Testing and User Acceptance Testing (UAT) Planning is done taking into account the changes/modifications made during functional/regression testing. Test Cases were gathered from the test plan in Quality Center to create a Performance and UAT test set.
• Coordinated with product system engineers, developers and architecture designer teams to gather requirement’s key information to create workflows and implement process to improve effective system test plans for multiple projects. Participated and responsible for design architecture specification compliance through documents reviews with attention to details with high level team members.
• Managed and handled application system test plan, status reports, bug reports and requirement document on quality center.
• Investigating and assisting defect reports in conducting root cause analysis. As required, escalating issues to ensure that customer needs are met and project deadlines achieved.
Environment: SQL server, UNIX, Window XP, Java, J2EE, XML, HTML, QTP, Quality Center, FileNet P-8.0, and PRISM.
Arbor Glen.Org (Somerset NJ) Sr. Quality Analyst Nov-2005 to Feb-2006
Responsible for new software maintenance with finance department where all type of billings, Accounts Payable, and Receivables were involved.
Roles and Responsibilities:
• Managed and coordinated billings and account payable / receivable for all expenses ‘Arbor Glen’ non profit organization for assistance living.
• Responsible for corporate-wide business processes for investigations, and resolving billing issues with Medical Manager, Add-On, and Life-Care software systems. Interfaced with clients, team members, and third party vendors to coordinate efforts and implement business compliance.
• Responsible for Installations for new software called Add-On to make shore software configured proper with other systems and all the systems work smoothly.
• Resolved uncovered database deficiencies to maximize productivity and profitability. Partnered with both Internal and External customers to verify corrective action.
Environment: Server 2003, Window XP, Medicare and Medicaid system.
Lockheed Martin Inc. (Red Bank NJ) Sr. Software Engineer Jul-2000 to Oct-2005
Worked on Maneuver Control Systems for the 2nd and 4th Infantry Division of the US Army. The program was used for military deployments in Iraq. I was actively involved in testing software to ensure all systems were configured properly and worked according to specified requirements; was responsible for E-mail transfer, messages transfer with special massages where user accounts and user permeations were involved.
Roles and Responsibilities:
• Worked on Sun Ultra 10 in UNIX environments and in JAVA/ J2EE to identify and document all problems. Analyzed and interpreted customer requirements also established to best test plans and test procedures.
• Responsibilities included loading Installing Segments each time on weekly based in the Lab to create military environments. Involved client server configuration with other hardware and software, also responsible for HUB, Routers connections with TCP/IP. Updated defects and system test plans and test related documents in Clear-case.
• Participated in code walkthroughs to validate the application design against the written test cases. Verified as second look for system architecture design to ensure architecture design matched with lab designs and written requirements. Developed test plans according to requirements. Volunteer to update architecture design document by using Visio with team members’ approval.
• Coordinated UAT and performance testing by working directly with the end user customers and several internal departments to ensure successful client relations.
• Created SQL queries to test data consistency by using database as a SQL server 2003 and Informix. While testing setup was done in UNIX environment.
• The defects found in the application were sent to the developer through e-mail and after fix the same defects were reassigned to the tester for regression test. Defect Reports were generated daily or weekly based on priority. Bug tracking process was done in Rationale’s clear quest tool.
• Managed a defense project with system installation in test lab for client server environments in a US Army base test lab facility. Preformed integrations testing, scheduling for test related all activities.
• Worked as a part of the team with the top-level manager, developer, and team leader. Assisted in evaluating other software applications that were apart of and supported the MCS project with FBCB2 box.
Environment: Window NT, XP, 2000, 2003. Java, XML, JavaScript, Visual Basic, Informix, Clear Quest.
Hardware: FBCB2, ASAS, Ultra10, Sun Spark, Sun Solaris, SQL Server, ISO 9000.
Vertical Net.Com (Horsham-PA) Test Engineer Feb-2000 to Jul-2000
Roles and Responsibilities:
• Worked to identify and document all problems on a C and C++ web-based application. Updated and checked account information. Performed analysis of requirements.
• Created Test Execution scripts and Test Environment set-up steps which were also used as training material for new members transitioning into the team.
• Executed and tested SQL queries for backend testing to retrieving and verifying data.
• Carried out functional, regression, and performance tests for Client Server GUI Screens. Updated defects and system test plans and test related documents in Clear Quest.
Environment: Window NT, C, C++, XML.
Educations & Certifications
• BA, India: Psychology and Sociology. ( Ravishanker University in India- CG)
• Middlesex College, NJ: Certificate in Technical Writing in Business.
• ITM Tec, NJ: Oracle/Unix; C Programming.
• Soft Vision, NJ: SQA-ROBO; Win-Runner 6.0; and X-Runner.
• Sun Micro System, NJ: C ++ and Object-Oriented Programming.
• Rational University, NJ: Clear Case Fundamentals for UNIX; Introduction to Rational Rose 98i.
• Ascend Training Institute, NJ: Microsoft System Administration; MCSA.
• SQE.com: Certified Software quality Tester {eMastering Test Design}.
• On Line training with Agile. ( Familiar with Agile IDE process)
• Status: US citizen