Post Job Free

Resume

Sign in

Software Development Test Automation

Location:
Gainesville, VA
Salary:
90K Annually
Posted:
January 25, 2024

Contact this candidate

Resume:

Saima Khan

Gainesville VA *****

571-***-****(Cell)

Email: ad23je@r.postjobfree.com

OBJECTIVE

Seeking a position of Software QA Test Automation Engineer /QA analyst.

SUMMARY

• Strong knowledge of software development processes and methodologies (SDLC)

•Familiarity with testing distributed application in large Web-based environments

• Strong knowledge of SQA and testing philosophies and methodologies

• Ability to design and implement customized test fixtures

• Solid experience with manual testing, including test planning and execution

• Familiarity with the process for releasing a test set into production

•Highly experience in developing automated tests using test tools and scripting languages

•Expertise experience with STLC management tools Quality Center and Test Director

•Experience in authoring load, performance, and endurance test scripts in LoadRunner

•Experience in working with test automation frameworks, such as keyword and data-driven with Quick Test Professional

•Strong SQL, PL/SQL skills

• Solid analytical and problem-solving abilities

• Knowledge and experience working in a iterative/agile test process

•Knowledge of special testing needs as relevant to testing of mobile solutions, internet portals, and web-based applications (security, load, application servers, differences in browsers)

• Expert in industry standard software development methodologies and life-cycles

•Knowledge in test planning, product verification, product validation, and test automation; implementation phases of the development methodology and life-cycle

• Ability and desire to work in a spirited, collaborative environment

• Ability to identify and prioritize important tasks independently

• Self-motivated, willing to learn new concepts, technologies, and ability to produce quickly

TECHNICAL SKILLS

STLC Tools

Quick Test Professional, Load Runner, Quality Center, Test Director, Selenium, Jmeter, ALM Performance Center, Unified Functional Testing

Programming Languages

VB.NET, Java, Visual Basic, JavaScript, VBScript, HTML, XML

Application Software

Microsoft Visio, Excel, Word, PowerPoint

Databases

Oracle, Microsoft Access, Microsoft SQL Server,

Tools

SQL*Loader, TOAD, SQL Analyzer, SQL Profiler

Operating Systems

Windows XP, Windows 2003, UNIX, Linux, MS-DOS

Others

Web Services, IE, Firefox, Opera

PROFESSIONAL EXPERIENCE

Bank of America, Richmond, VA

July 2020 – Present

Software Automation Tester

Job Description:

Perform quality assurance, quality control, and security tests for system designs, processes, and security features

•Production Support: Smoke testing on all production update in various environments. Script writing, plan and executing test including Automation with Selenium

•Automated testing tools such as Junit and Selenium to conduct systems, integration, user acceptance, positive and negative, functionality, object, and regression tests.

•Design, create, and customize scripts using various scripting language and testing tools, such as JavaScript, Selenium with Java, JUnit, TestNG, and QTP 11, for data-driven network systems and others using JAVA Language

•Worked as an Automation Tester, responsible for development and maintenance of Automation Frameworks, tools and solutions. Managed and coordinated onsite/offshore functional test efforts and Automated functional testing

• Write and execute automation test script for UFT

•Participate in the automated testing tool vendor selection process. Conduct a Pros & Cons analysis of HP UFT

•Performed manual and light selenium IDE script-driven sanity and regression, cross-browser testing to ensure consistency.

•Create solutions to improve scripts by designing new functions, synchronization threads and processes, and check points

•Test system requirements for bugs and glitches using various web-based test management software such as ALM Quality Center 11.00

•Analyze system designs, requirements, and documentation to effectively develop test scripts, and test specific scenarios for required levels of security and quality-control testing

•Identify and resolve technical problems with systems by comparing newly designed project interface requirements with current interfaces in the mainframe-based legacy system

•Analyze and verify data requirements and layout reports for various database designs and other systems

•Collaborate with business users and customers to clarify system requirements to improve the user interface and the design and development of the system processes

•Work directly and independently with customers to perform usability testing to thoroughly review and test scripts

•Perform complex analysis and testing support for government clients by executing regression and system testing and manually integrate system improvements

•Analyze physical system designs to develop system test plans and outline an estimated timeline for test schedules

•Work closely with development team to identify and resolve any system-related

•Configured Salesforce to align with specific business requirements, enhancing overall productivity.

•Customized page layouts, record types, and profiles to optimize user experience.

•Implemented and maintained data integration between Salesforce and other systems.

•Proficient in building Lightning pages, components, and apps to enhance user interaction.

•Leveraged Lightning App Builder to create custom dashboards and home pages.

•Integrated Salesforce with various external systems and applications using Postman APIs.

•Worked closely with PO’s and Developers to gather requirements and prioritize feature testing.

•Developed end-to-end automated test scripts using Cypress and JavaScript to ensure the reliability and functionality of web applications.

•Implemented Cypress test suites to perform regression testing, resulting in manual testing efforts and improved release quality.

•Created custom Cypress commands and utilities to enhance test readability, reusability, and maintainability across multiple projects.

•Stayed updated on the latest Cypress and JavaScript advancements and recommended relevant improvements to the testing process.

•Regularly updated and maintained existing Cypress test suites to keep pace with evolving application features.

•Executed regression tests to ensure new code changes did not introduce regressions.

•Conducted comprehensive cross-browser testing using Cypress to ensure compatibility with major browsers, including Chrome, Firefox, and Edge.

•Created Test Automation Framework with Cucumber and Selenium WebDriver.

•Converted manual test scripts to automated test scripts in Selenium WebDriver and JavaScript, enhanced the scripts by adding user-defined functions.

•Extensive experience in writing and implementation of complex test plans, design, development, and execution of test scripts for system, integration, user acceptance test (UAT) and regression testing.

•Administered and monitored AWS instances using AWS console and AWS CLI tools.

•Developed and implemented cloud-based solutions using Amazon Web Services.

•Collaborated with developers, architects, and other stakeholders to migrate legacy applications to the AWS cloud.

JP Morgan Chase, Richmond, VA

May 2018– June 2020

Quality Assurance Analyst

Job Description:

•Informed supervisor of important developments and obtains guidance and direction on individual assignments

•Represented the company through customer visits and consultation for the solution of technical problems

•Conceived ideas and developed testing events and actions for products to meet objectives

•Performed business analysis in accordance with established theories and methods

•Planed, designed, and conducted lab and tests of developmental and competitive products

•Accountable for complete results on development projects and special function within assigned area

•Communicated technical results and information effectively both in written and oral form

•Developed and created master test plans and related documents, test cases, and test schedules

•Executed test cases and test scenarios across development projects

•Involved in functionality, user interface, regression, security, and UAT

•Identified and tracked defects, issues, risks, and action items

•Validated requirements for system testing, report preparation, defect recording, and defect tracking

•Performed regression testing to validate the resolution of any software or system defects

•Used Quality Center a web-based test management tool for centralized control over the entire testing life cycle

•Wrote and executed SQL queries to interpret test results and create test data

•Created, enhanced and maintained high-end object repository for various functional and regression test using Quick Test Professional

•Executed written test case scenarios, including manual, automated, and data-driven regression testing, and GUI verification by using Quick Test Professional (QTP).

•Developed Keyword Driven and Data Driven Frameworks test scripts using VBScript

•In many cases, QA Analysts also work on Automation testing, creating scripts or using testing, frameworks to automate repetitive tasks and improve testing efficiency. Worked as an Automation Test, responsible for development and maintenance of Automation Frameworks, tools and solutions. Managed and coordinated onsite/offshore functional test efforts and Automated functional testing

• Write and execute Automation test script for UFT

•Participate in the automated testing tool vendor selection process. Conduct a Pros & Cons analysis of HP UFT

GEICO, Cleveland OHIO

March 2015 - April 2018

Automation Test Engineer

Job Description:

•Transitioned user stories and collected test requirements in creating test cases and test procedures with emphasis on automation testing and scripting

•Applied the set of operations, and disciplines for the planning, analysis, design and construction of information systems across a major sector of the organization

•Participated in test case coverage, test case design, and script design and reviews

•Developed test scripts along with maintaining and enhancing the automated test framework supporting a continuous integration environment with automated smoke and regression testing

•Ensured high test and code coverage, maintainability of scripts, reliability of equipment, and overall robustness of environment and solution during the entire development cycle

•Responsible for performing analysis of requirements, writing requirements verification points and providing feedback on requirement testability

•Responsible for having a thorough understanding of the projects test environment(s) and the projects policies for working in the test environment(s)

•Responsible for installing software into and upgrading test environments (hardware and software) including in-house applications and 3rd party applications

•Responsible for integration testing applications as appropriate to use on the internet portal

•Responsible for working with a team including development, system engineering and customer representatives in combined integration test efforts

•Involved in continuous support of overall software quality and testing with continuing refactoring of scripts and test cases as required and enhanced test coverage (system, performance, interoperability, stress, negative testing, etc.)

•Clearly logged defects, maintained test data and results, and monitored/analyzed automated test runs and reports

•Responsible for developing manual test cases in HP Quality Center and executing tests according to software test processes and procedures

•Responsible for developing automated test cases within Quick Test Professional and custom scripting as appropriate to the test case

•Used Quality Center to manage and organize STLC activities like Requirements coverage, Test Case Management, Test Execution Reporting, Defect Management, and Test Automation

REFERENCES

AVAILABLE UPON THE REQUEST



Contact this candidate