Srilatha Ch
Cell # 703-***-**** Email: ***********@*****.***
Professional Summary:
•
•10+ years of professional experience in Information Technology with emphasis on Quality Assurance, Data warehousing Testing, Manual & Automation testing of Web and Client/Server based commercial applications.
•Well-versed with all stages of Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
•Experience in Test Environment Setup and Test Infrastructure Development in both Manual and Automation.
•Experience in web driver Programming using Selenium Web driver.
•Extensively used Automated Testing Tools like Selenium for Functional and Regression Testing.
•Strong knowledge of backend service testing (REST) using Postman.
•Good exposure on Requirements Analyzing, Streamlining and Management.
•Experience in WCAG testing using AXE DEV tools
•Well versed with various Testing Stages/Levels/Phases, Testing Types, Testing Techniques and Quality Work Products.
•Hands on experience in preparing Test Plans, Test Cases, Automated Test Scripts, Test Data and executing the same.
•Involved in the Analysis, design, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications.
•Well versed with Software testing concepts, SDLC, Agile and Waterfall methodologies
•In depth technical understanding of Data Warehousing, Data Validations, OLAP, SQL Server, Oracle and ETL.
•Hands on Experience on Mobile, IPad/tablet testing on different devices like iOS and Android
•Understanding and identifying testing scope based on Functional specifications & Use case documents.
•Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.
•Extensive experience in reviewing Business Requirement Documents, Software Requirement Documents and preparing Test Cases, Test scripts and Execution.
•Experience in Black box testing with a complete QA cycle - from testing, defect logging and verification of fixed bugs.
•Expert in using Test Management Tools – Test rail,Jira Xray,HP Quality center, HP ALM.
•Experience in writing complex SQL statements and understanding and updating PL/SQL statements.
•Experience in integration of various data sources with Multiple Relational Databases like Oracle, SQL Server, MS Excel, MS Access, and Flat Files.
•Well versed in GUI application testing, Database testing and Front-end testing.
•Expert in Test plan, Test strategy, Test Case Design, Test Tool Usage, Test Execution, and Defect Management.
•Implemented various Test cases for the Functional Testing, smoke testing, re-testing, Regression Testing, Interface Testing, Performance Testing, Back-End and User Acceptance testing
•Ability to understand and adapt to new technologies and environments faster, good at analysis and troubleshooting and excellent analytical, programming, written and verbal communication skills.
Education:
•B. Tech (JNTU, Andhra Pradesh, INDIA)
Technical Skills:
Test Management Tools
Quality Center 11/10.0, ALM 11.52, JIRA, Xray, Confluence,TestRail
Automation tools
Selenium WebDriver/IDE, Webservices(SOAP,REST),Postman
Programming/Scripting Languages
SQL, JAVA, Java Script, HTML, XML
ETL Tools
Ab Initio CO>Op 2.15, GDE 1.15, Informatica 8.6/8.1
Bug Reporting
Jira, Quality Center 9.2,Rational Clear Quest, Test Director, Bugzilla
Databases
Aurora Postgres, Oracle 11g/10g/9i/8i/7.x,MS SQL Server 2008
Reporting Tools
Power BI, Tableau, Business Objects6.5
Professional Experience:
Cambium Assessment, DC Jan’20-Current
Sr.QA/ Lead
AIRWAYS Reporting system Provides information about the structure of AIR Ways and explains which assessments and students are included in AIR Ways reports.
Responsibilities:
•Performed the role of the QA lead. The role includes resource planning, status tracking, work distribution among the teams.
•Managed the offshore team of 8 testers and Responsible for status reporting on system and regression test activities.
•Participate in Agile Planning, Execution, Review and Retrospective meetings.
•Designed comprehensive test strategies, plans, and test cases for assigned applications.
•Served as Subject Matter Expert (SME) for assigned products, ensuring data integrity across integrated systems.
•Analyzing the use case requirements to have the detailed knowledge about the application.
•Closely working with DevOps team to incorporate the test automation as a part of development process.
•Delivered clear and timely status updates to stakeholders and leadership.
•Participated in Daily Standups, Sprints and Demonstrations & Retrospectives.
•Participated in Sizing sessions, created tests based on user stories and executed within that sprint
•Developed testing plan/schedule and maintain test matrix based on development completions and project timelines.
•Actively involved in End of sprint meetings to present demo on completed Stories with Product Owner.
•Document Test Acceptance Criteria and execute Test Sessions in Jira.
•Reviewed test scripts to ensure testing efficiency and code coverage including assisting programmers in resolution of any problems. Also responded to inquiries from users and technical support
•Tested Complex ETL Mappings and Load rules based on business user requirements and business rules to load data from source to target tables.
•Identified the test cases to be automated and performed data driven testing and GUI Check Points using Selenium WebDriver and enhanced the overall functionality of the application.
•Performed Regression testing on new builds, every modification in the application using Selenium WebDriver
•Tested Web Services /XML /SOAP and RESTFul services using Postman tool
•Document Test Acceptance Criteria and execute Test Sessions in Rally
•Reviewed test scripts to ensure testing efficiency and code coverage including assisting programmers in resolution of any problems. Also responded to inquiries from users and technical support
•Performed data quality analysis using advanced SQL skills.
•Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
•Actively involved in End of sprint meetings to present demo on completed Stories with Product Owner.
American Institute for Research July ‘17 – Dec ‘19
QA Specialist-II
Responsibilities:
•Participate in Agile Planning, Execution, Review and Retrospective meetings.
•Capable in developing test strategy based on customer requirements, project constrains and risk analysis and delivering high quality products.
•Analyzing the manual test cases for the feasibility of automation in regression phase
•Part of Automation Development team.
•Worked on Integrating Selenium automation scripts with Jenkins
•Working with the development team to create a suite of test data (both input files and expected results) that fully exercises data validation.
•Experienced with functional/Automated web services tests
•Worked on BDD framework using selenium.
•Involved in Web Services Testing using Postman
•Closely working with DevOps team to incorporate the test automation as a part of development process.
•Worked with the Product Owner (PO) on a regular basis to develop a comprehensive backlog of User Stories that define the work that have to be developed by the team.
•Added information in to Rally to include User Stories, Velocity Charts, Burn-down Charts, Sprint Retrospective notes.
•Participated in Daily Standups, Sprints and Demonstrations & Retrospectives.
•Participated in Sizing sessions, created tests based on user stories and executed within that sprints
•Developed testing plan/schedule and maintain test matrix based on development completions and project timelines.
•Actively involved in End of sprint meetings to present demo on completed Stories with Product Owner.
•Document Test Acceptance Criteria and execute Test Sessions in Rally
•Reviewed test scripts to ensure testing efficiency and code coverage including assisting programmers in resolution of any problems. Also responded to inquiries from users and technical support
•Actively participated in system test environment setup which includes source data validation, database connections to multiple upstream and downstream users and new code deployment to system test.
•Performed data quality analysis using advanced SQL skills.
•Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted.
•Reviewed testing execution and ensured project testing deadlines and milestones were met utilizing various reporting features on ALM.
•Log defects and work with the Team to assign priorities.
•Prepared Dashboard Reports and produced QA reporting decks for project stakeholders.
•Utilized automation framework for Selenium test scripts and managed automated test databases in maintaining automation environments.
AA software &Networking Inc Nov ‘15 – June’ 17
Sr. Test Engineer
Responsibilities:
•Analyzed the User/Business Requirements and Functional specifications documents as per the client’s requirement.
•Converted business requirements and design documentation into test design products: Test Scenarios, Test Cases and Test Scripts.
•Developed test plan adhering to specifications and business requirements
•Collaborated with client and business analyst regarding application functionality requirements
•Played an active role in the full life cycle (SDLC) and STLC for the project using AGILE methodology.
•Responsible for defect tracking and reporting the defects and also worked closely with technology team to ensure high quality and timely SIT releases.
•Assured software delivered meets all requirements and expectations of final test plan
•Designing various test cases at various phases based up on the business flow and change requirements.
•Identify the Automation Test cases form manual Packages
•Maintained and enhanced test repositories to ensure accuracy and traceability.
•Enforced test management best practices and drove corrective actions when required
•Good ability to identify and resolve web services user interface issues
•Retested and reevaluated after debugging applications
•Participated in walk through and business requirements meetings.
•Prepared and executed SQL queries for Database verifications
•Involved in Sanity Check test, whenever there is a new builds and was responsible for communicating the results of new build to the team.
•Extensively used Microsoft office suite (MS work, Excel and power points).
Tech Mahindra, India Aug ‘13 – Oct‘15
DWH/ETL/Backend Tester
Responsibilities:
•Participated in business requirement walk through, design walk through and analyzed Business requirements.
•Involved in developing detailed test plan, Test summary report, test cases, Test scenarios and test scripts for Functional, Regression and Integration Testing.
•Reviewed Design documents, Requirements, Solution specifications and Data mapping documents to create and execute detailed test plans.
•Involved in End to End testing stating from smoke test, data quality testing to integration testing.
•Tested Complex ETL Mappings and Load rules based on business user requirements and business rules to load data from source to target tables.
•Tested the Informatica mappings by running the workflows.
•Actively participated in system test environment setup which includes source data validation, database connections to multiple upstream and downstream users and new code deployment to system test.
•Performed data mining and manipulations on stage tables to create the test data required for testing.
•Written several complex SQL queries for validating target data population based on derivation logic.
•Reviewed system use cases and functional specs with business and System Analysts.
•Performed regression testing to ensure the existing functionality is not broken with all positive, negative and reconciliation test conditions.
•Tested all the Autosys jobs to validate the functional business process flow.
•Ran shell scripts on UNIX to test any specific functionality.
•Tested the ETL process for both before and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
•Coordinated with downstream users for validating the data passed to downstream systems as a part of integration testing.
•Responsible for status reporting on system and regression test activities.
•Used Quality Center to track and report all system defects.
•Conducting peer review of the code that is developed by other team members.
•Created Requirement Traceability Matrix to keep the project management informed of the progress of testing.