Srividya Koganti
Mobile: 903-***-****
**** ********* ** ************@*****.***
Chester Springs, PA - 19425
SUMMARY
Around 10 years of experience in the areas of Software Manual Testing/ Quality Assurance, Business Analysis, Data and Requirement Analysis and Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC) process.
Excellent familiarity with various phases of Software Development Life Cycle (SDLC).
Experience in Agile/Scrum, Waterfall Model and good understanding of agile methodology.
Extensive experience in developing, executing and maintaining test plans, test scenarios, test cases, and creating the Traceability Matrix based on business requirements.
Experience in working with various testing tools Jira, X-Ray Test Management, Microsoft Test Manager, Automation Test Manager, Microsoft Visual Studio, Team Foundation Server for Test Management purposes like writing Test Cases, configuring Test Scripts and Maintained Traceability by linking the Functional Requirements (FR' s), System Requirements (SR's) and Business Requirements (BR's) with the Test Cases.
Experience with Veeva Vault, Confluence, Master control, Dot Compliance and Box for test documents uploading and approvals and for reviewing SOP’s and standard template documents.
Experience in analyzing bugs, interaction with team members in fixing the errors.
Experience in writing SQL Queries using SQL Server to validate the data during testing.
Experience in GUI Testing, Functional/System Testing, Integration Testing, and Function Flow Testing, Load testing, Automation of GUI Testing, User Acceptance Testing (UAT), Regression Testing, End-End Testing, Front End Testing, Black Box Testing and Database testing.
Extensive experience in testing Windows, Web & Mobile applications.
Experience in design, development, and execution of Test Plans, Test Scenarios, Test cases for Web-based and Client/Server applications
Performed System Integration, End to End and User Acceptance Testing for the data services.
Used Jira, Assembla, MTM and VSO Tool for Defect Management.
Generated Reports using Assembla and VSO tool based on the status of the Defect, Test Case and Test Script
Prepare Test Environment by creating robust test data and preparing baseline documents
Prepare clear documentation for all test results to validate with Client/Business.
Prepare QC Report for test execution and Defects on daily basis
Coordinating the testing activity with development team.
Results-oriented, dedicated and self-motivated professional with excellent verbal and written communication, and organizational and interpersonal skills.
Comprehensive understanding of QA standards, flows, methodologies, procedures, and QA documentation
Experience in implementing complex functional tests that require an understanding of the application logic and excellent problem analysis and bug reporting.
Self-motivated professional, capable of working independently or as part of a team.
Expert at analyzing and solving the most complex of problems.
TECHNICAL SKILLS
Languages:
C, SQL, PL/SQL, HTML, Java Scripting
Databases:
Oracle 10g/9i, MS SQL Server 2008/2005/2000, MS Access 2003
Front End Tools:
SQL* Plus, TOAD, SQL Enterprise Manager, SQL Profiler, SQL Query Analyzer, SQL Server Management Studio
Methodologies:
SDLC, Agile, Waterfall
OLAP Tools:
Service Server Reporting Services (SSRS), Business Objects XI 3.0
Operating Systems:
Windows 7/ Windows Vista, Windows XP/2000/98, Windows Server 2003/2000, MS-Dos
Office Tools:
MS Word, Excel, PowerPoint, Visio, Outlook
Web Technologies:
HTML, XML, VB Script, CSS
Testing Tools:
X-Ray Test Management Tool, Microsoft Test Manager, Automation Test Manager
Defect Tracking Tools:
Assembla, Bugzilla, VSO, Jira
Project Management Tools:
Jira, Team Foundation Server, Visual studio team services
CERTIFICATION
Microsoft Certified Technology Specialist (MCTS): SQL Server 2008, Business Intelligence Development and Maintenance
Designing and Implementing Databases with Microsoft SQL Server 2008
PROFESSIONAL EXPERIENCE
Senior Software Test Engineer Mar 2023 - Present
Signant Health, Blue Bell, PA Rater Station, an enhanced type of electronic Clinician-Reported Outcome (eClinRO) designed to improve interview quality while reducing administration and scoring errors.
Execute tests for all technology platforms.
Leading testing efforts on assigned projects in the pod.
Mentor team members and oversee the quality of deliverables for all projects in the pod.
Understands the functional business processes across the entire organization.
Works closely with Global Technical Delivery staff to formulate the high-level testing solution and is responsible for ensuring that the testing strategy includes complete coverage of requirements and business processes.
Create and execute test cases, manage bugs and report testing status to project teams.
Communicating project status/ bugs / blockers/ risks in daily stand-up meetings with team for projects assigned.
Estimating the level of Effort required for testing each functionality and communicating overall estimation of time frame for completion of study or change request to management.
Providing feedback and input for development on suggested unit testing approaches and techniques.
Reviewing test cases and test scripts for quality and coverage.
Drafting all required SDLC and validation documents if required.
Facilitates the review and approval of all SDLC and validation documents, if required.
Follow defined validation process and produce validated documents for all initiatives.
Supported User Acceptance Testing when appropriate.
Reviewing UAT testing results and ensures quality and adherence to SDLC processes.
Contributes to the authoring and review of departmental procedures, documentation of testing processes, and work instructions, as necessary.
Self-managing testing workload for testing multiple systems in various phases of development to meet deadlines.
Documents traceability between testing and system requirements documentation.
Upon completion of testing, I made sure to resolve all the related tasks and requirements with appropriate objective evidence attached.
Managing various aspects of in-house testing documentation.
I participated in testing process improvement activities as needed.
Demonstrated total ownership, accountability, and commitment to the testing deliverables.
Responsible for reviewing overall timelines specific to testing tasks for projects assigned and supporting other testing resources.
Providing support to GTD staff on other internal initiatives as assigned.
Drafting, and reviews training materials for the testing team.
Trained new team members who joined the company for all testing related activities.
Environment: SQL Server 2008, SQL Server Management Studio, ASP .NET, HTML, Jira, X-Ray Test Management Tool
Software Test Engineer II Mar 2019 – Feb 2023
Signant Health, Blue Bell, PA Rater Station, an enhanced type of electronic Clinician-Reported Outcome (eClinRO) designed to improve interview quality while reducing administration and scoring errors.
Execute tests for all technology platforms.
Lead testing efforts on assigned projects.
Understands the functional business processes across the entire organization.
Works closely with Technical Delivery staff to formulate the high level testing solution, and is responsible for ensuring that the testing strategy includes complete coverage of requirements and business process.
Create and execute test cases, manage bugs and report testing status to project teams.
Develop test tools and test solutions.
Review project plan and estimate hours for testing tasks based on project scope.
Drafts all required SDLC and validation documents, if required.
Facilitates the review and approval of all SDLC and validation documents, if required.
Follows defined validation process and produces validated documents for all initiatives.
Responsible for supporting User Acceptance Testing when appropriate.
Reviews IQ, OQ and UAT testing results and ensures quality and adherence to SDLC processes.
Responsible for tracking issues and resolutions throughout the testing process.
Self-manage testing workload for testing multiple systems in various phases of development in order to meet deadlines.
Document traceability between testing and system requirements documentation.
Manage various aspects of in-house testing documentation.
Participate in testing process improvement activities as needed.
Some technical support after-hours on-call time may be required.
Demonstrate total ownership, accountability, and commitment to the testing deliverables.
Provide support to TD staff on other internal initiatives as assigned.
Environment: SQL Server 2008, SQL Server Management Studio, ASP .NET, HTML, Jira, X-Ray Test Management Tool
Software Test Analyst II Sep 2017 – Dec 2018
Almac Group, Lansdale, PA
Interactive technology and services solutions include eClinical Interactive Voice and Web Response System (IVRS/IWRS/IXRS) for patient randomization, tracking and clinical supply management; electronic patient reported outcomes data collection and web drug reconciliation. The base platform incorporated the basic clinical functionality common across most clinical studies to dynamically manage clinical supplies and patient interactions. Clinical study specific requirements were built on the base platform and will be customized based on trial specific needs. Validation of clinical study included testing of customized functionality in web and voice platforms.
Reviewed the customized new/modified requirements and writing test cases for functionalities by following the risk based approach.
Good understanding of how the base core functionality works and ability to identity risks based on the customization added.
Planning, executing and documenting the testing of all study system (IXRS), including manual and automated testing and tracking of errors for resolution.
Estimating the level of Effort required for testing each functionality and communicate overall estimation of time frame for completion of study or change request to management.
Involved in creating manual and automation test cases to cover all possible positive and negative scenarios.
Effective linking of requirements and change requests to test cases for better traceability.
Perform system, regression and functional/performance testing of applications using both automation and manual testing methods.
Leverage knowledge of automation to help validate software application for internal projects.
Executed automated test plans, test cases and scripts and collected objective evidence.
Followed logical, risk based test approach for each project assigned to me.
Manage and prioritize workload, escalate conflicting work schedules to Group Leader.
Coordinate project test effort and communicate progress during project status meetings.
Log bugs with detail documentation in Microsoft Test Manager, recreate them as needed, track resolution and escalate issues when appropriate.
Ensure communication of project test status to project team members and will be responsible for the timely delivery of all test deliverables.
Experience of testing web based (IWR), voice based (IVR) and locally deployed client server projects.
Reviewed Randomization specification requirements and tested internal generated dummy list meets the requirement criteria.
Tested all custom requirements in each module and validated against the approved requirements.
Used Team Foundation Server to track all project related information for all releases.
Generated creation, execution and QC tasks for all the features to be tested in the release.
Used Microsoft Test Manager for writing test cases for customized features for all possible scenarios and linked the related requirements.
Used Automation Test Manager for creating automation scripts for customized features and regression tested by executing for multiple iterations.
Performed deployments of code to Test and UAT environments using Octopus tool.
Created the test data in Test Environment according to study specific requirements to test the various functional aspects of the clinical study.
Created the test data in IUAT Environment according to client requested scenarios to cover various functional aspects of the clinical study.
Upon completion of testing resolved all the related tasks and requirements with appropriate objective evidence attached.
Interacted with Project Managers, Design Managers, Developers and Group Leaders to understand custom functional requirements.
Perform problem solving analysis and root cause for system functionality challenges
Environment: SQL Server 2008, SQL Server Management Studio, ASP .NET, HTML, Team Foundation Server, Microsoft Test Manager, Automation Test Manager, Web Services.
Software QA Tester Jan 2012 – Jun 2017
Y-Prime, PA, USA
The IRT and eCOA system is a home grown application which is a platform-independent solution that provides one step access to all clinical trial management needs through a seamless unified interface for both web based as well as over the mobile to automate and connect key data. The base platform incorporated the basic clinical functionality common across most clinical studies to dynamically manage clinical supplies and patient interactions. Clinical study specific requirements were built on the base platform and will be customized based on trial specific needs. Validation of clinical study included testing for all study specific functionality as well as certain basic functionality in both IWR (web based) and eCOA(electronic clinical outcome assessment) platforms.
Ensure the correct functionality of clinical trial software with focus on patient safety and integrity of clinical study data
Ensure that requirements are clear, testable and consistent with the clinical trial protocol.
Reviewed study specific requirements along with the design document as approved by the client.
Analyze Business Requirements and Application design documents (SRD) and create Traceability Matrix interlinking the test requirements and test cases as per SOPs.
Reviewed Randomization specification requirements and dummy kit list.
Involved in documenting the risk assessment based on the protocol and requirements document.
Involved in creating Validation plan according to a risk-based testing methodology and risk based change controls.
Handle the tasks of writing functional test plans from the system requirement documents of IWR.
Created Formal and scenario test scripts for all functional modules included in the clinical study.
Interacted with Business Analysts for constant upgrade of the functional requirements
Manually tested all custom requirements in each module and validated against the approved requirements and design document.
Performed manual verification of all backend tables being impacted as a result of testing.
Created the test data according to study specific requirements to test the various functional aspects of the clinical study.
Performed both positive and negative testing for all possible scenarios.
Interacted with Project Managers, Validation Leads and Developers to understand custom functional requirements.
Perform problem solving analysis and root cause for system functionality challenges
Accurately identify and communicate bugs and systems enhancements to the development teams
Utilize defect tracking system to document defects and track resolution
Created Test Summary reports and capture test metrics upon completion of validation effort.
Involved in Mobile application testing with Virtual Mobile Devices (Android, iPhone, HTC). Involved in iOS (iPhone, iPad) App functional and UAT testing for recording patient test data through mobile and verifying the results are captured in database.
Performed regression test on new code prior to implementing into the test environments
Used SQL Server Management Studio to execute and perform SQL query to validate back-end table updates.
Checking the data flow and extensively using SQL Queries to extract the data from the Database.
Involved in Writing, Modifying Queries, Stored Procedures and Triggers using SQL Server.
Performing switch shake out end to end regression in test environments prior to releasing to functional testing teams.
Worked closely with developers to resolve bugs by helping them to replicate the same scenario in the development environment.
Involved in System Testing, Regression testing, and Black Box testing.
Defect tracking and reporting was performed using department approved tools like Assembla or Microsoft Visual Studio (VSO)
Performed verification and validation of kit and rand loads prior system going live.
Involved in verification and validation of change requests in production environment.
Performing data creation and wring scripts upon requests for UAT (User Acceptance Testing)
Prepare Test Environment by creating robust test data and preparing baseline documents
Prepare QC Report for test execution and Defects on daily basis and communicating with the team
Involved in Daily Scrum meeting and discuss with developer regarding Defect Status.
Involved in testing complex functionalities like patient randomization, product inventory management, pooled material system, drug accountability and reconciliation and study reporting.
Involved in Data Transfer Integration testing.
Environment: SQL Server 2008, SQL Server Management Studio, ASP .NET, HTML, Microsoft office share Point portal Server 2007, Web Services, FileZilla
Junior Business Analyst May 2010 – Dec 2011
Y-Prime, PA, USA
Translate high-level business and user requirements into detailed Functional Requirement Specifications
Involved in analyzing business requirements and preparing Test Scenarios and Test data for testing purposes
Tested Business Objects XI 3.0 reports and validated values as per user requirements using IMPACT application
Written several SQL queries using TOAD for validating the BO reports with backend data
Involved in testing the BO reports during Service Pack upgrades
Conducted test case reviews to ensure scenarios accurately capture business functionality
Prepared status summary reports with details of executed, passed and failed test cases
Took part in requirement walk through meetings.
Participating in QA team meeting and bug tracking meetings
Involved in documenting Test Scenarios, Test Scripts and Test Results for Business review
Environment: Oracle 10g, TOAD, Microsoft office share Point portal Server 2007, Business Objects XI 3.0
EDUCATION
Master of Science, University of Texas at Tyler, Tyler, Texas. DEC 2009
GPA: 3.86/4.0
B.E in Electronics and Communications Engineering, Jawaharlal Nehru Technology University, India. MAY 2006
GPA: 3.6/4.0
ACCOMPLISHMENTS
I have been honored with a Gold medal by alpha xi committee for achieving highest GPA in my Master’s degree
CERTIFICATIONS
Microsoft Certified Technology Specialist (MCTS): SQL Server 2008, Business Intelligence Development and Maintenance
Designing and Implementing Databases with Microsoft SQL Server 2008
Certified SAFe 5 Practitioner: Scaled Agile, Inc.