Nithya Muthyala ****************@*****.*** +1-847-***-****
Business and Quality analyst with outstanding record of conveying cost effective, high performance technology solutions to meet challenging business demands. Extensive qualification in all facets of information systems methodology from conceptual design through documentation, implementation, user training, quality review and enhancement.
Result focused and customer oriented QA lead with Professional expertise in designing test strategy, test planning, estimation, leading testing effort for large/complex projects
Seasoned IT Professional with 7+ years experience in the Software Quality Assurance vertical specializing in validations across all phases of Software Development Life Cycle (SDLC).
Wide experience in requirement analysis, GAP Analysis, SWOT Analysis, design the ‘to be’ process, process management, Data conversion management.
Strong understanding of QA Principles, QA Process, Use Cases, and SDLC – Agile and Waterfall
Proven ability to lead complex testing projects from initial conceptualization through implementation in Global delivery model with testing based both onshore and offshore
Core group member, responsible for transitioning of a large sized project from waterfall to agile
Extensive working experience in functional testing (web, windows, mainframe applications), black box testing, smoke testing, system testing, regression testing
Strong experience working with STIBO applications including Workbench
Performed UI testing in Native mobile Applications in Simulator, Emulator and Real Devices.
Strong ability to create and present reports using SQL
Extensively used SQL and PL/SQL to retrieve the data from databases to perform data validations and comparisons for regression testing
Strong Experience in mobile applications testing in both IOS and Android Platforms; E2E testing in LTE networks.
Performed validation of UX, UI and backend testing with WEB Services and Swagger links.
Established risk based testing and defect management process for offshore testing team through JIRA
Worked with project management tools such as JIRA/TestRail/Zephyr/RALLY
Over 2 years experience in Riversand PIM, JDA, ATG and Stibo STEP applications
Comprehensive and intuitive dashboards for daily status reports, defect analysis, speed-to-market analysis and steering committee
EXPERIENCE
Client: Florida Blue (BCBS) May 2019- Present
QA Engineer/Lead
•Plans, creates, maintains and executes detailed test plans, and test scripts – both automated and manual for verification of, but not limited to, software functionality, load, and performance.
•Worked closely with development team to clarify and to ensure that business requirements are being met or adhered to the business requirements
•Suggested changes to ensure accuracy of documented requirements. Participated in reviews of technical documentation, such as user manuals and suggests improvements and tests incorporated procedures.
•Participated in daily and weekly meetings and leveraging ALM and RALLY tools to maintain the dashboards and provide real-time metrics to the management
•Lead team of 6 and built a positive, productive work team.
•Participated in reviewing user stories and back logs of the existing system and provide feedback as required from QA standpoint to the team and management.
•Participated in daily scrum meetings following agile methodology and provide day to day burn down hours of QA and tasks progress for the different sprints utilizing RALLY tool
•Responsible for establishing test methodologies and using HP Application Life Cycle Management (ALM) for defect tracking, defect reporting, requirement gathering, planning, analyzing results, storing and executing automation and manual scripts. Evaluates test processes and procedures and makes recommendations for improvement
•Creation of test data involving mocking up of EDI 834 files for member maintenance by using EDI X12 Add, Term and change (021,024,001) transactions and also created CIC files for plan changes
•Validated DB2 tables for enrollment load to verify the data consistency across multiple systems
•Validated the web services integration with member portals, verification of the schema, request and response validation, DB verification for the data received utilizing SOAPUI
•Performed validation of UX, UI and backend testing with WEB Services and Swagger links. Validation of member billing details after case or policy issue and also daily as well as monthly feeds and reports
•Participated and worked on multiple projects parallelly- Expertise in ‘Over65 age group’ native mobile application.
•Validated testing different methods of payments made by member and also the bills generated by the system
•Testing of member payments utilizing self service portals Web Application which uses Restful Web services, SOAP UI, XML, DOM and JSON
•Reads, understands and correctly interprets business, certification and technical requirements. Develops plans, strategies and scripts to test these requirements.
•Participated in Sprint review sessions to ensure the business requirements are met.
•Participated in sprint planning and retrospective sessions after each sprint to ensure the quality.
•Developed QA project schedules and QA plans
•Developed UAT document for users to refer which includes step to step validations and helped to complete the UAT testing as smooth as possible and responsible for getting the signoff from business users before the deployment
Environment: Rally, Json, Visual code, Native mobile app, Swagger web services, mlp & mws Native apps, IOS/Android platforms
Client: Bed Bath & Beyond- Union Jun 2017 to Apr 2019
Quality Assurance analyst/ QA Lead
The legacy item entry system PIM (Riversand PIM) to be replaced by Stibo PDM – an initiative towards creating an enterprise-wide product/item master. The legacy item master was JDA, and as different concepts have different JDAs, there was no single enterprise mater. Further, the item setup processes varied across concepts. The new system helped to have all enterprise item data in one place, enforcing uniform item setup and maintenance processes.
Led six-person product-quality team in development and execution of test scenarios and cases, increasing productivity by 40%
Sustained a 96% on-time completion rate for software tests by effectively managing assignments and schedules, and balancing tester work load
Developed creative testing methods using knowledge in embedded, integration and system tests, which resulted in extensive software testing.
Utilized collaborative tools like Slack and Trello to communicate project updates and requirements with third-party vendors
Comprehensive dashboards with near-real-time updates including RAG (Red Amber Green) status for senior management and various stakeholders to get insights on the project health and progress
Implemented RAID logs to discuss Risks, Assumptions, Issues and Dependencies to be more agile and have clear visibility
Developed and implemented quality assurance metrics like ‘QA status’ and ‘Resolution Type’ to measure defect validity and performance of quality assurance team and to calculate completed assignments.
Generated and Transferred performance statistics and reports using UNIX-FTP
Implemented brownbag sessions and quiz programs to get feedback on new releases and product enhancements
Analysed business requirements, Software requirement specifications to create test plan and test cases for manual testing.
Discussions with Developer team to review code and release notes.
Workshops with in the team to discuss the testing strategy, Design, execution and continuous improvement planning.
Switched gears from Waterfall to Agile to Waterfall depending on the nature and impact of the project
Build user stories through Test Rail/Zephyr based on feature and its supporting requirement documents.
Streamlined the QA process for every build release by initiating with a smoke test to provide validity of the build before moving to functional testing
Optimized test cases based on the product enhancements, improving reusability and eliminating re-scripting of the test cases
Established the concept of Dress Rehearsal post UAT sessions to ensure migration of code is validated in a near-real Production environment before moving to the Production environment
Drive development of automation and testing in a CD/CI environment.
Worked closely with e-commerce teams, buyers and various vendors to understand the post-production experience
Environment: JIRA Quality Management System, JIRA Confluence, JIRA – TestRail, Oracle DB, STIBO, Workbench, TIBCO, JDA, Riversand PIM, IBM Mainframes, IE, Chrome, Safari, Windows 7,Unix, IBM Lotus Notes
Client: Indus Group inc- Hoboken, NJ Apr 2016 to May 2017
Role: Business Analyst/Quality Analyst
Created Test plans, Test scenarios, Test cases and test execution.
Involved in the development of system testing strategies, plans, cases and conditions to ensure processes and products meet standards.
Analysed business needs, distinguish between needs and wants, identify gaps between business needs and standard application functionality.
Excellent involvement in QA activities and development support of various interfaces like writing Test Plans/Test Cases for System /Interface and Business requirements.
Extensively involved in various phases of manual testing.
Exposed in System, Regression and Integration testing.
Development of Test Suites and Services for testing the API's (platform).
Created SQL scripts for Validation.
Extensively used SQL and PL/SQL to retrieve the data from databases to perform data validations and comparisons for regression testing
Prepared Test Plan, test Cases, Test scripts and Test data for the application as well as for the database verification based on the functional requirements and test specs.
Sole performance of API testing with different sub-systems as a part of Back-end test.
Responsible creating Test Plan, Test Cases and Test Cards based on use cases in business requirements to cover both functional and non-functional requirements.
Created Test cases using Element locators and Selenium WebDriver (Java) methods.
Expertise in writing SQL queries to fetch test data from RDBMS such as Oracle and SQL Server.
Assist in identifying entity type to determine customer KYC requirements to be requested from clients.
Assisted in creating and presenting informational reports for management based on SQL data
Managed JIRA tickets and worked closely with prod support team in scoping tickets for the upcoming releases.
Coordinating testing efforts between all the technical teams involved, setting up most comprehensive test data set under different testing environments.
Responsible Performance Test Strategy Design and Performance Test Plan Preparation and Presentations.
Environment: Manual Testing, QTP, Agile, Firebug, Fire path, Quality Center, MSWord, MS Excel, SVN, agile, PL/SQL, Windows 7.
Client: Sutherland Global services. - Hyderabad IND May 2012 to Feb 2015
Role: Business Analyst
•Analysed business needs, distinguish between needs and wants, identify gaps between business needs and standard application
functionality
•Involved in the development of system testing strategies, plans, cases and conditions to ensure processes and products meet
standards
•Work with internal experts and senior client staff to develop/gather formal customer requirements, project planning, system
design, and end-user documents in an iterative development environment
•Collaborate with the project team to define, build, and rollout innovative technology and business process solutions to increase
efficiencies and improve customer satisfaction
•Leading ongoing reviews of business processes and developing optimization strategies.
•Conducting meetings and presentations to share ideas and findings
•Performing requirements analysis
•Documenting and communicating the results of your efforts
•Effectively communicating your insights and plans to cross-functional team members and management
•Managed JIRA tickets and worked closely with prod support team in scoping tickets for the upcoming releases
•Coordinating testing efforts between all the technical teams involved, setting up most comprehensive test data set under
different testing environments
Environment: SQL, JIRA, Quality Centre, MSWord, MS Excel, Windows 7.
Education:
Master of Science in Information Technology, GPA: 3.8