Post Job Free

Resume

Sign in

Quality Analyst

Location:
Edison, NJ
Posted:
October 17, 2018

Contact this candidate

Resume:

Dharani Subramaniam

ac7etc@r.postjobfree.com

609-***-****

A dynamic professional with 8+ years of experience as a Technical Consultant in multiple Domains focusing on Quality Assurance and Software Testing.

5 years of solid experience as Lead QA in both manual and automation testing process

Experience in Content Management System (CMS), Adobe Experience Manager (AEM) testing.

Experience with cross-browser, cross-platform and mobile device testing.

Extensive experience in automation tools like Selenium and QTP.

Extensive experience in all phases of SDLC processes & methodologies like Agile (Scrum) and Waterfall models.

Experience in Bug tracking tools such as JIRA, HP QC and Bugzilla.

Extensive experience in Scrum QA testing, Scrum Integration Testing, Defects Triage, Scrum Stand up call, Scrum exit criteria and other agile methodologies.

Experience in Functional and Regression testing in the Software Development Life cycle.

Experience in SQL, database connectivity for Back-end testing.

Hewlett-Packard certified tester in HP Quality Centre/ALM and HP QTP/UFT.

Provided technical solutions by implementing/testing/maintaining the enterprise tools such as HP OSS and BSS products[HP OVO, HP OVPA, HP OMU] developed for monitoring, automating and provisioning the enterprise networks, systems and applications.

Good understanding on all types of operating systems [UNIX, Linux and Windows]

Hewlett-Packard certified in Open View Operations (OVO) I 8.x UNIX and AIS - Universal Configuration Management Database v9.

Also worked in MYCOM NIMS-PrOptima, which involves managing and implementing performance reports and geographical map views.

Strong communication, interpersonal, analytical skills with proficiency at grasping new concepts quickly and utilizing the same in a productive manner.

B-Tech - Bachelor’s Degree in Information Technology, Anna University, India.

Tools

Adobe Experience Manager,

MYCOM NIMS-PrOptima, HP OM, HP OVPM, HP OVR, HP UCMDB, HP SM

Testing Tools

Selenium, HP QC/ALM, HP QTP/UFT, Litmus, TestRail

DevOps Tools

Jenkins, GitHub

Defect Tracking Tool

JIRA, HP QC, Bugzilla, Mantis

Servers

UNIX(Solaris, AIX, HP-UX), Linux(Red Hat,SuSE), Windows 2000, 2003, 2008

Languages

Oracle, PL/SQL, Shell/Perl Scripting, Java

DB

SQL

Hewlett-Packard Internally Certified QC/ALM

Hewlett-Packard Internally Certified QTP/UFT

Sun Certified Java Programmer (SCJP) 1.6

Hewlett Packard Certified Open View Operations (OVO) I 8.x UNIX

Hewlett Packard Certified AIS - Universal Configuration Management Database v9

Trained in Oracle PL/SQL and UNIX Shell/Perl Scripting

Trained in MYCOM Software

Bose Corporation - PCE Aug 2018 to Till Date

Client Name Bose – Framingham, MA, USA

Role AEM QA Lead

Tools AEM, JIRA, Agile

Project Description: Bose's Personalized Customer Experience (PCE) program committed to driving customer-centric value and enabling capabilities that will allow Bose to deliver astonishing experiences at scale.

Responsibilities

Worked with cutting edge technologies, like Adobe Marketing Cloud (AEM, Campaign, Analytics, Audience Manager, Target)

Worked closely with Concept Owner and provided inputs so they can manage the backlog, define concept roadmap, and conduct sprint planning, reviews, and retrospectives.

Assessed and understand business requirements and translated them into test cases and acceptance criteria

Validated business requirements against the implementation through comprehensive testing

Delivered detailed test plans, test cases, test scripts and test reports any other assets

Ensure deliverables are of the highest quality to promote client satisfaction

Acted as an Agile tester and played multiple roles in the team at times. Participated as key member of concept test team with focus on collaboration, issue resolution and task estimation.

Created thorough and detailed training documentation for business and functional users

Independently managed assigned tasks

Samsung.com Sep 2015 to Aug 2018

Client Name Samsung SDSA – Ridgefield Park, NJ, USA

Role AEM QA Lead

Tools AEM, Selenium, Jenkins

Project Description: Samsung.com projects that includes Shop site and Support re-design developed on AEM

Responsibilities

Led & managed quality assurance efforts for high visible Samsung.com projects such as Shop site re-design project phase 1 & 2, Support re-design project developed on AEM.

Designed process flow and testing strategy for new features on Samsung.com including proprietary CMS system as well as front-end validation.

Created, prioritized and tracked user stories in JIRA as per the business needs, lead scrum meetings

Coordinated & tested weekly bug fixes & new enhancements.

Testing author configurations and UI display of AEM components, templates as per the requirements; as well as testing workflows, tagging, user permissions as per the business requirements.

Authoring pages with different developed components and testing in both author and publish environments with providing digital content as per the style guides.

Provide updated and accurate status of testing to project management through status reports and meetings.

Testing API services using Postman and Build deployments using Jenkins for testing.

Provide keen oversight to defects from point of discovery through resolution and business sign-off in partnership with business stakeholders and project management to record, prioritize, implement, and retest fixes for reported defects within deadline-driven environment.

Prepared Test Plan, Test Scenarios and Test Cases and documented them in JIRA.

Responsible for GUI and functional testing.

Prepared Automation test scripts using Selenium WebDriver and TestNG.

Performed testing on backend transactions in database using SQL queries.

Prepared weekly and monthly test reports and participated in desk check with Business analyst and developers and providing test data for UAT testing. Create JIRA cards for all the defects found during the functional and regression testing.

Coordinated & led User Acceptance Testing with business users.

Participated in content authoring and validation on content in staging and production environment and regression testing on cross-browser, cross-platform and mobile devices.

MYCOM OSI Implementation Sep 2013 to Apr 2015

Client Name Alcatel Lucent – NJ, USA

Role Domain Lead

Tools MYCOM NIMS-PrOptima, JIRA

Project Description: MYCOM's NIMS-PrOptima(TM) solution being implemented in Vodafone Super NOC (Network Operations Centre) for the primary purposes of providing performance reports, analysis, end-to-end optimization, capacity management, automatic performance alarming and automation across all network domains and technologies. Processed very large volumes of performance, configuration and services data and implemented correlation, analysis, reporting and visualization modules via automation, workflow and customization capabilities.

Responsibilities

Interacted with business analyst and developers on daily basis for requirement analysis and requirement gathering, testing, bug review and documentation for application developed in agile process with frequent changing requirements and features set.

Used agile methodology to ensure the new features meet the customer requirements and acceptable quality assurance standards.

Manages Core Domain includes Multi-report management, chaining and synchronization.

Build reports containing tables, graphs and charts that are dynamic and drill-down/roll-up enabled for Core Domain.

Creates, Implements and Tested the Performance Reports

As a Black Box Testing conducted GUI, Functional, Integration, Regression and UAT testing.

Developed and executed Test Cases manually to verify expected result.

Write Positive and Negative Test cases for functional testing and then execute them.

Efficient distributed information to all organizational levels via a number of options including web-based reporting

Correlation and dynamic drill-down/roll-up of cross-domain performance, configuration, fault and other data.

Documented test procedures for release acceptance and regression testing of new builds

Identified and isolated software defects and reported them via JIRA

Actively participated in QA team meetings and discussions

USM-SOT Automation Dec 2012 to Aug 2013

Client Name Telstra - Australia

Role Test Lead

Tools HP OMU, HP QC/ALM, HP QTP/UFT, SQL Developer

Project Description: USM Service Ordering Tool (SOT) is a web-based tool to submit and provision monitoring requests into USM platform. It automates the current manual USM-Design and Build (USM-D&B) process. USM-SOT captures all the business rules involved in this manual process and provides a system that can greatly reduce implementation time and efforts for any USM monitoring request. SOT follows a three -tiered architecture consisting of the User Interface Layer, the Business Logic Layer and the Database Layer. SOT involves development activities at different layers of the SOT application. This requires efforts for design and development, testing and rollout of these new catalogue items in SOT.

Responsibilities

Responsible for QA & UAT testing of USM Automation(Automation of OM process on client environment)

Worked with Design & Build team to obtain requirement and test inputs.

Defines Testing Process Documents including Test Strategy, Test Plan, and Understanding Documents on Requirement.

Test Planning and Test execution in Quality Center

Set up test environment and created test data for both positive and negative test cases.

Developed SQL queries/procedures for database testing / backend testing

Creates Manual Test Cases which were used to Created Automation Test scripts

Identifies the components to be automated for the Regression Pack

Schedule the Test execution in an automated fashion overnight and trigger the Email with the Test Results to the Project Team

Defect tracking and Test Reporting in HP Quality center

Conducts the Defect Meetings with Development and Business Analysts.

USM Agent Remediation – Agent Remediation V8 Nov 2011 to Nov 2012

Client Name Telstra - Australia

Role Data/Technical Analyst, SQA & CM Lead

Tools HP OM, Borland StarTeam, HP QC

Responsibilities

Performs requirements determination, first line data analysis

Performs data analysis to determine the supported agent version V8 on various OS platforms[22 flavours of Windows, UNIX, LINUX and AIX]

Identifies and analyses the agent version/Patch/Hotfix details applicable for different OS flavours

Estimates effort and duration for tasks related to various OS platforms

Reports progress of assigned tasks

Creates system and user documentation as required

Assures the completeness, accuracy, and integrity of deliverables

Participates in Work Product Reviews

Communicates with the customer mangers of various divisions of server and getting approval on implementing the product on servers that they own.

Communicates any issues, concerns, or problems with the approval process on servers to the Project Manager for resolution

Co-ordinate with the server mangers and get the approval done for implementation that includes pre-requisite confirmation, implementation time and downtime data.

Manages implementation team to get the implementation done on approved time period

Get the team executes defined test plans and evaluates test results.

Identified and isolated software defects and reported them via HP QC

Ensures test results are communicated and documented according to standards

Ensures issues found in product implementation on servers are logged, monitored, fixed and reported complete to the server managers

Records metrics to monitor key testing criteria

Reports progress of the testing activity and any issues, risks or hazards that could impact successful completion

Maintains awareness of the policies, methods and quality Assurance standards applicable to the testing process and ensure they are applied consistently

Agrees to and meets delivery commitments under the direction of a Project Manager

Ensures accurate time recording for all product implementation and project effort

USM SYSEXT – Agent v11 Certification Aug 2011 to Oct 2011

Client Name Telstra - Australia

Role Agent Implementer

Tools HP OM, HP QC

Project Description: This project intends to certify the latest released version of agent (v11) on different OS flavours deployed in the Telstra environment. The latest versions of OVO and OVPA agents identified during the Solution definition phase of the project and deployed on different OS flavours. Deployment involves the process of manual installation and configuration of an Open View Operations (OVO) HTTPS agent and a Performance (OVP) HTTP agent

Responsibilities

Performs requirements analysis to determine the supported versions(major and minor) of agent

Identifies and analyses the agent version/Patch/Hotfix details applicable for different OS flavours

Performs OS requirement checks as part of pre-installation of identified agent

Installs the identified agent with the patch and hotfix of supported version for each OS

Configures the agent installed with the management server

Performs post installation steps to ensure the connectivity between the managed node and USM's OVO, OVPM and OV Reporter management servers

Prepares the deployment and test documentation.

Developed new scripts for the regression and sanity testing execute/run the scripts.

Involved in test scripts review.

Verifying Test result and logging defects.

Mapping Test Cases and Test Scripts on Requirements Traceability Matrix

Identified and isolated software defects and reported them via HP QC

USM Agent Remediation – Template Remediation V8 Jan 2011 - Jul 2011

Client Name Telstra - Australia

Role Template Certifier

Tools HP OM, HP QC

Responsibilities

Identify OVO Agent 7.x templates, scripts and ECS circuits and documentation that needs to be remediated

Performs requirements analysis to determine the supported agent V8 template for the corresponding agent V7 templates

Identifies the required templates for various OS

Migrate OVO Agent 7.x templates, scripts and ECS circuits to identified OVO 8.x

Certify the migrated templates on different OS flavours.

Prepares user implementation document

Develops unit/system/integration test plans and data

Executes test plans and evaluates test results

Coordinates and communicates the review of all unit tests

Ensures test results are communicated and documented according to standards

Ensures defects found in products are logged, monitored, fixed and reported complete in HP QC

USM Agent Remediation – Agent Certification V8 May 2010 to Dec 2010

Client Name Telstra - Australia

Role Agent Certifier

Tools HP OM, HP QC/ALM

Project Description: USM provides a unified and consolidated application service monitoring system, event management and performance monitoring for all applications within Telstra and will contribute to Telstra meeting its capacity management and availability requirements for both customer facing and internal systems.

There are approximately 4000 unsupported agents deployed for which the vendor support has expired. Agents are required to provide continued monitoring of Telstra’s customer facing and revenue generating applications. This project intends to replace the installed base of obsolete and unsupported server and application monitoring agents with the newest supported versions. The project will integrate with USM’s agent release management with Telstra’s existing software distribution systems, to ensure future agent versions are deployed with much reduced effort.

Responsibilities

Performs requirements analysis to determine the supported agent version V8 on various OS platforms[22 flavours of Windows, UNIX, LINUX and AIX]

Performs requirements determination, first line data and causal analysis

Identifies and analyses the agent version/Patch/Hotfix details applicable for different OS flavours

Estimates effort and duration for tasks related to be performed on various OS

Performs OS requirement checks as part of pre-installation of identified agent on different OS platforms

Installs the identified agent with the patch and hotfix of supported version for each OS

Configures the agent installed with the management server

Performs post installation steps to ensure the connectivity between the managed node installed on the server and USM's OVO, OVPM and OV Reporter management servers

Prepares the system and user deployment documentation as required

Reports progress of assigned tasks

Assures the completeness, accuracy, and integrity of deliverables

Participates in Work Product Reviews

Develops unit/system/integration test plans and data

Executes test plans and evaluates test results in Quality Centre

Coordinates and communicates the review of all unit tests

Ensures test results are communicated and documented according to standards

Ensures defects found in products are logged, monitored, fixed and reported complete

Ensures the testing environment has been set up correctly

Agrees to and meets delivery commitments under the direction of a Project Manager

Adheres to objectives set by the Project Manager

SUMMARY

EDUCATION

TECHNICAL SKILLS

TRAINING AND CERTIFICATIONS

PROFESSIONAL EXPERIENCE



Contact this candidate