Post Job Free
Sign in

Quality Assurance Functional Testing

Location:
Chicago, IL, 60602
Posted:
May 18, 2024

Contact this candidate

Resume:

Rupesh Gautam

Email: ad5r8w@r.postjobfree.com

Phone: 773-***-****

Professional Summary:

Over 5+ years of IT experience as Quality Assurance Engineer with hands on experience in Manual Testing, Functional Testing and Selenium Automation of various business applications and web applications.

Strong knowledge of SDLC and Software Test Life Cycle and highly proficient in SQA methodologies for Waterfall, Agile.

Hands on experience in Automation testing using Selenium.

Data validation using Rest API, User Interface API for multiple web-based application.

Experience in creating Test Strategy, creating automation test suite, automate test script and validate test results.

Involved in Data Extraction from Teradata and Flat Files using sql assistant.

Used Jira for defect tracking.

Experience on UNIX commands and Shell Scripting.

Experienced in Functional testing, Performance testing, Load testing, Integration testing, System Testing, Regression testing, Black Box testing, Stress Testing, Smoke Testing, Recovery Testing and GUI Testing.

Strong experience in preparing documentation, preparing test environments, executing, and analyzing the results.

Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.

Involved in testing the batch jobs, using UNIX and SQl Jobs.

Have experience in testing reconcile (initial) and delta (daily) loads.

Involved in preparation of Requirement Traceability Matrix (RTM), Defect Reports, and Weekly Status Reports.

Experienced in using the Oracle SQL* Loader features for loading the data from Flat Files to Data Base tables for Bulk Loading.

Proficiency in Defect management, including Defect creation, modification, tracking, and reporting using Industry standard Tools like Jira and Team Foundation Server.

Expert in writing complex SQL queries for back-end testing.

Experience in interacting with Clients, Business Analysts, UAT Users and Developers.

Technical Skills

Defect Tracking

Jira

Databases:

SQL Server 2005/2008, Oracle, MySQL

Scripting languages:

Java script, VB Script, HTML, XML

Application environments:

C#, .NET Framework, .NET Web Services, SSIS/ SSRS/ SSAS, SOA, CSS, Java, J2EE, ADF

MS Suite/ Project Tools:

MS Office (Word, Excel, PowerPoint, Outlook), MS Visio, MS Project, MS SharePoint

Operating Systems/ Products

Windows, UNIX, LINUX

DWH/ETL Tools

SSIS, Informatica, SSRS, Cognos, Tableau, etc….

Professional Experience: Nov 2020 – Present

Advantasure – End Client

Implementation-HCL America, INC -VA

QA Engineer / QA Lead

Roles & Responsibilities:

Involves in analyzing system design specification end develop this plan, Test scenario, test cases to cover overall business needs and quality assurance testing.

conduct smoke testing regression testing, system testing, performance testing and End to End testing and user acceptance testing.

Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements, and system specifications.

Work on MA and Comm Files for Client Application and business.

Exposure on working Medical, Dental and Professional claim files for the clients.

Worked on validation of data as per mapping document from source to target including data flow as per the transformation logic and business rules.

Create detailed. Comprehensive, well-structured dis plan and test suits to perform functional and driven testing web services.

Create Test case to automate test case using selenium web driver.

Execute the automated test script and analyze and validate error in the log files.

Create and manage excel sheet for test scenario, test data for automation.

Test and Execute CI/CD Pipeline in Azure Data Factory.

Create Triggers to automate testing for the ADF Pipeline.

Validate the Pipeline, Storage container, Process logs on Azure Data Factory.

Create Test Script to test and validate using Swagger API and Postman

Worked on File validation process using company internal file specification, validation of the ETL configurations and data transformation as per ETL mapping from source to target tables.

Experienced in working in Agile Scrum and Waterfall SDLC methodology environments.

Tested complex SQL scripts for Teradata database for creating Bilayer on DW for tableau reporting.

Verified session logs to identify the errors occurred during the ETL execution.

Verified the SQL job logs to identify the errors occurred during the ETL execution and debugging the error codes with SSIS packages for finding the root cause analysis for job error.

Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.

Tested the ETL Informatica transformation, ETL mapping and other ETL Processes (DW Testing).

Validate SSIS packages and execute the stored procedure, validate transformation in data and control flow.

Test the business transformation rules, and logic implemented with the SQL jobs an ETL package during data migration from source to target.

Create a test environment for a staging area, loading data from flat files, and different Database with execution of Sql jobs and store procedure.

Work with data analyst to implement informatica mapping, mapping parameters, create workflows, shell scripts and store procedure to meet the business requirement.

Mockup the data set to test the business rules and logic, positive and negative scenario testing.

Involved in backend testing for the front end of the application for salesforce using SOQL Queries in Oracle data base.

Perform defect tracking and reporting in Jira defect tracking tools with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Exclusively used requirement module for requirement and defect module for entering the defects in Jira

Experienced in Jira workflow like Creating a test plan, test case and test case with Jira plugin Zephyr.

Created Batch processes using Fast Load, SQL to transfer, cleanup and summarize data.

Test complex ETL mapping and session based on business user requirements and business rules to load the flat files and EDBMS tables to target tables invalidate Cognos report.

Worked on ETL mapping analysis and documentation of OLAP reports requirements, solid understanding of OLAP concepts and challenges with the larger datasets.

Environment: SQL Server Management Studio 2016, Jira, Salesforce, PL/SQL, RabbitMQ, Microsoft Visual Studio 2010, Bitbucket, Bamboo, Microsoft Azure, SSIS, TIBCO.

Centene Health, St Louis, MO Dec 2019 - OCT 2020

Test Engineer Analyst

Roles & Responsibilities:

Analyzed the Requirements from the client and developed Test cases based on functional requirements, general requirements, and system specifications.

Prepared test data for positive and negative test scenarios for Functional Testing as documented in the test plan.

Prepared Test Cases and Test Plans for the mappings developed through the ETL tool from the requirements.

Scheduling and automating jobs to be run in a batch process.

Good experience in web related technologies including HTML, XML, SOAP, JavaScript, ASP, VB, VB.NET, ASP.NET

Validated the archived jobs using DVO (data validation tool)

Experienced in configuration management using Visual Studio Team System (VSTS), Team Foundation Server (TFS), VSS and Subversion.

Experienced in working in Agile Scrum and Waterfall SDLC methodology environments.

Created the Regression Test cases and automated them for the purpose of regression testing using QTP.

Tested complex SQL scripts for Teradata database for creating BI layer on DW for tableau reporting.

Verified session logs to identify the errors occurred during the ETL execution.

Created Test Cases, traceability matrix based on mapping document and requirements.

Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices

Verified the logs to identify the errors occurred during the ETL execution.

Written several complex SQL queries for data verification and data quality checks in SCD Type 2 and SCD Type 1 Tables.

I have tested SCD Type 2 initially with History Load and Incremental load and verify the inserts and updates of data.

Reviewed the test cases written based on the Change Request document and Testing has been done based on Change Requests and Defect Requests.

Wrote various documents describing DVO functionality, and how DVO should be used by group based on discussions with team lead.

Tested the ETL Informatica transformation and other ETL Processes (DW Testing).

Tested Cognos reports to check whether they are generated as per the company standards.

Prepared Test Scenarios by creating Mock data based on the different test cases.

Perform defect Tracking and reporting with strong emphasis on root-cause analysis to determine where and why defects are being introduced in the development process.

Exclusively used test plan module to write and test lab module to execute the test cases in HP Quality Center/ALM.

Exclusively used requirement module for requirement and defect module for entering the defects in HP Quality Center/ALM.

Created Batch processes using FastLoad, BTEQ, Unix Shell and Teradata SQL to transfer, cleanup and summarize data.

Involved in testing the batch programs by using the Autosys tool.

Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.

Before the Load is accepted to test, have performed the smoke or shake-out test.

Validated various transformations of SSIS data flow and control flow.

Developed UNIX Shell Scripts for scheduling various data cleansing scripts and loading process.

Involved in testing the Cognos reports by writing complex SQL queries.

Provided the management with weekly QA documents like test metrics, reports, and schedules.

Environment: Oracle 11g, Informatica Power Center 9.5 (power exchange, Informatica DVO), Clear Case, Cognos 8.0 Series, Autosys, Teradata SQL Assistant, Agile, Clear Quest, SQL, PL/SQL, TOAD, UNIX, TFS, Putty, SharePoint, Flat files, XML files, DB2, Netezza, Tableau (Desktop, Server), SOAPUI, QTP, HP Quality Center/ALM, Autosys, VSTS, SQL Server 2012.

Emblem Health, NY Dec 2018 – Nov 2019

ETL QA Engineer

Roles & Responsibilities:

Designed and Created test plan, test scenarios and test cases for Data warehouse and ETL testing.

Worked with business team to test the reports developed in Cognos.

Responsible to translate business requirements into quality assurance test cases.

Review of test scenarios, test cases and Data warehouse Test Results.

Developed test scripts using SQL queries to validate data.

Preparing Dashboards using advance table calculations, parameters, calculated fields, groups, sets, forecasting and hierarchies in Tableau.

Used Informatica Data Validation Option (DVO) for a large set of pre-built operators to build this type of ETL testing with no programming skills required.

Prepared Regression Test Plans, Requirements Traceability Metrics (RTM), positive and negative test scenarios, detailed oriented Test Scripts, Test Kickoff documents, Test Scorecard for test progress status, Test Results, Release Check list, Lessons Learned documents and Regression Test Suite for future use.

Involved in the error checking and testing of the ETL procedures and programs Informatica session log.

Testing the ETL data movement from Oracle Data mart to Netezza Data mart on an Incremental and full load basis

Used import and export facilities of the application to download/upload XMLs of failed test cases so as to re-verify.

Tested the batch scripts to automate the cube building.

Done Unit Web, Performance testing VSTS 2005

Hands on experience in Automation testing using QTP.

Generated reports using SOAPUI.

Responsible in testing Initial and daily loads of ETL jobs.

Interacted with design team to decide on the various dimensions and facts to test the application.

Written several complex SQL queries & PL/SQL stored procedures for validating Cognos Reports

Extracted Data from Teradata using Informatica Power Center ETL and DTS Packages to the target database.

Environment: HP Quality Center/ALM, Informatica Power Center 9.5 (power exchange, Informatica DVO), Rational Clear Case, Rational Clear Quest, Teradata SQL Assistant, Teradata V2R6, QTP, Agile, Rational Requisite Pro, Autosys, Rally, TOAD, Putty, Tableau (XML, VSTS, Autosys, Oracle 11g/10g, QL Server 2012, SSIS, SSRS, SOAPUI, TIBCO, PL/SQL, Linux



Contact this candidate