Post Job Free
Sign in

Manager Digital Marketing

Location:
Plano, TX
Posted:
July 12, 2020

Contact this candidate

Resume:

Delwar Hossain

Plano, Texas ● 469-***-**** ● *********@*****.***

Accomplishments

Experience in Manual & Automated Testing of GUI and functional aspects of Client - server and Web based Applications on multiple levels of SDLC and Software Testing Life Cycle (STLC).

Expertise in Requirement Analysis, Code Design and Production Implementation of Web and client server based application in a variety of technologies

Good Knowledge of Object Oriented programming Concepts(OOPS) and python Skills

Good experience in creating test scripts using WebDriver, Selenium RC, Selenium-IDE, Pytest Selenium Grid in Python.

Good experience in DataDriven and Hybrid Testing.

Experience in Functional, Regression and System testing using Manual and Automated testing tools - selenium WebDriver.

Used the Firebug to inspect the XPath of the web elements and Pycharm IDE for script development

Experienced in providing the efficient locators strategy like XPath and CSS to run WebDriver script in stable condition. Good Experience in Robot FrameWork and Reporting Tools.

Worked on Cross browser testing, browsers like Mozilla Firefox, Google Chrome and IE using WebDriver.

Well versed with Agile process

Proficient in manual testing Desktop, Web, Mobile Native and Hybrid Applications in both iOS and Android mobile devices and automation testing using Appium automation tool.

Experience in analyzing Business, Functional and Technical Specifications

Experience in writing Test Plans, Test Cases, Test Procedures and Test Scripts from requirements and Use-Cases

Expertise in performing different types of Testing: White Box (Unit Testing, Integration testing), Black Box, Smoke, Functionality, Integration, Stress, Volume, System, Performance, Regression Testing and Full Life Cycle Testing.

Experience in Software Verification and Validation based on Testing Methodology.

Proficient in Functional Testing tool Quick Test Professional (QTP/UFT) and the various frameworks in QTP/UFT.

Experienced in giving training to the functional testers to execute Test Cases using QTP/UFT as part of Regression Testing.

Working knowledge in UNIX, SQL .Python and Windows platforms

Responsible for Testing both front end and back end processes in Oracle E-Business Suite 11i

Managed Oracle Apps Testing and test planning

Documentation, Integration, Conversion, of data from non-Oacle legacy systems into Oracle EBS.

Responsible for Testing both front and backend processes in Oracle EBS R12

As a Oracle Financial QA, working on modules like Fixed Assets, Procure to Pay, Accounts Receivables, General Ledger, and Inventory and mostly involved in eAM (Enterprise Asset management) in Oracle EBS R12

Ability to work in a fast-paced, deadline-driven environments

Excellent interpersonal and customer relational Skills.

Experience in working in domains like Insurance, HR & Payroll, Banking, Health-care, Retail/e-commerce Mortgage

Sound Technical knowledge, excellent exposure and ability to learn any tool quickly.

Experience in Oracle, SQL server

Expertise on Testing REST API (Postman/Newman), SOAP UI (SmartBear)

Expertise in architecting manual testing strategies to determine negative path of critical business processes

Practical comprehension of Agile (Kanban and Scrum), Waterfall and V-Model

Practical knowledge in code walk through, test fixture development for testing class in Test Driven Development methodology

Extensively used Quality Center / Test Director for Test Planning, Test Designing, Test Analysis, Test Execution, Defect Tracking and Test Evaluation.

Experience in leading projects as Technical Lead in multiple phases of the SDLC

Experience in training team in processes, methodology and utilization of automation frameworks

Responsible for interacting with business partners to identify information needs and business requirements for developers

Exceptional analytical and problem solving skills.

Team Player with the ability to communicate effectively at all levels of the development process.

SKILLS

Testing Tools: Selenium 3.x, Robot Framework, Selenium WebDriver, Pytest, Atlassian Jira, Zephyr, XRay, HP ALM, UFT, Microsoft Test Manager, VSTS, Gatling, JMeter, Blazemeter, Postman, Newman, SmartBear SoapUI, LoadRunner, TestRail, Eclipse, PyCharm, Chropath, Atom, Sublime, Notepad++, Katalon Studio, Fiddler, putty, winscp, html, css, xml, xpath

Programming Languages: Python, Javascript, VB Script, CL, JCL, .Net/C#

Databases: Elasticsearch db, SQL SERVER, PostgreSQL, SQLlite, MySQL, MongoBD, Oracle, DB2

Web Technologies: AnglularJS, Nodejs, Python Flask, XML, PHP, Web Services, REST API, SOAP API

MS Office Tools: Word, Excel, Power point and Visio

Automation Technology: PageObject Model, Behaviour Driven Testing, Automation Acceptance Testing, Keyword Driven Framework, Data Driven Framework, Hybrid Automation Framework, Test case Driven Framework, TDD

CICD/CICT Git, Github, Gitbash, Jenkins, Gitlab, Kubernetes, Virtual pc

CERTIFICATIONS

Certified Software Test Professional-CSTP (IIST)

Certified Test Manager-CTM (IIST)

Certified Scrum Master-CSM (Mountain Goat)

EDUCATION:

Post Baccalaureate - 2001

Virginia Commonwealth University, Richmond, VA

Major – MIS

Bachelor’s Degree - 1997

West Virginia University Tech., Montgomery, WV

Major – Graphic Communications

EXPERIENCE

Senior QA Automation Developer (Functional, API and Performance) Oct. 2015 – Present

VPay, Inc

Plano, TX

Built smoke, BVT (build verification test), functional automation, regression scripts using selenium/python and robot frameworks

Reviewed and Analyzed the System Requirement Specs and understand the control flow of the whole system.

Follow the Agile methodology due to the continuous change of requirements.

Performed Ad-hoc/Exploratory testing prior to automate the testing on the application.

Developed re-useable code to share functionality with different tests, thereby providing easy to maintain.

Created functional automation scripts using Selenium 3.x

Actively participated in writing Test Plan and Test cases and creation of automation framework.

Worked with deployment team to create nightly automation regression execution builds.

Trained and helped other testers on Selenium/API testing tools.

Created test scripts (Jmeter) to monitor response time in development phase.

Prepared Test Cases, Vugen scripts, Load Test, Test Data, execute test, validate results, Manage defects and report results

Interface with developers, project managers, and management in the development, execution and reporting of test automation results

Identify and eliminate performance bottlenecks during the development lifecycle

Changed my roles based on agile testing needs.( manual testing, automation, load testing)

Effectively participate in cross-platform and cross-browser testing of all responsive design web-sties in Mobile phones and tablets, verifying page layout meets the design specifications and identifying key issues.

Performed both manual and automated test using Selenium and Appium on mobile devices covering Android Phones, iPhones, Android Tablets, iPads, Windows Phones and desktop including Windows and Macintosh.

As a tester, I ensured that daily automated regression testing is running & completing successfully.

Performed UI and Functional testing on Android and iOS devices(smart phones, tablets) using Katalon Studio.

Worked closely with the developers within the Scrum/Kanban team.

Tested RESTful Api using Postman/Newman.

Created selenium automation scripts in Python.

Supporting agile testers in creation of test scripts.

Common approach for cross platform and cross browser testing.

Experience building and maintaining a Selenium Regression test suite.

Involved in Tool Evaluation Process.

Used Excel to create data driven and keyword driven scripts.

Involved in setting up the Testing lab for the Automation system.

Framework to be reused for future versions of the project.

Applied test automation framework on CI/CD process using Jenkins.

Quality Assurance Analyst Oct. 2012 – Jul. 2015

StoneEagle Insurance Systems

Richardson, TX

Gathered the requirements and compiled them into Test Plan

Prepared Test Cases, Load Test, Test Data, execute test, validate results, Manage defects and report results by using Apache Jmeter

Developing functional testing plans and performing end-to-end phases of test execution and built automation script using QTP 11

Solid working experience with TestRail testing tools

Utilize Testrail (test case management tool) to capture and execute all test cases and test suites

Created, modified and executed Test Cases for each functionality based on product functional specification in TestRail.

Managed Auto regression and enhanced script

Understanding application architecture and overall technology landscape

Documenting functional, regression, and acceptance testing procedures

Refining full functional test plans and developing Quality Assurance test plans

Tracking defects found at software release phase and develop test data to be used in software development life cycle

Estimating work load requirements and developing standard test strategy plans

Developed and executed functional test plans and test cases

Provided support in all phases of functional test execution and documentation

Identified bugs and monitored defect tracking systems

Communicated test results and performed tracking of non-testable software

Performance Consultant July. 2012 – Sept. 2012

Metlife, Irving, TX

Gathered the requirements and compiled them into Test Plan

Followed Agile Methodologies (scrum)

Responsible for implementing LoadRunner based infrastructure including: Architecting the load testing infrastructure, hardware & software integration with LoadRunner.

Prepared Test Cases, Vugen scripts, Load Test, Test Data, execute test, validate results, Manage defects and report results

Expert in creating Next Generation Usage Pattern Analysis from the Production Logs to generate Performance Load.

Perform optimization; advising on overall BI infrastructure, ETL and BI (front-end) tools; performing BI reporting,

Gathering and finalizing of specs / defining business and functional requirements for BI reporting

Identified Real World Scenarios and Day in Life Performance Tests

Complex Usage Pattern Analysis

Used LoadRunner define performance requirement like SLA in test.

Interface with developers, project managers, and management in the development, execution and reporting of test automation results

Identify and eliminate performance bottlenecks during the development lifecycle

Accurately produce regular project status reports to senior management to ensure on-time project launch.

Performed Black Box,White Box, Performance testing, Regression, and Validation testing during the testing life cycle of the product release.

Participated in Integration, System, Smoke and User Acceptance Testing.

Wrote User Acceptance Test (UAT) Plans and User Acceptance Test Cases.

Verify that new or upgraded applications meet specified performance requirements.

Lead QA Engineer/QA Automation Engineer Oct. 2011 – Jul. 2012

Dealer Track (Texas), Dallas, TX

Project: NADA, eCL, iOS

DealerTrack eCarList is one of the fastest-growing solution sets in the automotive retail industry, with a focused array of award winning products including vehicle inventory management, merchandising, appraisal, pricing, mobile software, dealership health reporting, CRM, custom web design, and digital marketing solutions via an integrated online platform.

Assist in the development of a Keyword Data Driven Automation Framework by developing the backend functional web control library

As Quality Assurance Lead, lead a team of three QA Engineers in all phases of the strategic projects identification, documentation, preparation, execution and implementation of testing strategies for small, medium, and large-scale application

Assign manual testing task for software changes originating from upgrades and new software development for a variety of applications throughout the Agile SDLC including functional verification, verifying acceptance criteria, system testing, integration testing, regression testing, interface testing and batch testing

Ensure multiple web application meet the user and business stories based on executing test cases written to satisfy acceptance criteria and exploratory testing

Lead team in designing, development, testing and test related deliverables for application enhancements and new product release

Worked tightly with developers to discuss and identify technical stories and develop the user acceptance testing criteria for functional verification

Coordinated testing resources and ensured test cases are correct and have appropriate application coverage

Prepared detailed white box testing artifact to interrogate backend web services communication

Mange and lead the team to meet project milestones and maintain application quality

Mentored junior QA Engineers in procedures and best-practices for all testing areas

Collected status from QA Team, prepared and issued final audit reports summarizing findings and noting correction action

Assisted in the design, development, testing and test related deliverables for system enhancements and new production releases.

Ensure enterprise applications meet the functional, business and system requirements

Senior QA Analyst

Pegasus Solutions Inc. (Texas), Dallas, TX Feb. 2011 – Sept. 2011

Project: Electronics Distributions (ED) Systems for travel industry

As a pioneer in the travel technology industry, Pegasus Solutions Inc.has a long history of breakthroughs. With Electronic Distribution, reservation requests can be processed and confirmed in a matter of seconds. When a reservation is transmitted to a hotel from a GDS travel agent or online consumer, the Pegasus Switch delivers the request transaction in the appropriate message format allowing your central reservation system to seamlessly process the request and respond; confirming the information with the requestor.

SMP (Screen Management Program) project is to re-write this tool and pair it with the Content Manager application since their front-end functionalities are very similar. A new WS (Web Service) has been added in the back-end for SMP processing. The SMP tool is now accessed through the portal instead of the old tools html page. The re-write consisted of changes to the look and feel of the tool, but all back-end database functionalities for the property information remain the same - there were no changes made to the data that is written to and retrieved from the Corse Filter Database (CFDB) in USW. A new session table was added to the CM database which is written to when a user logs into the tool.

Testing (that includes unit, integrated, regression, and UAT) a new .NET application which uses Microsoft Visual Studio under the Scrum (Agile) Methodology.

Provide assistance and consulting to Development on project activities.

Working from functional specifications to write test plans and test scripts.

Provides technical leadership and responsible for overall technical solutions and quality.

Documenting test requirements, test designs and test plans (writing and reviewing of unit and system testing documentation for technical accuracy).

Utilizing XML Web Service.

Properly diagnose test results and document product defects.

Consult with Development and Business Analysts to identify test data.

Coordinates application builds and release processes with Project Manager.

Analyze, develop, and manage test plans and test cases utilizing the software automated testing tool (Quick Test Professional).

Virginia Farm Bureau, Richmond, VA

Senior QA Analyst Dec. 07 – Jan. 2011

Project: Proprietary insurance Application (Auto, Home and Farmers Insurance)

Primia Collaborator is a new Web-based Auto/Home/Farmers Insurance maintenance application developed to replace existing legacy system. It is developed on Oracle Forms and .Net platform. Collaborator module is used by the insurance agents throughout the state while Primia application is used by home office (Claims, Underwriter and Management).

My responsibility is to write test cases, automate all the application testing and compare the test result of this new system with the legacy system (PMS). In addition to this, I also developed a LR script (parameterized, correlate, troubleshoot, analyze and monitor hardware resources) which runs in QA/Test/Prod environment with specific load parameters. This LR script also generates reports. These reports are crucial to the upper management and audit department to determine the overall progress of the project.

Responsibilities:

Write complex SQL to create test data and to validate test cases.

Involved in requirement analysis and Gap analysis of Business System Design and Use cases

Designed and developed the Test plan, Test Scripts and test strategies for the applications and executed them.

Participated in Business/Development meetings to understand user requirements.

Used Quality Center as repository for maintaining test cases, execution and tracking the defects.

Involved in User Interface, Functionality, Navigation, Volume, Load, Performance, Security testing and Beta Testing.

Produced Performance test report with the aid of sitescope tools

Prepared test data for positive and negative testing used in data driven testing for testing the application dynamically

Performed Sanity testing for each new build of the application.

Performed functional, system, and regression testing manually and with automated test tools like Quick Test Professional.

Created User defined functions to avoid redundancy and increase flexibility within the testing repository.

Developed VuGen Scripts using Load Runner for conducting Performance Testing by using controller and submit the report to the upper management by completing report analysis.

Actively participate in project meetings, release meetings, and QA status meetings.

Used GUI, Bitmap and Text checkpoints in QTP scripts for validations

Parameterized test scripts and correlate dynamic data (server respond) based on Load test objectives by using Load Runner.

Environment/Tools: Quality Center 11, Load Runner 11 (Protocol: Ajax-Click and Script, Oracle Web App 11i), Sitescope, VSTS 2008 (For .net application), Quick Test Professional 11, ASP.Net, C, Windows XP/NT/2000/2003, SQL Server, Toad, Snagit, AutoIt, Oracle Forms (Oracle 10g) and IIS

Cenveo Inc., Richmond, VA

Software Test Engineer Nov. 03 – Nov. 2007

Project: Web-based peer-review system

Rapid Review is a web-based peer-review system that puts the entire peer-review process on the web. This workflow application automatically delivers assignments, sends notification of tasks, tracks schedules, monitors deliverables and provides overall management of the entire peer review process. With 24/7 access, authors, reviewers, and editors can contribute to the peer-review process– eliminating the barriers of time and geography.

Responsibilities:

Worked closely with Development team to discuss the design and testing aspects of the applications to design the Test plans

Performed manual, automated, and regression testing

Developed LR Scripts (AQT) for conducting Performance Testing.

Constructed SQL scripts using PL/SQL on Oracle 10g database.

Tested front-end/Backend web interfaces on different environments

In every testing phase, ensured the changes did not break the existing functionality.

Used Test Director for tracking the defects, and building test cases based on customer requirements provided by the business analyst.

Involved in all phases of the Software Test Life Cycle.

Performed application testing (including GUI, functional, regression, integration, performance, customer acceptance (beta testing), system, multi-user and others) and validating actual results against expected results.

Demonstrated ability in having initiative, creativity, independence and responsibility in testing assignments.

Environment/Tools: Oracle 10G, Microsoft Office 1997, MS Project, Windows XP, DB Visualizer, Mercury Quality Center 9.0,QTP 8.2, AQT, Visio, Cognos, Linux, Shell Scripts, SQL Server2005

Cadmus Communications, Richmond, VA

Computer Programmer/Data Analyst June. 97 – Oct. 2003

Project:Cadmus Mailing Service and Software Integration with USPS

Responsibilities:

Managed and performed automated data conversions and system customizations. A conversion involved analyzing customer data, and working closely with the customer to configure the different format (dbase, ASCII, .CSV, tab delimited etc.) data into a Mailing systems standard which met all of their requirements. Further responsibilities include developing new code to implement new customer requirements

Created custom programs to improved ease of data access, and created GUI that provided the data to end user through different presentation. This eliminated the need to provide raw access to data thus improved the security, confidentiality and protection from losses. Also created various scripts to automate data integrity check once new data was send for processing.

HONOR’S AND ACTIVITIES:

Virginia Farm Bureau Excellent Award 2008, 2009 and 2010

Diversity Facilitator Award 2002, Cadmus specialty Publications

Who’s Who among American Universities and Colleges

Academic Deans Lists

References: Available upon request



Contact this candidate