Post Job Free
Sign in

QA Analyst Systems

Location:
San Diego, CA, 92101
Posted:
September 20, 2023

Contact this candidate

Resume:

RAMESH MAMILLAPALLI

adzt4j@r.postjobfree.com 410-***-****

Summary

Progressive IT experience across several domains and industries.

More than 12 years of experience in Quality Assurance (QA) and Application/Test Support for multiple applications running on UNIX, Windows, LINUX, Mainframe, Cloud, and Client/Server Platforms.

Worked on applications for DHS/CBP/USBP, Census, Healthcare, Medicare, Government, Marketing, Educational, Banking, Financial, Telecom, and Internet industries.

Experienced in all phases of Software Development Life Cycle (SDLC).

Ability to understand and translate business requirements and to operate independently or in a team environment.

Experienced in databases including Oracle, Sybase, and SQL Server. Monitoring, Scripting, and generating reports to ensure data integrity and business rules validation.

Experienced in Data Warehousing/ETL/Informatica/Ab-Initio/Netezza/SQL Server/Oracle, Backend Testing, Maintenance, Support, and in Data Analysis, Functional Requirements Analysis, and Effort Estimation.

Experienced in installing application Releases/patches in Test and Production environments.

Create test case scenarios, test cases, execute test cases and exceptionally document the process to perform functional testing of the application. Experienced in Cross Browser Testing. Expertise in testing of Web and Client/Server applications.

Extensive experience in Quality Analysis, Business Analysis, Database Testing, 508 testing and documentation, and Big Data Testing. Strong experience in eliciting, analyzing, and documenting testing critical artifacts.

Worked with systems engineering team to deploy and test new environments. Worked closely with Business Analysts, Developers and Product owners.

Experienced in creating maintaining Requirement Traceability Matrix (RTM).

Excellent technical, process improvement, team player with excellent analytical and problem-solving skills team and good oral/written communications skills.

Ability to work and manage projects independently, with a proven ability to meet deadlines.

Transfer data to the relational databases and generate reports by the BI team.

Create Hive queries which help analysts spot emerging trends by comparing fresh data with historical claim metrics. Involved in creating Hive tables, loading and analyzing data using Hive Queries.

Load and transform large sets of structured, and unstructured data. Experienced in installing application Releases/patches in Test and Production environments.

Set-up testing environments and prepare test data for testing flows to validate and prove positive and negative cases. Test and validate data at all stages of the ETL process.

Proficient in Manual, Functional, Regression, and Integration/End to End testing.

Exposure with automated test tools like Selenium, JMeter for GUI and web-based applications.

Nearly 2 years of Oracle database development experience.

Experienced working in Agile and Waterfall development methodologies for User Stories, Allocation of QA Tasks, Progression, Regression, conducting Defect Triage Meetings on daily basis.

Status: US Citizen

Clearance: Position of Public Trust and Full (BI) Background Investigation (DHS/CBP) Cleared.

Education

Master of Science (M.S.) in Electrical and Computer Engg., Western Michigan University, MI, USA.

Master of Engineering (M.E.) in Electronics and Controls Engg., Birla Institute of Tech. and Science, India.

Bachelors (B.E.) in Electronics and Communications Engg, M.K. University, India.

Certifications

DHS Level 1 Systems Engineering from (DHS) Homeland Security Acquisition Institute.

DHS Level 1 Test and Evaluation from (DHS) Homeland Security Acquisition Institute.

Technical Skills

Industries

DHS/CBP/USBP, Insurance, Govt, Telecom, Healthcare, Medicare, Census,

Marketing, Banking, Educational, Internet, and Financial

ETL Tools

OBIEE, Informatica, Ab Initio, SPSS, Netezza

Cloud

AWS, Azure

Automated Test Tools

Selenium, LoadRunner, JMeter

Operating Systems

UNIX/Linux, Windows Server

Languages/Applications

C, C++, Java, Java Script, PL/SQL, SQL, Unix Shell Scripting, Python, PERL,

Rest API, Spark

Middleware

Tuxedo, BEA Message Queue, IBM MQ Series

508 tools - JAWS, Web Accessibility Evaluation Tool (WAVE).

Database Systems

ORACLE, MS SQL Server, MS Access, Sybase, Hive, HBase, DB2

Tools

HP ALM, Bamboo, SOAP UI, SOAP, Bitbucket, Jenkins, ClearQuest, Gitlab, JIRA, Splunk, ServiceNow, Control-M, AutoSys, Confluence

Methodologies:

Agile and Waterfall Methodologies, Software Development Life Cycle (SDLC)

Experience:

DHS-CBP-OT (Office of Trade)

QA Validation Tester/ Software Systems Analyst 03/2023 - Present

Contracted to DHS/CBP/OT OPS; Projects: CBP OT ITAS, ASB UTED, ATAP

Worked on ITAS Project (International Travel Authorization System) with CBP Office of Trade (OT OPS). ITAS application collects travel information and track/gain authorization for all international travel that occurs within OT. It provides travelers an easy-to-use tool to submit travel approvals and ensures the required prerequisites steps have been completed. ITAS application consists of PowerApps application that uses SharePoint Lists as the Data source. ITAS app has features that has homepages, Views, forms, and approvals etc.

Worked with OT OPS Team on ASB (Analytical Services Branch) UTED (Unified Trade Enforcement Database) application by doing thorough evaluation of the current state of the monthly data Translation process/application. The goal of the data analysis done is to eventually improve the existing tools and processes applied to the monthly data translation activities with Liquidation Damages (LD), Penalties (PENS), Trade Pulse, and Clean Seizures Data for UTED. Assessed the current source requirements, processing steps, manual inputs, Macros and SQL queries leveraged for data translation.

Supported ATAP (Advanced Trade Analytics Platform) application team in CBP CSPD (Cargo Systems Program Directorate). Evaluating current processes to develop run and maintenance standards that supports dashboard Operations and Maintenance (O&M) procedures, Review, and document standards of Extract Transform Load (ETL) workflow between Databricks and QLIK, as they pertain to dashboard visualizations. Make sure the enhancements to TERM and Import profile align work and align to general workflow of the tool, so that dashboard begin using it with other ATAP tools.

Validating testing in ATAP involves the testing the functionality of TERM and Import Profile and new updates to Dashboards in QLIK.

Generating output and views from the dashboard

Testing filters work

Testing app chaining works

Testing ODAG works.

Comparing output of TERM ODAG (Trade Entity Details) with ES Tariff Line Details

Capturing screenshots and documenting findings in ATAP testing document

Reviewing findings with CO + TERM Team

Summary of testing findings (including list of identified bugs) detailed in test log.

Provide evaluation and documentation for 508 testing and remediation workflow process and continuous functional testing in accordance with business requirements. Establish and conduct technical ETL code peer-reviews. Performed testing on

ATAP scheduling App with MS Lists and Power Automate.

Tools: Power Apps, Databricks, QLIK, SailPoint IdentityIQ 8.3, Power Automate, SharePoint, MS Access, Cognos, MS Lists, Excel, Planner, Project, JIRA, Confluence, Visio.

LMI, Inc

Technical Tester/Software Systems Engineer 11/2019 – 09/2022

Client: DHS-CBP-USBP, DC/VA

Projects: USBP PMOD COP, MVSS, MSC, TAK

Worked on Common Operating Platform (COP) project with USBP PMOD. Supporting PMOD PM, Govt Lead System Engineer, Systems Engineer, and the Govt. team on COP project. Providing support to the PMOD acquisition life cycle process by working with Lead System Engineer(s) on defining, selecting, acquiring, testing, securing, and maintaining the technology required to support the COP program on the borders of the U.S. Worked on COP, DHS S&T (Science and Technology), DHS ACQ, DHS SE, DHS T&E, DHS EAD (Enterprise Architecture), Risk Management, and SELC topics/issues.

Involved in writing Test plan based on Business Requirement Documents and System Requirements Specifications.

Provide level of effort (LOE) from testing point of view, provide input for budgeting.

Working with upper management and project managers throughout the project lifecycle to provide accurate testing status, issues in timely manner.

Validated the Request/Response XML files against the XSD document.

Used ALM to maintain requirements, Test Cases, Test executions, Defect logging and establish the RTM.

Kafka used as messaging service to produce and consume the data.

Performed 508 testing with 508 Testing tools.

Testing of Web Services calls in an integrated platform between Client’s products and other software solutions using SOAPUI. Support Production environments for each build and release.

UNISYS Federal

Technical Tester/Application Support 05/2019 – 11/2019

Client: DHS-CBP-USBP, DC/VA

Projects: TASPD-UPAX/Biometrics

Worked on Traveler Verification System – Biometrics Applications. Congress transferred the entry/exit policy and operations to U.S. Customs and Border Protection (CBP). As part of the border security mission, the agency is deploying new technologies to verify traveler's identities – both when they arrive and when they leave the United States – by matching a traveler to the document they are presenting. Biometrically record persons leaving the United States using facial recognition technology. CBP’s goal is to expedite lawful travelers while enhancing national security and protecting a traveler’s identity against theft using biometrics.

Worked on Traveler Resolution and Trip Leg Resolution (UPAX targeting Test team) as part of Targeting and Analysis Systems Program Directorate (TASPD). The Entitlement System is the primary system used by the Targeting and Analysis Systems Program Directorate (TASPD) for authenticating and authorizing users of CBP applications.

Worked on Unified Passenger Vetting which process of vetting travelers (Traveler Verification System – Biometrics Applications) in the Unified Hot-lists, officers identify more accurate biographic information (Name, DOB, Document) based on research and analysis compared to the biographic transmitted via the APIS or PNR feeds. Responsibilities involved validating the source data in the flat files, and WEB user Interface applications in DEV, Test, and in Prod env(s).

Involved in writing Test plan based on Business Requirement Documents and System Requirements Specifications

Provide level of effort (LOE) from testing point of view, provide input for budgeting.

Worked with upper management and project managers throughout the project lifecycle to provide accurate testing status, issues in timely manner. Performed 508 testing with 508 testing tools.

Validated the Request/Response XML files against the XSD document.

Used ALM to maintain requirements, Test Cases, Test executions, Defect logging and establish the RTM.

Validated the HBase NoSQL DB which store the historical data.

Kafka used as messaging service to produce and consume the data.

Extensively worked on JMeter to create Thread Groups and test Web Application for various loads on key business scenarios.

Worked on testing the deployment of RDS from blue to green stack (vice versa)

Testing of Web Services calls in an integrated platform between Client’s products and other software solutions using SOAPUI.

Worked on Cloud watch logs for AWS and Cognito for user’s management.

Worked on AWS Data Pipeline to validate data loads from S3 to into Redshift

Support Production environments for each build and release.

Designed and executed the tests to verify the web GUI using Selenium. Performed Functional testing, Regression testing, and GUI Testing.

Dynamed, Inc

QA Analyst/Systems Analyst/Support 02/2019 – 05/2019

Client: CRISP, Columbia, MD

Projects: CRISP Applications

Worked for CRISP (CHESAPEAKE REGIONAL INFORMATION SYSTEM FOR OUR PATIENTS)

Worked as systems analyst in systems integrations - TECHOPS team providing application support, and maintenance.

Supporting providers/clients on Care Alert, Overdose, PDMP, Immunizations, Healthshare, PatientMatch medical applications.

Hands on with extensive SQL queries and populate tables as per the functional and specifications.

Working with upper management and project managers throughout the project lifecycle to provide accurate testing status, issues in timely manner.

Environment: SQL Server Management Studio 17.9, Putty, Notepad snipping tool, Windows Powershell ISE, CRISP API Gateway, Postman, AZURE, JIRA, GITHUB, CRISP ULP (Unified Landing Page), CRISP TEST Initiate Software (IBM Info Sphere MDM Inspector, 508 testing tools, SLACK for Messaging, Remote Desktop Connection, XML, JSON, Adobe Acrobat Reader DC, PULSE SECURE for VPN.

Enterprise Information Systems (EIS), Inc

QA Analyst/Systems Analyst/Data Quality Analyst 01/2018 – 10/2018

Client: Decennial Response Processing System/Decennial Information Technology Division (U.S. Census Bureau), MD

Worked on Census applications i.e. CUF, PCUF, SMaRCS, NRFU, CEF, ADCAN at US Census Bureau, Suitland, MD

Currently working on DRPS projects in U.S. Census Bureau. These involves SMaRCS (Sampling, Matching, Review and Coding System), CUF (Census Unedited File), CEF (Census Edited File), SMaRCS - AdCan (Address Canvasing) Modules. All these applications/modules are part of the U.S. Census Bureau’s “Census 2020 Project”.

Attended daily team status meetings, daily stand-up meetings on the projects. Attended current and upcoming Sprint planning to discuss the systems requirements, User stories for the applications involved.

Testing both backend and WEB UI applications using SQL and PL/SQL programs. Also performed 508 testing with 508 tools. Analyzed reports using Tableau, and OBIEE and Teradata

Wrote test cases in HP ALM and executed the test cases against the requirements. Created Test Defects in ALM while testing the application. Created Test Execution report in ALM and also provided Test Analysis Report.

General Dynamics Information Technology (GDIT)

QA Analyst/Application Support 06/2013 – 11/2017

Client: CMS (Centers for Medicare & Medicaid Services), MD

Projects: CCSQ, HQR

Worked on Centers for Medicare & Medicaid Services, Center of Clinical Standards and Quality (CCSQ) Project.

Participated in Business Requirements review, Functional Requirements review, Detailed Design Document, Transformation rules review, and Source to Target Mapping review.

Analyze the user stories for the HQR to determine functional requirements for a wide variety of application projects. Provide technical support to V&V Testing, 508 testing, Prod. Support, and Help Desk Teams in resolving the issues.

Performed application testing for the Centers for Medicare & Medicaid Services (CMS) on a variety of projects. Exported the test data in the oracle data base to validate the Backend PL/SQL Scripts, Informatica Work Flow/Mappings. Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing)

Research, identify, and recommends resources required for task execution and completion. Exported the test cases to ALM and mapped the requirement in the Quality Center QC/ALM tools. Participated in QA environment setups by means of triage meetings. Conducted multi browser testing on the portal website.

Validated the loan exports in XML format and ensure that the correct loan data has been exported to XML.

Merkle, Inc, MD

QA Analyst/Systems Analyst 04/2012 – 02/2013

Participated in Business Requirements review, Functional Requirements review, Detailed Design Document,

Transformation rules review, Source to Target Mapping review, and Data Dictionary review meetings. Validated

source/target mapping document, transformation rules applied. Wrote SQL/PLSQL scripts to verify the backend data and reports generated.

Involved in System Testing, Regression Testing, System Integration Testing, 508 testing, Functional Testing, and other testing activities. The testing involves validating all the business and technical requirements for each scenario in the data set. Validation including verifying the data source and target data tables based on requirements, and rules. JIRA is used to log all the defects.

Performed data analysis/profiling using IDE. Involved in writing the Test Plans, writing Test Scripts and Test Cases, Test Results, provide Test Status Reports, and Defect Status Reports to the team

Clovis Inc/ACS, Inc.

QA Analyst/Data Quality Analyst: 08/2010 – 09/2011

Client: Federal Dept. of Education (DOE)

Projects: FSA, CSB, DMCS, DLSS, and CDDTS

Worked on Federal Education Dept.’s DMCS2 Replacement and Conversion Data Warehouse project. The Common Services for Borrowers (CSB) project is an effort by the Federal Student Aid (FSA) office of the U.S. Dept. of Education to manage more effectively and efficiently the Federal loan program by integrating its many legacy systems. The core business processes of Borrower Services are being reengineered into the CSB solution, replacing the Direct Loan Servicing System (DLSS), Debt Management and Collections System (DMCS), and Conditional Disability Discharge Tracking System (CDDTS).

Participated in Business Requirements review, Functional Requirements review, Detailed Design Document, Transformation rules review, Logical Data Model review, and Source to Target Mapping review. Validated the source/target mapping document, transformation rules applied. Wrote SQL/PLSQL scripts to validate the backend data and reports generated.

Involved in System Testing, Regression Testing, System Integration Testing, Functional Testing, and other testing activities. Validation including verifying the data source and target data tables based on requirements, and rules. ClearQuest is used to log all the defects.

MARKEL Corp., VA

QA Analyst/System Analyst 01/2010 – 08/2010

Worked on Markel’s Atlas Data Warehouse project. Its main goal is to consolidate all the Markel’s legacy data (PRIMIS, ISIS, RE-PRIM, PROFIT, ADVANTAGE) to ODS and to support a daily interface of new policy-related charges as needed by Markel’s new Billing System. The ODS 2.2 addresses changes to ODS model and ETL processes to fulfill the policy transaction data requirements of new Guidewir’e Billing Center.

Involved in validating Informatica Mappings, Transformations, Workflows, and Sessions in TEST, DEV, UAT, and in Pre-Production Environments. Validation and verifying the data source and target data tables based on Reqs, and rules. Involved in System Testing, Regression Testing, System Integration Testing, Functional Testing.

Responsibilities involved validating the source (legacy) data by running SQL Server Wrapper Scripts, Running SQL queries in DEV, and in the Test environments.

Validation including verifying the data source and target data tables based on requirements, and rules. HP Quality Center is used to log all the defects. Generated the reports using Cognos.

CACI, Inc.

QA Analyst/Data Quality Analyst 03/08 – 10/09

Client: DoD (Walter Reed Army Medical Center), DC

Projects: AMSA, DMSS, and AFHSC

Worked as Informatica/Backend developer and validation tester for Army Medical Surveillance System (AMSA)/Armed Forces Health Surveillance Center (AFHSC). AMSA/AFHSC is part of the Defense Medical Surveillance System (DMSS) which is the central repository of medical surveillance and seroepidemiologic data related to members of the DOD.

Testing involves validating Informatica mappings, Transformations, Workflows, and sessions in NORM, WAREHOUSE, and in Production Environments. Responsibilities involved validating the source data in the flat files with Informatica mappings, and workflows in DEV, and in Test environments.

Validated SERUM, GAIN/LOSS, DEPLOY FORMS, DEPLOY ROOSTER, MEPS, OUTPATIENT, INPATIENT, CASUALITY, MEPS, NAVY REPEVENTS, and NEW DEPLOY FORMS data sets for US department of defense Army Medical personnel Data. Testing involves running Informatica mappings, Transformations, workflows, and sessions in NORM, WAREHOUSE, and in Production Environments.

Involved in validating all the business and technical requirements for each scenario in the data set. Validation including verifying the data source and target data tables workflows in DEV, and in Test environment based on requirements, and rules. Worked to validate 508 testing

TD Ameritrade, MD

Software QA Analyst 09/06 – 02/08

Projects: TAOS, Branch Score Card, and Card Integration

Worked as Data Warehouse tester on EDW (Enterprise Data Warehouse Expansion), Branch Score Card Redesign, Branch Score, Card Integration, Salesforce.com database validation, TAOS (TD AMERITRADE Opportunity System), Campaign Tracking, and Translog Process Application.

Did System testing and System Integration testing includes testing the source tables/files to target tables and checking the dependencies on other tables. Wrote SQL/PLSQL scripts to validate the backend data and reports generated. Involved in System Testing, Regression Testing, System Integration Testing, Functional Testing.

Participated in Business Requirements review, Functional Requirements review, Detailed Design Document, transformation rules review, Logical Data Model review and Source to Target Mapping review.



Contact this candidate