Post Job Free

Resume

Sign in

Data Manager

Location:
Edison, NJ
Posted:
May 26, 2020

Contact this candidate

Resume:

Senior Etl Quality Assurance Analyst

Professional Summary

Around 9 plus years of IT experience in Quality Assurance for ETL, Backend, Web based, Client/Server applications using ETL, Manual, Automated testing tools and involving cloud computing platforms like AWS(Amazon Web Services) and Google cloud Platform(GCP)

8 years of experience in Data Analyzing and supporting data extraction, transformations and loading processing, in a corporate-wide-ETL datawarehouse systems

Expert in testing the cubes in various subject areas and report dashboards for data analytics

Proficient in gathering and analyzing the Business Requirements, documenting System Requirement specification, Functional Requirement specifications, Requirements Traceability Matrix

Experience in Travel, Insurance, Healthcare, Retail and Public Services applications

Experience in Defining Testing Methodologies; Test Estimation, Requirements Review, creating Test Plans and Test Cases, Verifying and Validating Application Software and Documentation based on standards for Software Development and effective QA implementation in all phases of Software Development Life Cycle (SDLC)

Strong working experience in validating DSS (Decision Support Systems) applications, and Extraction, Transformation and Load (ETL) of data from Legacy systems using SSIS, Informatica and Pentaho and cloud data warehousing tool like snowflake for data analytics

Hands on experience on programming languages such as C# and Python

Written test cases in Hip Test/Cucumber to automate the application and tracking defects in Quality Center, JIRA and Rational clear quest tools

Expertise in understanding Business Requirements, Functional Requirements, decomposing into HLD’s and ETL Specifications into Test Cases for positive and negative test conditions

Experience in webs services testing using SOAP UI and Postman tools

Experience in testing and writing SQL statements

Expertise in multiple testing techniques including functional, regression, integration, system, parallel, data base, performance, smoke, User acceptance testing

Experience in Dimensional Data Modeling using Star and Snow Flake Schema

Expertise in using data centric testing for data migration

Extensive experience in testing and implementing Extraction, Transformation and Loading of data from multiple sources into Data warehouse

Worked on scheduling tools like Autosys and IBM Tivoli Workload

Used Visual Studio Test Professional to configure or record manual tests

Solid Back End Testing experience by writing and executing SQL Queries

Involved in Developing Data Marts for specific Business aspects like Marketing & Finance.

Involved in working on reporting services like SSRS/Business Objects/Crystal Reports, SQL Server Analysis services SSAS, Jasper and Dundas BI

Performed backend testing for Database integrity by executing complex PL/SQL queries for validating the data in the backend database tables

Experience using query tools for PostGreSQL, SSMS,NoSQL, Oracle, DB2 to validate reports and troubleshoot data quality issues

Expertise in working in Agile (Scrum), Waterfall, Spiral methodologies

Involved in using agile management tools like MINGLE,JIRA

Expertise in subversion control and version control tools like TFS and GIT

Involved in standalone and distributed GUI-based applications and SOA Web Services

Experienced in executing UNIX shell scripts for monitoring batch jobs

Involved in testing both .NET and web based components

Education:

Bachelor of Technology in Mechanical Engineering from GITAM in May 2007

Master of Science in Computer Science from Texas A&M University in May 2010

Technical Skills:

ETL Tools

Pentaho, SSIS, Informatica

Agile/Testing Tools

JIRA, ZEPHYR, Cucumber, HP ALM, Rational Clear Quest, Microsoft Test Manager, Team Foundation Server, QTP, Mercury Quality Center

Microsoft Office

Office, Word, Excel, Office Timeline, Power Point

BI/DWH Tools

Jasper, Business Objects, SSRS, SSAS, Snowflake, Dundas, Tableau

API tools

Soap UI, Postman

Big Data

Hadoop, Spark, Kafka

Operating Systems

Windows7, Windows XP, Windows NT, Windows 95/98/2000, OS

Cloud Environment

Amazon Web Services(AWS), Google Cloud Platform(GCP)

SQL Tools

PostGreSQL, Oracle 8.1/9i/10g/11g, PL/SQL, SQL Server 2000/2005/2008/2010/, Azure, DB2, Siebel, SQL*Plus, SQL*Loader, SAS, Rapid SQL

NoSQL

MongoDB

Scheduling

Autosys, Tivoli workload

Programming Languages

.Net, C#, Python

Repository Tools

GitHub

Environment

UNIX, MVS, HP-UX, IBM AIX 4.2/4.3, Novell NetWare, Win 3.x/95/98, NT 4.0, Sun-Ultra, Sun-Spark, MS Visual Studio 2010 Professional, Visual Studio Test Professional 2010,Sun Classic

Professional Experience

Employer: Symbioun Technologies Oct 2016 – Present

Client: CareCloud, Miami, Florida

Senior ETL Tester

Responsibilities:

Identifying the business requirements for multiple projects in each release and participated in meetings with SME and Business Analysts.

Testing different kind of analytics and advanced analytics reports data for analytics app of care cloud health care SAAS clients

Writing advanced SQL scripts to test different report dashboards in data analytics projects

Creating test cases, action words and test scenarios in jenkins into Hip test/Cucumber tool

Ensuring smooth execution of GUI according to policies and procedures

Creating and monitoring Automated jobs using ETL Pentaho and informatica tool, validated data between source and target datawarehouse and report the issues to developer

Tested the Kafka data flowing into snowflake datawarehouse through alooma pipeline

Worked with data warehouse in the development and execution of data conversion, data standardization of several tables in one single data repository system MDM(master data management)

Verified data coming from different databases such as Postgres, Mongo DB and snowflake cloud datawarehouse tools

Used Postman and Soap UI for rest service testing

Tested different claim extracts data loaded using unix and winscp

Batch and file validation testing using unix commands

Ran and monitor job streams using Tivoli work load scheduling tool

Used SAS to query and extract data from various customer databases like (Sql server, Oracle and DB2) and to prepare analysis

Participated in fact and dimension table implementation in Star Schema model based on requirements.

Tested various types of cubes or mdm data based on different subject areas such as appointments, encounters and health claims, denials writing mdx queries and views on pentaho tool

Conducted various management activities by analyzing and verifying test results, providing status reports

Analyzed and validated different type of data related to claims, remittance requests, patient and appointments

Involved in profiling the data using Python pandas and Apache spark

Expertise in understanding and testing MIPS-Advanced Care Information transitional measures and Eligible Professional measures

Analyzing the graphs and data generated from the key performance indicators,revenue and practice performances for monthly, daily and yearly data of different clients and matching it against the data source

Troubleshooting the issues raised by the different clients and responding or creating defects accordingly

Created check and balance monthly, yearly and daily jobs to validate the data from sources for different DWH facts and dimensions tables

Involved in creating pie charts, bar charts and various graphical representation of visualized data using Dundas and Tableau reporting tools

Involved in testing data generated from cubes into data warehouse tables and reports

Effectively communicating with developers and other QA engineers and understand effectively how the business works with the associated data

Providing reports in excel to the sales and marketing team requested based on business needs

Environment: Pentaho 5.0,Informatica,PostGreSql9.4,GitHub,Jira,Zephyr,Tableau,T-SQL,SQL,PL/SQL, SAS,UNIX, Charles, Dundas, Python, Snowflake,kafka, Apache Spark, Tivoli, Windows, OS, MS Office, Visio, MS Excel, AWS, GCP

Employer: Symbioun Technologies May 2016-Oct 2016

Client: Carnival Cruise Line, Florida

Project Title: Datwarehouse Enhancements

Test Lead

Responsibilities:

Understanding technical specifications and business requirements and translated to test artifacts

Performed all aspects of verification, validation including functional, structural, regression, load and system testing

Writing complex SQL queries for data validation for verifying the ETL Mapping Rules

Created weekly status reports and defect tracking dashboard reports to the management team

Worked on different projects related to gratuity, pricing, promotion and guest operation projects

Presented DWH testing road map timelines to the management team using Power point office timeline tool

Used V-lookup and H-lookup in excel in comparing different sets of data extracting from CRM and revenue management data warehouse systems

Worked with BI team to extract the data generated from Hyperion Tools and analyze the data necessary for performing Regression scenarios and resolve issues in production that impacted business

Created Test input requirements and prepared the test data for Data Driven testing

Collected requirements and tested several business reports

Used SQL tools like Oracle and NoSQl to run SQL queries and validate the data loaded in to the target tables

Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS

Involved in user training sessions and assisting in UAT (User Acceptance Testing).

Strong ability in developing advanced SQL queries to extract, manipulate, and/or calculate information to fulfill data and reporting requirements including identifying the tables and columns from which data is extracted

Interacting with senior peers or subject matter experts to learn more about the data

Extensively written test scripts for back-end validations

Experience in creating UNIX scripts for file transfer and file manipulation.

Created ETL execution scripts for automating jobs.

Performed extensive DATA validation using SQL queries and back-end testing

Tested the different sources such as Flat files, Main Frame Legacy Flat Files to load into the Oracle data warehouse

Used workflow manager for session management, database connection management and scheduling of jobs.

Environment: Informatica PowerCenter 9.6.1,Toad, Oracle 10g, Quality Center 10.0, T-SQL,SQL,PL/SQL, Unix, Opcon Scheduling, Windows7, MS Office, Visio, MS Project Timeline, MS Visual Studio 2012 Professional, Power point Office Timeline, Visual Studio Test Professional 2012, Microsoft Test Manager, Team Foundation Server, Rational Clear Quest

Employer: Symbioun Technologies Jan 2014-April 2016

Client: Volkswagen Credit Inc., Libertyville, IL

Project Title: VCI Cloverleaf Project

Test Lead

Responsibilities:

Involved in writing test strategies, cases and scripts based on functional and business requirements

Involved proactively working with the Networking and Server teams to resolve testing issues

Developed technical documentation and giving presentations to diverse types of audiences like technical staff, business users and management

Worked on Agile management tools like TFS work bench and enterprise software to track the story and product delivery requirements

Involved in test planning, writing test cases and defect tracking using Microsoft Test Management tool

Worked on different projects like Customer Payments, Payment services, Canada Insurance Import/Extract, PTMS, Quick Fund/Net Fund dealer payments and different cloverleaf Data warehousing ODS, dimension and fact tables

Involved in writing SQL queries using RapidSql programming tool which is used across all

platforms including Oracle, Microsoft SQL Server, Sybase

Extensively used Informatica PowerCenter 9.5.1 to load data sourcing from Source to Target

Databases

Expertise in executing Informatica workflows and testing the output based on the requirement criteria

Ran jobs through scheduling tools like Autosys

Execution of manual testing and automation testing for GUI using various tools

Experience in handling data coming from various sources like Relational, XML and flat files

Tested the service contract for each release and the mode of data that goes through XML

Worked on Visual studio test case manager to execute test results and post it to the team foundation server

Worked on web services testing using the service endpoints and based on the passed parameters

Validated data flow from Source through to FACT tables and Dimension tables using complex queries (left outer Joins, sub queries etc.)

Involved in testing the Biz talk applications hitting the web services for validating the data coming from different sources like Relational, XML and flat files

Used UNIX shell scripts and compared the files in .csv format to make sure the extracted data is correct.

Coordinated with developers in performance tuning of mappings and sessions

Documented dependencies of workflows and load process along with logic in specification documents and test results

Wrote complex SQL queries to validate EDW data versus EDM source data including identification of duplicate records and quality of data based on Mapping/Transformation rules

Involved in peer review of test cases, Test Pool meetings, Impact analysis and test estimate meetings

Used Microsoft Test Manager and TFS for defect tracking and reporting

Environment: Oracle SQL\PL-SQL, RapidSql, MS Sql Server2008/2010,Windows7, MS Office, Visio, MSVisual Studio 2012 Professional, Visual Studio Test Professional 2012, Microsoft Test Manager, XML, CSV Files, XML Files, subversion, Informatica Power Center9.5.1,Unix,Putty,SOAPUI 5.0

Employer: Symbioun Technologies March2012-Dec 2013

Senior ETL Quality Assurance Analyst

Client: Redbox Outerwall

Project Title: Punch Card/Gift Card

Responsibilities:

Using Agile Project Management tool – MingleBI/TFS to determine the Stories in BI and DI projects respectively and derive the test details

Preparing and Maintaining the Test strategy, test design, test cases, and traceability for applications under test

Involved in the preparation of Technical design documents, Source to target (S2T) document.

Worked in various projects like PunchCard, GiftCard, Inventory, Customer Platform Service (CPS), BI/DI data warehouse enhancements

Involved in understanding the logic behind the stored procedure and running package from SQL Server Management Studio(SSMS)

Create and execute manual and automated tests, document testing practices, and communicate results/recommendations to team

Involved in testing of the ETL solutions developed using SQL Server Integration Services (SSIS) 2005 / 2008 and SQL / T-SQL

Develop and maintain the manual and automated test scripts, functions / SQL code

Involved in testing on Filezilla and CoreSFTP using PUTTY in transferring flat files from local to remote server

Responsible for testing of business intelligence reporting solutions developed using SQL / T-SQL and Reporting Services SSRS and SQL Server Analysis solutions SSAS

Environment: SSIS, SSRS, SSAS, TFS, Mingle, MySQL Server2008/2010,SSMS. SQL\PL-SQL, Unix,Windows7, MS Office, Visio, MS Visual Studio 2010 Professional, Visual Studio Test Professional 2010, XML, CSV Files, XML Files, Filezilla, Putty, CoreSFTP, subversion

Employer: IT America, Inc. June2011-Nov2011

Datawarehouse Tester

Client: JP Morgan Chase

Project Title: UNO-CHASE/Heloc-MSP to VLS conversion

Responsibilities:

Analyzed business requirements, system requirements, data mapping requirement specifications, and responsible for documenting functional requirements and supplementary requirements in Quality Center

Experience with Toad or other SQL query tools

Demonstrated experience in QA, SIT, and UAT testing for a large data warehouse including writing and executing test scripts

Experience with verification of master reference data, duplicate check, data cleansing, transformation and attribute correctness check in MDM

Tested mappings with the Design Documents and also performed testing for various sample data

Involved in the post implementation validation of UNO CHASE and HELOC-MSP to VLS CONVERSION projects

Expert in understanding ETL Packages using several transformations to perform Data profiling, Data Cleansing and Data Transformation

Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.

Performed system testing, regression testing, integration testing and documented results

Worked with ETL group and data analysts for understating and preparation of technical specifications mappings for dimensions and facts

Writing complex SQL queries using Case Logic, Intersect, Minus, Sub Queries, Inline Views, and Union in Oracle

Automating/ handling and recording scripts using QTP

Reviewed and tested database modifications as per Physical Data Model

Involved in converting informatica code into SQL queries and generating reverse engineered STTS

Environment: Informatica 8.6.1, Quality Center 10.0, Oracle 9i/10g,TOAD, SQL\PL-SQL, Unix, Putty, Business Objects, Windows 2000/XP, MS Office, Visio, XML, CSV Files,.NET, XML Files, QTP

Employer: Accretion Consultants LLC Apr2010-March2011

Client: Broadridge Financial

Programmer Analyst

Project Title: Relocation Translation

Responsibilities:

Involved in all phases of Software Development Life cycle from software requirements such as Analysis phase to design, Development, Integration, Regression, Functionality and Usability Testing and Maintenance

Develops programming solutions according to technical specifications while adhering to policies and standards for the integrity and safety of our data

Designed Windows forms using Data Grid, Validation, Login Controls, User Controls

Designed Custom classes for Data Validations

Used Atalasoft controls for building capture applications, document processing and .Net Imaging

XML files are created for each document and converting into Tiff and grouped to PDF

Involved in creating Zip files using PGP Encryption

Worked with ETL group for understating mappings for dimensions and facts

Performed testing for integration, functional and unit testing of standalone and distributed GUI-based applications and SOA Web Services

Extracted data from various sources like flat files and SQL Server

Design new systems or enhancements to existing systems, modify, code, debug, test, and document moderately complex application systems

Environment: .Net framework 3.5, C#, VB.NET, Testing Tools, AtalasoftDotImage, Visual Studio 2008/10, SQL Server 2005/2008,XML,MS Visio, subversions, SSRS, Informatica9.1

Employer: Data metrics, Inc. June2010-Oct2010

Intern

Responsibilities:

Manually testing web based applications and working on .Net and database related objects using SQLServer2005

Involved in working on mainframe related applications using Unix/Linux

Assisting technical lead in performing integration testing, functional and unit testing

Understanding the business requirements and creating test cases and running the test scripts using Quality Center

Coordinated with PM, BA's, Developers and functional teams and coordinated meetings.

Design and build efficient SQL queries, analyze query cost comparison, use indexes efficiently

Environment: .Net framework 3.5, VB.NET, Testing Tools, Visual Studio 2008/10, SQL Server 2005/2008,XML,MS Visio, SSRS, Quality Center

Additional Information:

Relocation: Yes

Availability: 1-2 weeks notice

References: Provided upon request



Contact this candidate