Post Job Free
Sign in

Data Manager

Location:
San Ramon, CA
Posted:
July 02, 2020

Contact this candidate

Resume:

Career Summary:

Data warehousing and BI QA Analyst with overall 8 years of rich experience in Software Testing/Development

Good understanding of SDLC and STLC methodologies including Waterfall, Iterative, Agile and Scrum methodologies.

Good knowledge on different functional areas like Health care & Banking and Financial Services.

Involved in preparation of Test Metrics, Status Report and Defect Metrics.

High degree of analytical skills which aids data and root cause analysis

Worked extensively on BI, DW.

Worked on all phases of Software Testing Life Cycle.

Good understanding of onsite/offshore work model.

Proficient with MS Excel and using advanced functions like VLOOKUP, Pivot Tables, Pivot Charts for data analysis and reporting.

Extensive knowledge in ETL testing using SQL, UNIX, Pl/SQL.

Good exposure in working various test management and defect management tools like Quality Center, Jira, Rally and HP ALM.

Good knowledge on writing SQL queries to extract data from Database and to respond to ad-hoc requests.

Extensively used joins and analytical functions in SQL to extract data stored across multiple tables in Relational Data Warehouse and getting it ready for further analysis.

Proficient in testing complex Reports generated by OBIEE including Dashboards, Summary reports, Master detailed and Drill down.

Expert in data analysis for correct data reporting.

Good communication skills with all co-workers including Business teams.

Flexible and versatile to adapt to a new environment and ability to multitasking and work independently

Technical Skills:

Operating Systems

Windows XP, UNIX

Testing Tools

HP ALM, JIRA, Rally, Informatica DVO

Programming Languages

SQL, PL-SQL, C, XML

ETL Tools

Informatica10.2/10, SSIS

BI Tools

OBIEE, Tableau

Testing Methodologies

Water fall, STLC, Agile

Databases/

Database tools

Oracle 13g, SQL Server,Netezza

Documentation Tools

MS-Office, MS-Visio, MS Project, PowerPoint

Scheduling Tools

Autosys

EDUCATIONAL BACKGROUND

Master of Technology Vnit India

Projects Profile

Client: Blue Shield of California, SFO,CA

Position: Sr QA Analyst

Date: Aug 2019 – present

This project is a part of maintenance of Data Store Provider Book of Records which stores the provider information from the various Blue Shield CA systems of record. Provider information included the data coming from Vendor Provider source systems and from internal portal Provider Information Management system. The objective was to ensure all the system components (new and existing) are functioning according to specifications.

Responsibilities

Analyzed the requirements in BRD and other specification documents to understand the desired changes in the functionality of system components.

Documented the exact scope of functional, system integration and regression testing with thorough impact analysis

Actively participated in the Daily scrum meetings to discuss tasks and issues relating to the project

Created Test documents such as Test Plan, Test Cases / Test Scripts covering all the possible positive and negative test scenarios of impacted and dependent downstream objects.

Updated HP Application Lifecycle Management (Quality Center) tool with the test cases and scripts.

Executed the Unix scripts to load data into intake, staging layer and structural data model

Validated the Informatica workflows for connections and properties.

Performed DDL validations as part of functional testing.

Performed Data validation both quantitative checks using counts and qualitative check by field by field verification in Netezza.

Verified all the expected data is loaded from source to target as part of data consistency and completeness check.

Executed Referential integrity as well as default value validations as part of data integrity testing.

Verified Duplicate rows present in the table due to any incorrect natural key selections.

Participated in the System Integration Testing to test the functionality across the modules and interfaces.

Performed end to end testing in the release environment to ensure modules interact appropriately with proper functioning of parameter file, load sequence and cross reference or look-up tables.

Documented the Test Results and test Result summary and did walk through with lead.

Reported any defects/issues in JIRA and helped the dev team for quicker resolution of issues with detailed investigation and providing possible root cause.

Created automated scripts using Infosys Inhouse Tool where applicable to run regression for source to staging and staging to target tables.

Participated in the prep-work needed for UAT validations and was involved in UAT support.

Tools: Aginity Workbench for Netezza, Netezza, Informatica Power Center Client 10.2.0, PuTTy, JIRA, HPALM, Winscp

Client: Wells Fargo (Dealer Services- Auto) Concord, CA

Position: QA Lead

Date: 19th Feb 2019 – 2nd Aug 2019

This project is a part of Wells Fargo Dealer services with a primary focus on WF Auto Collections and Repossessions to address regulatory issues .The Data warehouse serves as a repository mainly for AFS feeds including all Loan related transactions and CARS feeds from collections and recovery accounting functions to provide simplified approach to assess, aggregate, analyze, and report on risk across Wells Fargo.

Responsibilities

Participated in the User Stories refinements for clear evaluation of acceptance criteria in JIRA and to discuss and understand the requirements for DW updates and QA validations.

Helped to create and translate user stories to define the functions the system must provide with thorough understanding of requirements.

Actively participated in the Daily scrum meetings to discuss tasks and issues relating to the project

Created Test documents such as Test Plan, Test Cases / Test Scripts with thorough understanding of the requirements and careful impact analysis of dependent objects.

Co-ordinated with the upstream application team for the availability of test files and with the Developer about the correct process of execution of jobs.

Validated the file using UNIX and reported any issues to concerned team.

Validated the ETL sessions and workflow details including parameter files before execution and reported any transformation error observed in Informatica Monitor and Session logs.

Co-ordinated with the DBAs in case of table refresh required in QA environment for proper sync up with Prod.

Created and Executed test cases in DVO.

Executed SQL queries for Oracle Objects validations and documented the Test Results and test Result summary and did walk through with lead.

Reported any test results discrepancies and helped the dev team for quicker resolution of issues with detailed investigation and providing possible root cause.

Updated HP Application Lifecycle Management (Quality Center) tool with the defects observed and followed up with the Dev team for defect resolution.

Created automated scripts using Data Validation Option Tool where applicable to run regression for source to staging and staging to target tables.

Executed and Validated the Backfill process jobs correcting any missing/erroneous data related issues

Participated in monthly maintenance validation activities such as validation of encrypted field and regression testing of remaining fields and validation of the new fields and expected values

Regression tested existing objects as part of process updates.

Participated in post-Production deployment validations

Tools: Toad for Oracle 13.0, Informatica Power Center Client 10.2.0, Informatica DVO Data Validation Option 10.2.0, PuTTy, JIRA, HPALM

Client: Wells Fargo (Wealth Management Group) San Francisco

Position: QA Lead

Date: 13th Nov 2017 – 31st Jan 2019

Source Platform for Abbot Downing changed from SEI Trust 300 platform to SEI Wealth Platform (SWP).

This change resulted in modifications of the existing applications in terms of new field mappings and new functionalities. Work involved validations in the process of sourcing, hosting and reporting many such applications each considered as project viz. Quantifacts Fees, AD Agg Master, Ad Portfolio Management, Beta Bridge and Investment Manager Navigator Reporting etc.

Responsibilities:

Analyzed available documents i.e. BRD, FSD and TSDs and discussed with other teams to understand the requirements for QA validations

Clarified requirements by communicating to Business Users.

Created Test plans for ETL validations and OBIEE validations of above applications.

Created and documented test scenarios and test scripts for Smoke Testing, GUI Testing, Functional Testing, Data Integrity Testing etc. for ETL and OBIEE Validations.

Analyzed data displayed on previous reports to come up with exact join conditions and tables/views to be used for new Reports.

Validated thoroughly all the Informatica mappings, workflows, Target Tables/Files, Triggers and Audit Tables using complex SQL Queries

Created test data when needed for accurate execution of all test scenarios.

Executed regression tests for ETL and OBIEE validations.

Validated the subject areas in OBIEE used for Ad-hoc Reporting by Business Users.

Validated the filter criteria and Grouping etc. used in Report as per the Business logic.

Logged issues and defects in JIRA and ALM with clear description, screenshots and comments.

Helped Dev teams for quicker resolution of issues/defects with detailed investigation and providing possible root cause.

Worked with Dev teams for accurate mappings of the new source fields to reporting fields as per Business logic.

Validated the Autosys Job parameters efficiently for automated batch cycles.

Collaborated with multiple teams in resolving issues.

Initiated requests and monitored installation of new versions of required software applications for QA team

Contributed to the project Compliance for all documentation including Test plan, Test cases and Test Summary Artifacts.

Assisted in identifying missing/erroneous data, disparities/ inconsistencies between data sources as well as validating accuracy and consistency with stated business rules.

Created defect log and metrics summary to report to the management.

Co-ordinated with Off-shore team members for various testing activities.

Environment: Windows 10, Informatica 10/10.2, OBIEE 11g, IBM Data Studio 4.1.3, HP ALM (Application life cycle management)12.53, SQL, PLSQL, Oracle 11g, UNIX, Informatica DVO 10.2.0,Toad

Client: Credit Suisse, NYC, NY

Position: Sr. QA Analyst

Date: 1st April 2015 – 27th Oct 2017

Credit Suisse is a leading wealth manager with strong investment banking capabilities. This Wealth Management project deals with building and enhancing of Client Analytical Services (CAS). CAS data mart is intended to provide an analytics and data management capability that enables deeper client and FA insight to influence marketing, sales and service strategies.

Responsibilities:

Analyzed Functional, Business Requirements, and technical specifications to determine its intended functionality to develop Test Scenarios for System & Functional Tests.

Developed unit test cases, configured source and target environments and prepared test data for testing.

Designed test scripts and tested SCD type I, II- and III-dimensions involving transformations such as joiner, lookup, update strategy, expression, and filter.

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source files and RDBMS tables to target tables.

Worked with BI Team for Database Testing for validations using SQL Queries using SQL Developer and Toad.

Written several complex SQL queries for validating Tableau Reports.

Assisted Business users towards creation of UAT test scenarios and execution of UAT scenarios under the guidance of Test Manager.

Environment: Windows 7, Informatica 9, OBIEE 11g, QTP (Quick test professional), HP ALM (Application life cycle management), Sql, PLSQL, Oracle 11g, UNIX, Toad, Tableau

Client: Magellan Health, Richmond, VA

Position: QA Analyst

Date: 2nd April 2012 – 31st March 2015

Responsibilities:

Analyzed business requirements, system requirements, data mapping requirement specifications,

Developed complex SQL scripts to validate the data loaded into warehouse and Data Mart tables using Informatica.

Performed Regression, smoke, sanity and data quality testing writing complex SQL queries.

Prioritized and identified regression test cases based on the dependency for each of the data move.

Involved in analyzing the requests for new, enhanced, and modified systems, against user and business requirements.

Conducted Functional testing to ensure that the application meets the requirements and to ensure usability readiness

Involved in creation of defect analysis report for system Test and UAT.

Validated data for completeness and quality when moving complete deposit data from files to Stage and then to source data.

Environment: SQL, UNIX, Toad, SSIS, Windows NT, Sun Solaris, MS SQL server, Rally, Oracle 10g



Contact this candidate