VinayKumar Dori
Data Lead, ETL Testing, Data Analyst, Data Warehousing and BI Report Testing
Mobile: +1-657-***-****
Email: *****.******@*****.***
LinkedIn: https://www.linkedin.com/in/vinaykumardori7507144b/
Professional Summary:
●With 13.5+ years of experience in Software Quality Assurance, I specialize in testing ETL/Data Warehousing, BI reports, Azure Cloud, ADF, Databricks, databases, data analysis, Spark, Matillion, and MDM solutions (including Reltio and Stibo), along with web and client-server applications.
●Extensive experience across the Banking, Insurance, Healthcare, and Telecom industries, working with prominent clients such as Wells Fargo, Microsoft, UNUM, Delta Dental, Hertz, CNO Financial, Kaiser Permanente, Novartis, Northern Trust, COX Communications, and HyperionX.
●Proficient in Agile and Waterfall software development methodologies.
●Expertise in ETL processes, supporting data extraction, transformation, and loading using tools such as Informatica and SQL Server Integration Services (SSIS).
●Skilled in testing BI applications, including Qlikview, OBIEE, MSBI, Angular, Power BI, and Tableau.
●Strong ability to write and test complex SQL and Oracle queries for BI report validation.
●Extensive experience with ETL testing using Informatica Power Center, including components like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, and more.
●Adept at performing data analysis, validation, cleansing, standardization, verification, and resolving data discrepancies.
●Expertise in Snowflake and Azure cloud environments, including loading nested JSON data into Snowflake tables.
●Significant experience in analyzing, designing, and testing data warehousing solutions, with a focus on extraction, transformation, and loading using Ab Initio, and familiarity with job scheduling using Autosys.
●Proven ability in gathering business requirements, conducting product demonstrations, and managing client interactions.
●Experience in data quality and ingestion testing using Azure Cloud, Power Exchange, and Informatica Test Data Management tools.
●Involved in various data migration projects, including SSIS to cloud, Informatica upgrades, and Teradata to SQL Server migrations.
●Strong understanding of data warehousing concepts, including fact and dimension tables, and Star/Snowflake schema modeling.
●Worked on data conversion projects, performing data reconciliation between QA/UAT and production environments.
●Involved in data cleansing projects, ensuring data optimization for improved retrieval and usability.
●Extensive experience in analyzing business, technical, and functional requirements, and developing, executing, and testing test plans, cases, and strategies.
●Led QA efforts for ETL processes in Master Data Management (MDM) systems, validating the transformation and integration of large-scale data across multiple sources using tools like Informatica, Reltio, and Stibo.
●Experienced in defect tracking and analysis, using tools like TFS (Team Foundation Server) and HP ALM.
●In depth knowledge of Snowflake database structures, schemas, and table designs.
●Managed offshore/onsite models and coordinated offshore resources effectively.
●Proficient in Azure, Jira, HP Quality Center, and MTM, handling test estimation, planning, design, execution, and defect tracking.
●Skilled in JavaScript for responsive website development and familiar with Python (NumPy, Pandas, PySpark) scripting.
●Extensive experience in data migration testing from Informatica Power Center and SSIS to Azure Cloud, defining migration strategies and plans.
●Participated in RFP (Request for Proposal) meetings with banking and insurance clients.
Education Qualifications:
●Master of computer applications (MCA) from Osmania University in 2010
●Bachelor of science (BSc MPC) from Kakatiya University in 2007
Technical Skills:
#Azure, #Cloud, #ETL, #Database Testing, #Big Data Testing,#Data Warehouse #SQL Server, #Informatica, #SSRS, #SSIS, #MSBI, #Reltio, #MDM, #OBIEE, #BI, #Angular Dashboards #Qlikview, #Oracle, Informatica Test Data management tool, #Data Analyst, #Teradata, Data Bricks, Datastage, #Matillion #SSRS, #Azure, #Autosys, #Tableau, #Datastage, #Jira, #HP QA, #MTM, #SDLC, #STLC, #Data cleansing, #AWS, # Spark, #Data Migration, #Snowflake
Relevant Project Experience
Client: Delta Dental Role: ETL/SQL/BI Test Lead Duration: May 2023 – Present, Location: California, USA
Roles and responsibilities:
●Developed Snowpipe for continuous and automated data loading.
●Designed and executed test cases for ETL pipelines in Databricks, ensuring accurate and efficient data transformations.
●Utilized Databricks notebooks for real-time data validation and quality checks, detecting and resolving discrepancies.
●Collaborated with data engineers to optimize Spark jobs in Databricks, reducing data processing time by 20%.
●Implemented automated data quality monitoring with Databricks, enhancing the detection and resolution of data issues.
●Integrated data from multiple source systems into Snowflake Cloud Data Warehouse and Azure.
●Performed end-to-end data validation between source and target systems using the Snowflake database.
●Worked on highly efficient data pipelines to ingest, store, and process data from various sources using Databricks.
●Developed and executed complex SQL queries to verify data quality, accuracy, and consistency across MDM repositories, ensuring compliance with data governance standards during data migration and transformation.
●Analyzed and validated data transformations, integrations, and syncs between ETL pipelines and MDM systems like Reltio, STIBO, and others.
●Used Matillion ETL for data transformations and Power BI for reporting.
●Created CI/CD pipelines with ADF Pipeline to automate end to end testing processes.
●Working on creating snow pipe for continuous data loads
●Leveraged Databricks notebooks to conduct data validation and quality checks, identifying and resolving data discrepancies in real time.
●Collaborated with data engineers to optimize Spark jobs within Databricks, resulting in a 20% reduction in data processing time.
●Implemented automated data quality monitoring solutions using Databricks, significantly improving the detection and resolution of data issues.
●Working on snowflake cloud data warehouse and Azure to integrate data from multiple source systems
●Working on a highly efficient data pipeline to ingest, store, and process data from multiple sources by using Databricks.
●Used Matillion ETL cloud tool for transformations and Power BI as a reporting tool
●Validating ingested data from Azure bronze zone to snowflake using Scala and SQL programming and preparing a regression suite to check data after ingestion from source file to snowflake.
Client: Hertz Role: ETL Engineer Duration: July 2022 – Apr 2023, Location: Estero, Florida, USA
Roles and responsibilities:
●Design and Development of QA documentation like Test Cases and Test scenarios from business and functional requirements.
●Working on data validation by pulling data from Hertz data warehouse and validating as per the mapping sheet
●Expertise in data engineering where we can load the data using ADF pipelines
●Build bulk ETL that ingest data from various sources transactional data into target tables
●Involved in testing of complicated graph with various Ab Initio components such as join and join with DB validations
●Extensively worked on power center tools like Designer, workflow monitor and workflow manager to monitor and schedule sessions and workflows.
●Managed and collaborated on code repositories using GitHub, ensuring seamless integration and continuous delivery.
●Managed ETL QA processes for migrating MDM data from on-premises to the cloud using Reltio, ensuring data integrity and seamless integration.
●Ensured accurate data lineage and traceability for all master data sources across the ETL and MDM ecosystem
●Worked on end to end data validations between source to target using snowflake database
●Identified the requirements and arranged peer reviews, walkthroughs and sign off meetings.
●Performing regression testing by using python (Numpy and Pandas ) automation tool
●Using ControlM scheduler to schedule the jobs and load the data into QA environment
●Used Datastage as the ETL tool for the project and performing the data validations according to the mapping documents
Project Title: Accelerator – Hyperion Client: Hyperion Insurance Group Company: Accenture, India
Role: ETLBI Tester / Test Lead/ Data Analyst Duration: Aug 2018 June 2022
Roles and responsibilities:
●Manage a team of testers to lead the testing efforts of multiple projects that meet user requirements and technical specifications in an agile environment with 2 week iterations.
●Handling offshore scrums, stories planning and estimations for sprints
●Worked on data analysis, data migration and data validation in cloud environment post code migration from Informatica Power Center, Informatica Power exchange and test data management
●Created, executed, and maintained detailed test plans, test cases, and test scripts to verify data accuracy, consistency, and completeness in Stibo-based MDM implementations.
●Automated end-to-end ETL testing for MDM platforms by regression test suite, significantly reducing manual testing efforts and improving overall test coverage for data validation
●Working on data migration testing from different databases to snowflake
●Using MDM to find data from multiple sources and match them to the single person based on demographics and other identifies to create a golden a golden record
●Validated data integration and migration processes within the Stibo MDM environment, ensuring seamless data flow between source systems and MDM repositories
●Tested the data acquisition, data transformation and data cleansing approach for the MDM implementation
●Used Spark framework to pull the data into ingestion layer and transforming the files in to Parquet format
●Worked on end to end data validations between source to target using snowflake database
●Developed data migration test suite to compare the data between older and newer versions
●Part of regression testing, UAT testing and data conversion checks between QA and Production
●Part of Informatica transformation validations between databases based on mapping documents
●Managed defects in Azure and Jira for the different modules of the project
●Working on data quality issues related to MDM and verifying data quality parameters in MDM
●Worked on data reconciliation between QA and Production EDW databases by using NBI automation tool
●Using SQL server for reports verification between backend and frontend data
●Involved in migration from SSIS to Cloud platform and define the migration strategy and plan
●Implemented Python as an automation tool for the regression testing and target data validations
●Creating and scheduling jobs on the ControlM job, creating job dependencies and alerts as per application team and business requirement
Project Title: NTS (National Transplant Services) Client: Kaiser Permanente Company: Cognizant, India
Role: Data Analyst/ETL/ BI Tester/Test Lead/MDM Test Duration: Aug 2017 July 2018
Roles and responsibilities:
●Worked on data migration testing between Informatica Power Centre with backend data
●Extensively worked on power center tools like Designer, workflow monitor and workflow manager to monitor and schedule sessions and workflows.
●Having solid understanding of FHIR's core concepts, including resources, profiles, extensions, and operations
●Identified the requirements and arranged peer reviews, walkthroughs and sign off meetings.
●Involved in testing application that was used for checking the eligibilities, claim processing and claim status
●Part of checking patients related claims status, claims process and benefits and all end to end validations
●Exported Manual Test Cases from MS Excel template directly to HP Quality Center and executed all the Test Cases in Quality Center with Pass/Fail/Blocked status
●Using informatica power center as ETL tool and validating informatica mappings according to the mapping documents shared by BAs
●Performed data validation and data reconciliation between latest changes in QA with existing data in production
●Expertise in creating complex SQL Queries to verify and validate the Database Updates.
●Verified end to end validations as part of code migration from Informatica lower to higher versions and Teradata to SQL server
Project: Northern Trust Credit Application Client: Northern Trust Corporation Company: Cognizant, India
Role: Data Analyst/ETL/ BI Tester/Test Lead Duration: Apr 2017 Aug 2017
Roles and responsibilities:
●Prepared test cases for checking the populated data correctly into fact and dimension tables.
●Contributed to quality and process improvement initiatives
●Coordinated and controlled testing projects at every step of the quality cycle from test planning through execution to defect management.
●Reviewed system use cases and functional specifications with the appropriate business analyst.
Project Title: Alcon Surgical US/Canada Deploy Client: Novartis Company: Cognizant, India
Role: ETL/Tableau Testing/Lead Duration: Jan 2017 Apr 2017
Roles and responsibilities:
●Test cases and test scripts preparation/execution with team
●We used Datastage to transform and load the data from source to target tables and performing the end to end ETL validations
●Writing queries to validate reports
●Dashboard validations in Tableau reporting tool
●Worked on transform, Partition, Dataset and database components with Ab Initio
●Worked debugging strategies on Ab Initio ETL tool
Project Title: CNO AIM Agent Incentive Management Client: CNO Financial Group Company: Cognizant, India
Position: BI/ETL Test Lead/MDM/Data Analyst Duration: Jan 2016 Jan 2017
Roles and responsibilities:
●Involved in Ab Initio mapping testing and ETL validation between source to target
●Involved in testing of complicated graph with various Ab Initio components such as join and join with DB validations
●Involved in preparation of test data to test the functionality of Ab Initio graphs
●Tracked the defects using the Quality center tool and generated defect summary reports.
●Created Test Cases in Quality Centre and mapped Test Cases to Requirements.
Project Title: CAR ECP (Credit Analysis Reporting Enterprise Common Process) Client: Wells Fargo Center
Company: Cognizant, India Position: BI/ETL Tester/Report Testing Duration: Jan 2015 Dec 2015
Roles and responsibilities:
●Coordinated and controlled testing projects at every step of the quality cycle from test planning through execution to defect management.
●Used Datastage as ETL tool for this project where we are pulling data into SQL by using Datastage
●Generation and documentation of test cases from functional requirements and design documents.
●Performed Integration, System tests and Customer acceptance testing. Created new tests based on the review of the customer in the acceptance test phase.
●Defining and performing the Test strategies and associated scripts for the verification and validation of the application and ensuring that it meets all defined business requirements and associated functionality.
●Used Teradata as the backend database and we did the validations between source and target data
Project Title: Pearson Enhancement Client: Pearson Publications Company: Cognizant, India
Position: BI/ETL Test Engineer Duration: Aug 2014 – Dec 2014
Roles and responsibilities:
●Handling team work, Client meetings and Status reports
●Developed Test Plans, Test Cases, and Test Scripts testing.
●Created Test Cases and developed Traceability Matrix and Test Coverage reports.
●Managed and conducted System testing, Integration testing and Functional testing.
●Worked on data validation between Teradata as source and Oracle as target databases
Project Title: Microsoft Volume License Client: Microsoft Company: Accenture, India
Position: BI Testing/MSBI Report Testing/SDET Duration: May 2013 Jul 2014
Roles and responsibilities:
●Understanding the scope and writing ATP and Test case preparation
●Test case review with Engineering teams and Dev team
●Executing test cases and working on web services
●Defect tracking and Report Generation in MTM
●Used TFS for bug tracking, work item tracking and publish test results in TFS to share with client and managers
●Used Oracle as backend database for this project and completed the data validations according to the BRD, FSD, TDD and Test plans
Project: COX Upgrade Client: COX Communications, USA Company: GGK Technologies, India
Position: ETL Test Engineer/ OBIEE Report Tester Duration: Jan 2011 Feb 2013
Roles and responsibilities:
●Created Test Cases and developed Traceability Matrix and Test Coverage reports.
●Tracked the defects using Quality center tools and generated defect summary reports.
●Created Test Cases in Quality Centre and mapped Test Cases to Requirements.
●Used the Informatica Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse databases
●Did end to end data validations after code migrations in project from one platform to another platform