Resume

Sign in

Data Etl

Location:
Milpitas, CA
Posted:
May 13, 2020

Contact this candidate

Resume:

*QVidya Sagar Nattala

California

714-***-****

adc7ft@r.postjobfree.com

GC

Professional Summary

8+ years of IT experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, Client/Server applications

Solid Back End Testing experience by writing and executing SQL Queries

Excellent testing experience in all phases and stages of Software Testing Life Cycle and Software Development Life Cycle (SDLC) with good working knowledge of testing methodologies, disciplines, tasks, resources and scheduling

Expertise in creating Test Plan documents and developing test strategy documents and preparing the Traceability Matrices

Expertise in designing the test scenarios and scripting the test cases in order to test the application and Test Data Management Plan.

Expert in PERL, UNIX Shell Scripting&SQL writing for several technology needs

Involved in Developing Data Marts for specific Business aspects like Marketing & Finance

Creation of test data and Automation of test cases for system and analyzing bug tracking reports

Expertise Shell Scripting experience on UNIX, Linux and Windows environment

Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide-ETL Solution using ETL tools like Informatica and Data Stage

Good working experience in Oracle, Unix Shell Scripting

Extensively strong on databases including Oracle, MS SQL Server, and Teradata

Strong in testing Stored Procedures, Functions, Triggers and packages utilizing PL/SQL

Good experience in Data Modeling using Star Schema and Snow Flake Schema and well versed with UNIX shell wrappers, KSH and Oracle PL/SQL programming

Experience in testing and writing complex SQL, T-SQL and PL/SQL statements to validate the database systems and for backend database testing

Technical Skills Summary

Big Data: Hadoop, Hive, GCP Big query, Sqoop, Kafka, Spark (PySpark) and Tableau.

Programming: Python, Shell scripting, Understanding of Core Java.

RDBMS: Oracle 10g &11g, Teradata and Informatica.

WIPRO TECHNOLOGIES 06/2019 – Till Date

Test Lead

Client: Kohl’s, CA.

June 2019 to Present

PROJECT DESCRIPTION:

The scope of this project is to process sales & customer data to analyze the customer behavior (Customer 3600), offers impacts (Loyalty programs) and also to generate data for Campaign.

Responsibilities:

Develop the automation scripts using UNIX to validate Hive & PySpark applications. Coordinate with all the QA Leads to prepare automated scripts for regression testing.

Validate the webstore data application which provide Transacational data for the marketing team using Hive.

Validate the PySpark applications sourcing data from Mosaic (ecom) & Sales Hub system to generate Analytical reports of sales.

Validate the Demand, Fulfilled Sales data & Offers to analyze the various loyalty programs using Spark.

Validate incremental model for multiple databases to ingest data from GCP MySQL source to GCP bucket using Sqoop.

Validate the Pig scripts to process online transactions data.

Validate the reports in Tableau using the aggregated data to give a visual representation.

Analyze the aggregated data migrated from Hive to Big query for faster query responses for the business teams.

Worked closely with GCP DevOps, Security developers to fix environmental issues and integrate projects into GCP environment.

Worked closely with Product Owners (PO) for requirements, Jira stories and source to target mappings (STM). Analyzed STM’s based on source databases.

Environment: Google Cloud Platform (GCP), HDFS, Hive, Sqoop, Gerrit, Kafka, Linux Shell, Python, Spark, Agile and Scrum model.

Bank of America, Plano, TX Aug 2017 – Jun2019

ETL Tester

Environment: Informatica, UNIX, Hadoop/Bigdata, Oracle, Teradata, ALM/HP Quality Center, Agile, PL/SQL, Cognos

Responsibilities:

Extensively tested several Cognos reports for data quality, fonts, headers & cosmetic

Highly experienced with different RDBMS and No SQL DB such as Oracle, MS SQL Server, Hadoop, MongoDB, Casandra, Talend, Teradata and MySQL

Proficient experience in different Databases like Oracle, SQL Server, DB2 and Teradata

Tested data migration to ensure that integrity of data was not compromised.

Automated and scheduled the Informatica jobs using UNIX Shell Scripting

Coordinating data movement and validation with third party vendors.

Implementing ATDD Test concepts for defined User Stories.

Wrote PL/SQL stored procedures for certain data validations

Tested several Data warehouse ETL Informatica Mappings to validate the business conditions.

Wrote SQL and PL/SQL scripts to perform database testing and to verify data integrity.

Involved in testing the Cognos reports by writing complex SQL queries

Worked on issues with migration from development to testing

Created Requirement Traceability Matrix by mapping the requirements with test cases to ensure that all requirements are covered in testing

Writing Test Hive queries to test development algorithms in Hadoop cluster

Validated data moving from heterogeneous sources to target using Informatica

Monitored the workflow transformations in Informatica work flow monitor

Preparing manual test cases and test data for Scenarios

Developed test plans, test cases and test scripts for data validation testing.

Experience using query tools for Oracle, DB2 and MS SQL Server to validate reports and troubleshoot data quality issues.

Experienced in DW projects ETL testing against DW testing with Informatica, Ab Initio and data stage tools .

Experience in creating UNIX scripts for file transfer and file manipulation

Cytel, Waltham, MA Mar 2017 – Aug 2017

ETL Tester

Environment: Data Stage, UNIX, Teradata, Hadoop, Oracle, PL/SQL, ALM/HP Quality Center, Web Services, Agile, Cognos

Responsibilities:

Created various PL/SQL stored procedures for dropping and recreating indexes on target tables.

Tested whether the reports developed in Cognos are as per company standards.

Managed and executed the test process, using Agile Methodology.

Prepared and validated several SQL queries for data retrieval from TERADATA and used every utility available in including MLOAD, FLOAD, FAST EXPORT, BTEQ AND TPUMP

Performed Pre Migration Assessment, Migration and Post Migration.

Validated data moving from different sources to target using Informatica

Prepared test data based on data mapping document.

Have experience in validating the business transformations by writing complex SQL queries.

Executing Autosys jobs and validating the source and the target files.

Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a corporate-wide ETL Solution using Informatica, SSIS, Data stage and Ab Initio

Tested Informatica ETL mappings that transfer data from Oracle source systems to the Data Mart.

In depth technical knowledge and understanding of Data Warehousing, Data Validations, SQL, and Hadoop

Review each PBI with the Product Owner as part of Definition of Done.

Hands on experience in using Autosys scheduling.

Onsite/offshore coordination on daily bases.

Wrote SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application.

Developed Test Plan, Test Cases, Test Data and Test Summary Reports and followed Agile/Scrum process.

Created UNIX scripts for file transfer and file manipulation

Identify business rules for data migration and Perform data validations.

Involved in various testing phases like Integration Testing, System Testing, Regression testing and Inactive prod testing.

Managed and reviewed Hadoop log files.

Ensure defect management activities for all bugs identified, reported and worked upon.

Draft and review test plans every sprint.

Tested several Data Stage jobs and ran on for loading purpose and checking the log files

Prepared Test Cases for the mappings developed through the tool Data Stage and executed those Test Cases

Worked with ETL group for understanding Data Stage graphs for dimensions and facts.

Assisted in promotion of Data Stage code and UNIX Shell scripts from UAT to Production.

Alta, Madison, WI Jan 2016 – Mar 2017

ETL Tester

Environment: Data Stage, Oracle, SQL, PL/SQL, UNIX, Teradata, ALM/HP Quality Center 11, TOAD, Cognos, MS Excel, Agile, XML, XSD, XSLT, XML Spy

Responsibilities:

Written extensive UNIX Shell scripting for error logging, cleanup, data archiving& job scheduling

Conducted backend testing by querying databases to synchronize testing databases and checked for data integrity and proper routing based on workflow rules at each step.

Performed Teradata SQL Queries, creating Tables, and Views by following Teradata Best Practices

Generated the detailed Bug reports and Defects were tracked, reviewed and analyzed.

Involved in meetings to discuss the findings in the executed tests and decide the next steps.

Developed and Performed execution of Test Scripts manually to verify the expected results.

Validated that application meets requirements and functioning as per the technical and functional specifications.

Created complex SQL queries for querying data against different data bases for data verification process.

Tested new Data migration tool used in the mapping of data to new system.

Data migration of raw data to excel documents for clients

Participated in defining and executing test strategies using agile methodology.

Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.

Created Test cases for Data Acquisition and Data Delivery and tested the data accuracy for all the transformations.

Executed the Test cases for Cognos Reports.

Used TOAD & SQL Navigator GUI tools for Querying Database.

Interacted with the Business users to identify the process metrics and various keys dimensions and measures.

Developed the detail ETL Test Plan, Test Resource Utilization and Close Down document for multiple products.

Developed complex queries to test the reports developed.

Involved in testing the XML files and checked whether data is parsed and loaded to table.

Tested the data and data integrity among various sources and targets.

Baxter Health, Chicago, IL Aug 2014 to Dec 2015

ETL TESTER

Environment:, Teradata SQL Assistant, Oracle, MICROSTRATEGY, SQL Server, ALM/ HP Quality Center, SQL, TOAD, XML, XSLT, XPath, XQuery.

Responsibilities:

Validated the data of reports by writing SQL queries in PL/SQL Developer against ODS

Performed end-to-end testing of applications and coordinated with other teams.

Created ETL execution scripts for automating jobs

Prepared Execution procedure document format to prepare the Test cases based on mapping document.

Ensured that the mappings are correct and Conducted data validation testing

Worked with business analyst and developers in setting up test data

Tested several Data warehouse ETL Informatica Mappings to validate the business conditions.

Extensively tested several MicroStrategy reports for data quality, fonts, headers & cosmetic

Writing complex SQL queries for data validation for verifying the ETL Mapping Rules

Extensively test the reports for data accuracy and universe related errors

Used Toad for query Oracle, SQL Assistant for Teradata and SQL Management studio for SQL Server

Used T-SQL for Querying the SQL Server database for data validation and data conditioning

Tested several dashboards and deployed them across the organization to monitor the performance

Used SQL tools to run SQL queries and validate the data loaded into the target tables

Involved in extensive DATA validation using SQL queries and back-end testing

Tested Analytical Data Mart (ADM) of Oracle system and Stored Procedures

Tested Operation Data Store process of ODS/DWL product

Used workflow manager for session management, database connection management and scheduling of jobs

Patni Computers Aug 2011 to Jun 2014

ETL Tester

Environment: UNIX Shell Scripting, Oracle, Informatica Power Center, Mercury Test Director, SQL *Loader, Cognos, SQL Server, Windows, TOAD

Responsibilities:

Assisted in creating fact and dimension table implementation in Star Schema model based on requirements

Written several complex SQL queries for validating Cognos Reports

Worked with business team to test the reports developed in Cognos

Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads

Extensively used Informatica power center for extraction, transformation and loading process

Creating test cases for ETL mappings and design documents for production support

Extensively worked with flat files and excel sheet data sources. Wrote scripts to convert excel to flat files

Scheduling and automating jobs to be run in a batch process

Effectively communicate testing activities and findings in oral and written formats

Reported bugs and tracked defects using Test Director

Worked with ETL group for understating mappings for dimensions and facts

Extracted data from various sources like Oracle, flat files and SQL Server

Develop test plans based on test strategy. Created and executed test cases based on test strategy and test plans based on ETL mapping document

Written complex SQL queries for querying data against different data bases for data verification process

Prepared the Test Plan and Testing Strategies for Data Warehousing Application

Preparation of technical specifications and Source to Target mappings

Extensively used SQL programming in backend and front-end functions, procedures, packages to implement business rules and security

Written test cases to test the application manually in Quality Center and automated using Quick Test Pro

Defects identified in testing environment where communicated to the developers using defect tracking tool Mercury Test Director

Developed scripts, utilities, simulators, data sets and other programmatic test tools as required executing test plans



Contact this candidate