Post Job Free

Resume

Sign in

Data Azure

Location:
Phoenix, AZ
Posted:
July 07, 2020

Contact this candidate

Resume:

Ranadeesh Kumar

adeek9@r.postjobfree.com

518-***-****

PROFESSIONAL EXPERIENCE:

6+ years of extensive experience in Quality Assurance testing of data warehousing and Business Intelligence processes, Web Based and Client/Server applications.

Excellent analytical skills for understanding the business requirements, business rules/processes, and detailed design of the application.

Experience in creating functional/technical specifications, data design documents based on the requirements.

Expert in writing Test Plans, defining Test Cases, developing and maintaining Test Scripts, Test Case Execution, Analyzing Bugs and interacting with team members in fixing the errors as per specifications and requirements.

Experienced with Full Life Cycle and Methodology for implementing Data warehouse and Business Intelligence Reporting System.

Extensive experience in quality assurance using Manual Testing, Automated testing tools like Quick Test Pro, Quality Center.

Expertise in working in Agile (Scrum), Waterfall, Spiral methodologies.

Extensively strong on databases including Oracle 11g/10g/9i/8i, MS SQL Server 2012/2008R2 and IBM DB2, Teradata, TOAD.

Experienced in both Manual and Automated testing tools using HP Quality Center, Test Director and QTP with VB Script.

Expert in generating writing parameterized queries, drill through reports and formatted SQL server Reports in SSRS 2005/ 2008/2008 R2 using data from ETL Loads, SSAS Cubes and various heterogeneous data sources.

Good understanding of Relational Database Design, Data Warehouse/OLAP concepts and methodologies.

Plan and Develop roadmaps and deliverables to advance the migration of existing solutions on-premise systems/applications to Azure cloud.

Design and implement streaming solutions using Azure Stream Analytics.

Experienced in Azure Data Factory (ADF),Integration Run Time(IR),File System Data Ingestion, Relational Data Ingestion.

Extensive experience in writing SQL and PL/SQL scripts to validate the database systems and for backend database testing.

Experience in complete Software Testing life cycle: Unit Testing, Functional Testing, Regression, Integration, System Testing, Performance testing and Data driven testing.

Expertise in Defect Reporting and Tracking using Test Director/ HP Quality Center.

Extensive experience in testing applications on UNIX environment.

Excellent in meeting deadlines, time management and presentation skills. As well as an excellent team player with problem-solving and trouble-shooting experience with better Communication, interpersonal, analytical skills and strong ability to perform in a team as well as individually.

Technical Skills:

Quality Assurance Tools /Test Management

HP ALM 11.5/HP Quality Center 10, Test Director 7.5, IBM Clear Quest

Environment

Windows 98/2000/NT/XP, Unix, Linux.

Languages

SQL, PL/SQL

RDBMS

Oracle 8i/9i/10g/11g, SQL Server 2000/2005/2008, Teradata, Sybase, DB2, TOAD.

GUI/RDBMS Query Tools

SQL* Plus, Oracle SQL Developer, SQL Server Management studio, Teradata SQL Assistant 13.0, TOAD, Win SQL, Squirrel client services.

ETL Tools

Informatica Power Center, SSIS and Data stage

Bug Tracking tools

Agile, JIRA, Jenkins, Github.

Other Tools

Putty, WinSCP, FileZilla, Ms-Office.

PROFESSIONAL EXPERIENCE:

Chico’s FAS, Fort Myers, Florida September 2019 – Till now

Sr. ETL – QA Tester

Responsibilities:

Review Mapping Documents formula and rule engine.

Analyzing change request for the existing mapping and creating an impact analysis for the change on Data.

Writing new mappings specification according to the business data requirements.

Developed Test Plan, Test Cases, Test Data and Test Summary Reports and followed Agile/Scrum process.

Involved in the Agile Scrum Process.

Providing status reports on daily basis for the scrum process as well as the project status reports.

Working with team members of the QA team to advise and assist on project working on.

Working closely with the DBA team to know the data base architecture more intimately.

Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2008 and SSIS in Business Intelligence Development Studio (BIDS).

Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.

Prepared the complete data mapping for all the migrated jobs using SSIS.

Worked on different tracks of the project, each track with BigData consists of over more than 30 million records to a billion records.

Design and implement database solutions in Azure SQL Data Warehouse, Azure SQL.

Deploying Azure Resource Manager JSON Templates from PowerShell.

Worked on importing data from SAP systems and quality checking that data in HIVE environment and transferring said data to MapR system mainly to JSON messaging systems and testing data over there.

Writing high level and complex SQL queries for testing of data transformations and their logics.

Worked with ETL tool like Informatica to manage and view the relations and mapping of the databases.

Prepared Test data preparation strategy and data masking strategy for the shared test environment

Written Test Cases for ETL to compare Source and Target database systems.

Working on Hadoop big data to check the migration of the data from one server to another server without any possible sings of data mishap.

Creating and writing test scripts in sqoop for handling Hadoop bigdata.

Expertise at designing tables in Hive, MYSQL using SQOOP and processing data like importing and exporting of databases to the HDFS.

Creating a pathway from Hadoop to the Data Lakes (MapR) and making sure that the data has been migrated properly to all the way.

Exposed towards the python scripts on how to use them as well as how to configure them to run with current settings of the local systems as well as with the servers.

Worked with different Department stakeholders to gather requirements in Scrum sessions to identify the business requirements, financial requirements to be processed. Identify and Create Test Cases and Scenarios.

Expertise in modifying and executing the python scripts to run the tests on big data.

Interacting with senior peers or subject matter experts to learn more about the data migration and Involved in completing data testing before the UAT.

Co-ordinate with development and QA team for handling of defects with respect to data.

Tested several Informatica Mappings to validate the business conditions.

Extensively written test scripts for back-end validations.

Worked in all areas of Jenkins setting up CI for new branches, build automation, plugin management and securing Jenkins and setting up master/slave configurations.

Expertise in using JIRA software with Jenkins and github for real time bug tracking and issue management.

Troubleshoot build issues in Jenkins, performance and generating metrics on master's performance along with jobs usage.

Environment: HP ALM, SQL, PL/SQL TOAD, Oracle 11g, Agile/Scrum, MS Office, HIVE, Hadoop, SAP HANA Studio, UNIX, Hadoop, Data Lakes (MapR), Shell Scripting, Informatica, Jenkins, Github, JIRA, Python scripts, Sqoop.

Great American Insurance Group, Cincinnati, OH Oct 2018 – Aug 2019

Sr. ETL - QA Tester

Responsibilities:

Experience in migrating data from the Database by generating the SQL from the Pivot table

Performed dynamic pivoting by using dynamic SQL by dynamically building the pivoting query

Worked on creating a custom SQL query on the DB to test the query, and then put the query inside an Excel pivot table in order to display the data.

Used Sqoop for data transfer between MS-Sql and HDFS.

Created Complex ETL Packages using SSIS to extract data from staging tables to partitioned tables with incremental load.

Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.

Developed, deployed, and monitored SSIS Packages.

Created SSRS reports using Report Parameters, Drop-Down Parameters, Multi-Valued Parameters Debugging Parameter Issues Matrix Reports and Charts.

Worked with Azure transformation project and Azure architecture decision.

Deploying Azure Resource Manager JSON Templates from PowerShell worked on Azure suite: Azure SQL Database, Azure Data Lake, Azure Data Factory, Azure SQL Data Warehouse, Azure Analysis Service.

Developed and implement ETL and data movement solutions using Azure Data Factory, SSIS.

Design and implement end-to-end data solutions (storage, integration, processing, visualization) in Azure.

Exposure in Propose architectures considering cost/spend in Azure and develop recommendations to right-size data infrastructure.

Created reports with Analysis Services Cube as the data source using SSRS.

Performing integration testing and validating SIT and CT environments for UAT and Carrier Testing

Tested the reports generated by OBIEE and verified and validated the reports using SQL.

Manipulating, cleansing & processing data using Excel, Access and SQL

Writing SQL scripts to manipulate data for data loads and extracts.

Writing test cases to compare the data between source and target databases

Writing complex SQL queries to check the Views built on the source tables and to compare data between source and target

Worked with SQL Queries using joins to facilitate the use of huge tables for processing several records in the database.

Identify business rules for data migration and Perform data validations.

Developed stored procedures to validate the data obtained.

Testing the source data for data completeness and data correctness.

Participate in the creation of Test Scenarios Test Cases with the UAT Team and the Business Analysts.

Played a functional data SME role in the development of the Marketing Measurement and Metrics reporting dashboard implemented in Siebel Analytics /OBIEE environment.

Checking the PL/SQL procedures that load data into the target database from standard tables.

Testing flat file data in Unix environment by using complex Unix commands.

Coordinating with offshore team for testing purposes, And assigning tasks to all testing team members.

Tested several data migration application for security, data protection and data corruption during transfer

Involved in the Agile Scrum Process.

Verify the ETL process by running the Informatica workflows (both from Informatica Workflow Manager and through UNIX scripting), monitor the status in Informatica Monitor and verify logs

Worked with both offshore and onsite and co-ordinate with business teams and technology team

Testing and resolving loops and contexts to ensure the correct results from the query.

Participating in the requirements gathering meetings, sprint planning meetings and defect review meetings.

Environment: Agile, Informatica, Oracle 11g, Unix Shell Scripting, OBIEE,SQL, PL/SQL, Toad for Oracle, Jira, Unix, Windows XP platform, MS Office, Soap UI,Data Profiling, Excel, Sqoop.

Texas Mutual Insurance Company, Austin, TX June 2016 – Sep 2018

ETL/Big data Tester

Responsibilities:

Developed and conducted a wide range of tests and analysis to ensure that software, systems, and services meet minimum company standards and defined end-user and system requirements.

Experience managing Azure Data Lakes (ADLS) and Data Lake Analytics and an understanding of how to integrate with other Azure Services. Knowledge of USQL and how it can be used for data transformation as part of a cloud data integration strategy.

Involved in high and detail level design reviews to ensure requirement traceability and to determine application/component functional readiness requirements.

Used various sources to pull data into Power BI such as SQL Server, Excel, Oracle, SQL Azure etc.

Being part of the test team, responsibilities involved writing complex queries using SQL and PL/SQL to generate data based on the complex derivations for each attribute.

Worked with systems engineering team to deploy and test new Hadoop environments and expand existing Hadoop clusters.

Developed Pig UDFs for preprocessing the data for analysis and handle any kinds of additional functionalities needed.

Used Spark-Streaming APIs to perform necessary transformations and actions on the fly for building the common learner data model which gets the data from Kafka in near real time and persists into Cassandra

Complex SQL queries were written to compare data generated by the application against the expected results generated based on mapping requirements for each interface.

Exclusively involved in execution of Autosys jobs, PL/SQL batch programs and responsible for reporting the defects to development team.

Ran UNIX shell scripts to count the records for EDW source to staging tables.

Worked on Autosys, Unix, Hadoop, Hive, Impala and shell scripting for big data testing. Leading the team for the same.

Expert in Waterfall Lifecycle, AGILE and Iterative project testing methodologies

Performed manual testing to conduct backend testing using UNIX shell scripts and SQL Queries

Validating the data passed to downstream systems.

Involved in maintaining the test environments; with activities like requesting data loads, data base backups, restarting the servers, requesting the deployments, troubleshooting issues.

Tracked and executed the User Acceptance Test Cases with respect to the requirements to determine the feature coverage.

Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.

Responsible for testing Initial and daily loads of ETL jobs.

Worked with non-transactional data entities for multiple feeds of reference data using Master Data Management.

Involved in Database Validations, Writing Test scripts (Including the related SQL Statements by joining various tables) depending on Requirements for both Positive and Negative Scenarios.

Automating the Data and Validating in various builds, using QTP and Quality Center.

Attended and worked with agile scrum standup meetings.

Environment: Agile, Informatica, Big Data/Hadoop, SQL, PL/SQL, Toad, Unix, shell script, Business Objects XI 3 Reports, DOORS, Oracle SQL Developer, HP/ALM Quality Center,QTP,UNIX.

G.E Research, Albany, New York May 2015 – Sep 2016

ETL – QA Tester

Responsibilities:

Developed ETL Test Plans, Test Strategies and Test Cases.

Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.

Interacting with the mangers and development team to find the issues going on with the employment exchange program data bases and troubleshooting the problems involved with the development of stored Procedures, triggers and problems related to the privileges.

Wrote and modified required UNIX scripts and other SQL validation scripts and writing scripts in SQL to validate the outputs.

Involved in defect reporting, defect tracking and defect reproducing using bug tracking systems Jira and Agile systems.

Opened tickets using JIRA systems and assigned to IT and DBA teams to work on it till they resolved the issues and restored customer services.

Co-ordinated with on-site and off-shore teams to work on those issues and fix them in time.

Participate in the creation of Test Scenarios & Test Cases with the UAT Team and the Business Analysts.

Developed UNIX shell scripts to format the session log files and to extract the information from error logs.

Involved in creating scrum meetings and Agile story board systems to stay on time for deliverables.

Involved in testing ETL and PL/SQL batches scheduled by using Autosys.

Tested UI Screens and Reports for cosmetic and data requirements.

Involved in Usability Testing and coordinating with end users for giving Training on Web Based application.

Defined and implemented verification and validation processes across the project and manage QA and regression testing environments.

Analyze software failures, report detailed steps to reproduce. Identified, analyzed, and documented defects utilizing JIRA and Asana as a defect tracking system.

Created UAT test cases and Executed them, helped the end users understand and use the application

Tested the web pages, user login pages according to the user security roles and requirements.

Worked with business team to test their ports developed in Cognos.

Expertise in Analysis of Test scenarios and documenting Test cases and developing SIT, UAT Plans, maintaining the Extensive testing on the Performance of the Stored Procedures in PL/SQL

Tested the data consolidation for Master Data Management

Ensured developers coordinated their efforts in managing and maintaining different application databases

Tested OLAP cubes to verify and validate the functionality of the reports.

Monitored the performance of the reports.

Environment: Cognos, Informatica, Unix, shell script,, DB2, PL/SQL, Autosys, SQL*Plus, JIRA, Netezza, UNIX, Shell Scripting, IBM AIX 5.0, UNIX, Windows XP, Oracle SQL Developer, Log Files, Putty, SQL Loader, TOAD, Unix, shell script, Flat files, Windows NT and UNIX

Birla Soft, India Apr 2014 - Jan 2015

SOFTWARE TESTER

Responsibilities:

Understanding the business process; manage testing activities.

Formulate the Test Strategy, Designing, System Plan and Validations.

Involved in life cycle testing, functionality testing, regression testing, performance testing, volume testing.

Written SQL/Shell scripts to perform data integrity testing and validation of logs.

Performed system test following the standard guidelines provided in the design documents.

Done Cognos Reports testing which includes data validation and cosmetic validation.

Used to run manual batch job using UNIX shell scripts in Putty.

Created and Executed System Test cases and User Acceptance Test cases for the application.

Interacted with developers for resolving Change Request issues.

Tested the application manually for its GUI objects and their functionality and reported the bugs.

Performed extensive manual testing of the entire application suite.

Developed Test Data and individual test cases to satisfy positive and negative testing.

Environment: Oracle, SQL, Cognos 7.0, J2EE, Apache Tomcat, Unix, shell script, Manual testing, TOAD, PuTTY, WinSCP, Log files, JSP, XML, Visual Basic, ASP, VB.Net, ASP. Net, Windows, UNIX



Contact this candidate