Post Job Free

Resume

Sign in

Data Etl

Location:
Schaumburg, IL
Posted:
June 24, 2020

Contact this candidate

Resume:

SUMMARY:

*+ years of IT experience in Data warehousing/ETL/QA testing with extensive usage of SQL, PL/SQL, UNIX, and Shell Scripting & HP ALM/Quality Center.

Good idea of data and testing with working knowledge of Banking, Insurance, Retail and Healthcare domains.

Experience with complete Software Development Life Cycle (SDLC) and STLC.

Extensive experience in Functional, Integration, Regression, User Acceptance, System, Smoke, Sanity, Black Box, Performance, Load and Backend Testing.

Experience with Waterfall, V-Model and Agile-Scrum Methodologies.

Good experience in data sources, data profiling, data validation, developing low level design patterns based on the business and functional requirements.

Very good understanding of Data Warehousing concepts, Data Analysis, Data Warehouse Architecture and Designing.

Solid Back-end testing experience in writing and executing SQL Queries.

Experienced working with SQL Server 2008, Netezza, Oracle 10g/11g, Teradata, Sybase & DB2.

Extensive experience in Oracle Packages, Stored Procedures, Functions and Database Triggers using PL/SQL. Efficiently used Oracle utilities like Toad and SQL Loader.

Good experience in manual testing on Hadoop and Mainframes environments.

Experienced in Writing UNIX Korn Shell Scripting for job scheduling and sequencing.

Hands on expertise in Data ware housing concepts and tools. Involved in the ETL processes where in the organizations were using Data stage, Informatica, and Ab Initio and Autosys tool.

Automated and scheduled the Informatica jobs using UNIX Shell Scripting.

Great Knowledge and experience in Metadata and Star schema/Snowflake schema. Analyzed Source Systems, Staging area, Fact and Dimension tables in Target data warehouse.

Extensive experience in UNIX Shell Scripting, AWK and file manipulation techniques.

Experience with QA Methodology and QA Validations to ensure the Quality.

Experience in testing Business Report developed in Business Objects XIR2 & Cognos 8 series, OBIEE, Hyperion reports.

Ability to interact with all levels of personnel from technical to high level executive management within the information technology and business communities.

SKILL MATRIX:

Quality Assurance/ Defect Tracking Tools: DOORS, Requisite Pro, HP ALM 11/Quality Center 10, IBM Clear Quest 9.0, IBM Clear Case 9, Test Director 7.5, Bugzilla, Jira

SDLC/STLC: Agile-SCRUM, Water Fall, RAPID, Spiral, V- Model

RDBMS Technologies & Data Sources: Oracle 11g/10g/9i, Teradata 13.0/12.0, SQL Server 2012/2008R2, IBM DB2, SYBASE, MS Access, Netezza, MS EXCEL, FLAT Files, XML

RDBMS Query Tools/GUI Tools: TOAD 10.6, SQL* Loader, Oracle SQL Developer, Teradata SQL Assistant 13.0, WinSQL, and MYSQL

ETL/Data Warehouse: Informatica 9, SSIS, Ab Initio 2.15, IBM Data Stage 8.1

BI/Reporting Tools: Business Objects XIR4, Cognos 8.4, Micro strategy, OBIEE 10, Hyperion 11.1.2

Operating System: MS–Windows 9x/NT/2000/XP, Red Hat Linux, UNIX

PROFESSIONAL EXPERIENCE:

INFOSYS/AMERICAN EXPRESS, PHONEIX, AZ FED’19 – TILL DATE

TECHNICAL TEST LEAD/ QA ENGINEER

Responsibilities:

Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Wrote SQL and PL/SQL scripts to validate the database systems and for backend database testing.

Involved in Teradata SQL Development, Unit Testing and Performance Tuning.

Tested Complex ETL mappings and Sessions based on business user requirements and business rules to load data from

Source flat files and RDBMS tables to target tables.

Based on the business requirements, created Test Plan, Test Cases, and Test Script documents to test the Business Objects.

Performed troubleshooting, fixed and deployed many Python bug fixes of the two main applications that were Maintained main source of data

Good knowledge of Swagger API framework and web and/or mobile software development.

Support the development of virtualized APIs by creating sample API request/response messages.

Tested API (REST and SOAP) for webservices using Karate and Postman.

Performed integration testing of Hadoop into traditional ETL, extraction, transformation, and loading of massive structured and unstructured data.

Successfully planned and executed QA process in AGILE environment.

Performed Database Testing to validate database using TOAD, Checked Data Integrity and Constraints by SQL Queries.

Extraction of test data from tables and loading of data into SQL tables.

Tested UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.

Used Agile Test Methods to provide rapid feedback to the developers significantly helping them uncover important risks.

Solid experience in implementing automation test scripts, record scripts using QTP and UFT.

Loading Data from Hadoop to Teradata using External Frame work

Tested the PL/SQL package that loaded data into staging from the source database.

Create and execute a strategy to build mindshare and broad use of AWS within a wide range of customers and partners.

Created and maintained the business complex reports using the Business Objects Reporting Module as per user requirements

Created ETL test data for all ETL mapping rules to test the functionality of the Informatica Mappings.

Tested the ETL mappings and other ETL Processes (Data Warehouse Testing).

Written several complex SQL queries for validating Business Reports.

Tested several stored procedures.

Work in AGILE Software development environment and work with AGILE team members to find and fix defects in our production.

Worked in the AGILE environment on one of the project called - Microbiology orders.

Tested the ETL process for both before data validation and after data validation process.

Tested the messages published by ETL tool and data loaded into various databases.

Involved in testing the metadata to test the integrity of the Business Objects Universe

Responsible for Data mapping testing by writing complex SQL Queries using WINSQL.

Created new connections in query surge for flat files, databases and Hadoop.

Experience in creating UNIX scripts for file transfer and file manipulation.

Followed extensive programming methodology of AGILE to produce high quality software.

Involved in Data Extraction from Teradata and Flat Files using sql assistant.

Wrote PL/SQL and Complex SQL queries for system testing.

Used Quality Center to track and report system defects.

Involved in testing the XML files and checked whether data is parsed and loaded to staging of jobs.

Enironment: Ab- Initio, Rally, Hadoop, Big data, API, AWS, Spark, Python, API, Agile, Business Objects, QTP/UFT, Winscp, Putty, DB2, SQL, PL/SQL, TOAD, XML, IBM AIX 5.3, Java, UNIX, Shell Scripting, WINSQL.

CIOX HEALTH – ALPHARETTA, GA - AUG 2017 TO Till Date

SR. ETL/QA/BI/DWH Tester - BIG DATA TESTER

Responsibilities:

Wrote test scenarios and then test cases and created test data for testing the application.

Actively involved in functional, unit and integration testing in agile methodology.

Performed integration testing of Hadoop into traditional ETL, extraction, transformation, and loading of massive structured and unstructured data.

Followed agile development methodology and adhered to strict quality standards in requirement gathering.

Worked on Waterfall methodology for testing and have used HP Mercury ALM

Integrated Informatica Data Quality IDQ with MDM and created various data quality mapplets in Informatica Data Quality tool and imported them as IDQ cleanse functions.

Analyzed the data by performing Hive queries.

The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.

Created Source to Target mapping document from Mainframe logic.

Worked as a consultant in delivering quality data centric solutions for various business domains

Mapped all test cases with Requirement Traceability Matrix to ensure coverage of all requirements.

Used HP Quality Center to upload all test cases in Test plan and Test lab modules.

Loading Data from Hadoop to Teradata using External Frame work

Executed all test cases and loaded the test results in QC.

Create mapping using Informatica Data Quality to cleanness data and feed into tables again.

Raised defects in defect module of QC and retested when in test status.

Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).

Tested ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files to target tables.

Tested data migration to ensure that integrity of data was not compromised.

Performed ETL Data Validation, testing functionality of mapping by preparing test data for all ETL mapping rules by loading data from various sources using Informatica Power Center.

Created partitioned tables in Hive.

Created new connections in query surge for flat files, databases and Hadoop.

Tested ETL process (Data Loading, Extraction and transformations) using Informatica.

Supporting presales activity for Data centric warehouse testing projects.

Performed System, Integration, User Acceptance and Regression testing.

Worked on various projects involving Data warehousing using Informatica.

Logged defects in ALM and communicated that to the development team during daily defect status call.

Created well detail Manual Test scripts in HP ALM to test application Functionality, End to End, GUI, Look & Feel, and Content.

Acting as Vertica support, assisted in troubleshooting and improving slow running GUI-based queries against Vertica

Designed the Data Driven tests in Quick Test Professional (QTP) script to validate with different sets of test data

Have been actively involved in monthly team meetings to discuss the project status and bottlenecks.

Validation of pre-Hadoop processing to ensure that that data is getting processed without any errors.

Involved in testing the Cognos reports by writing complex SQL queries.

Validated reports cosmetic like color, size in Cognos reports.

Environment: Informatica 9.6.1, Oracle 11g, IDQ, SQL Server, Data Quality, TOAD, QTP, Oracle SQL Developer, PL/SQL, Cognos, Agile, HP ALM, Web Services, Hadoop, Big data, Vertica, SQL, MS Outlook, MYSQL, MS Office, Windows 7, Unix.

PREMIER HEALTH CARE, CHARLOTTE, NC - JUL 2016 TO JUL 2017

SR.ETL/QA/BI/DWH Tester - BIG DATA TESTER

Responsibilities:

Verified Client Segment Roll UP Codes for Combine. Implemented Agile - Scrum Methodology for frequent changes to client requirements and following parallel development and testing.

Used Agile testing methodology for achieving deadlines in UAT testing.

Validation of Hadoop Map Reduce process data output processing to ensure that that data is getting processed without any errors.

Tested several data migration application for security, data protection and data corruption during transfer

Based on the business requirements, created Test Plan, Test Cases, and Test Script documents to test the Business Objects

Wrote the SQL queries on data staging tables and data warehouse tables to validate the data results.

Created SQL scripts for sourcing data from Teradata, including creating tables, views, stored procedures, loading data into the tables.

Involved in backend testing for the front end of the application using SQL Queries in Teradata data base.

Installed and configured Hadoop Map Reduce, HDFS, developed multiple Map Reduce jobs and PIG UDFS in java for data cleaning and preprocessing.

Worked with XML feeds from multiple sources systems and loaded the same into Enterprise data warehouse.

Involved in Unit testing and integration testing of Informatica mappings

Validate Sqoop data from Oracle to Hadoop to generate reports by BI team.

Performed Verification, Validation, and Transformations on the Source data before loading into target database.

Tested PL/SQL Programs to prepare and clean Data for data migration and also to retrieve data from the database to compare against the input sets

Wrote SQL queries in Toad and SQL server Management studio 2008 to validate the source and target systems.

Clearly communicated and documented test results and defect resolution during all phases of testing.

Focused on Data Quality issues / problems that include completeness, conformity, consistency, accuracy, duplicates and integrity

Tested several stored procedures for data migration to ETL

Involved in fixing the defects and data validation of reports for Partially Converted and Fully Converted Reports.

Good experience on testing Business Objects reports.

Involved in performing extensive Back-End Testing by writing SQL queries to extract the data from HADOOP.

Performed the data profiling and analysis making use of Informatica Data Quality (IDQ)

Tested and maintained the Business Objects universe and reports

Involved in testing the data warehouse for both the initial load and the incremental loads of the target

Worked with systems engineering team to deploy and test new Hadoop environments and expand existing Hadoop clusters.

Evaluation of critical problems/issues during testing and reporting them in Quality Center.

Environment: Informatica, AGILE, Oracle 10g, TOAD, Hadoop, Big data, PL/SQL, SQL Server 2008R2, SQL Server Management studio 2008, Web Services, Business Objects Enterprise XI R3.1, MYSQL, PL/SQL, Teradata, Business Objects Designer, HP Quality Center 10

ALL SCRIPTS HEALTH CARE, ATLANTA, GA - JUN 2015 TO JUN 2016

ETL/QA/BI/DWH Tester - BIG DATA TESTER

Responsibilities

Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.

Prepared the Test Plan and Testing Strategies for Data Warehousing Application.

Did Data integration testing on extracted data from various sources like Oracle, flat files and SQL Server.

Written complex SQL queries for querying data against different data bases for data verification process.

Validated data imported to Hadoop or HDFS environment from RDBMS.

Involved in the error checking and testing of the ETL procedures and programs Informatica session log.

Scheduling the UNIX script from the CRONTAB

Performed extensive Data Validation, Data Verification against Data Warehouse.

Tested the reports generated in Hyperion interactive reporting by comparing the data against the source system

Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.

Setting up, monitoring and using Job Control System in Development/QA/Prod.

Worked with ETL group for understating mappings for dimensions and facts.

Performing various Hadoop controls like date checks, record checks, balance checks, threshold limits for records and balances

Tested several Data warehouse ETL Informatica Mappings to validate the business conditions.

Involved in testing the Cognos reports by writing complex SQL queries.

Validated reports cosmetic like color, size in Cognos reports.

Environment: Informatica Power Center (Power Center Designer, workflow manager, workflow monitor), ODI, Cognos, Hadoop, Big data, Data centric, Web Services, Agile, Mercury Quality Center, Teradata, SQL *Loader, MYSQL, SQL Server 2005, Erwin 3.5, Windows 2000, TOAD 7.

PNC BANK, CHEVELAND, OH - JUN 2013 TO MAY 2015

ETL/QA/BI/DWH Tester

Responsibilities:

Based on the business requirements, created Test Plan, Test Cases, and Test Script documents to test the Business Objects

Involved in Unit testing and integration testing of Informatica mappings

Performed the data profiling and analysis making use of Informatica Data Quality (IDQ)

Created SQL scripts for sourcing data from Teradata, including creating tables, views, stored procedures, loading data into the tables.

Involved in performing the source data validations in the HDFS environment by using the different Hadoop commands

Prepared the Test Plan and Testing Strategies for Data Warehousing Application.

Wrote test scenarios and then test cases and created test data for testing the application.

Actively involved in functional, unit and integration testing in agile methodology.

Prepared the Test Plan and Testing Strategies for Data Warehousing Application.

Did Data integration testing on extracted data from various sources like Oracle, flat files and SQL Server.

Written complex SQL queries for querying data against different data bases for data verification process.

Validated data imported to Teradata from RDBMS.

Involved in big data tools like Spark, Hadoop and Hive

Provided support to offshore QA team by giving them knowledge transfer and helping them in closing the defects by coordinating with the dev team.

Worked with the team to understand the source to target mapping document and accordingly helped them to clean the source data to decrease the defects and to ensure that final data matches with the data warehouse standards.

Environment: Informatica, Business Objects, Hadoop, Big data, Teradata, Oracle, SQL, PL/SQL

BIRLASOFT, INDIA - MAR 2011 TO MAY 2013

DWH/ETL/QA/BI TESTER

Responsibilities:

Created test cases and test plans for testing the ETL process and reports.

Tested all the ETL processes developed for fetching data from OLTP systems to the target Data warehouse using complex SQL queries.

Tested the entire reports generated using Business Objects BI tool by validating the data in the report against the database according to the requirement specifications using SQL.

Tested PL/SQL procedures that were developed to load the data from temporary tables in staging to target tables in the data warehouse

Provided support to offshore QA team by giving them knowledge transfer and helping them in closing the defects by coordinating with the dev team.

Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.

Worked with the team to understand the source to target mapping document and accordingly helped them to clean the source data to decrease the defects and to ensure that final data matches with the data warehouse standards.

Prepared test data to cover all the test scenarios.

Prepared UNIX scripts to run the Informatica ETL jobs from command line.

Tested several business reports developed using Business Objects including dashboard, drill-down, summarized, master-detail & Pivot reports.

Maintained all the test cases in Quality Center and logged all the defects into the defects module.

Environment: Oracle 8i+9i, SQL, PL/SQL, Quality Center 8.0, XML, PL/SQL, Informatica, Business Objects, TOAD for Oracle



Contact this candidate