Post Job Free
Sign in

Test Cases Data

Location:
New York, NY
Salary:
12L
Posted:
August 17, 2018

Contact this candidate

Resume:

SANDEEP SINGH NEGI

Mobile: 988******* * E-mail: ******************@*****.***

PROFESSIONAL SUMMARY

•Around 5.5 years of IT experience in Hadoop/Hive/ETL Testing and development of Data warehousing Projects.

•Seeking a growth and challenging environment in hadoop Testing,where i can prove my knowledge and technical skills

•Good knowledge in installing,configuring and Testing hadoop ecosystem components.

•Knowledge in Testing the process for Hadoop Based Applications design and implementation.

•Expertise in working with operating system like windows and linux with different languages such as Java script,Big data,SQl.

•Strong Understanding of Data Warehouse concepts.

•Deals with OLTP and OLAP, Data marts.

•Basic approaches followed to Create Data Warehouse.

•Strong Understanding of Dimensional Modeling concepts.

•Hand on experience in Dimension and Fact, Grained Dimensions and Facts.

•Primitive concepts of Star Schema's Cardinality and Granularity of Fact.

•Working experience in Different schema's (Star Schema’s, Snow Flake Schema’s, Constellation Schema's).

•Type of Dimensions and Facts (Factless Fact).

•Involved in handling SCD Change's like SCD1, SCD2, and SCD3.

•Clear understanding of Software Development Life Cycle models Waterfall Model.

•Thoroughly exposed in ETL Testing Process, STLC.

•Proficient in all phases of STLC (Test link, HP-ALM)).

•Extensive experience in reviewing Business Requirement Documents, Software Requirement Documents and preparing Test Cases and Execution.

•Involved in the Designing, Preparing, Execution and Maintenance of the test cases.

•Thorough knowledge of software Testing methodologies Functional, Integration/System Testing, Regression and User Acceptance Testing.

•Experience in coordinating the Testing activities with development team.

•Involve in BI reporting tool using OBIEE 11.1.1.7.17512

KNOWLEDGE OF SQL AND HQL

•DB objects, views, sequences, synonyms.

•DDL statements, DML statements, TCL statements, DCL statements, DQL statements.

•Join, Left Outer join, Right Outer Join, Partitioning, Bucketing.

•Analytical Functions, Rank, Dense_Rank, Row_Number .

•Aggregate Functions sum, count, max, min, avg .

•String Functions Substr, Instr .

•Date Functions To_char, To_date .

•Normalizations 1NF, 2NF, 3NF.

•Pivoting and Un-pivoting

INFORMATICA 9.0.1 ETL TOOL

Client Components:

•Repository Manager

•Power center Designer

•Workflow Designer

•Workflow Monitor

KNOWLEDGE OF PL/SQL

•Blocks in PL/SQL (Anonymous Block,Named Block)

•Static Sql and Dynamic Sql

•Identifiers, Datatype (User Define,% Type,%Rowtype, LOB),Null in PL/SQL

•Variable(Global,local), Constant,Operators

•Conditional Statements,Case Statements, Basic Loop,While loop,

•For Loop, Loop Control Structure (Exit, Continue, Goto).

•Strings (Fixed Length,Variable Length,CLOB)

•Procedure(Schema Level, Inside Package, Inside PL/SQL Block),Functions

•Cursors (Implicit Cursor,Explicit Cursor)

•Implicit Cursor (SQL Cursor Attributes %NOTFOUND, %FOUND, %ISOPEN %ROWCOUNT).

•Records (Table-Base Records,Cursor-Base Records,User-Defined Records )

•Exceptions(System Defined Exception, User Defined Exception )

•Triggers(Row level Triggers, Table level Triggers, Before and After or Before Trigger)

•Packages (Packages Specifications, Packages Body or definition).

INVOLVEMENT IN ETL PROCESS

•Solid Back End Testing experience by writing and executing SQL Queries.

•Experience in Testing, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.

•Strong working experience on DSS (Decision Support Systems) applications, Extraction, Transformation and Load (ETL) of data from Legacy systems using Abnitio.

•Experience in using Oracle Databases, oracle 10g\11g.

•Expertise in QA Testing in distributed Unix/Windows Environment and Oracle databases as back end, Performed

•End-to-End Testing.

•Expertise in extended validation of report functionality developed, using OBIEE 11.1x, Business Objects, by writing

•Complex SQLs at the backend.

•Good concept of ETL, Warehouse and BI applications integration.

•4 years with SQL/PL-SQL and expertise in writing complex SQL Queries.

•Work extensively with Slowly Changing Dimensions(SCD's) & Audit Tables

•Experienced SQL Data Analyst / Data Reporting Analyst with strong background in design, development,

•Support of online databases and information products as well as Testing / reporting. .

•Extensive experience in ETL process consisting of data transformation, sourcing, mapping, conversion and loading

•Well versed with Manual and Automated Testing methodologies and principles.

•Proficient in Oracle, PL/SQL Developer, SQL Developer, LINUX

PROFESSIONAL EXPERIENCE

•Previously Working as a Software Engineer in WebKriya PVT LTD from Feb, 2013 to Feb 2016.

•Previously Working in IBM Pvt. Ltd as an ETL Tester From 4th May 2016 to 31 Oct 2016.

•Currently Working in L&T Infotech Pvt. Ltd as an ETL Tester From 1st Nov 2016 till now .

TECHNICAL SKILLS

Testing Tools: HP Quality Center 9.0/11.0 (ALM)

Databases: Oracle 10g/11g.Hive

Utility Tools: SQL Developer, SQL Assistant

ETL Tools: Informatica 9.0.1, Abnitio 3.2.7.2

Reporting Tools: OBIEE 11.1.1.7.17512

Operation Systems: Windows XP/7, Linux

Microsoft Office: Word, PPT, Excel

Programming: PL/SQL, Core Java

ACADEMIC CREDENTIALS

B.Tech (Computer Science) from Uttarakhand Technical University, Dehradun and Secured 68%

PROJECT OVERVIEW

PROJECT# 1

Project Name: RISE

Client: TD-BANK

Environment: MS SQL SERVER

Role: TEST ENGINEER

DESCRIPTION:

TD-BANK is a Toronto Dominion Bank which provide various facilities to the Customers like

Deposits, Insurances, Investment, Personal Banking, Small Business Banking, Commercial Banking

Whole Sale Banking, Auto Financing . RISE stands for Risk Investigation Standard Edition that was

Basically a data mart on the basis of what reports had been generated.

ROLES & RESPONSIBILITIES:

•Developed Test Plans, Test Cases, and writing Test Scripts support for UAT tests Used Informatica as an ETL Tool for Developing the Data Warehouse.

•Managed and conducted, Sanity Testing,Integration Testing and Full load Testing, Incremental

•Load Testing, SCD Testing.

•Running the Jobs/Workflows for ETL process, ETL Test Script,Prepared SQL queries to verify the data, using Column Mapping Documents

•Verifying the ETL data in target database using Column Mapping between source and target databases, Reporting daily Testing status report

•Designed and Peer reviewed the test cases.

•Used the Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.

•Extensively worked with Informatica (Repository Manager, Power Center Designer, Workflow Manager, and Workflow Monitor).

•Used the Informatica and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions.

•Tracked the defects using Clear Quest tool and generated defect summary reports.

•Prepared status summary reports with details of executed, passed and failed test cases

•Interacted with developers, Business & Management Teams and End Users.

•Participated in regular project status meetings related to Testing.

ENVIRONMENT: SQL, PL/SQL, Test Cases, Test Scripts, Test Plan, Traceability Matrix, Test Director, Flat Files, Informatica Power Center, Scheduler Ctrl M, Informatica Power Center 9.0.1 (Power Center Designer, workflow manager, workflow monitor.

PROJECT# 2

Project Name: BCBS Regulatory Reporting

Client: Citibank

Environment: FAEM Portal, Maphub,hadoop,Hive,Unix and Data Warehouse

Role: Test Engineer

DESCRIPTION:

The Basel Committee on Banking Supervision's regulation number 239 (“BCBS 239”), titled “Principles for effective risk data aggregation and risk reporting”, is a set of federal guidelines that were set in place

to bolster banks’ risk data aggregation capabilities and internal risk reporting practices, thereby enhancing the risk management and decision making processes at banks.

As part of Citi’s commitment to Basel Committee regulators, it has identified 31 management reports that fall under BCBS 239. The program aims to eliminate manual data derivations and calculations currently managed in local files or End User Computing (EUC) and source all data that feed the 31 reports through a Data Bridge subscription via CTO Data Services; this will streamline the report data acquisition process and uphold the integrity of the attributes used in risk reporting.

ROLES & RESPONSIBILITIES:

•Developed Test Plans, Test Cases, and writing Test Scripts support for UAT tests Used Abnitio as an ETL Tool for Developing the Data Warehouse.

•Managed and conducted, Sanity Testing,Integration Testing and Full load Testing, Incremental

•Load Testing, SCD Testing.

•Running the Jobs/Workflows for ETL process, ETL Test Script,Prepared SQL queries to verify the data, using Column Mapping Documents

•Verifying the ETL data in target database using Column Mapping between source and target databases, Reporting daily Testing status report

•Designed and Peer reviewed the test cases.

•Used the Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.

•Extensively worked with Abnitio(Job-Monitoring)

•Used the Abnitio and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions.

•Tracked the defects using Clear Quest tool and generated defect summary reports.

•Prepared status summary reports with details of executed, passed and failed test cases

•Interacted with developers, Business & Management Teams and End Users.

•Participated in regular project status meetings related to Testing.

ENVIROMENT: SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, Traceability Matrix, Flat Files, Run Book, Abnitio Version 3.2.7.2, Oracle11g, Putty, Winscp

PROJECT# 3

Project Name: Yield Book

Client: Citibank

Environment: Unix and Data Warehouse, Web Services

Role: Test Engineer

DESCRIPTION:

GENESIS send US REL Mortgage data required for Yield Book and Yield Book will send the calculated output to a Data Bridge layer. The calculated output will be provisioned to RUBY through the Data Bridge via subscriptions.

ROLES & RESPONSIBILITIES:

•Developed Test Plans, Test Cases, and writing Test Scripts support for UAT tests Used Abnitio as an ETL Tool for Developing the Data Warehouse.

•Managed and conducted, Sanity Testing,Integration Testing and Full load Testing, Incremental

•Load Testing, SCD Testing.

•Running the Jobs/Workflows for ETL process, ETL Test Script,Prepared SQL queries to verify the data, using Column Mapping Documents

•Verifying the ETL data in target database using Column Mapping between source and target databases, Reporting daily Testing status report

•Designed and Peer reviewed the test cases.

•Used the Designer to develop processes for extracting, cleansing, transforming, integrating and loading data into data warehouse database.

•Extensively worked with Abnitio(Job-Monitoring)

•Used the Abnitio and its run-time engine to schedule the jobs, validating and debugging its components, and monitoring the resulting executable versions.

•Tracked the defects using Clear Quest tool and generated defect summary reports.

•Prepared status summary reports with details of executed, passed and failed test cases

•Interacted with developers, Business & Management Teams and End Users.

•Participated in regular project status meetings related to Testing.

ENVIROMENT: SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, Traceability Matrix, Flat Files, Run Book, Abnitio Version 3.2.7.2, Oracle11g, Putty, Winscp

PROJECT# 4

Project Name: IFRS Impairment Module

Client: Citi-Bank

Environment: FAEM Portal, Unix, Data Warehouse and IFW Portal

Role: Test Engineer

DESCRIPTION:

The Impairment Module will be part of the Finance Full Suite Global framework. It will provide a common platform for viewing and finalizing the Impairment Loss numbers for IFRS9. Management would be able to see the results of the Impairment Loss calculations received from Impairment Calculations executed in the Wholesale Optima and Retail Impairment Calculators. Also, additional adjustment amounts may need to be allocated at a population level through Management Adjustments to compensate for lag in information at calculation stage as well as new macro information that might have very recent.

Wholesale Optima would act as an input to the Impairment Module for the Wholesale work stream and the Impairment Calculator would act as input for the Retail work stream. Data Inputs received into Impairment Module as well Management Adjustments executed in Impairment Module will be available for reports using the Inquiry Framework (IFW) reporting platform.

ROLES & RESPONSIBILITIES:

•Involved in the analysis of the user requirements and identifying the sources.

•Involved in checking the data is transforming correctly according to business requirements.

•Involved in checking the projected data is loaded into the data warehouse without any truncation and data loss.

•Tested Abnitio’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

•Involved in Execution of Test Cases based on Duplicate Check, Null Parameters and constraint checking.

•Involved in various Stages of ETL Testing.

•Involved in the Testing of various types of Dimension Table and Fact Tables.

•Worked with different data sources like Flat Files, Subscription Files.

•Involved in monitoring the workflows and in optimizing the load times.

•Prepared test Scenarios and Test cases and involved in unit Testing of mappings, system Testing and user acceptance Testing.

•ENVIROMENT :SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, Traceability Matrix, Flat Files, Run Book, Abnitio Version 3.2.7.2,Oracle11g, Putty, Winscp

PROJECT# 5

Project Name: Stock Records

Client: Citi-Bank

Environment: Unix, Data Warehouse and IFW Portal

Role: Test Engineer

DESCRIPTION:

All the stock records data from distinguished resources like GPDW(Global Position Data Warehouse) and other’s consolidated into the centralized data warehouse which further provisioned to consumer(Treasury) via Data Bridge.

ROLES & RESPONSIBILITIES:

•Involved in the analysis of the user requirements and identifying the sources.

•Involved in checking the data is transforming correctly according to business requirements.

•Involved in checking the projected data is loaded into the data warehouse without any truncation and data loss.

•Tested Abnitio’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

•Involved in Execution of Test Cases based on Duplicate Check, Null Parameters and constraint checking.

•Involved in various Stages of ETL Testing.

•Involved in the Testing of various types of Dimension Table and Fact Tables.

•Worked with different data sources like Flat Files, Subscription Files.

•Involved in monitoring the workflows and in optimizing the load times.

•Prepared test Scenarios and Test cases and involved in unit Testing of mappings, system Testing and user acceptance Testing.

•ENVIROMENT :SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, Traceability Matrix, Flat Files, Run Book, Abnitio Version 3.2.7.2,Oracle11g, Putty, Winscp

PROJECT# 6

Project Name: Non-Financial Data

Client: Citi-Bank

Environment: Unix, Data Warehouse

Role: Test Engineer

DESCRIPTION:

The data from Archer will be the golden source of information. The Granular contractual data from Archer will be sent to Reginsight via the Databridge which will be used for FRY9-C reporting.

ROLES & RESPONSIBILITIES:

•Involved in the analysis of the user requirements and identifying the sources.

•Involved in checking the data is transforming correctly according to business requirements.

•Involved in checking the projected data is loaded into the data warehouse without any truncation and data loss.

•Tested Abnitio’s Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables.

•Involved in Execution of Test Cases based on Duplicate Check, Null Parameters and constraint checking.

•Involved in various Stages of ETL Testing.

•Involved in the Testing of various types of Dimension Table and Fact Tables.

•Worked with different data sources like Flat Files, Subscription Files.

•Involved in monitoring the workflows and in optimizing the load times.

•Prepared test Scenarios and Test cases and involved in unit Testing of mappings, system Testing and user acceptance Testing.

•ENVIROMENT :SQL, PL/SQL, Test Plan, Test Cases, Test Scripts, Traceability Matrix, Flat Files, Run Book, Abnitio Version 3.2.7.2,Oracle11g, Putty, Winscp

PERSONAL SNIPPETS

•Father’s Name : Fateh Singh Negi

•Mother’s Name : Uma Devi

•Gender : Male

•Nationality : Indian

•Religion : Hinduism

•Language Known : English, Hindi, Garhwali

•Address : Roorkee,Uttarakhand-240766

DECLARATION:

I hereby declare that all the information furnished above is true & correct to the best of my knowledge & belief.

Place: Chennai SANDEEP SINGH NEGI



Contact this candidate