Post Job Free

Resume

Sign in

Data Etl

Location:
New Providence, NJ
Posted:
August 19, 2020

Contact this candidate

Resume:

RAJITH T S

adfgbm@r.postjobfree.com

908-***-****

PROFESSIONAL SUMMARY

8.5 years of experience in ETL with big data testing and Automation Testing with various competencies in the Information Technology Industry in various technologies and domains and has expertise in Analysis, Documentation, reporting activities, Automating.

Proactively communicate and collaborate with external customers to analyze information and functional requirement.

Experience in validating connectivity products that allow efficient exchange of data between core database engine and Hadoop ecosystem.

Experienced in Java Selenium Automation, Web services Automation with BDD/Cucumber framework, TestNg class, POM class, locator, Error handling exception handling, Hook,Tag Hook Maven .

Experience in Map Reduce programming model for analyzing the data stored in HDFS.

Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data mining and transformation, distribution and optimizing ETL workflows, entitlement system using swagger application.

Experienced on Hadoop testing, HDFS, Querysurge, data lake, Kafka message, NDM, Spring XD, Autosys jobs,spark job,Hive and H-Base databases with different data format and batch jobs .

Expertise in ETL testing, reporting, data transformation testing on banking application.

Experienced on SCD, snowflake, star schema, Dimension table and fact table, partitions Type.

Expertise in DDL, DML, TCL, Staging, ODS, Mart, DWH and database testing like functions and triggers

Experienced on structured, semi structured and Un -structured data.

Experienced on different data bases like tera-data, PostgreSQL,SQL server, MySQL, Sybase,Nosql.

Expertise in finance, Insurance and BFSI domain like lumpsum payments, Withdrawal, Transfer, Loans and good experienced on Agile methodology and their tools like RTC.

Understanding of Data Models, Data Schema, ETL and created extensive store procedures, SQL queries to perform back-end data warehousing testing.

Expertise in Amdocs billing, Web-services API, SOA, Batch jobs, database testing, triggers, functions, backend application process.

Experienced on testing tool like SoapUI, filezilla, Mobile Apps, Sauce Lab and Unix/Linux operating system.

Experienced on configuration snowflake DB with the AWS S3 Bucket, AZURE and different Database.

Experienced in Test Estimation and identifying the Automating test cases.

Experienced in test management tools like QC ALM, Jama,Jira, confluence page.

Experienced in using BI tool Cognos, tableau dashboard, informatica, abinitio, Datastage testing

TECHNICAL SKILLS:

Technologies

Apache Spark, HDFS, Yarn, Hive, MapReduce, Pig, Sqoop, Flume, Oozie, Kafka, BFSI, Amdocs,Nosql

Programming Languages

Core Java

RDBMS

Postgres, MySQL, TOAD,SQL Server & Developer,Oracle11g,Snowflake,Aginity,Mago DB

NoSQL Databases

HBase

Operating Systems

Linux and Windows

Testing Tools

Test Manager,Jmeter,SoapUI, Mobile apps, sauce lab, Unix, ALM, Jama, Jira, GitHub, Autosys, MTM,TFS,Cognos BI tool, tableau, informatica, abinitio

EDUCATION, AWARDS AND CERTIFICATIONS:

Bachelor of Technology, GSSIT, Affiliated to VTU, 2011

Spark and Scala Developer edureka certificate

PROFESSIONAL EXPERIENCE:

Client 1 : PetSmart Phoenix Arizona as Quality Assurance Egg Nov 2019—Till date

Project Description: Enhancement of the customer Website and securing the customer data with the new software and upgrade to latest technology

Responsibilities:

Understanding the new enhancement and creating the story, grooming and Test estimation.

Involved in developing Test Cases, Test Plans and agile ceremonies and Maintaining in jira,Q-test and Jama.

Team collaboration documentation handling the offshore team.

Data incremental import to HDFS using the Sqoop, scheduling the job using oozies.

Testing the MDM flow with difference source system integration for customer journey like match merge .

Validating the data ingestion using ADLP (data bricks)) and validating the Scala spark SQL.

Testing the data flow across different stages of ETL process using complex query and running the ETL informatica workflow and monitoring the job status

Creating the higher order function data, dataframe for different tables

Validating the structural, un- structural and semi structural data in HDFS using hive, HBase.

Verify the data Flow between Netezza table and tibco in Azure queues. Running the Informatica workflow and checking the session logs verifying the data loaded into Snowflake DB.

Building the java automation scripts for data comparison using spark RDD,Sparksql/Scala, lambda function.

Automating the performance test cases using java JMeter

Setting the defect Triage meeting and interaction with different shake holders.

Identifying the regression and performance cases and adding them in JMeter and GIT with CI/CD process

Technical Environment: Tibco 6.4, informatica 9.6, Aginity 4.9, Tidal, AZURE, Jira GIT, Putty, JMeter 5.2.1, postman, hive, HBase, mango db,tidal,JAMA

Client-2: REI Seattle Washington onsite Co-Ordinator May2019 -Oct 2019

Project Description: Data migration to snowflake form different source system

Responsibilities:

Team collaboration documentation handling the offshore team.

Involve in setting up the test data and environment for sap to BI/BW process, T-Codes, Queues

Verify the data between snowflake and Netezza table and data flow in S3 bucket Running the job from DataStage and checking the EC2 logs.

Validating the each ETL logic by running each individual ETL workflow and verifying the data flow as per the ETL mapping and ETL transformation.

Testing the customer data flow in MDM data base.

Involved in developing Test Cases, Test Plans, Test Execution, Defect Tracking in QC ALM

Building the automation for data validation using Java and selenium with eclipse.

Running the shell and python script jobs form the Putty or tidal and verify the logs in EC2 serve.

Setting the defect Triage meeting and attending the on-shore scrum call with the team

Experience in preparing weekly status, test execution, and test metrics reports

Experience in testing of BI/BW data stage and cloud snowflake and Netezza DB, AWS S3 bucket

Technical Environment: Python 2.6.6, Datastage, Aginity 4.9, snowflake, Tidal, EC2, SQL, GIT, UNIX, API, AWS S3 bucket

ITC Infotech(EUROFINS),Bangalore India(Lead Consultant ) Sep 2018 – Feb 2019

Role: Selenium/ Big data lead Consultant.

Responsibilities:

Understand data mapping from Source to Target tables with business logic with ETL transformation across tables

Experience in preparing weekly status, test execution, and test metrics reports

Verify the Data flow (Data mapping, Counts, data Transformation logic) from source to target

Verify the Database structural changes, verifying the SED implementation

Involved in developing Test Cases, Test Plans, Test Execution, Defect Tracking, and Report Generation using JIRA on functional specifications.

Involved in end-to-end defect management of assigned projects. Identified defects, assess root cause, and prepared detailed information for developers and business stakeholders.

Experienced in Data Validation and Backend testing of databases to check the integrity of data.

And also used extensively HQL Queries to analyze the HDFS data

Experience in testing of Data Warehouse/ETL Applications developed in Informatica, Ab initio using SQL Server, Oracle, Hadoop, DB2,

Evaluate ETL/BI specifications and processes and testing API in postman tool.

Handling the offshore team and attending the scrum, Requirement, Defect Triage call

Experience with different file systems /databases like Oracle, HDFS and MS SQL Server to extract and load data using Sqoop.

Validating the Mapping in Infomatica IDQ, Maplet, worklet, workflow, session.

Extensively tested Data Warehouse or Big Data applications using Hadoop and ETL.

Worked on queries, and scripting languages like HQL, Apache Pig, and SQL.

Environment: JAVA 1.8, Infomatica 10.1, AQT, PUTTY, SQL, SQL Server 2005/2008, UNIX, POSTMAN,Rest Api, Oracle 9i, Maven, Jenkins, Test NG, Intellij, GIT

Synechron (Wells Fargo), Bangalore, India May 2017 – Sep 2018

Role: Senior ETL big data/Selenium Automation tester.

Responsibilities:

Involved in the requirement walk through and test estimation.

selenium Automating using BDD/Cucumber, TestNg class, POM class, Hook, Tag hook.

Code check in using the Git application

Tested range of Big Data architectures and frameworks including Hadoop and deployment of code build.

Validating the data flow between GitHub, Symphony, Maprf (Hbase) tables, swagger rest API for entitlement system, Batch Jobs.

Involved in validating HDFS for the client system against sourcefeed type (parquet and Avro file or DB pull) and validating data using hadoop commands.

Triggering the spring XD jobs and validating NDM data in down/up stream and Data Lake portal

Involved in validating the different abinitio pset jobs and deployment of Autosys jobs and control m tool for scheduling the job.

Validating internal and external table of hive to load large sets of structured, semi-structured data, NoSQL and a variety of portfolios and spark jobs.

Involved in testing the BI report using Cognos dashboard tool.

Validating the restapi requests and automating using postman application.

Involved in validating the data ingestion and data validating across the different ETL stages using HQL and SQL query

Followed the Agile Testing practices and guidelines, including Defect Management using HP ALM Quality Center tool and handling the small team.

Automating the large set of data using query surge tool and MD5sum tool.

Nttdata(Arqiva), Bangalore, India Feb 2016 – May 2017

Role: ETL/Selenium Tester.

Responsibilities:

Identifying test artifact like Test out line, scenarios, writing test cases outline for HLD and LLD document.

Involved in validating data in different table using join query in postgresql and validating the data in different level of ETL stages as per mapping FSD.

Involved in the configuring the jobs scheduler in informatica and validating the data in Staging, ODS, MART tables.

Modifying the transformation, maplet, work flow in the Infomatica IDQ

Involved in sanity, functional, system, regression and retesting in different environment and verifying DWH file

Involved in testing the BI reports in tableau tools

Environment: JAVA 1.7, Infomatica 9.1.0, postgresql 10.6, Linux (openSUSE 10.1), filezilla, XML SoapUI 4.5, Oracle 11g, JUnit, Jira, JAMA

Nttdata (TIAA CREF), Bangalore, India April 2014 – Feb 2016

Role: ETL/Selenium Tester.

Responsibilities:

Involved in the requirement walk though discussion, Comparing and adopting banking terminology with requirement and test estimation and handling the team.

Developing the automation script for new modules and worked on hybride module.

Involved in preparing the automation test data and preparing the selenium script for new functionality.

Involved in automating the lump sum Payments, end to end testing (Withdrawals), Transfer, Loans.

BDD through Cucumber using Selenium automation using core java and Responsible for planning regression test stories in every Sprint.

Maintain Code versions through GIT, maven. log4j

Environment: JDK 1.7,Selenium 3.1.0 PRISM, QC ALM 12.5, RTC, SQL Developer 18,SoapUI 5.2, Oracle 11g, Eclipse 4.5, Maven, Jenkins, Test NG, Intellij, GIT

Datamatics(AMDOCS), Pune, India July2013 to April2014 Role: Associate consultant.

Responsibilities:

Involved in detail understanding of application and data flow for new CRs.

Involved in testing the web management, order management, 2G& 3G usage, MFS, AAM, POS,

CRM, CSM.

Releases and builds and also testing the SOA, API application using SOAPUI

Involved in processing flat & XML file, running the batch jobs in back end and logs in the putty.

Involved in testing the Amdocs mobile finance service like E-wallet, M-wallet, and Auto payment.

Environment: Enabler and Ensemble, Mainframe, toad (SQL) 6.5, SoapUI 4.5, Quality center 11, CRM8, CSM,

Amdocs Billing, Linux, XML Bridge

Syniverse, Bangalore, India July 11 to July 13

Role: Test Engineer.

Responsibilities:

Analyzing the problem of description and resolving it by understanding the GHLD.

Test environment configuration in the Data base as per the logs error.

Performing the unit testing of triggers, convertors and flat file on unix platform and

Performing the regression testing for different operators and preparing the test detail reports to client.

Environment: C++, UNIX, QC, SQL Developer, Webservices, Restful Apl

Environment: C++, UNIX, QC, SQL Developer, Webservices, Resfull APL



Contact this candidate