Post Job Free
Sign in

Developer Data

Location:
The Bronx, NY
Posted:
June 18, 2020

Contact this candidate

Resume:

Shruthi.J

email: addwu6@r.postjobfree.com, Phone: 732-***-****

Over7+ years of IT experience in the Design, Analysis, Development, Modeling, Implementation, Administration and testing of various applications, Decision Support Systems & Data Warehousing applications using ETL tools like Talend.

5+ years of experience in Talend Data Integration/Big Data Integration /Talend Admin Console (TAC).

Experience working with Data Warehousing Concepts like OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/ Dimensional Data Modeling

Highly Proficient in Agile,Scrum and Waterfall software development life cycle.

Extracted data from multiple operational sources for loading staging area, Data warehouse, Data Marts using SCDs (Type 1/Type 2/ Type 3) loads.

Experience in creatingJoblets in TALEND for the processes which can be used in most of the jobs in a project like to Start job and Commit job.

Expertise in creating sub jobs in parallel to maximize the performance and reduce overall job execution time with the use of parallelize component of Talend in TIS and using the Multithreaded Executions in TOS.

Worked on RESTfulAPIs and Soap.

Adaptive to Cloud Strategies based on AWS (Amazon Web Services).

Technical knowledge on EC2, S3.

Extensive experience in using AWS S3 component of Talend such as tS3bucketcreate,ts3bucketdelete,ts3bucketexist,t3coy,ts3put,ts3get,ts3connection.

Experienced in CDC (Change Data Capture) monitor different target data bases.

Experienced in creating Triggers on TAC server to schedule Talend jobs to run on server

Worked extensively on Error Handling, Performance Analysis and Performance Tuning of Talend ETL Components .

Strong decision-making and interpersonal skills with result oriented dedication towards goals.

Education:

Masters in Computer Application,Bangalore,India.

Technology Skills:

ETL Tools

Talend Open Studio (TOS) for Data Integration(5.6/6.2/6.4),Talend for BigData,,SSIS

BigData

Hadoop, Sqoop, Hive

DB Tools

Toad, SQL Developer, WinSQL, Teradata SQL Assistant

Programming Language

JAVA,SQL, C, and C++,Html,Xml

Operating Systems

Windows, UNIX (SunSolaris10,HP,AIX) & Linux

Databases

Netezza, Oracle 10g & 11g, Teradata, MS SQL Server, MYSQL, Hive, MySQL,RedShift

Methodologies

Agile and Waterfall

Scheduling Tools

TAC

BI Tools

Tableau Desktop

Cognizant,NYC,NY July2019– Current

Talend Developer

Responsibilities:

Analyzed and performed data integration using Talend open integration suite.

Worked on the design, development and testing of Talend mappings.

Created ETL job infrastructure using Talend Open Studio.

Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow,tJava,

tjavarow, tConvertType etc.

Used Database components like tMSSQLInput,tMsSqlRow, tMsSqlOutput, tOracleOutput,

tOracleInput etc.

Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete,

tFileRename.

Worked on improving the performance of Talend jobs.

Created triggers for a Talend job to run automatically on server.

Worked on Exporting and Imporrting of Talend jobs.

Created jobs to pass parameters from child job to parent job .

Exported jobs to Nexus and SVN repository.

Implemented update strategy on tables and used tJava, tJavarowcomponnets to read data from

tables to pull only newly inserted data from source tables.

Observed statistics of Talend jobs in Airflow to improve the performance and in what scenarios errors

are causing.

Created Generic and Repository schemas.

Developed project specific Deployment job responsible to deploy Talend jar files on to the windows

environment as a zip file, later, this zip file is unzipped and the files are again deployed to the unix

box.

Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in

the unix environment.

Environment: Talend 7.2, UNIX, SQL Server, Bigquery, Postgrsql,FlatFiles,SQL,Airflow,Mysql.

HighPointSolutions,East Norriton,PA Jan2019–July2019

ETL/Talend Developer.

.

Responsibilities:

Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components

Created and managed Source to Target mapping documents for all Facts and Dimension tables.

Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.

Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.

Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput,tSqoopImport and tSqoopExport.

Importing and exporting data into Confidential and Hive using SQOOP.

Extensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.

Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.

Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.

Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on tables.

Extensively used Talend Administrator Console ( TAC ) for running the jobs on different servers by passing various context parameters.

Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.

Used tStatsCatcher, tDie, tLogRow to create a generic joblet to store processing stats into a Database table to record job history.

Used S3 components tS3Bucketcreate,ts3Bucketdelete,ts3Bucketexist,t3Copy,ts3Put,ts3Get,ts3Connection to create Buckets and Copy files and do some transformation and load the files in target system.

Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.

Worked on RESTfulAPIs and Soap.

Environment: Oracle 10g/11g Talend Open Studio (TOS) for Data Integration(5.6/6.2/6.4),Talend for BigData, HP QC 9.0/11.0, SQL developer, TOAD, SQL.

Citi, Citicardsway, Jacksonville,FL Apr2018– Dec2018

ETL/Talend Developer.

Responsibilities:

Worked with Data mapping team to understand the source to target mapping rules.

Analyzed the requirements and framed the business logic and implemented it using Talend.

Involved in ETL design and documentation.

Analyzed and performed data integration using Talend open integration suite. Worked on the design, development and testing of Talend mappings.

Created ETL job infrastructure using Talend Open Studio.

Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow,tJava, Tjavarow, tConvertType etc. Used Database components like tMSSQLInput,tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.

Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename. Worked on improving the performance of Talend jobs.

Created triggers for a Talend job to run automatically on server. Worked on Exporting and Imporrting of Talend jobs.

Created jobs to pass parameters from child job to parent job.

Exported jobs to Nexus and SVN repository. Implemented update strategy on tables and used tJava, tJavarowcomponnets to read data from tables to pull only newly inserted data from source tables. Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing. Created Generic and Repository schemas.

Developed project specific Deployment job responsible to deploy Talend jar files on to the windows environment as a zip file, later, this zip file is unzipped and the files are again deployed to the unix box.

Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in the unix environment. Developed shell scripts in unix environment to support scheduling of the Talend jobs.

Monitored the daily runs, weekly runs and adhoc runs to load data into the target systems .

Environment: Talend 5.5.2, UNIX, Shell script, SQL Server, Oracle, Business Objects, ERwin, SVN,

Corussoft Inc, Edison, NJ Oct 2015 – Mar 2018

Informatica/TalendETL Developer

Responsibilities:

Worked with Data mapping team to understand the source to target mapping rules.

Analyzed the requirements and framed the business logic and implemented it using Talend.

Involved in ETL design and documentation. Analyzed and performed data integration using Talend open integration suite.

Worked on the design, development and testing of Talend mappings. Created ETL job infrastructure using Talend Open Studio.

Worked on Talend components like tReplace, tmap, tsort and tFilterColumn, tFilterRow,tJava, Tjavarow, tConvertType etc. Used Database components like tMSSQLInput,tMsSqlRow, tMsSqlOutput, tOracleOutput, tOracleInput etc.

Worked with various File components like tFileCopy, tFileCompare, tFileExist, TFileDelete, tFileRename. Worked on improving the performance of Talend jobs.

Created triggers for a Talend job to run automatically on server. Worked on Exporting and Imporrting of Talend jobs.

Created jobs to pass parameters from child job to parent job. Exported jobs to Nexus and SVN repository.

Implemented update strategy on tables and used tJava, tJavarowcomponnets to read data from tables to pull only newly inserted data from source tables.

Created Generic and Repository schemas. Developed project specific Deployment job responsible to deploy Talend jar files on to the windows environment as a zip file, later, this zip file is unzipped and the files are again deployed to the unix box.

Also, this deployment job is responsible to maintain versioning of the Talend jobs that are deployed in the unix environment.

Environment: Talend 5.5.2, UNIX, SQL Server, Oracle, Postgrsql,FlatFiles,SQL.

BT- Telecom, India Jan 2012 – Aug 2014

JR. ETL/ Business Objects Reports Tester/QATester

Responsibilities:

Developed ETL test scripts based on technical specifications/Data design documents and Source to Target mappings.

Extensively interacted with developers, business& management teams to understand the project business requirements and ETL design document specifications.

Participated in regular project status meetings and QA status meetings.

Extensively used and developed SQL scripts/queries in backend testing of Databases.

Defects identified in testing environment are communicated to the developers using Quality Center – Defects module.

Prepared daily status reports with details of executed, passed, and failed test cases and defect status.

Tested a number of complex ETL mappings, mapplets and reusable transformations for daily data loads.

Creating test cases for ETL mappings and design documents for production support.

Extensively worked with flat files and excel sheet data sources. Wrote scripts to test the flat files data in the databases.

Environment: Oracle 10g/11g, Informatica Power Center 10(Power Center Designer, workflow manager, workflow monitor), SAP Business Objects 3.1 – Crystal Reports and Dashboard, HP QC 9.0/11.0, SQL developer, TOAD, SQL and SQL, Web Services



Contact this candidate