Post Job Free

Resume

Sign in

Talend Developer Data Migration

Location:
Jacksonville, FL
Posted:
November 05, 2023

Contact this candidate

Resume:

Siva Kumar

Talend Integration Developer

ad0vfg@r.postjobfree.com

Cont.no: 815-***-****

PROFESSIONAL SUMMARY:

● 10+ years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.

● Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from a wide variety of source systems including Oracle, DB2, SQLserver, Teradata, Hive, and flat files, XML and Mainframe files and Active MQ.

● Good experience in relational database management systems, experience in integrating data from various data sources like Oracle, MSSQL Server, MySQL and Flat files.

● Excellent knowledge of deployment process from DEV to QA, UAT and PROD with both Deployment group and Import/Export method using TAC.

● Working knowledge of Waterfall, Agile methodologies.

● Familiar with design and implementation of Data Warehouse life cycle and excellent knowledge of entity-relationship/multidimensional modeling (star schema, snowflake schema), Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).

● Debugging ETL job errors, ETL Sanity and production Deployment in TAC- Talend Administrator Console using SVN.

● Experience in Troubleshooting and implementing Performance tuning at various levels such as Source, Target, Mapping Session and System in the ETL Process.

● Experience in converting the Store Procedures logic into ETL requirements.

● Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environments.

● Experience in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle, DB2, Netezza, SQL server, Teradata, Hive and non-relational sources like flat files, XML and JSON files.

● Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.

● Involved in code migrations from Dev. to QA and production and provided operational instructions for deployment.

● Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data. ● Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.

Technical Skills:

Development Tools Talend Data Services Platform 7.1, Talend Open Studio For Data Integration 7.1, StudioTalend 6.2,

Talend ESB 6.2, Eclipse, Putty, File Zilla, SQL

Developer, Toad, Teradata SQL Assistant, Agnity

Workbench for Netezza, Quality Center, Harvest, SQL Server Management Studio.

Programming Languages Core Java, SQL, PL/SQL, C, C++ Operating Platforms MS Windows 2010/2000/XP/NT, Mac OS, Unix and Linux. Web Technologies HTML, XML, CSS, XSD, JavaScript and JSON. Client: TIAA, NC

Apr 2022 – Till date

Role: Sr. Talend Developer

Responsibilities:

● Designing and delivering complex, large-volume data warehouse applications using Talend ETL tool.

● Developed the ETL mappings for XML, CSV, TXT sources and loaded the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.

● Used components like tXMLMap, tFileInputXML, tFileOutputXML, tAdvancesFileOutputXML, tExtractXMLField component while dealing with XML structures.

● Worked on Error handling techniques and tuning the ETL flow for better performance.

● Created complex mappings in Talend using tHash, tDenormalize, tMap, and tUniqueRow components.

● Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.

● Developed joblets using the tLibraryLoad component in talend to get the encrypted database connection details from CyberArk to pass the encrypted password to the talend connection components.

● Migrated the jobs from mulesoft to talend.

● Analyzing, Understanding and creating the mapping document from Mulesoft code.

● Built a job using the tPrejob, tDBConnection, tLibraryLoad, tJMSInput, tMap, tFileInputDelimited, tSchemaComplianceCheck, tDBOutput, tDBInput, tFileOutputDelimited, tJMSOutput, tParallelize, tNormalizer, tAggregateRow, tFileStreamInputJson, tFileInputJson, tFileOutputJson, tJoin, tExtractJsonFields, tPostjob, tFileDelete, etc components.

● Used talend salesforce components like tSalesforceConnection, tSalesforceInput, tSalesforceOutput, tSalesforceOutputBulk, tSalesforceOutputBulkExec, tSalesforceBulkExec while working on CRM sources or targets.

● tSnowflakeConnection, tSnowflakeInput, tSnowflakeRow, tSnowflakeOutput, tSnowflakeClose components are used while using the snowflake DB as source.

● Worked on web services using talend components like tSOAP, tREST, tWebService, tWebService Input, tRestClient, tFileFetch etc.

● Performed data manipulations using various Talend components like tMSSQLInput, tMssqlOutput, tOracleInput, tOracleOutput, tMap, tJavarow, tjava, tFileExist, tFileCopy, tFileList, tDie, tsendEmail and many more.

● Also used the tSystem component to change the file properties and delete files from the talend servers (linux server) based on time bound (Inbound/ Outbound) mentioned like 90 days .

● Developing, Promoting code to ST2, ST4, DR and then PROD environments.

● Mentored other junior team members.

● Updated the confluence page with job summary, details, connections, dependencies, Autosys job name(box, command job names), Autosys schedule, Jira story link, Unit test, release date, QA, UAT.

Skills Used:

Talend Studio 7.3.1, Flat files, CSV files, Xml files, SoapUI, TMC, Winscp, Autosys Scheduler, Putty, Shell Scripts, Salesforce, Postman, Mulesoft 3.9.5 Exception, Mulesoft 4.1 Runtime, Mulesoft Cloud hub, Agile Methodology(jira), Confluence, Clarity.

Client: Walgreens, IL

May 2021 – Apr 2022

Role: Sr. Talend Developer

Responsibilities:

● Provides technical direction for the design of complex and large scale Data & Analytics solutions to effectively meet business needs.

● Provides leadership for the solution design of projects that address business needs, including the analysis of multiple solution options

● Partners with the Enterprise Architecture team on the creation of the conceptual architecture that addresses the high level business requirements.

● Participates in Solution Architecture Review for applicable projects

● Authors complex macro-level technical designs that define how the functional requirements will be technically enabled.

● Reviews & Approves macro level technical design for Small to Medium scale solutions

● Provides guidance to an extended project team related to the solution architecture and technical design.

● Assists and advises project teams in the configuration, testing, implementation and support of secure technical solutions.

● Actively worked on solving the visible and complex business problems.

● Worked on Error handling techniques and tuning the ETL flow for better performance.

● Developed the ETL mappings for JSON, CSV, TXT sources and loaded the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.

● Created complex mappings in Talend using tHash, tDenormalize, tMap, and tUniqueRow components.

● Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.

● Developed unit test cases for all enhancements/ new requirements and executed the tests.

● Built a job using the tPrejob, tDBConnection, tLibraryLoad, tJMSInput, tMap, tFileInputDelimited, tSchemaComplianceCheck, tDBOutput, tDBInput, tFileOutputDelimited, tJMSOutput, tParallelize, tNormalizer, tAggregateRow, tFileStreamInputJson, tFileInputJson, tFileOutputJson, tJoin, tExtractJsonFields, tPostjob, tFileDelete, etc components.

● Also used the tSystem component to call other processing commands that are already up and running in a large job.

● Used tJMSIntput and tJMSOutput components to post and call the messages from JMS queues.

● Mentored other junior team members.

Skills Used:

Talend DataManagement 6.3.1, Talend Studio 7.1.1, Flat files, CSV files, JASON,SoapUI, HermesJMS Queues, Winscp, ESP Scheduler, Putty, Shell Scripts. Client: BCBS, FL

June 2019 – April 2021

Role: Sr. Talend Developer

Responsibilities:

● Research, analyze and prepare logical and physical data models for new applications and optimize the data structures to enhance data load times and end-user data access response times.

● Design and develop Talend prototypes for batch acquisition and product fulfillment.

● Design Talend jobs as per Service Oriented Architecture.

● Work on optimization of Talend Jobs for better performance.

● Design normalized data structures and layouts for Consumer and Commercial data stores.

● Create RPDM projects to read data from web services and load into Salesforce using the RPDM ETL Tool.

● Develop and execute various member acquisition campaigns through RPI Tool, as part of the Member engagement process.

● Design and develop various RPDM Projects as part of Digital Campaign Management.

● Design campaigns to support Facebook, Instagram, Twitter campaigns as part of web and mobile applications.

● Design and develop generic framework using RPDM projects to dynamically generate products/segments to support Multi-Tenant processing.

● Work with SQL Tools like TOAD to run SQL queries and validate the data.

● Developed stored procedures, views in DB2 and used them for Talend and RPDM for loading staging and warehouse Data Marts..

● Developed various Unix scripts.

● Work with migrating and deploying the code to UAT and Production environments.

● Install and upgrade Talend and RPDM ETL Tools.

● Responsible for tuning ETL mappings, Workflows, and underlying data models to optimize load and query Performance.

● Developed unit test cases for all enhancements/ new requirements and executed the tests.

● Working with application design development solutions that meet the client product requirements.

● Develop solutions to meet design specifications from clients.

● Build features that meet design solutions and that are compliant with client design practices.

● Lead the ETL Offshore Development teams and provided the recommendations to optimize the data extraction and loading using ETL Talend jobs.

● Worked on end-to-end development of software products from requirement analysis to system study, designing, coding, testing (Unit & Performance), documentation and implementation.

● Working on Talend Management Console for Job Scheduling, Server Monitoring, Task Creation, Plan Creation, Job Deployment etc.

● Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.

● Use of Visual Studio Team Services (VSTS) for Source Code Control, project related document sharing and team collaboration.

Skills Used:

Talend Dataservices Platform 7.1, Talend Open Studio For Data Integration 7.1, Talend Big Data 6.2, Talend MDM, Hive, Oracle 11G, AWS, XML files, Flat files, HL7 files, JASON, Talend Administrator Console, IMS, Agile Methodology.

Client: SIRVA, Chicago, IL

March 2018 – May 2019

Role: Sr. ETL Talend Consultant.

Responsibilities:

● Collaborating with the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business-critical projects coming up with huge data extraction.

● Writing T-SQL (DDL and DML) statements in creating Tables, User Defined Functions, Views, Complex Stored Procedures, common table expressions (CTEs), temporary tables, Cluster/Non-Cluster Index, Unique/Check Constraints, Relational Database Models, SQL joins and Triggers to facilitate efficient data manipulation and consistent data storage.

● Worked on Tool Migration POC to Migrate Code from SSIS to Talend.

● Created ETL design and mapping document of the code for tool migration.

● Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of projects from development to testing environment and then to production environment.

● Created complex mappings in Talend using tHash, tDenormalize, tMap, and tUniqueRow.

● Used tStatsCatcher, tDie, and tLogRow to create a generic joblet to store processing stats into a Database table to record job history.

● Performed data manipulations using various Talend components like tMSSQLInput, tMssqlOutput, tOracleInput, tOracleOutput, tMap, tJavarow, tjava, tFileExist, tFileCopy, tFileList, tDie, tsendEmail and many more.

● Worked on web services using talend components like tSOAP, tREST, tWebService, tWebService Input etc.

● Designed Talend Jobs Using Big Data components like tHDFSInput, tHDFSOutput, tHDFSput, tHiveLoad, tHiveInput, tHivecreateTable.

● Design, Develop and deploy jobs to load the data into Hadoop and Hive.

● Responsible for tuning ETL mappings, Workflows, and underlying data models to optimize load and query Performance.

● Developed unit test cases for all enhancements/ new requirements and executed the tests.

● Working with application design development solutions that meet the client product requirements.

● Develop solutions to meet design specifications from clients.

● Build features that meet design solutions and that are compliant with client design practices.

● Led the ETL Offshore Development teams and provided the recommendations to optimize the data extraction and loading using ETL Talend jobs.

● Worked on end-to-end development of software products from requirement analysis to system study, designing, coding, testing (Unit & Performance), documentation and implementation.

● Working on Talend Management Console for Job Scheduling, Server Monitoring, Task Creation, Plan Creation, Job Deployment etc.

● Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation logic.

● Use of Visual Studio Team Services (VSTS) for Source Code Control, project related document sharing and team collaboration.

Skills Used:

Talend Big Data 6.2, Talend MDM, Hive, Oracle 11G, AWS, XML files, Flat files, HL7 files, JASON, Talend Administrator Console, IMS, Agile Methodology. Client: Tractor Supply Company, Brentwood, TN

Jan 2017 – Feb 2018

Role: Talend Developer

Responsibilities:

● Designed and implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.

● Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

● Created Talend jobs to retrieve data from Legacy sources and to retrieve user data from the Flat files on a monthly and weekly basis.

● Wrote Hive Queries to fetch Data from HBase and transfer them to HDFS through HIVE.

● Optimized the performance of the mappings by various tests on sources, targets and transformations.

● Used debugger and breakpoints to view transformation output and debug mappings.

● Developed the ETL mappings for XML, CSV, TXT sources and loaded the data from these sources into relational tables with Talend ETL Developed Joblets for reusability and to improve performance.

● Imported the data from different sources like HDFS/HBase into Spark RDD.

● Involved in converting Hive/SQL queries into Spark transformations using Spark RDD, Scala and Python.

● Involved in unit testing and system testing and preparing Unit Test Plan (UTP) and System Test Plan

(STP) documents.

● Migrated the code and release documents from DEV to QA (UAT) and to Production.

● Troubleshooting, debugging & altering Talend issues, while maintaining the health and performance of the ETL environment.

● Scheduled Talend jobs with Talend Admin Console, setting up best practices and migration strategy.

● Created Talend Mappings to populate the data into dimensions and fact tables.

● Responsible for Broad design, development and testing with Talend Integration Suite and Performance Tuning of mappings.

● Performed Talend Data Integration, Talend Platform Setup on Windows, and UNIX systems.

● Created complex mappings in Talend 6.0.1/5.5 using tMap, tJoin, tReplicate, tParallelize, tJava, tjavarow, tJavaFlex, tAggregateRow, tDie, tWarn, tLogCatcher, etc.

● Created joblets in Talend for the processes which can be used in most of the jobs in a project like Start job and Commit job.

● Used Repository Manager for Migration of Source code from Lower to higher environments.

● Developed jobs to move inbound files to vendor server location based on monthly, weekly, and daily frequency.

● Implemented Change Data Capture technology in Talend to load deltas to a DataWarehouse.

● Created jobs to perform record count validation and schema validation.

● Created contexts to use the values throughout the process to pass from parent to child jobs and child to parent jobs.

● Developed joblets that are reused in different processes in the flow.

● Developed an error logging module to capture both system errors and logical errors that contain Email notification and moving files to error directories.

● Provided Production Support by running the jobs and fixing the bugs.

● Used Talend database components, File components and processing components based on requirements.

● Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.

● Performed unit testing and integration testing after development and got the code reviewed.

● Responsible for code migrations from Dev. to QA and production and providing operational instructions for deployments.

Skills Used:

Talend 6.0.1, Oracle 11g, Hive, Sqoop, Teradata V 13.0, Fast load, Multi-load, Teradata SQL Assistant, MS SQL Server 2012/2008, PL/SQL, Agile Methodology, T-SQL, SSIS, TOAD, Erwin, AIX, Shell Scripts, Autosys.

Client: Ryaan Tech. Solutions, Bangalore, India

June 2015 - Dec 2016

Role: Talend Developer (Internship)

Responsibilities:

● The project was aimed to support existing clients by changing from manual Process to Automation through ETL development using Talend & Cognos Data Manager

● Closely worked with Data Architects in designing tables and was involved in modifying technical Specifications.

● Designed and Implemented the ETL process using Talend Enterprise Big Data Edition to load the data from Source to Target Database.

● Involved in Data Extraction from Oracle, Flat files and XML files using Talend by using Java as a Backend Language.

● Used the tWaitForFile component for file watch event jobs.

● Used over 20+ Components in Talend Like (tMap, tfilelist, tjava, tlogrow, toracleInput, toracleOutput, sendmail etc).

● Used debugger and breakpoints to view transformation output and debug mappings.

● Develop ETL mappings for various Sources (.TXT, .CSV, .XML) and load the data from these sources into relational tables with Talend Enterprise Edition.

● Worked on Global Context variables, Context variables, and extensively used over 100+ components in Talend to create jobs.

● Created child jobs to use them in parent jobs using tRunJob.

● Extracting transformed data from Hadoop to destination systems, as a one-off job, batch process, or Hadoop streaming process.

● Worked on Error handling techniques and tuning the ETL flow for better performance.

● Worked Extensively at TAC (Admin Console), where we Schedule Jobs in Job Conductor.

● Extensively Used Talend components tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, b, tOracleInput, tOracleOutput, tfileList, tDelimited etc.

● Worked with Oracle SQL Developer while Implementing Unit Testing of ETL Talend Jobs.

● Scheduling the ETL mappings on a daily, weekly, monthly and yearly basis.

● Working on POC Big Data like loading the data into HDFS and creating Map Reduce Jobs.

● Worked on the project documentation and prepared the Source Target mapping specs with the business logic and- also involved in data modeling.

● Worked on migrating data warehouses from existing SQL Server to Oracle database.

● Implemented Performance tuning in Mappings and Sessions by identifying the bottlenecks and Implemented effective transformation Logic.

Skills Used:

Talend 6.1/ 5.5.2, MS SQL Server, Teradata, Shell script, SQL Server, Oracle, Business Objects, ERwin, SVN.

Client: Sphinix Solutions, Hyderabad, India

April 2013 – May 2015

Role: SQL Developer

Responsibilities:

● Created and managed schema objects such as Tables, Views, Indexes, and referential integrity depending on user requirements.

● Actively involved in the complete software development life cycle for the design of the database for the new Financial Accounting System.

● Successfully implemented the physical design of the newly designed database into MS SQL Server 2008/2005.

● Used MS SQL Server 2008/2005 to design, implement and manage data warehouses, OLAP cubes and reporting solutions to improve asset management, incident management, data center services, system events support and billing.

● Utilized T-SQL daily in creating custom views for data and business analysis.

● Utilized Dynamic T-SQL within functions, stored procedures, views, and tables.

● Used the SQL Server Profiler tool to monitor the performance of SQL Server, particularly to analyze the performance of the stored procedures.

● Stored Procedures and Functions were optimized to handle major business calculations.

● Implementation of data collection and transformation between different heterogeneous sources such as flat file, Excel and SQL Server 2008/2005 using SSIS.

● Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the package according to the advanced features of SQL Server Integration Services.

● Defined Check constraints, rules, indexes, and views based on the business requirements.

● Extensively used SQL Reporting Services and Report Builder Model to generate custom reports.

● Designed and deployed Reports with a Drop Down menu option and Linked reports.

● Developed drill-down and drill-through reports from multi-dimensional objects like star schema and snowflake schema using SSRS and performance point server.

● Created subscriptions to provide daily basis reports and managed and troubleshoot report server related issues.

Skills Used:

MS SQL Server 2005/2008, SSIS, SSRS, Microsoft.NET, MS Office Access2003, Excel, Erwin.



Contact this candidate