Post Job Free
Sign in

Sr.ETL Talend Consultant

Location:
North Brunswick, NJ
Posted:
April 13, 2021

Contact this candidate

Resume:

Sravya N

adlnou@r.postjobfree.com

404-***-****

Sr. Talend Consultant

PROFESSIONAL SUMMARY

Around 8 years of IT experience in Analysis, Design, Developing and Testing and Implementation of business application systems.

Experiences on developing & leading the end-to-end implementation of Big Data projects by using Talend BIGDATA, comprehensive experience as a in Hadoop Ecosystem like Map Reduce, Hadoop Distributed File System (HDFS), Hive.

Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Oracle, DB2, SQL server, Teradata, Hive, Hana, and flat files, XML and Mainframe files and Active MQ.

Good Experiences on relational database management systems, experience in integrating data from various data source like Oracle, MSSQL Server, MySQL and Flat files too.

Hands on experience on Hadoop technology stack (HDFS, Map-Reduce, Hive, HBase, Pig, Sqoop, Oozie, Flume and Spark).

Experience with NOSQL databases like HBASE and Cassandra

Excellent knowledge in deployment process from DEV to QA, UAT and PROD with both Deployment group and Import/Exports method.

Excellent working experience in Waterfall, Agile methodologies.

Familiar with design and implementation of the Data Warehouse life cycle and excellent knowledge on entity-relationship/multidimensional modeling (star schema, snowflake schema), Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).

Debugging ETL jobs errors, ETL Sanity and production Deployment in TAC-Talend Administrator Console using SVN.

Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping Session and System in ETL Process.

Experience in converting the Store Procedures logic into ETL requirements

Good communication and interpersonal skills, ability to learn quickly, with good analytical reasoning and adaptive to new and challenging technological environment.

Experience in Big Data technologies like Hadoop/Map Reduce, Pig, HBASE, Hive, Sqoop, Hbase, Dynamodb, Elastic Search and Spark SQL.

Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle […] DB2, Netezza, SQL server, Teradata, Hive, Hana and non-relational sources like flat files, XML and Mainframe files.

Experiences on Designed tables, indexes and constraints using TOAD and loaded data into the database using SQL*Loader.

Involved in code migrations from Dev. to QA and production and providing operational instructions for deployments.

Strong Experiences on Data Warehousing ETL experience of using Informatica 9.x/8.x/7.x Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.

Experience in SQL Plus and TOAD as an interface to databases, to analyze, view and alter data.

Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.

Expertise in Using transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Java Transformation, Router and Sequence generator.

Experienced on writing Hive queries to load the data into HDFS.

Extensive experience on Pentaho report designer, Pentaho kettle, Pentaho BI server, BIRT report designer.

Hands on experience in developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.

Hands on experience in Deployment of DTS and SSIS packages using ETL Tool.

Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.

Strong experience in Dimensional Modeling using Star and Snowflake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.

Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.

Experienced with Teradata utilities Fast Load, MultiLoad, BTEQ scripting, FastExport, SQL Assitant.

Experienced in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.

Extensive experience in MS SQL Server 2008/2008R2/2012/2014 /2016/2017 and solid hands-on experience in installing. Configuring. Managing. Monitoring and troubleshooting SQL Server in Dev. Testing and Production environments.

Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting and used Netezza Utilities to load and execute SQL scripts using UNIX.

TECHNICAL SKILLS:

Development Tools

Talend 7.1, Talend ESB 6.2, informatica, Eclipse, Putty, File Zilla, SQL Developer, Toad, Teradata SQL Assistant, Quality Center, Harvest, SQL Server Management Studio.

Programming Languages

Core Java, SQL, PL/SQL, C, C++

Web Technologies

HTML, XML, CSS, XSD, JavaScript and JSON.

ETL Tools

DTS, SSIS (SQL Server Integration Services), Informatica

Reporting Packages

SQL Server Reporting Services, MS Excel.

Tools/Methodologies

MS Project, SQL Profiler, Toad, TFS 2010, Agile, Jira, Waterfall.

Operating Platforms

MS Windows 2010/2000/XP/NT, Mac OS, Unix and Linux.

PROFESSIONAL EXPERIENCE:

Barnes And Noble - Monroe Township, NJ Nov 2019 to Present

Role: Sr.ETL Talend Consultant

Responsibilities:

•Worked on Talend 7.1.1

•Analyzing the source data to know the quality of data by using Talend Data Quality

•Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings. Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.

•Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.

•Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.

•Developed Talend jobs to populate the claims data to data warehouse - star schema.

•Utilized Big Data components like tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput,

•Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.

•Used tStatsCatcher, tDie, tLogRow to create a generic Joblets to store processing stats into a Database table to record job history.

•Created complex mappings with shared objects/Reusable Transformations/Mapplets for staging unstructured HL7 files into Data Vault.

•Used Data Vault for traditional batch loads and incremental build load.

•Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.

•Experienced in using debug mode of talend to debug a job to fix errors. Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.

•Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.

•Created Context Variables and Groups to run Talend jobs against different environments.

•Used tParalleize component and multi thread execution option to run subjobs in parallel which increases the performance of a job.

•worked extensively in T-SQL for various needs of the transformations while loading the data into Data vault.

•Implemented Third party Scheduler (Automic Schedular) to trigger the jobs in TAC server.

•Created Talend jobs to load data into various Oracle tables. Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.

McDonalds - Chicago, IL March 2018 - Oct 2019

Role: Sr. ETL Talend Consultant

Responsibilities:

Worked on Talend Data Integration/Big Data Integration (6.1/5.x) / Talend Data Quality.

Expertise in creating mappings in TALEND using tMap, tJoin, tReplicate, tParallelize, tConvertType, tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components

Created and managed Source to Target mapping documents for all Facts and Dimension tables

Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.

Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.

Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

Worked Extensively on Talend Admin Console (TAC) and Schedule Jobs in Job Conductor.

Developed Talend jobs to populate the claims data to data warehouse - star schema.

Troubleshot long running jobs and fixing the issues.

Extensively used tSAPBapi component to load and read data from SAP System.

Created jobs to pass parameters from child job to parent job.

Developed data warehouse model in snowflake for over 100 datasets.

Exported jobs to Nexus and SVN repository.

Implemented update strategy on tables and used tJava, tJavarow components to read data from tables to pull only newly inserted data from source tables.

Observed statistics of Talend jobs in AMC to improve the performance and in what scenarios errors are causing.

Implemented few java functionalities using tJava and tjavaflex components.

Has developed custom components and multi-threaded configurations with a flat file by writing JAVA code in Talend.

Environment: Talend Data Integration 6.3.1, Snowflake, Talend Administrator Console, Hive, HDFS, SAP HANA, SQL Navigator, Toad, Azure Sql server, Winscp.

Rockwell Automation, Bloomington, IL Jan 2017 to Feb 2018

Role: ETL Talend Developer

Responsibilities:

Acquire and interpret business requirements, Create Technical artifacts, and determine the most efficient/appropriate solution design, thinking from an enterprise-wide view.

Worked in the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business-critical projects coming up with huge data extraction.

Extensively used tSAPBapi component to load and read data from SAP System.

Perform Technical analysis, ETL design, Development, Testing, and Deployment of IT solutions as needed by business or IT.

Developed drill down and drill through reports from multi-dimensional objects like star schema and snowflake schema using SSRS and performance point server.

Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements.

Explore prebuilt ETL metadata, Mappings and DAC metadata and Develop and maintain SQL code as needed for SQL Server database.

Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.

Analyzing the source data to know the quality of data by using Talend Data Quality.

Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal solutions, and revise procedures and documentation as needed.

Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.

Used SQL queries and other data analysis methods, as well as Talend Enterprise Data Quality Platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.

Worked on Talend RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite.

Writing Netezza SQL queries to join or any modifications in the table.

Used Talend reusable components like routines, context variable and globalMap variables.

Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and query Performance.

Processed Daily expenses through SSIS jobs by collecting data from Concur FTP servers.

Implementing fast and efficient data acquisition using Big Data processing techniques and tools.

Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC)

Developed Oracle PL/SQL, DDLs, and Stored Procedures and worked on performance and fine Tuning of SQL.

Environment: Talend 6.4, Netezza, Oracle 12c, IBM DB2, TOAD, Agility, BusinessObjects 4.1, MLOAD, SQL Server 2012, XML, SQL, Hive, Pig, SQL, PL/SQL, HP ALM, JIRA.

Prime East Investments, India June 2014 – Dec 2016

Role: ETL Developer

Responsibilities:

Work with the offshore team for the day-to-day work and review the tasks done by them get the status updates in the daily meetings.

Involved in Extraction, Transformation and Loading of data from 10 Source Systems into our data warehouse.

Created Jobs for Matching and Merging to cleanse the data coming from Source system.

Implemented code in Talend where Survivorship rules are required.

Scheduled Talend jobs using Talend Administration Center.

Worked with different Sources such as Oracle, SQL Server and Flat files.

Developed Informatica mappings, sessions and workflows as per the business rules and loading requirements.

Wrote test case scenarios and unit tested the code developed in Talend and Informatica.

Worked on the project documentation and prepared the Source Target mapping specs with the business logic and involved in data modeling.

Worked on migrating data warehouses from existing SQL Server to Oracle database.

Optimized and Tuned SQL queries used in the source qualifier of certain mappings to eliminate Full Table scans.

Used Command line Shell to run the Informatics and Talend jobs.

Created Workflows using various tasks like sessions, control, decision, e-mail, command, Mapplets/Joblets and assignment and worked on scheduling of the workflows.

Migrated the code and release documents from DEV to QA (UAT) and to Production.

Created data model using E/R studio from existing database using Reverse Engineering.

Environment: Informatica Power Center 8.6/9, Oracle 10g, SQL Server, ER/Studio, Toad, Talend 5.5.1 Enterprise MDM Edition, Windows XP, Unix, TOAD, SQL Developer.

CITRIX systems, India Jan 2013-May2014

Role: SQL Developer

Responsibilities:

Created and managed schema objects such as Tables, Views, Indexes and referential integrity depending on user requirements.

Actively involved in the complete software development life cycle for a design of database for new Financial Accounting System.

Successfully implemented the physical design of the new designed database into MSSQL Server 2008/2005.

Used MS SQL Server 2008/2005 to design, implement and manage data warehouses OLAP cubes and reporting solutions to improve asset management, incident management, data center services, system events support and billing.

Utilized T-SQL daily in creating customs view for data and business analysis.

Utilized Dynamic T-SQL within functions, stored procedures, views, and tables.

Used the SQL Server Profiler tool to monitor the performance of SQL Server particularly to analyze the performance of the stored procedures.

Stored Procedures and Functions were optimized to handle major business crucial calculations.

Implementation of data collection and transformation between different heterogeneous sources such as flat file, Excel and SQL Server 2008/2005 using SSIS.

Migrated all DTS packages to SQL Server Integration Services (SSIS) and modified the package according to the advanced feature of SQL Server Integration Services.

Defined Check constraints, rules, indexes and views based on the business requirements.

Extensively used SQL Reporting Services and Report Builder Model to generate custom reports.

Designed and deployed Reports with Drop Down menu option and Linked reports

Created subscriptions to provide a daily basis reports and managed and troubleshoot report server related issues.

Environment: MS SQL Server 2005/2008, SSIS, SSRS, Microsoft.NET, MS Office Access2003, Excel, Erwin.



Contact this candidate