Post Job Free

Resume

Sign in

Data Developer

Location:
Frisco, TX
Salary:
$55-$60/hr on C2C
Posted:
August 21, 2018

Contact this candidate

Resume:

Vamshi K

Sr. Talend Developer

Professional Summary

Vision-driven change agent with career-long record of data integration, building data warehouse and talend development successfully for leading organizations

Proven talent for aligning business strategy and objectives with established data integration, Talend development paradigms to achieve maximum operational impacts with minimum resource expenditures. Growth-focused thought leader with expertise Talend Data Integration, Big Data Integration, performing Data Profiling, Data Migration, Extraction, Transformation and designed data conversions from wide variety of source systems, business requirements analysis, technology solutions, cross-functional team leadership, performance assessment and client relationship management. Exceptional IT professional with keen interpersonal, communication, and organizational skills.

Professional Highlights

8+ years of experience in full life cycle of software project development in various areas like design, Applications development of Enterprise Data Warehouse on large scale development efforts using best practices with Talend and Informatica Power Centre.

5+ years of experience using Talend Data Integration/Big Data Integration (6.x/5.x).

Extensive knowledge of various business domains Health Care, Restaurant chain, Mortgage, Financial and Retail sectors.

Extensive experience working in AGILE team.

Extensive experience in ETL methodology for performing Data Profiling, Data Migration, Extraction, Transformation and Loading using Talend and designed data conversions from wide variety of source systems including Netezza, Oracle, DB2, SQL server, Hive, Hana and non-relational sources like flat files, XML and Mainframe files.

Expertise in Talend integration components like tMap, tJoin, tReplicate, tParallelize, tConvertType,,tflowtoIterate, tAggregate, tSortRow, tFlowMeter, tLogCatcher, tRowGenerator, tNormalize, tDenormalize, tSetGlobalVar, tHashInput, tHashOutput, tJava, tJavarow, tAggregateRow, tWarn, tLogCatcher, tMysqlScd, tFilter, tGlobalmap, tDie etc.

Well versed with Talend Big data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

Expertise in Tidal scheduling, creation and migration.

Expertise in marketing SAP and Salesforce CRM systems.

Expertise in processing data from HDFS, HBase using Hive, Sqoop and Pig components.

Used Spark and MapReduce frame works to configure the big data batch jobs.

Strong understanding of NoSQL databases like HBase, MongoDB.

Expertise in using context variables, Routines and metadata.

Experience on working various distributions of Hadoop like Cloud Era, Horton Works and MapR.

Good Knowledge in Spark and MapReduce frameworks.

Good Knowledge on Big data Hadoop architecture.

Expertise in Data modeling techniques like Data Modeling- Dimensional/ Star Schema and Snowflake modeling, Slowly Changing Dimensions (SCD Type 1, Type 2, and Type 3).

Excellent working experience in Waterfall, Agile methodologies.

Proficient in performance analysis, monitoring and SQL query tuning using EXPLAINPLAN, Collect Statistics, Hints and SQL Trace in Oracle.

Experience in development and design of ETL (Extract, Transform and Loading data) methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Centre.

Created mappings using Lookup, Aggregator, Joiner, Expression, Filter, Router, Update strategy and Normalizer Transformations. Developed reusable Transformation and Mapplets.

Strong Experience with shell scripting, understanding of approaches for business intelligence, data warehouse.

Good Knowledge in UNIX shell scripting and basic PERL scripting with experience in LINUX, AIX platforms.

Expertise in deploying from DEV to QA, UAT and PROD with both Deployment group and Import/Exports method.

Experience in gathering and writing detailed business requirement and translating them into technical specifications and design.

Expertise in understanding and supporting the client with requirements definition, analysis, design, testing, system documentation and user training.

Extensive experience in gathering requirements and documenting the same as per the industry best practices.

Designed the data conversion strategy, development of data mappings, source data profiling and the design of Extraction, Transformation and Load (ETL) routines for migrating data from non-relational or source relational to target relational.

Strong analytical, logical, Communication and problem solving skills and ability to quickly adapt to new technologies by self-learning.

Technical Skills

ETL Tools

Talend Data Integration/ Big Data Integration/ Data Quality 6.1/5.5/5.0, Talend Administrator Console, Informatica Power Cente 9.0/8.6, SAP DI.

Requirements Management

Microsoft Office (Word, PowerPoint, Excel)

RDBMS and SQL Tools

Programing languages

Oracle 11g/10g, DB2 9.x/8.x, Netezza, SQL Server, Hive, HBase, SAP HANA, MongoDB, Amazon redshift.

Core Java, shell scripting.

Code Management Tools

SVN, GIT, SharePoint, Quality Center

Methodologies

Kimball/Inmon, OLAP, Waterfall, Agile

Data Modeling Tools

Erwin, ER Studio, MS Visio

Operating System

Other Tools

Windows 2007, Unix, Linux

Tidal, ServiceNow.

Professional Experience

Tractor Supply Company, Brentwood TN March 2017 - Till date

Sr. ETL/ Talend Developer

Integrating the customer master data from various sources into SAP CRM system. Sending data to SAP front end for customer support. Developing Talend Jobs to send out campaigns, coupons and rewards to customers based on the purchase history. Sending data to EDW to build reports on sales data, customers purchase patterns.

Responsibilities:-

Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.

Processing delimited files and loading staging tables using Talend bulk load components.

Using temp tables and persistent tables concept for incremental loads using merge statements.

Consuming SAP IDOCS and loading into staging tables.

Developed jobs Using best practices for error logging and exception handling.

Posting data from Netezza to SAP CRM using SAP BAPI functions.

Rewrite the SAP DI jobs into Talend jobs.

Followed and enhanced programming and naming standards.

Created local and global Context variables in the job for reusability and to execute different environments.

Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading and unit tested the mappings.

Used tStatsCatcher, tDie, tLogRow to create a generic job let to store processing stats into a Database table to record job history.

Frequently used Talend Administrative Console for scheduling, running and debugging the logs.

Integrated java code inside Talend studio by using components like tJavaRow, tJava, tJavaFlex and Routines.

Experienced in using debug mode of talend to debug a job to fix errors.

Created complex mappings using tHashOutput, tHashInput, tNormalize, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited, etc.

Used tRunJob component to run child job from a parent job and to pass parameters from parent to child job.

Used tParalleize component and multi thread execution option to run sub jobs in parallel which increases the performance of a job.

Implemented FTP operations using Talend Studio to transfer files in between network folders as well as to FTP server using components like tFileCopy, TFileAcrchive, tFileDelete, tCreateTemporaryFile, tFTPDelete, tFTPCopy, tFTPRename, tFTPut, tFTPGet etc.

Experienced in Building a Talend job outside of a Talend studio as well as on TAC server.

Experienced in writing expressions with in tmap as per the business need.

Handled insert and update Strategy using tmap.

Used ETL methodologies and best practices to create Talend ETL jobs.

Extracted data from flat files/ databases applied business logic to load them in the staging database as well as flat files.

Environment: Talend Data Integration 6.3/5.6, Talend Administrator Console, Oracle 11g, Netezza, SAP HANA, SAP CRM, JIRA, Service Now, SVN, XML files, Putty.

Taco Bell, Irvine CA April 2016 – February 2017

Sr. ETL/ Talend Developer

The project was to integrate data and build analytics for Taco bell chain restaurants in US. Integrating and transforming sales, customer and vendor data from various sources and loading into HIVE and HBASE using talend jobs based on the business functionality for everyday morning dashboards to improve the customer experience and simplifying the process.

Responsibilities:-

Cooperated with internal management and team members to understand and clarify requirements / needs.

Compiled, analyzed, and documented business requirements.

Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.

Used ETL methodologies and best practices to create Talend ETL jobs.

Storing the Metadata files and file patterns on HBase.

Developed Hive queries based on the requirement.

Working on ETL data flow development using Hive/Pig scripts and loading from Hive views.

Working Sqoop import and Sqoop export to move data in and out of HDFS.

Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.

Storing and processing tables from HDFS using Hive Components.

Connecting to Hadoop cluster to Process Data on HDFS using Big Data Batch Jobs.

Converting normal jobs on talend to bigdata batch jobs.

Ingested the data from various heterogeneous sources into HDFS.

Generating and monitoring raw logs and enriched logs using the batch jobs and reports.

Creating Hive and Pig metadata manually.

Created tables, partitions and views on Hive and loaded data from various sources.

Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.

The most commonly used components and well reserved with tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput, tHashOutput and tFlowtoiterate.

Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.

Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.

Developed stored procedure to automate the testing process to ease QA efforts and also reduced the test timelines for data comparison on tables.

Automated SFTP process by exchanging SSH keys between UNIX servers.

Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.

Involved in production n deployment activities, creation of the deployment guide for migration of the code to production, also prepared production run books.

Environment: Talend Data Integration 6.1/5.5.1, Talend Big Data Platform 6.0.1/5.5, Talend Administrator Console, Oracle 11g, Hive, HDFS, Sqoop, SQL Navigator, Mongo DB 3.0.12, XML files, Flat files, HL7 files, JSON 2.4.1, Toad, Control M, Putty, WinSCP.

EllieMae, Pleasanton, CA July 2015 to March 2016

Sr. ETL/ Talend Developer

EllieMae is a leading provider of innovative on-demand software solutions and services for the residential mortgage industry. EllieMae's all-in-one Encompass mortgage management solution provides one system of record that allows banks, credit unions and mortgage lenders to originate and fund mortgages and improve compliance, loan quality and efficiency. The project was to build a Business Intelligence (BI) system that consolidates disaggregated data, defines the relationships between the disaggregated data elements, standardizes and aligns key strategic measures/metrics and improves the execution of analyses and report building.

Responsibilities:-

Analyzing the source data to know the quality of data by using Talend Data Quality.

Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.

Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.

Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.

Talend jobs to load data into Redshift via COPY function from S3 Avro data

Developed Talend jobs to load data from oracle into Amazon EMR with S3 data storage layer utilizing Avro data format.

Transformed data using Redshift SQL and loading of stage data into fact and dimensional structure.

Used tStatsCatcher, tDie, tLogRow to create a generic job let to store processing stats.

Participated in JAD sessions with business users and SME's for better understanding of the requirements.

Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.

Solid experience in implementing complex business rules by creating re-usable transformations and robust mappings using Talend transformations like tConvertType, tSortRow, tReplace, tAggregateRow, tUnite etc.

Developed Talend jobs to populate the claims data to data warehouse - star schema.

Creating Cluster metadata manually and automatically

Read/write data and files on HDFS using Sqoop components.

Managing messages on Kafka topics using Talend Jobs.

Utilized Big Data components such as tSqoopExport, tSqoopImport, tHDFSInput, tHDFSOutput, tHiveLoad, tHiveInput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult,, tHbaseInput, tHbaseOutput along with executing the jobs in Debug mode and also utilizing the tlogrow component to view the sample output.

Schedule Big Data Job execution from the Talend Administration Center (TAC) and enriching logs.

Environment: Talend 5.5, Talend Bigdata 5.3, Oracle 11g, My Sql, Amazon Redshift, HDFS, Sqoop, Hive, MS SQL Server 2012/2008, PL/SQL, Kafka, Agile Methodology, Cloudera, TOAD, Erwin, AIX, Shell Scripts, AutoSys, SVN.

Tenet Health Care, Plano TX May 2014 to June 2015

Talend Developer

Project: Meaningful Use

Tenet Healthcare which has transformational medical technologies and services to meet the demand for increased access, enhanced quality and more affordable healthcare around the world. The project was to integrate data and build analytics for Tenet Centricity Practice Solutions and Electronic Medical Records software teams.

Worked closely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements.

Developed jobs, components and Job lets in Talend.

Created complex mappings in Talend using tHash, tDenormalize, tMap, tUniqueRow. tPivotToColumnsDelimited as well as custom component such as tUnpivotRow.

Created Talend Mappings to populate the data into dimensions and fact tables.

Frequently used Talend Administrative Console for running jobs, debugging logs, removing locks.

Implemented new users, projects, tasks within multiple different environments.

Developed complex Talend ETL jobs to migrate the data from flat files to database.

Implemented custom error handling in Talend jobs and also worked on different methods of logging.

Created ETL/Talend jobs both design and code to process data to target databases.

Created Talend jobs to load data into various Oracle tables.

Utilized Oracle stored procedures and wrote few Java code to capture global map variables and use them in the job.

Successfully Loaded Data into different targets from various source systems like Oracle Database, DB2, Flat files, XML files etc. into the Staging table and then to the target database.

Troubleshot long running jobs and fixing the issues.

Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.

Performed Unit testing and System testing to validate data loads in the target.

Environment: Talend Open Studio 5.3, UNIX, Oracle, TOAD, MS SQL Server 2012/2008, PL/SQL, DB2, Flatfiles, XML, Talend Integration Suite, Talend Administrative Console, AutoSys.

Tenet Health Care, Plano TX October 2013 to April 2014

Talend Developer

Project: Emergency Department/Operational Reports (EDOR)

The solution leverages business intelligence to enable the business to make smarter business decisions on in patients bed availability, bed to triage, improve inpatient admission process, to better understand the support related needs and increase physician availability for bed visits and surgeries.

Present the IT solutions implementation approach as dictated by the Business.

Requirements Documents requirements classification and methodology.

Present the risks, dependencies, and outstanding items that require attention from the project stakeholders.

Responsible for designing, developing and unit testing of the mappings.

Developed mappings using Informatica Power Center Designer to load data from various source systems to target database as per the business rules.

Used various transformations like Source Qualifier, Aggregators, Connected & unconnected lookups, Filters, Sequence generators, Routers, Update Strategy, Expression, etc.

Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.

Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.

Performed the unit, system and integration testing for the jobs.

Validated the test results by executing the queries using the Toad software.

Prepared test plans for both unit and system tests.

Responsible to design, develop and unit test the mappings.

Developed re-usable transformations, re-usable mapplets.

Wrote design documentation for the ETL process and Informatica mappings.

Unit tested the mappings by running SQL queries and comparing the data in source and target databases.

Worked with source teams to resolve data quality issues raised by end users

Created TIDAL jobs and schedules based on demand, run on time, run only once and Ad-Hoc.

Involved in writing Pre-session and Post-session korn shell scripts for dropping & creating indexes for tables, created shell scripts which will substitute all the user and schema information in the SQL queries,

Environment: Informatica Power Center 8.6, Oracle 10g, SQL Server 2008, Business Objects XI R2, TIDAL

Dell, Bangalore, India October 2011 – September 2013

ETL Developer/ Software Analyst SR.ASC

Project: Cerner

Cerner is an innovative leader in the health and well-being industry, serving more than 70 million Americans through its family of companies. I was responsible for creating New Design for Salary Planning Tool (CASPER) which would take care of salary, budget assigning etc. and also involved in Line of credit project, Tenet Time keeping project CASPER Phase II, Voluntary Benefits Program.

Assisted in requirement analysis and planning of delivering projects using Waterfall model.

Aided developers and testing team with business cases and scenarios.

Worked on Informatica - Repository Manager, Designer, Workflow Manager & Workflow Monitor.

Integrated data into CDW by sourcing it from different sources like SQL, Flat Files and Mainframes (DB2) using Power Exchange.

Extensively worked on integrating data from Mainframes to Informatica Power Exchange.

Extensively worked on Informatica tools such as Source Analyzer, Data Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer to designed, developed and tested complex mappings and Mapplets to load data from external flat files and RDBMS.

Used output xml files, to remove empty delta files and to FTP the output xml files to different server.

Worked with the Business Analyst team during the functional design and technical design phases. Designed the mappings between sources (external files and databases) to operational staging targets.

Extensively used various transformations like Source Qualifier, Joiner, Aggregators, Connected and Unconnected lookups, Filters, Router, Expression, Rank Union, Normalizer, XML Transformations and Update Strategy & Sequence Generator.

Used XML transformation to load the data XML file.

Worked on Informatica Schedulers to schedule the workflows.

Extensively worked with Target XSD's in order to generate the output xml files.

Created mappings to read parameterized data from tables to create parameter files.

Good Experience in Co-Coordinating with Offshore.

Environment: Informatica Power Center 8.6.1, Power Exchange 8.6.1, Windows, IBM DB2 8.x, Mainframes, SQL Server 2008, ERwin.

Dell, Bangalore, India April 2010 – September 2011

Software Analyst

Project: Experian

Experian Bank offers regional consumer and business banking and wealth management services and global payments services. The project was to develop a data warehouse based on four kinds of data marts for Accounts, Loans, Credit Cards, and Insurance. In loan data mart they involved in disbursal of loan amounts for various purposes like: Personal Loan, Educational Loan, Vehicle Loan, Housing Loan, Consumer Durable Loans, etc.

Responsibilities:-

Conducted performance tuning of complex SQL queries and stored procedures by using SQL Profiler and index tuning wizard.

Analyzed reports and fixed bugs in stored procedures using SSRS.

Used complex expressions to group data, filter and parameterize reports.

Performed various calculations using complex expressions in the reports and created report models.

Generated Complex SSRS reports like reports using Cascading parameters, Snapshot reports Drill-down Reports, Drill-Through Reports, Parameterized Reports and Report Models and ad hoc reports using SSRS based on Business Requirement Document.

Provided Production support to analyze and fix the problems and errors on daily basis by modifying SSIS Packages and Stored Procedure if necessary.

Designed and developed Tables, Stored procedures, Triggers and SQL scripts using TSQL, Perl and Shell scripting for enhancements and maintenance of various database modules

Environment: MS SQL Server 2005/2008, SSRS, SSIS, SSAS, T-SQL, Erwin, SQL Explorer.



Contact this candidate