Post Job Free
Sign in

Data Software Engineer

Location:
Coimbatore, Tamil Nadu, India
Posted:
May 02, 2019

Contact this candidate

Resume:

Venkat Devarapalli

+91-957*******

***********@*****.***

Objective

Aim to be part of challenging work environment where I can utilize my expertise in technical skills, towards the development and implementation of the new ideas, contributing in growth of the organisation.

Professional Summary

Having 4 years with strong background in Data warehousing using ETL –Talend Data Integration and Talend Big Data.

Extensively used ETL Methodology for performing data extraction, transformation and loading using talend and designed data conversions from wide variety of source systems including flat files, SQL Server.

Capable of analysing and understanding the business requirements.

Worked in creating documents like Technical Specification, Design and Migration documents.

Significant experience with Data Extraction, Transformation and Loading (ETL) from disparate data sources such as multiple relational databases and worked on integrating data from flat files (CSV, PSV, txt files) and XML,JSON files into a common reporting and analytical data model.

Good Experience in Development of ETL Talend Data Integration and Talend Big Data Integration tool.

Having knowledge on TAC (Talend Administration Centre).

Good knowledge on AWS Glue.

With excellent user communication, quick analyzing, planning, strong analytical and problem solving and presentation skills.

Self-motivated, team player, highly dedicated, adaptable to the demands of the projects and zeal to learn New Technologies and Tools.

Experience on AWS cloud services (EC2, S3, RDS, Redshift, IAM).

Experience in Migration of DataStage jobs to Talend Job producing same functionality with improved performance.

Experience in scheduling and maintaining ETL jobs.

Professional Experience

Working as a Talend ETL Developer in Wavicle data solutions, Coimbatore from March - 2017 to till now.

Worked as an Associate software engineer at Bahwan Iton Technologies Pvt Ltd, Hyderabad from June – 2015 to March – 2017.

Academic Qualification

Bachelor of Engineering from O.U (2010 – 2014).

Technical Skills

●Databases : MS SQL Server

●Software Tools : Putty, FileZilla and S3 Browser.

●Software/Languages : SQL, Java SE.

●Operating System : Windows, Linux

●Project Management Tools : JIRA, Confluence

●Project Methodology : Agile

●Cloud Technology : Amazon Web Services(ECS, S3, EC2)

●Data Warehouse DB : Amazon Redshift, Teradata

●ETL Technologies : Talend DI/Big data 6.3/6.5, DataStage .

●Hadoop Ecosystems : HDFS, Hive, Hue

Project # 2 November 2018 to till now

Project Title : eCommerce Platform(eCP)

Client : McDonald’s Corporation

Environment : SQL Server, Talend, AWS, Hdfs

Role : Talend Developer

Description:

McDonald’s is an American fast food company and it is the world's largest restaurant chain by revenue, serving over 69 million customers daily in over 100 countries across approximately 36,900 outlets as of 2016. eCommerce Platform(eCP)-Operational Data Store (ODS) is part of the Global Data & Analytics Platform (GDAP) team. eCommerce Platform-ODS ETL and database structure is deployed into the MDAP in Five market and coincides with the eCP version running in that market. Once deployed the GDW Support team ensures data from eCP is received each day and processed. Development tickets are tracked in JIRA under the ODS project. General Data Production Regulation (GDPR) was approved and adopted by the EU. The purpose of this GDPR Solution Design is to translate business requirements and functional specifications to a logical and a physical solution design. GDPR information from eCP and passes that information to other GDAP systems.

Roles and Responsibilities:

Worked with Business Analysis team in gathering the requirements and created Functional and ETL specification documents.

Analysed the Business Requirement Documents (BRD) and laid out the steps for the data extraction, business logic implementation & loading into targets.

Need to collect the requirement and do the analysis and give the estimation ETA to the business users.

Need to analyse the STM (source to target mapping) from data analyst team and do the validation to start the mapping in Talend.

Involved into end to end Development, Implementation & production support process from dev to prod environment.

Involved in Testing the ETL process that meets the business requirement.

Developed Talend jobs to populate the claims data to data warehouse - star schema.

Worked with different sources such as Relational DB and flat files.

Extracting, Transforming and Loading the data from Source to Staging and Staging to Target according to the Business requirements.

Ingestion of huge files from various sources into S3 bucket and processing the data

Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.

Developed the mappings using needed Transformations in Talend tool according to technical specifications.

Ability to meet deadlines and handle multiple tasks.

Extensively used Talend Administrative Console (TAC) tool for Scheduling and monitoring.

Project # 1 May 2017 to Oct 2017

Project Title : Semantic Layer

Client : McDonald’s Corporation

Environment : Talend, AWS, Hdfs, Redshift.

Role : Talend Developer

Description:

McDonald’s is an American fast food company and it is the world's largest restaurant chain by revenue, serving over 69 million customers daily in over 100 countries across approximately 36,900 outlets as of 2016. Semantic Layer platform is built to make the NP6/POS unstructured XML data coming from McDonald's Restaurants available as structured CSV & JSON data in the Data Lake, which are ready to be consumed by data analysis applications, downstream systems for further ETL operations etc.

Roles & Responsibilities:

Interacting with Business analysis in understanding the Database requirements whenever debugging is required without changing the code.

Extensively Identifying and creation of reusable components/jobs using TIS.

Design of ETL process using Talend to Extract the data from source system Xml files and Data Transformations and then load to the Target Database tables.

Worked with different source like XML, S3 and targets RedShift.

Applied multiple lookups using tmap and DB components.

Implemented Error Handling to provide the detailed error messages. Experienced on processing the data files through Talend Data Integration.

Involved in Production deployment support and KT.

Worked on unit testing and performance tuning of the Jobs.

-Venkat



Contact this candidate