Post Job Free

Resume

Sign in

Engineer Data

Location:
Coimbatore, Tamil Nadu, India
Posted:
April 13, 2020

Contact this candidate

Resume:

Dhivya. M

Mobile : + **867-***-****

E-mail : adcsj5@r.postjobfree.com

Address:- No:- 5, MGR Nagar 2nd street, Rathnapuri, Coimbatore-641027

SUMMARY

Aspiring for a challenging career in Software Development as to use my learned skills and experience for best results

Around 4+ years of extensive hands on experience in developing and deploying N-tier based applications and BigData

Cohesive team worker, having strong analytical, problem solving and Interpersonal skills

WORKING EXPERIENCE

Working as a Data Engineer in Hypersonix, at Coimbatore from March 2019 to Till date.

Worked as a Talend Developer in Spigot solutions (Clarivate Analytics), at Bangalore from January 2019 to February 2019.

Worked as a Data Engineer in Onedata software solutions, at Coimbatore from October 2017 to December 2018.

Worked as a Programmer Analyst in Xortican Technologies, at Coimbatore from June 2015 to September 2017.

EDUCATION DETAILS

B.E. in Computer Science and Engineering, at CMS College of Engineering and Technology, Coimbatore, Graduated from Anna University (2010– 2014) with an Aggregate of 7.79(CGPA).

PROFESSIONAL SKILLS

Having knowledge on below skill sets and I am ready to apply my skills under challenging situations in an innovative and effective manner…

Java, JSP, Servlet, Scala(basics), HTML, CSS, JavaScript

Structs2, Spring 3.0, Hibernate 3.0, Apache Tomcat 8.0, MantisHub

Talend Open Studio

Apache NIFI

Amazon S3

Amazon Redshift, MySQL

PROJECT PROFILE

Roles and Responsibilities in Hypersonix

Understand the client requirements.

Working on Enhancements to meet the client’s expectations.

Designing and developing the ETL jobs as per the requirements.

Involved in developing ETL jobs using Talend Open Studio.

Performing cleansing operation in raw data.

Involved in writing SQL queries to generate consolidate tables.

Transformed data is stored into Redshift databases.

Automated the talend job using NIFI.

Participated in day-to-day meeting, status meeting, strong reporting and effective communication with project manager and developers.

Creating dashboards in UI to validate data

Product Description

Well developed business analytics tool. The product will provide clear insights of business growth. Analytics is based on elastic search. Based on search question the data will displayed from database dynamically.

Data will be dumped in redshift in OLAP structure. Combination of Mysql and redshift, data will deliver to the client. We can create customized menus, dashboards and documents in UI.

Projects Handled

Data Engineer (Current Projects):-

1. Corner Bakery Cafe

2. Brix Holdings

3. Zpizza

4. KFC

5. Winedirect

6. Panda

7. VCA

8. T2 Hospitality

9. Bashas

10. Smart and Final

Description:- A solution built to perform ETL operations that would extract data from files, API and SFTP Locations and analyze the data. Performing data cleansing and storing into Amazon Redshift. Configuring data in Mysql database and Creating dashboards in UI.

Talend Devloper in Clarivate Analytics

Project#1:- Elevate (Support project)

Description:- Analyzing data based on requirements.

Data Engineer in Onedata Software

Project#5:-Droot (September 2018- December 2018)

Description:- A solution built to perform ETL operations that would extract data from files, analysis the data and store them in RedShift.

Project#4:-Zenabi Data[Zenabi data, US] (May 2018 – September 2018)

Description:- A solution built to perform ETL operations that would extract data from redshift and store them in RedShift(Star schema format).

Project #3:-Urban One [Urban One, INC., US] (February 2018 – May 2018)

Description:- A solution built to perform ETL operations that would extract data from google analytics, analysis the data and store them in RedShift.

Project #2:-PAPAYA [PAPAYA GLOBAL](December 2017 – January 2018)

Description:- A solution built to perform ETL operations that would extract data from S3, analysis the data and store them in RedShift.

Project #1:-CUBE [ASPIRE GLOBAL] (October 2017 – Decmeber 2018)

Description: A solution built to perform ETL operations that would extract data from multiple data warehouses, de-duplicate the data and store them in RedShift.

DECLARATION

I hereby declare that the above-mentioned information is correct up to my knowledge and I bear the responsibility for the correctness of the above-mentioned particulars.

Date: Sincerely,

Place: Coimbatore (M.Dhivya)



Contact this candidate