Post Job Free
Sign in

C A Data Warehouse

Location:
Quan Ba Dinh, 118700, Vietnam
Posted:
August 09, 2023

Contact this candidate

Resume:

adytqh@r.postjobfree.com 092*******

CHU MINH TIEP

APPLICATION FOR POSITION: DATA ENGINEER FRESHER

https://github.com/crty1999

C A R E E R G O A L S

Accomplish the first 2-year goal of building sustainable core values and flexible and effective practical experience in the field of Data engineering. Learn and develop skills in Data engineering, development of the data pipeline to extract, load, and transform data. Implement methods to improve data reliability and quality of the company.

E D U C A T I O N

Nguyen Hue High School for the Gifted - Specialize in Math 1 ITMO University, Russian Federation, Majors : Applied Informatics; GPA : 3.5

FPT Information System Intern: Web Application intern Using PHP, JavaScript, phpMyadmin to create web application Model data by creating tables in Apache Cassandra to run queries against business results.

ETL pipeline that transfers data from a set of CSV files within a directory to create a streamlined CSV file to model and insert data into Apache Cassandra tables. Data Modeling

W O R K E X P E R I E N C E

2014 -2017

2018 -2023

4/2021 -7/2021

P E R S O R N A L P R O J E C T

Position in the project : Machine Learing Engineer. Describe project : Build an application that takes as input a contex and a question, the output will be an answer

Question and Answering using BERT

RESUME

C E R T I F I C A T I O N S

Machine Learning certificate

DataScienceWorld.Kan

SQL_Intermediate certificate

HackerRank

PYTHON_Basic certificate

HackerRank

Programming Languages: Python SQL

Libraries: Pandas, NumPy, BeatifulSoup, TensorFlow, Scikit-learn, Pytorch Tools: Apache Spark

Cloud Computing: Azure (Azure Synapse, Azure data lake storage gen2) Language skill: English, Russian

T E C H N I C A L S K I L L S

For the project use Chicago Divvy Bike Share Data, and load data into Postgres DB. Ingests the data from PostgreSQL into the Azure Data Lake Storage Gen2 (built on Azure Blob Storage). All tables are represented as text files in Data Lake. Create staging tables from these files, ready for loading into the Synapse Analytics Data Warehouse. Design a star schema based on the business outcomes listed below: Create fact tables and dimension tables based on business needs and a relational model from these staging tables in Synapse Analytics Data Warehouse. Create reports from Analytics.

Building an Azure Data Warehouse for Bike Share Data Analytics o Analyze how much time is spent per ride.

o Analyze how much money is spent.

o Analyze how much money is spent per member.



Contact this candidate