Nguyễn NamAnh
Tel: +84-967-***-*** Email: ad0cr5@r.postjobfree.com Skype: anhnguyen08231997
SKILLS EXPERIENCE
Grab Vietnam
Payment Executive 12/2019 – 10/2020
I worked in the payment team. My main role was automating my team’s business process. By the end of the contract, I saved two headcounts for manual work, reduced payment waiting time, increased partners' and drivers’ satisfaction. My main works:
• Automated the payment and collection process for Cash on Delivery service.
• Automated the top-up process for drivers.
• Built web scrawlers to collect data and generate daily reports.
• Sent daily auto email via AWS Boto3.
Other Python Skills
• Web scraping (Scrapy,
Selenium)
• Data wrangling (Pandas)
Languages
Tinh Vân Outsourcing (onsite at NEC)
Backend Developer 10/2020 – 10/2021
I worked in Food & Beverage team. Our mission was to provide the restaurant chains with convenient software solutions, included ordering software, delivery website, and CRM platform.
My main works:
• Designed and coded for the ordering software’s backend API.
• Designed and coded for the master tool (Admin Portal) which used to update the ordering software’s master data.
• Contributed to the CRM platform’s feature.
• Built services for the team.
The detail of these works are presented in the fourth page. Backend Skills
• Network protocol: HTTP,
SSL/TLS, WebSocket
• Framework: Django + Django
Rest Framework, Flask
• RDBMS: PostgreSQL, MySQL
• NoSQL: MongoDB, Redis
• Concurrency model: Celery
(collaborate with RabbitMQ
and Redis), Python AsyncIO
• Task pipeline: Apache Airflow
• Deploy: Docker, Bitbucket
Pipeline
• Cloud Skills: GCP Cloud
Functions, Cloud Run, Cloud
Storage
• Reading & listening fluently
in English.
• Good English writing
and speaking.
My favorite quote
• The only thing that constant is
changing.
Front Skills
• Frontend Library: ReactJS,
JQuery
Study University Program (10/2021 – Present)
TotallyAwesome Vietnam
Software Developer 07/2023 – Present
I work in Ads Tech team. Our mission is to develop and maintain an internal website that streamlines the business team's workflow and reduces their workload. My main works:
• Deploy and setup Airflow on GCP platform.
• Build Airflow DAG for daily task.
• Collect business information and design database for the internal website.
• Design and code for the website’s backend API.
The detail of these works are presented in the third page. Nguyễn NamAnh
Tel: +84-967-***-*** Email: ad0cr5@r.postjobfree.com Skype: anhnguyen08231997 SKILLS EXPERIENCE
Home Credit Vietnam
Data Analyst Intern 09/2019 –12/2019
I worked in Sale Planning & Modeling team. My main role was generating SQL scripts to calculate the bonus for more than 2000 sale agents.
• Built SQL script to query working-on-holiday bonus for sale agents.
• Built four Power BI reports for sale analysis.
EDUCATION
• Software Engineering in FPT University (Funix Online Program) REFERENCE
Other Python Skills
• Web scraping (Scrapy,
Selenium)
• Data wrangling (Pandas)
Languages
Backend Skills
• Network protocol: HTTP,
SSL/TLS, WebSocket
• Framework: Django + Django
Rest Framework, Flask
• RDBMS: PostgreSQL, MySQL
• NoSQL: MongoDB, Redis
• Concurrency model: Celery
(collaborate with RabbitMQ
and Redis), Python AsyncIO
• Task pipeline: Apache Airflow
• Deploy: Docker, Bitbucket
Pipeline
• Cloud Skills: GCP Cloud
Functions, Cloud Run, Cloud
Storage
• Reading & listening fluently
in English.
• Good English writing
and speaking.
My favorite quote
• The only thing that constant is
changing.
Front Skills
• Frontend Library: ReactJS,
JQuery
Mr. Trương Tấn Sang
• Manager at NEC.
• Tel: +84-908-***-***
Mr. Kiều Duy
• Acting lead at NEC.
• Tel: +84-936-***-***
My Projects in TotallyAwesome
1. Deploy and setup Airflow on GCP platform.
• I utilize Helm to deploy an Airflow instance on a Kubernetes cluster hosted on GCP GKE service.
• I utilize Kubernetes Secrets to store credential files and mount these files to Airflow containers for authen/authorizing with GCP and Bitbucket platform.
• I setup gitSync feature and make Airflow Scheduler pokes my Bitbucket repository to looks for new/updated DAG files. 2. Build Airflow DAG for daily task.
• I utilize the Airflow built-in operators in conjunction with Google Cloud operators to design DAG file.
• I deploy a webserver hosted on GCP Cloud Functions. The server exposes API endpoints that called by the Airflow operators.
• I also utilize GCP Cloud Storage for staging area that stores data temporarily to process before migrating it permanently to the database (hosted on Cloud SQL).
• I utilize Bitbucket pipeline for automatically deploying new Cloud Functions revision when I merge code to the Master brand. 3. Collect business information and design database for the internal website.
• I work closely with Product Owner guy to get the business information.
• I design database following the Boyce Codd Normal Form.
• I utilize Holistics’s database markup language to design and create the ER diagrams. 4. Design and code for the website’s backend API.
• I utilize FastAPI framework in conjunction with PostgreSQL database.
• I utilize SqlAlchemy ORM to create models that mapping with database tables.
• I create Python scripts to migrate data from excel file to my database.
• I also utilize Bitbucket pipeline to automatically deploy the backend application to the GCP Cloud Run service. My Projects in NEC
1. Designed and coded for the ordering software’s backend API.
• I designed the backend system base on Django + Django Rest Framework and utilized the Postgres RDBMS to store data.
• I used JWT token to encode and decode staff id for authentication (this ordering software is for staff only).
• I used serializer layer to validate data before continue to process.
• I handled logic to get info (table status, menu list, etc.), process orderring, execute payment, apply promotions and generate sales reports, etc.
2. Designed and coded for the master tool.
• The ordering software use master data like menu list, fixed comment list, accepted payment methods, etc.
• I designed the master tool to manage the master data.
• I used JWT token combine with django authorization layer to ensure staff in a specify store can only process master data of his/her store.
• I used the django model and serializer layer to comply the relationship between data model. 3. Contributed to the CRM platform’s feature.
• I designed and coded for the RFM feature (a marketing method for analyzing customer value).
• I used ORM to get raw data and utilized the Python Pandas library to process it. 4. Built services for the team.
• I created Postgres trigger to sync data from stores’ database to central data warehouse.
• I built docker container and deployed for testing team.