Post Job Free

Resume

Sign in

Data Analyst Deployment Manager

Location:
Manila, Philippines
Posted:
August 14, 2021

Contact this candidate

Resume:

Mark Anthony A. Villanueva

**** ********* **. ***. ****, Manila, Philippines

Email : adn6u9@r.postjobfree.com

Github : https://github.com/noknok79

Docker Hub : https://hub.docker.com/u/noknok79

Mobile #: 091*-*******

EXPERIENCE SUMMARIES: General Manager of a Garment company. Trained Cloud Engineer/Data Engineer/DevOps Engineer. Trained Cloudera Hadoop System Administrator. Trained in using Docker/Kubernetes. Multiple Professional & Specialization training Certifications in Google Cloud Platform. I have managed business for nearly 15 years. I have Industry experience in the Internet environment. I have great exposure/hands-on in cloud services such as AWS and Google Cloud Platform (GCP). I have a great exposure on B2C/B2B Technical Support and System Administration.

PROFESSIONAL EXPERIENCE

Mang Igme Garments (formerly Yeye Garments & Manufacturing)

August 6, 2006 – December 31, 2015

General Manager

Job Description

Responsible for general management and marketing for the retail branch/store and factory/garments, resulting in continuous increase of new customers.

Actively monitor, manage the output from the garments/factory, emerges to additional 0.9% to 1.2% production gained a month.

Personally manage the ordering, provisions and delivery of raw materials, supplies and paraphernalia required for the garment’s products, and smartly manage raw material cost price fluctuations.

Work closely and directly engage, dealing with the regular and wholesale customers; resulting in 6% to 8% increase of product orders every month.

Personally manage weekly payroll to staff, preserving a good working environment that increases workers morale.

Yeye Garments & Manufacturing

August 1, 2001 – August 6, 2006

Assistant General Manager

Job Description

Responsible for general management and marketing for the retail branch/store and factory/garments, when general manager is absent.

Actively monitor, manage the output from the garments/factory.

Personally manage the ordering, provisions and delivery of raw materials, supplies and paraphernalia required for the garment’s products.

Work closely and directly engaging, dealings with the regular and wholesale customers.

Personally manage weekly payroll to staffs.

InterDotNet Philippines Inc. (formerly known as PSINet Philippines)

June 9, 2000 – July 31, 2001

Technical Support / Customer Service Representative

Job Description:

A personalized servicing by guiding customers through company's Internet interface to ensure a world-class customer Internet experience.

Responding to customer phone calls and replying to e-mails to attend varieties of inquiries and problems.

Perform 24-hour standby duties.

On-site visits for dialups and systems configuration problems for corporate and consumer clients.

Supported clients on problems relating to Internet applications like mail applications, browsers, dialup networking on the Windows and Macintosh platform.

Supported corporate accounts on problem relating on mail server, DNS server, proxy server, web server, DDU & lease line connection on the Windows NT and LINUX system.

Install Linux operating systems and customize server according to clients’ needs.

Deploy, set-up servers within clients’ network and establish connection to the Internet.

Coordinate with Telco’s for leased line trouble and problem isolation.

Coordinate with the Systems and Network Operations team when unexpected or unusual technical issues arise.

Provide reports, documentation on clients’ cases.

PROJECTS BACKGROUND:

BlogNodeJS - DevOps Role – Continuous Integration / Continuous Development in GCP

Perform DevOps Role in CI/CD for microservice web application hosted in Google Cloud Platform

Technology Used:

NodeJS with MongoDB – Pre-existing project

Docker / Docker Compose

Google Cloud Platform

Terraform

Github

Ubuntu in WSL2

Project / Job Process:

Setup a GCP account/project for the integration of pre-existing NodeJS with MongoDB application, an microservice.

Enable necessary API in GCP such as Compute Engine API, OAuth API, Secret Manager API, Container Registry API that are essential for the project.

Create/Add role in IAM for access in Service Object Admin, Storage User Access for the access for storage bucket and generate Key API json to be accessible in ubuntu-wsl2 local machine.

Installation of pre-requisite applications such as NodeJS/NPM, Docker, Terraform, GCP SDK in ubuntu-wsl2 local machines.

Deployment of the NodeJS pre-existing project, MongoDB in docker, execution of docker-compose in the ubuntu-wsl2 local machine.

Creation of necessary files of .tf files for the deployment of the compute machines and integration of running docker in the local machines using Terraform

Execute Terraform Init, Plan, Apply for the creation of the GCP compute engine machine including the containerize nodejs app with a running container of mongodb .

Execute Terraform Init, Plan, Apply for the creation of the GCP compute engine machine.

Execution of github push for the source files of the project.

CLOUDERA DISTRIBUTION HADOOP in Cloud Application

Perfect tools for big data processing, big data warehousing

Technology Used:

Red Hat Enterprise Linux 7 / CentOS 7 / Oracle Linux 7

Cloudera Distribution Hadoop

Google Cloud Platform

Amazon Web Service

Postgresql

MySQL

Project / Job Process:

Setup from the scratch, deployment of 6 VM instances in GCP or AWS using personal Free Trial / Free Tier accounts

Deployed/Setup Master High Memory Ram, VM Instance with 8 vCore / 60GB RAM as Master node; deployed/setup 5 Workers High Memory Ram, VM Instances with 4vCore/20GB RAM as Worker Nodes.

Deployed/Setup RHEL or CentOS in each VM Instances as Operating System, interconnected, install necessary Server into Worker/Agents and its application.

Deployment/Setup and installation became much faster with the use of my personal bash scripts for the installation, automation of file settings and configurations and applications in each vm instance of master and worker nodes.

Deployed/Installed Cloudera Distro Hadoop in Master Node and interconnected to the multiple Workers node.

Deployed Zookeeper, YARN, HDFS, HUE/Hive among others make it available for big data processes.

Up and Running for big data set queries which is ideally for big data warehousing, big data queries processing.

CLOUDERA HIVE/IMPALA/Spark/PostgreSQL Data Warehousing in Cloud Application

Importation, integration of big data sets, Big Data Warehousing for big data queries and analytics.

Technology Used:

Red Hat Enterprise Linux 7 / CentOS 7 / Oracle Linux 7

Cloudera Distribution Hadoop

Google Cloud Platform

Amazon Web Service

Postgresql

Hive / Impala / Spark

Project / Job Process:

Setup from the scratch, deployment of 6 VM instances in GCP or AWS using personal Free Trial / Free Tier accounts

Deployed/Setup Master High Memory Ram, VM Instance with 8 vCore / 60GB RAM as Master node; deployed/setup 5 Workers High Memory Ram, VM Instances with 4vCore/20GB RAM as Worker Nodes.

Deployed/Setup RHEL or CentOS in each VM Instances as Operating System, interconnected, install necessary Server into Worker/Agents and its application.

Deployment/Setup and installation became much faster with the use of my personal bash scripts for the installation, automation of file settings and configurations and applications in each vm instance of master and worker nodes.

Deployed/Installed Cloudera Distro Hadoop in Master Node and interconnected to the multiple Workers node.

Deployed Zookeeper, YARN, HDFS, HUE/Hive among others make it available for big data processes.

Up and running for big data set queries which is ideally for big data warehousing, big data queries processing.

File manipulation using hdfs command. Import big data sets as large as 4GB csv data - 61 million entries of rows; for data warehouse, big data queries.

Integrate big data sets for Hive, Impala, Postgres, Spark for big data queries, for data analytics process.

DEVOPS ROLE: PUSHING WORDPRESS WITH MYSQL IN HOST CENTOS / REDHAT / ORACLE LINUX INTO DOCKER REPOSITORY

Present – compiling, pushing full wordress with mysql in the host centos 7 / rhel 7 into docker hub

Technology Used:

CentOS 7

Red Hat Enterprise Linux 7

Oracle Linux 7

Rocky Linux 8

Docker / Docker Compose / Podman

Wordpress

MySQL

Project / Job Process:

Docker pull of wordpress and mysql from docker repository, set the necessary credentials, username, password host for the said services; interconnect the two applications by creating docker network.

Compile the whole host operating system as preparation for the image to be pushed into docker hub

Create docker—compose for containerizing the word press while mysql running in the background.

Load the compiled image into the docker / podman

Run the image so it would create docker container process.

Tag and commit the container image for the preparation for push into docker

Login into docker account; push the docker container into docker hub using docker push command.

Docker pull the c7-wordpress / rh7-wordpress from docker hub; then run the image; browse the localhost:8080 for the wordpress is listening and running.

PORTING OF CENTOS/REDHAT/ORACLE/ROCKY LINUX INTO WINDOWS SUBSYSTEM LINUX 2

Present – porting full working linux distro in wsl2

Technology Used:

CentOS 7 / 8

Red Hat Enterprise Linux 7 / 8

Oracle Linux 7 / 8

Rock Linux 8

Docker / Podman

RPMBUILD/Compile Source Linux

Linux util-linux ver 2.36.2

Linux system ver 248.3 latest

Project / Job Process:

Preparation of your preferred linux distro, pulling of image container from the official CentOS, Red Hat container repositories.

Configuration of necessary porting tools such as genie or subsystem, customized app for porting linux distro to run in WSL2.

Build/compile necessary rpm such as util-linux 2.36.2 and the latest systemd 248.3 using their make/compile, meson build command respectively.

Saving tar.gz container images, downloading to the local machine.

Importing to the local machine, via wsl –import command, and load the linux operating system.

Install desired server, services such as mysql, postgresql among others; and run using the systemctl start command, which porting process has been successfully integrated into the ported linux distro in WSL2.

DEVOPS ROLE: COMPILING, PUSHING CENTOS / REDHAT / ORACLE / ROCKY LINUX FOR WINDOWS SUBSYSTEM LINUX 2 INTO DOCKER REPOSITORY

Present – compiling pushing full working linux distro in wsl2 into docker hub

Technology Used:

CentOS 7 / 8

Red Hat Enterprise Linux 7 / 8

Oracle Linux 7 / 8

Rocky Linux 8

Docker / Podman

Project / Job Process:

Preparation of your preferred linux distro, pulling of image container from the official CentOS, Red Hat container repositories.

Configuration of necessary porting tools such as genie or subsystem, customized app for porting linux distro to run in WSL2.

Build/compile necessary rpm such as util-linux 2.36.2 and the latest systemd 248.3 using their make/compile, meson build command respectively.

Saving tar.gz container images, downloading to the local machine.

Importing to the local machine, via wsl –import command, and load the linux operating system.

Install desired server, services such as mysql, postgresql among others; and run using the systemctl start command, which porting process has been successfully integrated into the ported linux distro in WSL2.

Compile the whole host operating system as preparation for the image to be pushed into docker hub.

Load the compiled image into the docker / podman

Run the image so it would create docker container process

Tag and commit the container image for the preparation for push into docker

Login into docker account; push the docker container into docker hub using docker push command.

EDUCATIONAL BACKGROUND:

San Sebastian College – Recoletos, Manila

Juris Doctor / Bachelor of Law

2016 – Present

San Sebastian College – Recoletos, Manila

Bachelor of Science in Computer Science, March 2000

1996 – 2000

TECHNICAL SKILLS:

Google Cloud Platform – GCP

(actual hands-on, can provide online technical demo)

Cloud Console/Shell, Compute Engine and Networking, Cloud Storage, Cloud SQL, Kubernetes-Docker, Virtual Network, BigQuery, Google Data Studio. DataFlow, DataProc, Tensorflow – Jupyter Notebook

Cloud Deployment/Solution/Integration

(actual hands-on, can provide online technical demo)

IIS Microsoft Server 2012/2016/2019, Jenkins Bitnami. Deployment Manager, Terraform

Cloud Scripting / Programming Language

(actual hands-on, can provide online technical demo)

Bash, Python, .yaml, HTML5 / CSS / JSON / XML

Amazon Web Services – AWS

(actual hands-on, can provide online technical demo)

Cloud Management, EC2, S3, Cloud9, EBS, AWS DMS, AWS IAM, Cloud Endure, AWS Lambda, Amazon Lex, API Gateway, DynamoDB

IBM Cloud Service

(actual hands-on, can provide online technical demo)

IBM DB2 Database, Cloud Foundry, Continuous Delivery, Toolchain, Git/Git Repo, Jupyter Notebook, Container Registry, Kubernetes-Docker, OpenShift

Data Engineering

(GCP personal account, actual hands-on)

Extract Transform Load, BigQuery, BigData, Google Data Studio, Pipelining Hadoop - Apache Sparks

DevOps Role

(VM test machines, actual hands-on)

CDH Deployment/Admin, CDH via Docker, WordPress via Docker + docker-compose, CI/CD via Jenkins, VM/VPC via Terraform. Docker Pull/Push development.

OpenSource Databases

(VM test machines, actual hands-on)

MySQL / mysql-workbench / phpMyAdmin, PostgreSQL / pgadmin4, Hue, Impala-shell, Hive

CLOUDERA DISTRIBUTION HADOOP - CDH

(actual hands-on, can provide online technical demo)

Installation from scratch up to running/operational CDH server or on GCP DataProc / AWS EMR Hadoop/HDFS(files manipulation), PostGreSQL queries, HUE – Impala queries, Impala shells - queries, Beelines queries, SparkSQL queries, Spark DataFrames queries.

Big Data Visualization Tools

(actual hands-on, can provide online technical demo)

KNiME, Neo4J, Hue, Splunk – Pivot, Splunk – Queries, Google Data Studio, Spreadsheets

DATABRICKS Platform

(actual hands-on, can provide online technical demo)

Databricks Interactive Notebook, Apache Spark Queries, Built-in visualization tools

Agile Development

(Continuous Design/Delivery/Deployment)

Donald Norman’s 7 steps, The Venture Design Process, Managing Iteration, Agile Methodologies Fundamentals – Scrum, Kanban, Extreme Programming, Lean Startup, Lean UX, Culture of Experimentation, Customer/Testing Analytics, Agile Story – Personas/Playbook, Hypothesis, Prototyping, Product Pipeline, Elements of Agile Team Charter, Test Pyramid, Release Stage

Machine Learning / Data Science

(Applied Data Science, concepts, sparks Python, spark SQL)

SciKit-Learn, K-means clustering, Supervised/Unsupervised learning, Principal Component Analysis, Feature Relationships, Correlation Matrix, Feature Engineering, Decision Tree, Random Forest, Label Imbalance, Hyperparameters, Model Generalization, Cross Validation.

Operating Systems

Windows 2000 / 2003 / 10 – WSL2, Debian Distro / RedHat Distro / Arch Linux / Clear Linux / VMWARE Photon Linux

GPON ONU OLT Fiber Technology (actual hands-on)

ZTE CX300/CX320, HUAWEI - MA5603T (vlan, gemport, t-cont) Basic to Intermédiate Configuration

Troubleshooting / Networking / Telecommunication

Hardware (PC Assembly) / Software / Internet / Network (LAN, installation, configuration)

LAN / Installation / Configuration / Ubuntu Server / MySQL-SQL Server / VOIP, SIP, IP-Routing Basic Configuration

TRAINING COURSES:

GOOGLE/GOOGLE CLOUD

Google IT Automation with Python (6 courses)

Professional Certificate

Data Engineering with Google Cloud (6 courses)

Professional Certificate

SRE and DevOps Engineer with Google Cloud (5 courses)

Professional Certificate

Cloud Architecture with Google Cloud (7 courses)

Professional Certificate

Google Cloud Security (7 courses)

Professional Certificate

Google Cloud Networking (4 courses)

Professional Certificate

Cloud Engineering with Google Cloud (6 courses)

Professional Certificate

Data Engineering, Big Data, and Machine Learning on GCP

Specialization Certificate (5 courses)

From Data to Insights with Google Cloud - BigQuery

Specialization Certificate (4 courses)

Architecting Hybrid Cloud Infrastructure with Anthos Specialization Certificate (3 courses)

Install and Manage Google Cloud's Apigee API Platform Specialization Certificate (3 courses)

Architecting with Google Kubernetes Engine (4 courses)

Specialization Certificate

Architecting with Google Compute Engine (5 courses)

Specialization Certificate

Security in Google Cloud Platform (4 courses)

Specialization Certificate

Networking in Google Cloud (3 courses)

Specialization Certificate

AMAZON WEB SERVICES

AWS Fundamentals (4 courses)

Specialization Certificate

IBM

IBM Data Engineering Foundations (5 courses)

Specialization Certificate

RED HAT Enterprise Linux

Linux and Private Cloud Administration on IBM Power Systems (3 courses)

Specialization Certificate

University of Virginia

Darden School of Business

Digital Product Management Specialization (5 courses)

Agile Development Specialization (4 courses)

Continuous Delivery & DevOps

DATABRICKS

Data Science with Databricks for Data Analysts (3 courses)

Specialization Certificate

CLOUDERA

Modern Big Data Analysis with SQL (3 courses)

Specialization Certificate

University of California San Diego

Big Data (6 courses) with Capstone Project

Specialization Certificate

Hadoop Platform and Application Framework

Development Academy of the Philippines

Dept of Science & Technology – SPARTA Scholarship Grantee

Data Analyst

September 2020 – Present

Development Academy of the Philippines

SP101 Getting Grounded on Analytics

October 2020 – Professional Certificate



Contact this candidate