Post Job Free

Resume

Sign in

Aws Engineer

Location:
Philippines
Posted:
September 22, 2020

Contact this candidate

Resume:

Joshua G. Calianga

I'm just a straightforward guy who happened to love backend development and

AWS Cloud. My core values are simplicity, innovation, and harmony. Away from my work, I can often be seen playing a guitar, playing basketball, or watching youtube tutorials.

Blk 9 Lot 11 Gabriel st.

Monterey Hills IV Brgy. Silangan

San Mateo Rizal

091********

adga1e@r.postjobfree.com

EXPERIENCE

Sqreem Technologies, Quezon City

- Senior S oftware Engineer / Cloud Engineer

SEP 2019 - SEP 2020

Pioneered both Test and Development environments for our AWS resources. Cooperated with the technical lead of di erent projects for infrastructure review and security implementation. Handled security and access control management for AWS servers and resources. Started automation of resources provisioning using AWS CloudFormation. Sqreem Technologies, Quezon City - S oftware Engineer APR 2019 - SEP 2019

Tackled Research and Development for security and access control for incoming and existing web applications and APIs. Ensured quality by reviewing and optimizing code of existing API libraries. Sqreem Technologies, Quezon City - Junior Software Engineer JAN 2018 - APR 2019

Developed assigned project API Libraries. Maintained live Web APIs by debugging, optimizing, and making use of latest C# technologies. ON Semiconductor Phils. Inc., Carmona, Cavite - On the Job Training

JUN 2017 - AUG 2017

Worked on Build Requests, Software Development, and Wafer Probing. TaskUs, Inc., Imus, Cavite - Customer Service Representative MAY 2016 - AUG 2016

EDUCATION

De La Salle University, Dasmarinas, Cavite— Bachelor of Science in Computer Engineering

JAN 2018

SKILLS

Node.js

Express

Vue.js

C# .NET

GraphCMS

Selenium

Systems Design

Security Token Service

JSON Web Tokens

AWS EC2

AWS ElasticBeanstalk

AWS CodePipeline

AWS Cognito

AWS S3

AWS Lambda

AWS API Gateway

AWS IAM

AWS DynamoDB

AWS Amplify

AWS Cloudformation

AWS CloudWatch

LANGUAGES

English

Filipino

Certifications

AWS Certified Solutions Architect – Associate

JUN 2020

AWS Certified Cloud Practitioner

APR 2020

TRAININGS AND SEMINARS

Security Engineering on AWS, Menon Network, 152 Beach Road, Level 28 Gateway East, Singapore

NOV 2019

Web Security, Pluralsight— Web Security and the OWASP Top 10 vulnerabilities

NOV 2019

CompTIA Security+, Pluralsight— Identity and Access management, Architecture and Design, Risk Management, Cryptography and PKI

NOV 2019

AWS SUMMIT 2019, Singapore Expo, Singapore— Data & Analytics, App Modernization, Security & Compliance APR 2019

AWS Solutions Deep-Dive, Amazon Web Services Singapore 23 Church St, Singapore— Fine-grained access control, Big-data processing, Serverless computing

APR 2019

PROJECTS

Talent Center — API using AWS ElasticBeanstalk, GraphCMS, DynamoDB

Handled backend for a website that accepts project inquiries from any client where our Admin can create custom workforce and budget proposal dashboards.

● Used Nodejs with Express for backend. Implemented ci/cd using CodePipeline connected to Github repository and run builds into ElasticBeanstalk.

● Used GraphCMS for easier content management, and created GraphQL queries inside Nodejs backend to fetch content data.

● Created Swagger webpage documentation for easier frontend integration and QA testing of API calls.

Hiring Tools — AWS Lambda, DynamoDB, S3

Handled backend and simple frontend (Vue.js) for a Hiring tools webpage with audio recording tool. Features of the page is to get information from applicants and securely save their recording to our S3 bucket.

● Used Nodejs with Lambda for backend.

● Used Vuejs for frontend, and implemented ci/cd using CodePipeline connected to Github repository and builds into S3 with CloudFront configuration.

Accounts Library — API using Amazon Web Services, two-way encryption

When I was assigned to do Research and development in integrating Amazon Web Services into a new Accounts Management Library, I learned how to use AWS Cognito, AWS IAM, AWS Lambda, and AWS API Gateway. With this serverless implementation, our accounts management system did not require any more server maintenance and eliminated the problems of slow responses due to many requests being handled by our server, as Amazon takes care of the elasticity and scalability of the API functions on the server side. This project also increased the security of our API functions with Amazon Cognito and Amazon API Gateway authentication and authorization features along with permissions handling in AWS IAM.

● Eliminated the speed problems being encountered with the previous accounts.

● Implemented AWS Signature v4 in every API call to block unauthorized POST requests.

● Fine-grained access control for all APIs and Web UI even for existing systems.

Task Control Center — API using AWS DynamoDB, SQS

A system for managing engine tasks for scraping and batch data processing. Utilizing AWS SQS and DynamoDB, the purpose of this library is to replace the existing task control center which uses websocket for server communication.

● Faster task display queries because of DynamoDB.

● Removed the issue of stuck tasks due to lost websocket connections.

● Reduced the cost of having many EC2 just for maintaining websocket connections.

Documents Website — API using AWS S3, AWS Lambda

This allowed SQREEM to have a website for displaying our legal and organizational documents that does not give access to other people outside of our company even if our own employee shares a document link.

● Implemented Pre-signed urls with short expiration time.

● Enabled our clients to review some of our documents without having access to anything else.

Alexa Site Screener and Website ad space scraper library — API using Selenium, AWS S3

This enabled us to have a list of sites related to a keyword and check if these sites has ad space and its ad dimensions.

● Relevant information is stored encrypted on S3.

● Enabled the media buying team to have our own website targeting for ad campaigns.

Chat Library with ChatBot — API using Websockets, WebRTC By using Websocket and WebRTC, I created a library with message, call, and bot features. The chatbot uses supervised learning and if the knowledge base does not contain any data to respond to a question, it automatically gets data from google search and stores it in its knowledge base.

● Highlights of the project were implementing video calls, and the chatbot generated answer function.

Help Center Library — API using WebSockets

With the clients getting lost on how to use the web applications, the Help Center API helps the user to know what to do depending on the application being used. The user can also ask a chatbot about what he/she is trying to do, and the bot will respond with instructions based on the help center database.

● Implemented my generate bot answer (ChatBot API) to also search in the FAQ data before searching in its own knowledge base, and if there is really no answer to a user’s question, scrapes the Google results and returns possible answers. This simplified the learning process for clients.

Words Library — API

This library scrapes raw words across di erent websites, and has the features to analyze data by extracting word count for di erent keywords. The analysis could be narrowed down when handling multiple word combinations that will help you determine what the keywords have in common for Search Engine Optimization.

● Optimized existing codes and handled the errors that the test team gave to me.

● Edited the code to accept more than 3 word combinations in extracting word counts

Baidu Scraper Library — API

Baidu search engine contains keyword search trends in relevance to time and search frequency. I managed to get the trend values by decrypting the data from their website.

● Edited 80% of existing code because Baidu’s website returns have changed and the old decryption process did not work.

● On the first week handling the project, I made a quick fix using selenium to cater the data analysis team’s need for data.

● Highlight of the project is when I nailed the decryption function to get the trend values without selenium, and the scraping process improved about more than 10x the speed.

Product Reviews Library — API using Selenium

Developed API and desktop application for scraping words from the product reviews from Ebay, Amazon, Rakuten(Global), Rakuten(Japan), and Lazada. Also created trends to know what keywords attribute to the product for digital marketing.

Facebook Reach Extractor — API using Selenium

This allows us to know the related interests and behaviors of people who uses facebook, and get the potential reach of marketing campaigns. In order to get facebook access token automatically, I used selenium to login, navigate and get the needed access token.

● Learned about facebook marketing api and developed our own api for mass facebook reach data mining.

● Also handled the database structure and task management

● Coordinated with front end developers to release a Facebook Reach Data Miner tool.



Contact this candidate