Post Job Free

Resume

Sign in

Engineer Aws

Location:
Dallas, TX
Salary:
130k
Posted:
September 14, 2020

Contact this candidate

Resume:

Isah Junior Yusuf

adf3mb@r.postjobfree.com

469-***-****

Summary

Motivated professional with 9 plus practicing Cloud Computing, Microservices, DevOps, CI/CD, Infrastructure automation, quality engineering, design and maintenance of client/server and web applications.

6 plus years in practicing Cloud Computing, DevOps, CI/CD, Infrastructure automation, quality engineering and release management.

Ability to make a positive impact in any business environment that has been demonstrated by my employment record

Expertise

Extensive knowledge of Agile and Scrum methodologies to develop best practices for software development and implementation Experience working with developing scripts and automation tools used for building, integrating and deploying software releases to multiple environments.

Experience in various roles as DevOps, Cloud Engineer, Build and Release Engineer with excellent experience in Software Integration, Configuration, Packaging, Building, Automating, managing and releasing code from one environment to other environment and deploy to servers, support and maintenance under Unix/Linux/VM's and Cloud Platforms.

Extensive experience in setting up CI/CD pipelines using tools such as Jenkins, TeamCity, Bit Bucket, GitHub, Maven, Nexus and Jira, Coverity. Accurev and Clearquest

Strong hands on experience in scripting languages such as Groovy, JSON, YAML, Shell scripting.

Experience in integrating code quality tools such as SonarQube, in CI/CD pipelines.

Good knowledge of virtualization and container technology like Docker. Experience in creating Dockerfiles and working with Docker containers and Deploying to AKS and EKS

Exposure to scripting languages such as Groovy, Shell scripting, Bash and Python

Exposure to configuration management tools such as Ansible, Terraform and Docker

Strong exposure to AWS and Azure cloud platforms.

Experience with Application web services Nginx

Excellent understanding of source-code management principles and systems, particularly Git

Strong knowledge on practicing TDD, automating unit tests using Unit test script

Participated in the release cycle of the product which involved environments like Development QA UAT and Production.

Worked with project documentation and documented other application related issues, bugs on internal wiki website.

A highly motivated, energetic individual, a team player with excellent communication and inter-personal skills

Education

Bachelor’s Degree in Medical Microbiology Ahmadu Bello University. Nigeria 2011

Master’s degree in public health Dubai School of Health. UAE 2013

Technical Skills

Cloud platforms: AWS, Azure,

Framework/tools: Nagios, SonarQube, Nexus, Jenkins, Coverity

Versioning Tools: Git, GitHub, Bitbucket, Microfocus Accurev

Application/Web Servers: Ansible, Terraform, CloudFormation, Tomcat, Nginx

Operating Systems: Windows, Linux, Mac

Database: MySQL, Postgresql,MongoDB

Technologies. Machine learning, AI forecasting

Programming Languages: Python 2.7, Java

Scripting & Other Tools: Shell Scripts (Ksh, Bash), Git Bash, Python

Containerization tools Docker, Ecs, Kubernetes

Task/Bug Management tools: Atlassian Jira, IBM ClearQuest, Agile craft,

AWS Tool: S3, Autoscaling, Load balancers, DynamoDB, IAM, AMI, EC2, Route

53, VPC, Cloud formation, Device Farm, EKS, EBS, WAF

Certifications

AWS Certified Cloud Practitioner

AWS Certified Developer - Associate

AWS Certified Solution Architect- Associate

AWS Certified Solution Architect - Professional

AWS Certified Security - Security Specialty

AWS Certified SysOps Admin – Associate

AWS Certified Devops Engineer - Professional

Microsoft Certified - Azure Fundamental

Microsoft Certified - Azure Administrator

Professional Experience

United States Navy Reserve June 2017– Present

Information System Technicians

Active Security Clearance

JHC Technology MD. Nov 2019 – present

Senior Cloud SME III

Project Description

Part Of an 11member Team tasked to deploy cloud solution for government agency and resellers, building a multi cloud platform for Drupal migration from on-prem to AWS cloud.

Responsibilities

Deploy Active Directory federation service and integrate with manage AD

Configure trusted device for AWS workspaces to restrict access to user with root CA on local laptops

Integrating ADFS with AWS SSO

Configure cross account s3 assume roles to be able to download objects from s3 bucket in other accounts in gov cloud

Containerizing legacy windows application using AWS containerization tool App2container.

Designing multi account platform for Drupal migration

Creating terraform and cloud formation template for AWS infrastructure

Creating cloud trail integration between the multi account platform created for Drupal

Deploy a High availability Gitlab with multi region failover

Configuring AWS single sign on for all AWS account to easy reduce the hustle of managing multiple sign on credentials for the account platforms

Creating AWS works spaces for Developers and other uses to access AWS platform

Managing O and M request in the Jira Board

Triaging the request on the Jira board to determine which team member to work on the Jira tickets and request

Creating S3 buckets and enabling cloud trail from other accounts to send their logs

Creating cross account IAM policy

Creating AWS KMS keys to encrypt s3 logs

Creating cloudtrail, gaurdduty and config for all accounts

Creating Teraform template to deploy load balancers and Autoscaling group

Creating terraform template to deploy VPC in different accounts

Managing users, servers and permission in Active Directory

Trouble shooting build failures and resolving all errors found

Xome Labs Lewisville TX June 2019--- Present

Senior Cloud Infrastructure Engineer

Project Description

I am part of a 4 member AWS infrastructure Devops engineer team, Providing DevOps thought leadership and mentoring in both advisory and delivery contexts, focusing on the requirements of our clients' customers and how these are best served by continuous improvements to our delivery approach

Responsibilities:

Design Multi Account platform for Drupal CMS

Provides DevOps thought leadership and mentoring in both advisory and delivery contexts, focusing on the requirements of our clients' customers and how these are best served by continuous improvements to our delivery approach.

Architects a project or programs Deployment Pipeline with set of technical and business assurance activities that support the transition of application and infrastructure services from development through to production.

Assists in the direction of how a system is designed and architecture

Actively manage, improve, and monitor cloud infrastructure on AWS, EC2, S3, and RDS, including backups, patches, and scaling

Design cloud endure backup and restore as a disaster recovery mechanism for Drupal

Configure AWS SSO to link all the multi account design for easy sign in.

Create CloudFormation templates for vpc, ec2, codebuild, codepipeline, codecommit, System manager, IAM,

Create Load balancer and auto scaling group

Focusing on efficient use of CICD processes and tools for AWS cloud migration with special attention to

characteristics like reuse, scalability, resilience and performance of the migration solutions

Manage and created development and test environments on AWS cloud which included creating (Application servers, provisioning databases and migrating existing applications)

Setup EFS mount point for the application

Infosys Consulting Richardson TX March 2019 --- May 19

DevOps AWS Architect

Project Description:

Part of a 6 member Infrastructure team, Providing AWS consulting for AT&T and my role was as a senior aws infrastructure engineer, to help build and deploy application Using cloud formation, stream lining the deployment process, planning with the dev team, training and coaching on right practices, using tools like,S3, Dynamo Db, vpc, cloud formation, packer, stacker, route 53, Aim, Ami,

Responsibilities:

Build custom VPC with multiple subnets in different availability zone for high availability of our application

Setting AWS Glue, sage maker and Athena for Developers to be able to deliver a POC for data lake ingestion.

Creating scaling policy for autoscaling groups

Setting up AWS WAF for application

Setting device Farm for mobile testing team, for both Andriod and IOS applications

Creating s3 bucket for dumping of open DNS log from splunk

Setting up right scale for monitoring our cloud environment

Updating our cloud formation template before deployment.

Setting up load balancer for our application to balance it during high traffic

Setting up auto scaling group with different scaling policies to scale up and down when cpu usage change in our application

Setting up demo environment for dev team for Data ops POC using aws glue, lake formation etc

Setting permission for users using IAM and creating custom policies

Setting up aws Eks for poc so dev team can start using docker for their application and leveraging Kubernetes

Creating tagging document for all our AWS account so we can use right scale to monitor cost

Creating AWS config to help us put Guard rails in place so as to limit resource being created by developers

Creating custom nodes for eke and connecting nodes to the cluster.

Setting up aws eke cluster, creating iam role for eks

Infosys Consulting Richardson TX Jan 2019 March 2019

DevOps Consultant (lead)

Project Description:

I am part of a 20 member Devops team, both offshore and onshore our role was to build and improve the CI/CD pipeline for a machine learning platform and AI forecasting for predictions and training of data. Using tools like Kubernetes, Docker, Docker central, Jenkins, Agile craft, Mongo DB.

Responsibilities:

Building CI/CD pipeline for machine learning platform and AI forecasting

Git commit, Build and Deploy.

Build a new Image from the develop branch or hotfix with commit ID and environment latest

Tag and push the Image to Docker central

Deploy tagged images to Kubernetes clusters

Create and approval workflow to take user input on deployment environment either to dev, qa or prod

Making changes to Python Projects to use pybuilder cdo as a dependency to generate tags and capability of adding more params before deployment to CDP

For Java Project Use Docker maven plugin for tag creation and pushing to Docker central

Set up a code review workflow for code review process with managers and dev team

K8S image downloaded Domain management -ABCD (commit hash)

Updated Jenkins file for both java and python project Micro services

Installed Helm container orchestration tool for Kubernetes in the dev environment for testing

Research options for continuous monitoring tools for our production and dev servers

Collaborate with the dev team and production support on deployment readiness of software versions and update

Integrated code quality tool into the deployment process for code quality check

Setting up sprint goals and making sure the team meets its goals at the end of the sprint.

Setting up GIT branching strategy and workflow

Set up a code review workflow for code review process with managers and dev team

Abbott Laboratories Plano TX July 2018 Jan 2019

DevOps Consultant

Project Description:

I am part of a 3 member DevOps consultant team, our role was to evaluate the current process of Abbot laboratories and come up with recommendations on way to improve the system using modern tools like Bitbucket, Jenkins, Nexus, SonarQube, Git, Jira, Coverity,Nginx, Ranorex and Visual studio. And maintaining the current system and improving it while coming up with upgrades as well and train the developers on how the modern tools installed will be used. And set up new development process.

Responsibilities:

Defining the criteria for a new set of infrastructures

Configured Nginx Reverse proxy to use CNames and secured port and deploy SSL certificates on prod/dev to use secure ports for authentication

Evaluation of the legacy application with set of criteria with modern devops tools

Migration of change management tool from (IBM ClearQuest) to Atlassian’s Jira

Define Jira workflow for CCB and STB team

Integrate Jira with Jenkins to trigger builds from Jira ticket.

Integration of Bitbucket and Jira to have one UI and link Jira ticket to bitbucket branches

Integrate Sonatype Nexus to Jenkins for automatic deployment of build artifact

Integrating sonar scanner for MS build to run analysis on Jenkins builds

Integrating Visual studios for windows with bitbucket to allow developers push their codes to bitbucket

Migration of Legacy IBM tool (Microfocus Accurev) to modern tool Git.

Automated the deployment of IOS application.

Setting up bitbucket server and defining merge requirement

Setting up bitbucket code review and pull request functionality before merging to master branch

Create Microfocus AccuRev to Bitbucket migration plan

Create Microfocus AccuRev to Bitbucket migration testing plan

Run Legacy Application weekly builds while migration planning continued

Migration of legacy Build tool IBM build forge to modern Devop build tool like Jenkins

Setting up GIT branching strategy and workflow

Dry Run Accurev to bitbucket migration testing plan

Updating python build script to use Git command to run Build jobs in Jenkins

Create staging/ production environment implementation planning

Create documentation to support adding additional build agent to the production environment

Create Jira, Jenkins, Nexus, bitbucket, and SonarQube maintenance document

Dev and Prod Environment Architecture

Investigation of bitbucket API

Setting up sprint goals and making sure the team meets its goals at the end of the sprint.

Installation of proposed Devops tools by the team on prod/ dev environment

Conduct Git, Jira, Jenkins and bitbucket training for the development team on how the tools work.

Define an improvement work flow strategy around the legacy tools during migration

Set up a code review workflow for code review process with managers and dev team

Setting up quality gates in SonarQube to fail the build if the gates are not passed.

Setting up Jenkins pipeline using the blue ocean plugin to involve Coverity analysis

Setting Up Jenkins pipeline to include automatic deployment of IPA’s to iPad and iPod

Bank of America Dallas/ Fort Worth TX October 2015– June 2018

DevOps Engineer

Project Description:

I am part of seven members DevOps team. Our role is to migrate the legacy applications to AWS cloud using DevOps tools like GitHub, Jenkins, Nexus, Docker and SonarQube. And GIT, Tomcat

Responsibilities:

Set up Git repositories and Assign SSH Keys to my team.

Work on Jenkins adding the necessary plugins and adding more slaves to support scalability and agility.

Improve systems performance with continuous monitoring tools, resolve day to day issues

Automated the deployment of Java applications

Set up CI/CD pipelines for Microservices on AWS using App services.

Created Docker file and automated Docker image creation using Ecs and Docker.

Automated infrastructure provisioning of EC2 on AWS using Terraform and Ansible.

Automated deployment of webapps to Tomcat

Monitoring deployment for Rollback

Created nightly builds with integration to code quality tools such as SonarQube,

Created quality gates in SonarQube dashboard and enforced in the pipelines to fail the builds when conditions not met.

Worked on integrating GIT into the continuous Integration (CI) environment along with Jenkins.

Manage/mentor both onsite/offshore team.

Enforced Test Driven development for the DEV teams for every sprint.

Built and deployed Docker containers to break up monolithic app into microservices, improving developer workflow, increasing scalability, and optimizing speed

Define an improvement work flow strategy around the legacy tools during migration

Set up a code review workflow for code review process with managers and dev team

Setting up quality gates in SonarQube to fail the build if the gates are not passed.

Defining the criteria for a new set of infrastructures

Setting up sprint goals and making sure the team meets its goals at the end of the sprint

Implementing Agile Scrum methodology to improve SDLC time.

Came up with a Devops Topology Recommendation for the Dev/prod team

Generate bitbucket release Note in Jenkins

Generated bitbucket code review report in Jenkins

Monitoring staging environment for performance bottleneck.

Environment: MySQL, Postgress, Ansible, Terraform, Django, Jenkins, AWS, SonarQube, TDD, Tomcat

NSG Groups Dubai UAE Oman March 2014 – June 2015

Dev Ops / Build & Release Engineer

As a member of Foundry team, my responsibility is to implement DevOps transformation for agile teams.

Responsibilities:

Performed code migration from TFVC to Git using TFS-Git utility.

Set up VSTS CI/CD pipelines for Microservices on VSTS and deployed to Azure cloud using App services.

Created nightly builds with integration to code quality tools such as SonarQube

Created quality gates in SonarQube dashboard and enforced in the pipelines to fail the builds when conditions not met.

Set up sonar lint, Codi scope plug-ins in Developer’s workstation.

Maintained production application and managed SLA’s and metric performance

Implemented Release management workflow for QA, UAT and Prod environments.

Collaborated with Sys Admin and DBAs to identify deployed issues for component-based application.

Environment: Git, Azure, VSTS, Docker, Nagios, Oracle 12c, Java, Windows Servers, JIRA

Fosad consulting Lagos Nigeria Jan 2012 – March 2014

Quality Automation Engineer

Responsibilities

Develop a performance testing practices and procedures to identify application bottlenecks and provides suggestion to development teams.

Prepares and executes Load Runner test scripts using modularized load testing framework.

Debug the development and execution of test scripts against the test targets.

Analyzes client needs, and develops a unique solution or chooses an approach or procedure for addressing a work task applying multiple and varied options

Analyze non-functional requirements to identify and prioritize test targets. Verify requirements as testable.

Create test data and develop test data requests for each test script.

Record time in time reporting system (EPM).

Working with performance testing tools including JMeter, HP Load Runner, and Performance Center with capabilities in analyzing performance test results. Also use HP Load Runner and HP Performance Center test tool components: VUGEN, HP Diagnostics, Controller, Analysis, and Correlation Libraries. CA-Wily Introscope



Contact this candidate