Post Job Free

Resume

Sign in

Aws Engineer

Location:
Owings Mills, MD
Salary:
70
Posted:
July 26, 2020

Contact this candidate

Resume:

VIJAY KUMAR

Sr. AWS DevOps Engineer

adevfa@r.postjobfree.com

+1(443)- 377 -3169

AWS Cloud Engineer Sr. DevSecOps Engineer

Devops Automation Engineer Build & Release Engineer

Carrier Highlights:

Certified AWS Solutions Architect Professional with over 7+ years of extensive IT experience, Expertise in DevOps and Cloud Engineering & UNIX, Linux Administrator.

Extensive experience in Amazon Web Services (AWS) Cloud services such as EC2, VPC, S3, IAM, EBS, RDS, ELB, VPC, Route53, Dynamo DB, Lambda, Gaurdduty, Config, Macie, Service Catalog, Cloud Formation, Auto Scaling, Cloud Front, Cloud Trail, Cloud Watch, Elastic search, Elastic File system (EFS), Elastic Beanstalk, AWS SNS, AWS SQS, AWS SES, AWS SWF & AWS Direct Connect.

Having experience in automating infrastructure provisioning using Terraform on AWS.

Having hands on experience in creating jobs and pipeline jobs in Jenkins.

Experience with designing, building, and operating solutions using Virtualization using private hybrid/public cloud technologies.

Knowledge of High Availability (HA) and Disaster Recovery (DR) options in AWS.

Experience in Migrating a production infrastructure into an Amazon Web Services utilizing AWS Cloud formation.

Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Planning, Developing, Testing and implementing and Post-production analysis of the projects and methodologies such as Agile, SCRUM.

Experience in configuring Docker Containers for Branching and deployed using Elastic Beanstalk.

Experience in designing, installing and implementing Ansible configuration management system for managing Web applications, Environments configuration Files, Users, Mount points and Packages.

Extensively worked on Jenkins and Bamboo by installing, configuring and maintaining the purpose of Continuous Integration (CI) and for End-to-End automation for all build and deployments and in implementing CI/CD for database using Jenkins.

Experience in managing Deploy configuration, administration, upgrade, security and maintenance of systems, platforms like Web, application.

Experience in using build utilities like Maven for building of jar, war, and ear files.

Experience in using version controller tools Git, GitHub, and Bit Bucket.

Experience in developing cloud solutions using IaaS, SaaS and PaaS.

Strong in building Object Oriented applications using Shell Scripts and Python Scripts on UNIX/LINUX.

Expertise in application builds, deployment, smoke testing and release promotion for complex applications and infrastructure

Performed several types of testing like smoke, functional, system integration, white box, black box, gray box, positive, negative and regression testing.

Integrated the AWS S3 logs with Athena service, exported the data in csv format by leveraging the Lambda with python code.

Experience in installing and configuring web application servers Tomcat, JBOSS, Web Logic and Nginx for application deployments for Linux, UNIX and Windows.

Wrote AWS Lambda functions in python for AWS's Lambda which invokes python scripts to perform various transformations and analytics on large data sets in EMR clusters.

Developed and run UNIX shell scripts and implemented auto deployment process.

Experience in administering, installation, configuration, support and maintenance of Linux.

Experience in implementing hybrid cloud solutions Direct connect/VPN and Active directory.

Working knowledge in Virtualization Technologies vSphere, VMware, Virtual Box and Hyper-V.

Created Step Functions to handle nested or parallel executions of multiple AWS Lambda functions.

Leveraging AWS SDKs to interact with AWS services from application.

Container management using Docker by writing Docker files and set up the automated build on Docker HUB and installed and configured Kubernetes.

Experienced in deployment of applications on Apache Webserver, Nginx and Application Servers such as Tomcat, JBoss.

Involved in the development of the UI using JSP, HTML5, CSS3, JavaScript, jQuery. Worked on JavaScript framework to augment browser-based applications with MVC capability.

EDUCATION:

Bachelor’s degree Electronics and Communication engineering with 3.58 GPA - JNTUK, India

Technical Skills:

Operating System RHEL/OEL/CentOS Linux -5/6/7

Cloud Technology AWS (EC2, ELB, VPC, RDS, IAM, CLOUD FORMATION, S3, CLOUD WATCH, Lambda, Service Catalog, Config, EFS, X-Ray, ECS, EKS, Step Functions, SNS,

SQS, Dynamo DB), AZURE, GCP

Virtualization Technologies vSphere, VMware Workstation, Oracle Virtual Box, Hyper-V

Scripting Languages Shell scripting, Python, Bash, Java scripting, Groovy and Ruby

Containers Tools Docker, ECS, Kubernetes, EKS

Configuration Management Tool Chef, Ansible

Versioning Tools GIT Hub, GIT Lab, Bit bucket

CI Tools Jenkins, Bamboo

Build Tools MAVEN

Bug Tracking Tools JIRA, ServiceNow

Web Application servers Apache Tomcat, JBOSS, Nginx

Databases MySQL, Dynamo DB, RDS

Monitoring Tools Amazon Cloud Watch, Nagios, Splunk

Networking/protocols VPC, subnets, VPN

Repositories GIT, ARTIFACTORY, NEXUS

Networking REST API, routing protocols, subnets, VPN

Work Experience:

AWS DEVOPS ENGINEER

EMC Ins – Des Moines, IA Feb 2020 – Present

Responsibilities:

•Maintaining Cl/CD pipeline with bamboo on Docker container environment utilizing Docker swarm and Docker for the runtime environment for the Cl/CD system to build test and deploy on DEV, UAT and PROD environment.

•Set up the ETL process by reading DDL statements from S3 buckets, validate the queries on Athena platform and loaded data into DynamoDB tables.

•Automated the process of keeping track of succeeded and failed Athena queries, written python code in lambda function.

•Experience in writing SAM template to deploy serverless applications on AWS cloud.

•Hands-on experience on working with AWS services like Lambda function, Athena, DynamoDB, Step functions, SNS, SQS, S3, IAM etc.

•Developing and maintaining applications written for AWS S3, Lambda, AWS DynamoDB, AWS SQS, AWS SNS and Amazon Serverless Application Model.

•Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.

•Integrated SQS and DynamoDB with step functions to iterate through list of messages and updated the status into DynamoDB table.

•Implemented different state machines and using almost all available tasks for step functions like map, parallel, choice, task etc.

•Involved in Upgrade of Bamboo & Artifactory Server by scheduling backups in S3.

•Managed the Code Repository by maintaining code in BIT Bucket, improve practices of branching and code merge to custom needs of development team.

•Worked on Docker container snapshots, attaching to a running container, removing images, managing Directory structures, and managing containers.

•Used Jenkins pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

•Experience in maintaining Atlassian products like JIRA, Confluence, Bamboo etc. Configuring and managing AWS Simple Notification Service (SNS) and Simple Queue Service (SQS).

•Installation, Configuration and Management of RDBMS and NoSQL tools such as Dynamo DB.

•Created a Lambda Deployment function and configured it to receive events from your S3 bucket.

•Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on requirement.

•Hands-on experience on Container Orchestration like AWS ECS, Kubernetes.

•Attach a resource policy to the Code Commit repository that denies members of the IAM developer group the actions of pushing commits, merging pull requests, and adding files to the master branch.

•Integrated services like Bitbucket, AWS Code Pipeline, Bamboo and AWS Elastic Beanstalk to create a deployment pipeline.

•Automated regular tasks using Python code and leveraged Lambda function wherever required.

•Configure the repository to generate an Amazon CloudWatch Events event upon changes.

•Integrated the AWS S3 logs with Athena service, exported the data in csv format by leveraging the Lambda with python code.

•Working with Data Engineer team to analyses the data in S3 bucket using AWS Athena and understand the pattern of data.

•Working on tools like Bamboo for building the job and putting the latest files for deployment in AWS S3 bucket.

•Configuring AWS SNS to publish message to trigger AWS lambda function.

•Define and document best practices and strategies regarding application deployment and infrastructure maintenance.

•Implement the application’s CI/CD pipeline using bitbucket/bamboo and lambda.

•Implemented step functions for triggering series of lambdas executing different functionalities.

•Automation of applications for build and deployments. No human interference during the build and deployment of the application.

Environment/Tools: EC2, Elastic Load Balancing, ECS, EKS, Cloud Formation, Cloud Watch, Route 53, Redshift, Lambda, Dynamo DB, Terraform, Jira, PowerShell, Bit bucket, Bamboo, Apache Mesos, X-Ray, VPC, Cloud Trail, IAM, S3, Elastic Search, SNS and SQS.

CLOUD DEVOPS ENGINEER

Jefferies LLC. – Jersey City, NJ June 2019 – Jan 2020

Responsibilities:

•New AWS Accounts setup from scratch, enabling IAM Users, Roles, Policies as per the Organization Security Standard.

•Built CloudFormation templates for SNS, SQS, Elasticsearch, DynamoDB, Lambda, EC2, VPC, RDS, S3, IAM, CloudWatch services implementation and integrated with Service Catalog.

•Proficient in AWS services like VPC, EC2, S3, ELB, Autoscaling Groups (ASG), EBS, RDS, IAM, CloudFormation, Route 53, CloudWatch, CloudFront, CloudTrail.

•Implemented Config-Aggregator to enhance the compliance across the accounts and centralized management.

•Built AWS Lambda function with python language to monitor the new resources creation without the organization standards and enabled notification and appropriate action to taken.

•Experienced in creating multiple VPC’s and public, private subnets as per requirement and distributed them as groups into various availability zones of the VPC.

•Developed unit and functional tests in Python managed the code migration from TFS, CVS and Star team to Bitbucket repository.

•Created NAT gateways and instances to allow communication from the private instances to the internet through bastion hosts.

Used security groups, network ACL’s, internet gateways and route tables to ensure a secure zone for organization in AWS public cloud.

• Consulted with stakeholders to gather and document requirements for data governance projects, helping to establish agreed upon

data definitions and consistent data capture across the company.

• Critically evaluated information gathered from multiple sources and worked with customers to assess whether data conformed to

data governance approved mappings and standards.

•Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets. Created a Lambda Deployment function and configured it to receive events from your S3 bucket.

•Implemented a centralized logging system using log stash configured as an ELK stack (Elastic search, Log stash, and Kibana to monitor system logs, AWS Cloud Watch, VPC Flow logs, Cloud Trail Events, changes in S3 etc.

•wrote one click deployments using SLS framework, AWS CDK (backed by CloudFormation implemented on Python SDK).

•Hands-on experience on Container Orchestration like AWS ECS, Kubernetes.

•Implemented the AWS Cost budget notifications environment wise by using python with Lambda functions and SNS notification service.

•Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Ruby, and PowerShell scripts.

•Experienced on Kubernetes, created and maintaining several production grade EKS clusters.

•Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.

•Wrote Ansible Playbooks with Python SSH as the Wrapper to Manage Configurations of AWS Nodes and Test Playbooks on AWS instances using Python. Experience with Ansible Tower to more easily manage enterprise Ansible deployments

• Configure the pipeline to periodically check the repository. Start the pipeline when changes are detected.

• Configure the repository to generate an Amazon CloudWatch Events event upon changes.

• Configure the pipeline to start in response to the event. Created and configured elastic load balancers and auto scaling groups to

distribute the traffic and to have a cost efficient, fault tolerant and highly available environment.

• Wrote Ansible playbooks to launch AWS instances and used Ansible to manage web applications, configuration files, used mount

points and packages.

• Attach an AWS IAM policy to the developer IAM group that denies the actions of pushing commits, merging pull requests, and

adding files to the master branch.

• Attach a resource policy to the Code Commit repository that denies members of the IAM developer

group the actions of pushing commits, merging pull requests, and adding files to the master branch.

• Set up an AWS Lambda function that runs every 15 minutes to check for repository changes and publishes a notification to an

Amazon SNS topic.

• Developed and maintained Python/Shell PowerShell scripts for build and release tasks and automating tasks.

• Integrated services like Bitbucket AWS Code Pipeline and AWS Elastic Beanstalk to create a deployment pipeline.

• Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.

• Configured S3 buckets with various life cycle policies to archive the infrequently accessed data to storage classes based on

requirement.

• Experienced with Ansible playbooks for virtual and physical instance provisioning, configuration management, patching and

software deployment.

• Implemented Ansible to manage all existing servers and automated build/configuration of new servers.

• Possess good knowledge in creating and launching EC2 instances using AMI’s of Linux, Ubuntu, RHEL, and Windows and wrote shell scripts to bootstrap instance.

• Used IAM for creating roles, users, groups and also implemented MFA to provide additional security to AWS account and its resources. AWS ECS and EKS for docker image storage and deployment.

• Used Bamboo pipelines to drive all micro services builds out to the Docker registry and then deployed to Kubernetes, Created Pods and managed using Kubernetes.

• Design an ELK system to monitor and search enterprise alerts. Installed, configured and managed the ELK Stack for Log management within EC2 / Elastic Load balancer for Elastic Search.

• Written cloud formation templates in json to create custom VPC, subnets, NAT to ensure successful deployment of web applications.

• Implemented domain name service (DNS) through route 53 to have highly available and scalable applications.

• Maintained the monitoring and alerting of production and corporate servers using Cloud Watch service.

• Created EBS volumes for storing application files for use with EC2 instances whenever they are mounted to them.

• Experienced in creating RDS instances to serve data through servers for responding to requests.

• Created snapshots to take backups of the volumes and also images to store launch configurations of the EC2 instances.

• Written Templates for AWS infrastructure as a code using Terraform to build staging and production environments.

• Automated regular tasks using Python code and leveraged Lambda function wherever required.

• Implemented Amazon Macie, Guard Duty, Centralized CloudTrail, Centralized Cloud Config, Redlock integration.

•Deploying Docker container on AWS ECS.

• Involved in designing and deploying a multitude application utilizing almost all of the AWS stack (Including EC2, Route53, S3, RDS, VPC, ELB, Cloud Watch, IAM, Lambda, SNS, Cloud Trail, ECS, EBS) focusing on high-availability, fault tolerance, and auto-scaling.

• Knowledge on Containerization Management and setup tool Kubernetes and ECS.

• Implemented WAF & Shield across all the AWS environments to maintain the environments safe and secured.

• Implemented the AWS Cost budget notifications environment wise by using python with Lambda functions and SNS notification service.

Environment/Tools: EC2, Elastic Load Balancing, ECS, EKS, Cloud Front, Power Shell, Cloud Formation, Elastic Cache, Cloud Watch, Route 53, Redshift, Lambda, Dynamo DB, RDS, Terraform, Jira, Ansible, Bash scripts, Bit bucket, Service Catalog, X-Ray, Guard Duty, Config, Macie, VPC, Cloud Trail, IAM, Redlock, Organizations, WAF, S3,Elastic Search, SNS, SQS and SES.

DEVOPS ENGINEER

eBay Inc. - San Jose, ca March 2018 – May 2019

Responsibilities:

•Implemented Management tools like Artifactory, Jenkins, Terraform and other infra on AWS.

•Used the services EC2, EBS, S3, IAM, VPC, EFS, ELB, CloudWatch.

•Implemented VMware ESX server to provide multiple virtual hardware platforms while keeping hardware costs and energy consumption down.

•Designed and setup CI/CD pipeline to deploy containerized applications in the cloud.

•Involved in installing Jenkins on Linux environment and implemented a Master and Slave configuration to run multiple build operations in parallel.

•Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell, Ruby, and PowerShell scripts.

•Deployed and monitored Micro services using pivotal cloud foundry, also managed domains and routes with the cloud foundry.

•Configure the repository to periodically run an AWS Lambda function. The function should check the repository and start the pipeline when changes are detected.

•Configure the repository to publish an SNS notification upon changes. Subscribe the pipeline to the Amazon SNS topic.

•Building/Maintaining Docker container clusters managed by Kubernetes on GCP (Google Cloud Platform).

•Worked on Ansible for configuration management and infrastructure automation. Also created inventory in Ansible for automating continuous deployment and wrote playbooks using YAML scripting.

•Installed Docker using Docker Toolbox and worked on creating the Docker containers and Docker consoles for managing the application life.

•Collaborated in the automation of AWS infrastructure via Terraform, deployed micro services, including provisioning AWS environments using Ansible Playbooks.

•Installed Docker using Docker Toolbox and worked on creating the Docker containers and Docker consoles for managing the application life.

•Created Kubernetes cluster with objects like Pods, Deployments, Services and Config Maps and created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and Helm packages and implemented Kubernetes to deploy scale, load balance, scale and manage Docker containers with multiple namespace versions.

•Configured the application to run on the datacenter using Terraform.

•Set up an Amazon CloudWatch Events rule triggered by a Code Commit Repository State Change event for the master branch and add an Amazon SNS topic as a target.

•Configure AWS CloudTrail to send log events to Amazon CloudWatch Logs.

• Define a metric filter to identify repository events. Create a CloudWatch alarm with an Amazon SNS topic as a target.

•Used Minikube to manage local deployments in Kubernetes, created local cluster and deployed application containers.

•Performed integration of Code Quality Analysis Techniques using SonarQube, Check style and find bugs with CI tools.

•Implemented logging solutions with Elastic search, Logstash & Kibana.

•Utilized Kubernetes for the runtime environment of the CI/CD system to build, test deploy.

•Used Nagios for application and hardware resource monitoring and wrote new plugins in Nagios to monitor resources.

•Used IAM to create new accounts, roles and groups which are engaged in enabling Lambda functions for dynamic creation of roles.

Environment/Tools: EC2, Elastic Load Balancing, Cloud Front, Cloud Formation, Elastic Cache, Cloud Watch, Route 53,

Lambda, Nagios, PowerShell, Terraform, Kubernetes, Docker, Jira, Ansible.

Build and Release Engineer

MAXIMUS- Reston, VA Jan 2017 – Feb 2018

Responsibilities:

• Implemented AWS solutions using EC2, S3, EBS, Elastic Load Balancer, Auto-scaling groups.

• Managing Cloud Services using AWS Cloud Formation, which helped developers and businesses an easy way to create a collection

of, related AWS resources and provision them in an orderly and predictable fashion.

• Experience in building CI/CD pipeline using Jenkins Multi Branch Pipeline Scripts.

• Created snapshots and Amazon machine images (AMI) of the instances for backup and creating clone instance.

• Built and configured a virtual data center in the Amazon Web Services cloud to support Enterprise Data Warehouse hosting

including Virtual Private Cloud (VPC), Public and Private Subnets, Security Groups, Route Tables, Elastic Load Balancer.

• Implemented Continuous Integration using Jenkins and GIT from scratch.

• Dealt with errors in pom.xml file to obtain appropriate builds using maven build tool.

• Developed and scheduled bash shell scripts for various activities (deployed environment verification, running database scripts,

file manipulations etc.)

• Designed Methodologies to troubleshoot based on the issues and documented all the procedures to educate team members.

• Design, install, administer, and optimize hybrid cloud components to ensure business continuity

• Performed Test Case Automation, Code Analysis, Code Reviews, Continuous Integration, Continuous Deployment using

Team Foundation Server

• Design EC2 instance architecture to meet high availability application architecture and security parameters.

• Connected continuous integration system (CI/CD) with Git version control repository and continually build as the check-in’s come

from the developer.

• Focus on continuous integration and continuous deployment (CI/CD) and promote enterprise solution deployment assets to target

environments.

• Manage internal and external build, packaging, and release projects

• Maintained source code repositories and build scripts and installation processes.

• Integration of Automated Build with Deployment Pipeline. Currently installed Chef Server and clients to pick up the Build from Jenkins repository and deploy in target environments (Integration, QA, and Production)

• Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby/Bash

scripts

• Comfortable and flexible with installing, updating and configuring various flavors of UNIX and Windows.

• Defined dependencies and plugins in Maven pom.xml for various activities and integrated Maven with GIT to manage and deploy

project related tags.

• Implemented & maintained the branching and build/release strategies utilizing GIT.

Environment: Subversion, Maven, Jenkins, GIT, Chef, terraform, AWS (EC2, VPC, ELB, S3, CloudWatch and Cloud Trail),

Shell Scripting, PUT

SYSTEM ENGINEER

Live Code Technologies - Bangalore, India July 2014 – Dec 2015

Responsibilities:

• Performed hardware and software installations, upgrades, and maintenance, patch administration, kernel modification/upgrades,

file system management, performance and security analysis and network configuration/tuning.

• Constituted Jenkins to perform the build in the non-production and production environments.

• Implementing build automation in Jenkins using Bash scripting for daily builds.

• Ensuring release to test environments by merging conflict code.

• Experience in managing GIT as source control systems.

• Used Chef for implementing Automated Application Deployment.

• Developed Micro services using Go language and developed corresponding test cases

• Assisting developers to integrate their ode with mainstream.

• Managing Nexus for Artifactory and dependency management systems.

• Created PDF reports using Golang and XML documents to send it to all customers at the end of month with international

language support. Implementation of Continuous Integration and designing of continuous delivery using Jenkins.

• Defined the build and automated testing infrastructure by educating the development and QA teams with the tools and processes.

• Configured and maintained Jenkins to implement the CI/CD process and integrated the tool with ANT and Maven to schedule the

builds.

• Automated Weekly releases with ANT/Maven scripting for Compiling Java Code, Debugging and Placing Builds into

Maven Repository.

• Performed hardware and software installations, upgrades, and maintenance, patch administration, kernel modification/upgrades,

file system management, performance and security analysis and network configuration/tuning.

• Created and deployed a tool to automate branch and project creation in Subversion using Perl and Chef Scripts.

• Hands on experience with Ruby/Rails to deploy production and development stacks.

• Experience in designing and implementing continuous integration pipeline using Agile methodologies.

Environment: LAN/WAN Administration, Systems Installation, Configuration & Upgrading, Linux Servers, Oracle Databases,

OS Patches & Updates.

Junior LINUX ADIMINASTOR

Global Logic, - HYD, IN Jan 2013– June 2014

Responsibilities:

•Installation and configuration of Red Hat Enterprise Linux 5/6 systems.

•Involved in building servers using kickstart in RHEL.

•Installation and configuration of RedHat virtual servers using ESXi 4/5.

•Performed package and patches management, firmware upgrades and debugging.

•Addition and configuration of SAN disks for LVM on Linux.

•Configuration and troubleshooting of NAS mounts on Linux Servers.

•Configuration and administration of ASM disks for Oracle RAC servers.

•Analyzing and reviewing the System performance tuning and Network Configurations.

•Managed Logical volumes, Volume Groups, using Logical Volume Manager.

•Performed configuration and troubleshooting of services like NFS, FTP, LDAP and Web servers.

•Involved in patching RedHat servers.

•Worked NAS and SAN concepts and technology.

•Configured and maintained Network Multipathing in Linux.

•Configuration of Multipath, EMC power path on Linux Servers.

•Provided production support and 24/7 support on rotation basis.

•Performed POC on Tableau which includes running load tests and system performance with large amount of data.

Environment: RedHat Linux 4/5/6, HP & Dell blade servers, VMware ESX Server



Contact this candidate