Post Job Free
Sign in

Web Services Amazon

Location:
Irving, TX
Posted:
July 19, 2024

Contact this candidate

Resume:

+1-434-***-****

*****.*****@*****.***

PROFESSIONAL SUMMARY:

·Certified Amazon Web Services (AWS)and Azure are professionals with almost 8 years of experience as a DevopsConsultant, Build & ReleaseEngineer, and Linux System Administrator.

·Good exposure to Cloud providers including Amazon Web Services (AWS) and Microsoft Azure.

·Designed, configured, and managed public/private cloud infrastructures utilizing Amazon Web Services (AWS) including EC2, Elastic Load-balancers, Elastic Container Service (Docker Containers), S3, CloudFront, Elastic Filesystem, RDS, DynamoDB, DMS, VPC, Direct Connect, Route53, Lambda, CloudWatch, CloudTrail, CloudFormation, IAM, Elasticsearch, ELK.

·Experience working on Azure Cloud services, Azure storage, Azure CLI, Azure Key Vault, Azure active directory, and Azure Service Bus. Managing Client's Microsoft Azure-based PaaS and IAAS environment.

·Hands-on experience in Azure Development worked on Azure web applications, App services, Azure storage, Azure SQL Database, Virtual machines, Azure AD, and notification hub.

·Exposure to writing Groovy and Ruby scripts for automation of build and infrastructure automation.

·Created instances in AWS as well as migrated data to AWS from private data centers.

Dynamic configuration of servers on the Amazon Web Services (AWS) platform using Puppet and Chef.

Expertise in CI (Continuous Integration) and CD (Continuous Deployment) methodologies using Jenkins.

Experience in google cloud platform (GCP) cloud by provisioning compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring components using the Terraform GCP Foundation modules.

Setting up CICD pipelines on various platforms including AWS code pipeline, Azure DevOps,, and Jenkins’s setup on remote Mac VMs for microservices, IOS, IONIC and Android mobile applications.

Experience in configuring the GCP Firewall rules in Terraform script to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.

Experienced in scripting languages like Python, PowerShell, Ruby, Perl, Bash.

Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation-using Jenkins along with Python and Shell scripts to automate routine jobs

Experience in reviewing python code for running the troubleshooting test-cases and bug issues.

Installed new software releases and system upgrades, evaluated and installed patches and resolved software related problems.

Migrated and resettled the applications and server instances from on-premise environment to AWS and GCP cloud.

Designed the front end of the application using Python, Java, Html, CSS, Ajax, JSon and JQuery.

Technical – Experience Designing, Developing, Deploying, and/or Migrating to AZURE, AWS, and other leading data environments.

·Work with Sales teams to identify and qualify opportunities hosted by Azure, AWS, and other Data Services.

·Experience in setting up Baselines, Branching, Merging and Automation Processes using Shell/bash and Batch Scripts.

·Exposure to administration of servers such as Apache, Tomcat, WebLogic, JBoss & WebSphere. Hands on experience with Chef, Ansible, RunDeck,AWS, Ruby, Vagrant,Pivotal Tracker, Bash and middleware tools

·Migrated 400+ applications from site minder to Ping Federate.

·Designing and Implementing applications integration with Ping Federate/ Ping Access/ Ping ID in Non-Prod and Prod. Working with applications business and technical teams to gather requirements to integrate applications with Ping Federate/ Ping Access/ Ping ID.

·Knowledge on managing web and windows platform applications of Microsoft technologies like ASP.Net, VBA, Classic ASP and COM components and COTS products.

Also familiar with Streaming Sources and loading raw data from stream sources using Kafka and or Stream Analytics

·Experience with Jenkins and Add-on Plugins like Jenkins DSL Plugin and more. Extensive experience with software builds tools like Apache Maven and Apache Ant. Extensive experience in automation using UNIX Shell scripting, Perl, Ruby, Python, and Terraform.

·Expertise in Build Automation Tools / Continuous Integration Tools like - Jenkins / Cloud bees / Bamboo. Configured deployed and maintained multi-node Dev and Test Kafka Clusters.

·Configuring the Load Balance Sets Azure Load Balancer, Internal Load Balancer, and Traffic Manager and working on Application Gateway.

·Experience in administration in Linux/Unix and WebLogic. IT experience and gained expertise in systems Windows and Unix administration and transformed traditional environments to virtualized environments with AWS - EC2, EBS, S3, and Load Balancing.

TECHNICAL SKILLS:

·CI Tools: Hudson, Jenkins and Build Forge, TFS, Docker, Kubernetes

·Cloud Computing : Amazon AWS, Azure, GCP

·Repositories : Nexus, Artifactory.

·Languages : Python, Perl scripting, Bash, Shell scripting, SQL. Net, Java

·Web/App servers : Web logic, Web Sphere, Apache Tomcat.

·Operating System: Unix, Linux (Ubuntu, Debian, Red Hat(RHEL), Centos) and Windows.

PROFESSIONAL EXPERIENCE:

Client: DES Arizona State - Remote Dec2022- present

DevOps Cloud Engineer

·Worked on setting up CI/CD pipeline integrating various tools with Azure DevOps

·Created CI/CD pipelines using YAML files in Azure DevOps

·Worked with Docker for convenient environment setup for development and testing.

·Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation-using Jenkins along with Python and Shell scripts to automate routine jobs.

·Created AWS Route53 to route traffic between different regions.

·Upgraded and maintained servers, operating systems, and patches.

·Configured AWS IAM and Security Group in Public and Private Subnets in VPC

·Involved in maintaining the user accounts (IAM), RDS, S3, EC2, Security Groups, Route 53, VPC, RDB, DynamoDB, SES, SQS and SNS services in AWS cloud.

·Improve speed, efficiency, and scalability of the continuous integration environment, automating wherever possible using Python, Ruby, bash, Shell, and PowerShell Scripts

·Implemented rapid-provisioning and life-cycle management for Ubuntu Linux using Amazon EC2, Chef, and custom Ruby and shell/Bash scripts.

·Automated Compliance Policy Framework for multiple projects in GCP.

·Created Service accounts, VM instances, VPC networks using Terraform on GCP.

·Configured Apigee Adapter and ISTIO mesh for our applications on GKE.

·Created scripts in Python which integrated with Amazon API to control instance operations.

·Defined AWS resources, Parameters, and mappings in JSON and YAML using CloudFormation.

·Configuring, Maintaining, and troubleshooting Cluster server issues.

·Work as part of a cross-functional team and participate in the design and development of Grafana Dashboard UI.

•Experience building and setting up system and application monitoring (Grafana)

·Research, plan, and execute the build of all on-prem Azure Stack for Confidential POC to migrate from Azure Websites

·Grafana integrations for common data sources such as Linux machines, databases, applications, and more

·New dashboards are created and pre-configured to receive and visualize metrics from the Grafana Agent.

·Responsible for automating the build scripts and release process of them.

·Primary responsibilities include build and deploying the Java applications onto different environments Dev, QA, and UAT.

·Building the test environment to test the changes in the deployment for Containers.

·Creating a pipeline to integrate the Stage Deployments and setting up the Parallel execution.

Client: CapitalOne, Dallas, TX - Remote Oct2020 – Nov2022

DevOps Cloud Engineer

·Built Jenkins jobs to create AWS infrastructure from GitHub repos containing Terraform code and administered/engineered Jenkins for managing weekly builds.

·Setting and building infrastructure resources of AWS such as VPC EC2, S3, IAM, EBS, Elastic search, logstash, Security Group, Auto Scaling, and Rds in Terraform.

·Used Kafka for messaging and subscribing to a topic, where the producer produces a topic and the consumer consumes the data via subscription Grafana integrations for common data sources such as Linux machines, databases, applications, and more.

·Build a Continuous Integration environment (Jenkins) and a continuous delivery environment (Chef).

·Enabled API and created REST Wrapper in Python to automate agent procedures and patch deployment.

·Used test driven approach for developing the application and implemented the unit tests using Python unit test framework.

·Responsible for defining branching & merging strategy, checking policies, improving code quality, defining backup and archival plans.

·Automated setting up server infrastructure for the DevOps services, using Ansible, shell and python scripts.

·Designing the application layer of the product with ec2, rds, and elastic cache, and configuring AWS cloud watch for application monitoring.

·Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups.

·Experience in Setting up Continuous Integration and Demand Builds.

·Automated builds and deployment using Jenkins as part of Continuous Integration.

·Implemented AWS solutions using EC2, S3, Aws Lambda, RDS, IAM (Identity and Access management), Route 53, Elasticsearch, Cloud front, EBS, Elastic Load Balancer, Auto scaling groups, Optimized volumes and EC2 instances using API Gateway.

·Experience with Jenkins and Add-on Plugins like Jenkins DSL Plugin and more.

·Migrated the existing subversion repository including all history over to GIT.

·Integrated GitHub into Jenkins to automate the code check-out process.

·Used shell scripts to automate the deployment process.

·and implemented Maven workflow to use and publish JAR files and packages to the central repository.

· Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for small data sets processing and storage, Experienced in Maintaining the Hadoop cluster on AWS EMR.

·Designed AWS infrastructure for new applications, using Terraform to manage all aspects.

·Installed and Managed Jenkins and Nexus for CI and Sharing Artifacts respectively within the company.

·Implemented Jenkins Workflow and Plugins for repeatable deployments of multi-tier applications, artifacts, and services to Docker.

Client: Bank of America Apr2019 – Sept2020

DevOps Engineer:

·Performed WebLogicServer 8.x/9.x/10.x11g, JBoss 4.x/5.x, and Tomcat 5. x server administration tasks such as Installation, Configuration, Monitoring, and Performance Tuning.

·Responsible for starting up, configuring, administering, and maintaining the J2EE applications as part of the enterprise's computing/ networking infrastructure and applications.

·Core development experience for Groovy Grails restful web services.

·Configuring the Load Balance Sets Azure Load Balancer, Internal Load Balancer, and Traffic Manager and working on Application Gateway.

·Experience with Ping Federate/ Ping Access/ Ping ID for single sign-on.

·Designed a POC for ping federate and created policies and Rules for MFA.

·Migrated 400+ applications from Site Minder to Ping.

·Monitoring the startup logs for any exceptions or errors. Performing regular health checks for the servers in the production environment.

·Maintained and configured standalone instances and application server clusters.

·Troubleshooting emerging application issues, from WebLogic configuration to code issues.

·Dealt with troubleshooting issues like Server hang, Application Deadlock, Out of Memory, and High CPU.

·Wrote Python routines to log into the websites and fetch data for selected options.

·Wrote Python scripts to parse XML documents and load the data in database.

·Experience in development and configuration experience with software provisioning tools like Chef, Puppet, Terraform, and Ansible.

·Developed and executed software systems utilizing JavaScript and Groovy.

·Configured and managed secured environments using SSL and digital certificates.

·Providing Support for Development and testing environments.

·Designing and Implementing applications integration with Ping Federate/ Ping Access/ Ping ID in Non-Prod and Prod. Working with applications business and technical teams to gather requirements to integrate applications with Ping Federate/ Ping Access/ Ping ID.

·Successfully optimized codes in Python to solve a variety of purposes in data mining and machine learning in Python.

·Automated configuration changes for all environments in the cloud using Chef and Puppet and developed various modules and templates for different application roles.

·Wrote Ansible playbooks to set up the Continuous Delivery pipeline. This primarily consists of a Jenkins, Sonar server, and Vagrant, for the infrastructure to run these packages and various supporting software components such as Maven, etc.

·Utilized SPLUNK for log analyzing and improving the performance of servers and have done Issue identification, data analysis, and security analysis.

·Maintained the user accounts (IAM), CloudWatch, RDS, Route 53, VPC, RDB, Dynamo DB, SES, SQS, and SNS services in the AWS cloud.

·AWS Cloud Management and CHEF Automation. rote many Ansible playbooks which is the entry point for Ansible provisioning, where the automation is defined through tasks using YAML format. Run Ansible Scripts to provision Dev servers.

·Implemented the Project structure based on the Spring MVC pattern using spring boot.

Client: Expedia.inc, Chicago, IL Sept2018 – March 2019

Software Dev Engineer

·Designed the front end and back end of the application using Python on Django Web Framework.

·Develop consumer-based features and applications using Python and Django in test-driven Development and pair-based programming.

·Creation of the DAGs using Python to integrate with Kubernetes Operators

·Design, Install, configure, and maintain heterogeneous Linux and cross-domain services/applications.

·Working on Airflow provisioning of Instances creating the DAG and maintaining the DAG’s with the dependent modules.

·Specialized in provisioning EKS Kubernetes cluster on AWS, GKE Kubernetes cluster on GCP including masters, slave, RBAC, helm, kubectl, ingress controllers via Terraform foundation modules.

·Creation of the Kubernetes Clusters and adding the Worked and Master Node Authentication.

·Created Terraform scripts for EC2 instances, Elastic Load balancers, and S3 buckets.

·Implemented Terraform to manage the AWS infrastructure and managed servers using configuration management tools like Chef and Ansible.

·Used Python and DJango to interface with the jquery ui and manage the storage and deletion of content.

·Excellent hands-on experience in installation, configuration, and troubleshooting the issues and performance tuning of WebLogic/Apache/IIS and Tomcat.

·Written shell scripts for end-to-end build and deployment automation. Run Ansible Scripts to provision Dev servers.

·Created Docker container using Docker images to test the application even ship and run applications.

·Built Jenkins jobs to create AWS infrastructure from GitHub repos containing Terraform code and administered/engineered Jenkins for managing weekly builds.

·Designed AWS infrastructure for new applications, using Terraform to manage all aspects.

·Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes, and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.

·Used Python-based GUI components for the front-end functionality such as selection criteria.

·Connected continuous integration system with GIT version control repository and continually build as the check-ins come from the developer.

·Worked with View Sets in Django-REST framework for providing web services and consumed web services performing CRUD operations.

·Wrote ANSIBLE Playbooks with Python, SSH as the Wrapper to Manage Configurations of AWS Nodes, and Test Playbooks on AWS instances using Python. Run Ansible Scripts to provision Dev servers.

·Refined automation components with scripting and configuration management (Ansible).

Client: UBS, Chicago, IL Jun2017 – Aug2018

DevOps Cloud Engineer

·Responsibilities:

·Implementing new projects builds a framework using Jenkins & maven as build framework tools.

·Implementing a Continuous Delivery framework using Jenkins, CHEF, and Maven in a Linux environment.

·Experience with different flavors of Linux (RHEL, CentOS, Oracle Linux, Debian, and Ubuntu).

·Proficiency with Linux-based operating systems, commands, and utilities, as well as Configuration Management Tools, like Puppet.

·Converting production support scripts to chef recipes.

·Maintained code quality across multiple mobile software development environments.

·Automated various infrastructure activities like Continuous Deployment, Application Server setup, and Stack monitoring using Ansible playbooks and Integrated Ansible with Jenkins.

·Experience writing various custom Ansible Playbooks and modules for Deployment Orchestration.

·Collaborated with internal and external teams such as Product Managers, QC Testers

·Contributed to the design and development of mobile software libraries, tools, and applications.

·Created efficient and fast frontends for our consumer site, partner portals, and monetization system.

·Created Automation to create infrastructure for Kafka clusters and different instances as per components in the cluster using Terraform for creating multiple EC2 instances & attaching ephemeral or EBS volumes as per instance type in different availability zones & multiple regions in AWS.

·Involved in setting up application servers like Tomcat, and WebLogic across Linux platforms as well as written shell scripts, Python, and Ruby scripting on Linux.

·Experienced in troubleshooting and automated deployment to web-end application servers like WebLogic, and Apache Tomcat.

·Configured and managed Ansible playbooks with Ansible roles.

·Successfully optimized codes in Python to solve a variety of purposes in data mining and machine learning in Python.

·AWS server provisioning using Chef Recipes.

·Responsible for Database build, release, and configuration.

·Perform Deployment of Releases to various QA & UAT in Linux environments.

·Work with different team members for automation of Release components.

·Supporting different projects building and releasing SCM efforts e.g. branching, tagging, merging, etc.

Client: Mind Tree- Bangalore, Karnataka. Nov 2014 - Dec 2015

Role: DevOps Engineer

·Installed/Configured/Managed Puppet Master. Wrote custom Modules and Manifests, also modifying prewritten modules from puppet-forge by making use of community-developed code to fit for respective organization business case.

·Used Puppet to manage Web Applications, Config files, Database, Commands, user mount Points, and Packages.

·Written Cloud formation templates and deployed AWS resources using it.

·Implemented Git mirror for SVN repository, which enables users to use both Git and SVN.

·Implemented Continuous Integration using Jenkins and GIT.

·Integrated GitHub into Jenkins to automate the code check-out process.

·Automated the build and release management process including monitoring changes between releases.

·Participated in the release cycle of the product, which involves environments like development QA, and UAT production.

·Involved in branching and merging of code.

·Managed to Build results in Jenkins and Deployed using workflows.

·Automated Build artifacts (jar, war & ear) using continuous integration tools.

·Deployed JAR, WAR & EAR artifacts into the Web Logic servers and Apache servers.

·Coordinated the resources by working closely with project Managers for the release.

·Monitor the progression of releases and keep them on track for delivery on the planned release date.

Educational Summary

·Bachelors in computer science from JNTU, INDIA.2014

·Master of Science, Texas A&M University.2017



Contact this candidate