Post Job Free

Resume

Sign in

Ci Cd Devops Engineer

Location:
Miami, FL
Salary:
60$
Posted:
October 30, 2023

Contact this candidate

Resume:

Hindhu

Sr. DevOps/Cloud Engineer

PH: 980-***-****

Email: ad0p79@r.postjobfree.com

PROFESSIONAL SUMMARY:

An accomplished and result-oriented professional with 8 years of experience in IT Industries, as an AWS, Azure DevOps Engineer, Linux Administrator, Cloud Management, CDK, Analyzing Data with Microsoft Azure, Building, Deploying, and releasing code from one environment to another environment, Source Code Management, leveraging configuration management, CI/CD, and DevOps Processes releasing code from one environment to another and deploying to servers. Extensive experience includes AWS, DevOps, CI/CD, SCM, Build/Release Management, Cloud Management, and Containerization.

Hands-on experience in Automation, Configuring, and Deploying instances on Amazon web services (AWS) and experience with AWS components like EC2, ELB, Auto scaling, S3, VPC, Route53, Cloud Watch, Cloud Trial, Cloud Formation Templates.

Good working knowledge of AWS IAM service, IAM policies, Roles, Users, Groups, AWS access keys, and Multi-Factor Authentication. And migrated applications to the AWS Cloud.

Experience with AWS Command line interface and PowerShell for automating administrative tasks. Defined AWS Security Groups which acted as virtual firewalls that controlled the traffic reaching one or more AWS EC2, LAMBDA instances.

Collaborated with cross-functional teams to implement security policies and procedures, in GCP resulting in compliance improvement with industry standards.

Hand-On experience in Implementing, Building, and Deployment of CI/CD pipelines, managing projects often including tracking multiple deployments across multiple pipeline stages (Dev, Test/QA staging, and production).

Successfully deployed Azure Kubernetes Service (AKS) using Azure Container Service from Azure CLI, utilizing Kubernetes and Docker as the runtime environment for the CI/CD system to facilitate seamless testing and deployment.

Proficient in tracing complex build problems, release issues and Implemented Continuous Integration and deployment using various CI Tools like Jenkins, Bamboo, and Puppet.

Involved in migration from the current data center to Azure cloud using Azure Site Recovery and Database Migration Service, performed API management in Azure for Backend operations and data persistence.

Worked with the security team to make sure Azure data is highly secure and configured BGP routes to enable ExpressRoute connections between on premise data centers and Azure cloud.

Written many playbooks using YAML scripts on Ansible and used them for configuration management and orchestration of IT environments. Also used Ansible for deploying or updating the web application server’s code data.

Written Chef Cookbooks and recipes to automate the deployment process and integrate Chef Cookbooks into Jenkins jobs for a continuous delivery framework. Worked with developing Chef Recipes using Terraform scripts to perform deployments onto application servers like Tomcat and Nginx.

Successfully managed the migration of a large-scale application to GCP, reducing infrastructure costs and improving application performance.

Worked with Jenkins by installing, configuring, and maintaining for continuous integration & continuous deployment (CI/CD) and for End to End automation for all build and deployments.

Hands-on experience on Chef as a Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manage change.

Worked with Docker and Vagrant for different infrastructure setups and testing of code. Written build scripts from scratch for new projects and new modules within existing projects.

Worked with Vagrant for local development setup. Moved Vagrant setup to a Docker-based environment and experienced in writing Vagrant files and Docker files.

Worked on creating the Docker containers, Docker images, tagging and pushing the images, and Docker consoles for managing the application life cycle and Deployed Docker Engines in Virtualized Platforms for containerization of multiple applications. Experience in clustering and container management using Docker Swarm and Kubernetes.

Used Terraform to map more complex dependencies and identify network issues. Hands-on experience on Terraform a tool for building, changing, and versioning infrastructure safely and efficiently.

Skilled GCP DevOps with a proven record of implementing automated monitoring and alerting systems, reducing incident response time by 50%, and improving system uptime.

Created and Maintained Chef Recipes and cookbooks to simplify and expedite the deployment of applications and mitigate user error.

Used Python programming and language to develop a working and efficient network within the company. Utilized Python in the handling of all hits on Django Redis and other applications. Performed research regarding Python Programming and its uses and efficiency.

Installed, Configured, and Managed Monitoring Tools such as ELK Stack, Splunk, Nagios, and App Dynamics.

Experienced administrator skills on Linux/UNIX distributions (Red Hat/CentOS, Ubuntu, Solaris, and AIX) and Windows flavors. Installed Linux/Unix distributions and configured various services as per the requirement using Puppet.

Designed and deployed a scalable LINUX/Windows infrastructure that seamlessly integrated a network storage server, LDAP servers, and SAMBA servers. Reduced infrastructure complexity by 60% and improved development productivity by 50%.

Experience in successfully managing the migration of large-scale applications to GCP, reducing infrastructure costs and improving application performance.

Worked with other internal IT teams to complete project activities (Architecture and Database teams).

Configured and maintained Jenkins to implement the CI/CD process and integrated the tool with Git, Maven, Nexus, Docker, and Nagios for end-to-end automation and took the sole responsibility to maintain the CI/CD Jenkins server.

Experience in Linux/Unix System Administration, Network Administration, and Application Support working on Red Hat Linux 6/7, SUSE Linux 10/11.

In-depth knowledge of computer applications and shell scripts (Bash), Ruby, Python, and PowerShell for automating tasks.

Experience in Build/Release/Deployment/Operations (DevOps) engineering with a strong background in Linux/Unix Administration and best practices of SDLC methodologies like Agile, Scrum, waterfall, and DevOps/Cloud processes.

Configured WebLogic JDBC connection pools with databases like Oracle 9i/8i, My SQL, MS SQL Server, MS Access and DB2.

Ability to quickly understand, learn and implement the new system design, new technologies, data models, and functional components of software systems in a professional work environment.

Hands on experience in GCP services like EC2, S3, ELB, RDS, SQS, EBS, VPC, EBS, AMI, SNS, RDS, EBS, Cloud Watch, Cloud Trail, Cloud Formation GCP Config, Autoscaling, Cloud Front, IAM, R53.

Able to build solid relationships within as well as consensus across multiple teams on environment strategies of the build and release process.

TECHNICAL SKILLS

Cloud Environment

Amazon Web Services, Microsoft Azure, Open stack

Operating Systems

RHEL, RH Linux CentOS, Apache Mesos, Unix, Windows

Version control Tools

Subversion (SVN), GIT, GIT HUB and Bitbucket

CI Tools:

Jenkins/Hudson, Build Forge, Capistrano, IBM uDeploy, and Bamboo

Configuration Tools

Chef, Puppet, Ansible, FAI and Terraform

Bug Tracking Tools

JIRA, Remedy and Junit, Confluence, Alfresco, and IBM Clear Quest

Cloud:

AWS, Microsoft Azure, GCP, OpenStack

Monitoring Tools

Nagios, Splunk, The data log, New Relic, Cloud Watch.

DevOps Tools

Chef, Puppet, Ansible, AWS (EC2, S3, VPC, ELB, Glacier, SQS, SNS, Cloud Watch, Lambda, RDS, Route53), Nexus, Jfrog Artifactory, Terraform, Cloud Formation, Docker, Kubernetes, Mesosphere.

Virtual/Cloud Environment

AWS, GCP (Google Cloud Platform), Azure, CDK

Languages

C, Java, Shell, Perl, Python Scripting, SQL, J2EE, CSS, XML, HTML, JavaScript.

Database Servers

Oracle RDBMS, IBM DB2, Microsoft SQL, PostgreSQL, and MYSQL server

Networking

TCP/IP, NIS, NIS+NFS, DNS, DHCP, WAN, SMTP, LAN, FTP/TFTP, TELNET, Firewalls.

Repositories

Nexus, Artifactory, Nagios

Build Tools:

Ant, Maven, Autosys, Grunt

Web Servers:

Web Logic, WebSphere, Apache Tomcat, JBoss, Docker

PROFESSIONAL EXPERIENCE

Client: U.S Bank, Florida FL Apr 2021 to Present

Role: DevOps Engineer

Responsibilities:

Managed Azure Infrastructure Azure Web Roles, Worker Roles, VM Roles, Azure SQL, Azure Storage, Azure AD Licenses, Virtual Machine Backup, and get over a Recovery Services Vault using Azure PowerShell and Azure Portal.

Managed user access and permissions within OpenShift. This includes creating and managing user accounts, setting role-based access controls (RBAC), and managing groups.

Monitor and managed the resources within the OpenShift cluster. This includes monitoring CPU, memory, storage, and network usage, and ensuring that resources are allocated efficiently.

As part of continuously delivering Agile team, develop, test, and deploy Data platform features Develop ongoing test automation using chef, a Python-based framework using Ansible to Setup/teardown of ELK Stack.

Implemented high availability with Azure Resource Manager deployment models.

Deploying Windows Kubernetes (K8s) on setting-up continuous deployment pipelines using Jenkins across multiple Kubernetes clusters to stress the clusters with new infrastructure tools for Kubernetes in development environments.

Cloud DevOps Professional with expertise in Linux System Administration, Server infrastructure development & CI/CD pipeline concerning - premises, AWS (Amazon Web Services), CDK & GCP Cloud environments.

Worked on migration of data from On-premises SQL Server to Cloud database (Azure Synapse Analytics (DW) and Azure SQL DB).

Created complex Splunk queries and search patterns to extract meaningful insights from data.

Building Data platforms using modern data stacks like AWS, Snowflake, and Data bricks

Worked on web servers like Apache, and Nginx and application servers like Web Logic, Tomcat, WebSphere, JBOS, and IIS Server to deploy code.

Upgraded Atlassian products like Bamboo, JIRA, and Confluence.

Migration of the Source Code from the GIT repository to the BIT Bucket.

Created scripts in Python that integrated with Azure API to manage instance operations.

Developed a completely automated script for a continuous integration system using Git, Jenkins, MySQL, and custom tools developed in Python and Bash.

Automating and Managing the Azure services for the creation of Storage Accounts, subscriptions, and tables using Windows PowerShell.

Coordinated with Business Analysts, developers, and managers to make sure that code is deployed within the Production and test environment using the terraform.

Implemented and designed the Terraform to migrate legacy and monolithic systems to Azure. Used Terraform and did "Infrastructure as code" and modified terraform scripts as and when configuration changes happen.

Provisioned Datadog monitoring metrics at build time using Terraform.

Integrated Datadog with Azure cloud services to get a comprehensive view of Azure infrastructure.

Configured Ansible to manage Azure environments and automate the build process utilized by all application deployments including Auto Scaling and Formation Scripts with Kubernetes.

Ran Ansible playbooks/Scripts and installed Ansible Registry for local upload and download of images and even from Docker Hub.

AWS-focused DevOps – using Cloud formation and the AWS Cloud Development Kit (CDK) to develop tooling and provision resources.

Performed Proof of Concepts on Splunk, ELK Stack, Service Catalog, Kube-Apps, and Elastic Cloud Enterprise.

Experience with Agile Development Methodology (Scrum), and Waterfall.

Implemented a production-ready, load-balanced, highly available, fault-tolerant Kubernetes infrastructure and used Kubernetes for orchestration, auto-deploy scale, load balance, scale, and management.

Configured and integrated GIT into the continual integration (CI) environment with Jenkins and wrote scripts to containerize using Ansible with Docker and orchestrate it using Kubernetes.

Developed builds using Maven as a build tool and Jenkins to kick off the pipelines to move from one environment to another environment and scheduled Cron jobs periodically using SCM and created CI/CD pipeline using Groovy scripts to enable automation of build and deployment with Jenkins.

Transitioned real-time data using Kafka from on-premises to Azure and processed it using Spark in Data bricks.

• Managed and operated OpenShift clusters, including deployment, scaling, and maintenance of applications on OpenShift.

Environment: AWS (EC2, S3, EBS, ELB, IAM, SQS, RDS, Autoscaling), Redshift, Python, GIT, Bitbucket, Cloud Formation Templates, Jenkins, Docker, JIRA, Red Hat Linux, Web Logic Servers, Jfrog, Shell scripts, Chef, Kubernetes, Networking, Shibboleth, SSO, Old to New Hardware Migration, CI/CD.

Client: Centene Corporation, St Louis, Missouri Nov 2019 to Apr 2021

Role: DevOps Engineer

Responsibilities:

Plan, deploy, monitor, and maintain Amazon AWS cloud infrastructure consisting of multiple EC2nodes and VMware VMs as required in the environment.

Developed the back end using Groovy and Grails, Value Object, and DAO. Used different design strategies like the Facade pattern, and proxy command pattern to efficiently use resources.

Utilized Terraform to define the infrastructure-as-code (IaC) for provisioning the S3 bucket and EC2 instances. Wrote Terraform configuration files to specify the desired state of the infrastructure.

Provisioned the highly available EC2 instances using Terraform and wrote new plugins to support new functionality in Terraform.

Written Junit test cases to test services implemented in Grails and Groovy. Used browser plug-in Postman to test web services.

Integrated Jenkins/Helm/Kubernetes/Vault with GCP to perform semi-automated and automated releases to lower and production environments.

Using the AWS Code suite – Code Commit, Code Build, Code Deploy, and Code Pipeline from AWS CDK.

Good working knowledge of AWS IAM service - IAM policies, Roles, Users, Groups, AWS access keys, and Multi-Factor Authentication and migrating applications to the AWS Cloud through AWS console and API integration.

Created Chef Recipes for Infrastructure maintenance on VMWare, AWS EC2, and Physical Servers.

Integrated Docker into various infrastructure tools, including Ansible, Chef, and VMware Integrated Containers.

Converted production support scripts to chef recipes.

Created and maintained many cookbooks and recipes using Ruby language in Chef to speed up automation of various applications, configuration, and deployment of software components.

Implemented automation of provisioning and deployment using AWS Cloud Formation.

Set up and build AWS infrastructure as a code using Terraform to build Dev, QA, Staging, and production environments.

Wrote Python scripts to parse XML documents and load the data in the database.

Migrated SQL Server Code to Oracle PL/SQL code.

Worked on Lambda Route to call the API Gateway based on servers.

Defined and deployed monitoring, metrics, and logging systems on AWS, primarily configuring Cloud Watch metrics for RDS and Redshift.

Wrote Python scripts with Cloud formation templates to automate Auto scaling, EC2, VPC installation, and other services.

Involved in setting up and operating AWS Relational database service & NoSQL Database, Dynamo DB services.

Developed Microservices onboarding tools leveraging Python and Jenkins allowing for easy creation and maintenance of build jobs and Kubernetes deploy and services.

Created Cloud watch dashboards to monitor the performance of the application using AWS Cloud watch insights.

Monitor and manage the WebLogic server instances using WSLT and also for automation purposes.

Built AWS Kinesis for processing real-time data invoking Lambda functions and loading it to Dynamo DB tables.

Used MySQL, Dynamo DB, code deploy, and Elastic Cache to perform basic database administration build.

Excellent working knowledge in configuring multiple WebLogic domains including machines, managing servers, node managers, and cluster environments.

Environment: RHEL 6/7, Centos, Windows 2008/2012, VMware, WebLogic, Oracle DB, Morpheus Data, Apache, LVM, WebSphere, GIT, Maven, Jenkins, Nexus, SonarQube, Chef, Ansible, Docker, Selenium, AWS, Kubernetes, Splunk, JIRA, Python, Bash, and YAML scripting.

Client: GameStop, Grapevine, TX July 2018 to Nov 2019

Role: DevOps Engineer

Responsibilities:

Partnered with Business, Architecture, and Application teams to perform POCs and complete technical design for Infrastructure (CAAS/FAAS) solutions for the organization.

As Content As a Service team was responsible for leading, building, designing, and configuring the event-driven infrastructure based on serverless architecture based as Function as a service in AWS using the resources like lambda, kinesis streams, firehose, DynamoDB, guard duty, s3, API Gateway, Direct connect, SNS, CloudWatch, Cloud Front, Route53, IAM, As FAAS with an event-driven operating model for making the calls in REST API pattern for reading, creating, deleting the data from the Siebel communications.

Used tools like JIRA for Bug tracking/created tickets, and generated reports on different bugs and Tickets.

Using GitHub across the enterprise using the organization feature through private repo and making use for cost optimization.

Configured Node manager for administering the managed servers in WebLogic.

Implemented a new CI/CD pipeline with application containerized deployment using container orchestration tools like ECS, ECR, Code commit, code build, and code pipeline with the AWS environment.

Implanted a Continuous Delivery pipeline with Docker, semaphore, and GitHub. Whenever a new GitHub branch gets started, semaphore our Continuous Integration server, automatically attempts to build a new Docker container from preconfigured webhooks and deployed in the Docker hub and deployed to Kubernetes cluster in Digital Ocean using the semaphore by autoscaling and load balanced instances.

Worked with Docker container snapshots, attaching to a running container, managing containers, directory structures, and removing Docker images.

Worked with various Docker components like Docker Engine, Hub, Machine, Compose, and Docker Registry

Developed scripts for deployment of customer environments into AWS using Bash, and Python and created scripts that integrated with Amazon API to control instance operations for Rest API calls.

Environment: Linux, TFS/Azure DevOps, Azure Portal, SSL, WebLogic, Azure CLI, Bash, SonarQube, Splunk, JIRA, PowerShell Scripting, IIS, Kubernetes, Terraform, Docker, Ansible, GitHub, JFrog, Confluence.

Client: Resideo – New York, NY Aug 2017 to July 2018

Role: GCP DevOps Engineer

Responsibilities:

Deployed GKE on GCP with the help of Gitlab-Jenkins-Terraform integration.

Automated Compliance Policy Framework for multiple projects in GCP.

Set up Cloud SQL and Proxy for DR situations.

Created Service accounts, VM instances, and VPC networks using Terraform on GCP.

Configured Apigee Adapter and ISTIO mesh for our applications on GKE.

Aggregated logs from Stack Driver and monitored pod and node health by creating metric-based dashboards on GCP.

Created DR plans, SRE scorecards, Error Budget reports, and Blue Green planning assessments for auditor approval.

Renewed certificates in NonProd and Prod clusters and re-applied them to the namespaces.

Set up ISTIO across a shared cluster in the production and configured it properly to send requests to the gateway.

Performed change planning and CAB for production maintenance and application deployments.

Automated the creation of VM, security policies, instance templates, K8 cluster creation, databases, and proxy using TF and Jenkins pipelines.

Created alerts and dashboards on Stack driver and logging.

Maintained SRE and DevOps run books for all the processes and provided project documentation on the confluence.

Environment: ELK Stack. EKS. ECS. Docker. Kubernetes. Splunk. Open Policy Agent. Morpheus Data, Service Catalog, Terraforming /Terragrunt, Cloud Formation, GCP/AWS, CloudWatch, AWS Lambda Functions, AWS Code Commit, S3, SQS, Elastic Beats, Spinnaker, Kube Apps, Helm Charts, CJOC, GitLab, AWS Athena, AWS Config, CloudTrail and SNS.

Client: Broadgate Technologies Pvt. Ltd. Aug 2015 – Dec 2016

Role: Software Developer

Responsibilities:

Integrated the Java Code (API) in JSP pages.

Used jQuery core library functions for the logical implementation part on the client side for all the applications.

Used HTML and JavaScript for client-side presentation and, data validation on the client side within the forms.

Worked on enhancements in designing and developing the GUI for the user interface with various.

User Interface using JavaScript, HTML, XHTML, XML, CSS, JSP, AJAX, and MySQL.

Used XML, XSLT, JSON, and Schemas for communication between different tiers of the application.

Migrated content from the existing website to a new, database-driven website.

Designed table-less layouts using CSS and appropriate HTML tags as per W3C standards.

Used jQuery plug-ins to implement features such as a lightbox and sliders. Developed custom data grids in the JQuery framework to deliver business data.

Worked on the project dashboard, which contained various charts and drags gable components using jQuery UI Library.

Environment: AGILE, AJAX, JSPs, HTML, XHTML, XML, JSON, jQuery, CSS, JavaScript, MySQL, and Windows XP.



Contact this candidate