Post Job Free

Resume

Sign in

Cloud Architect/ Devops Operation manager

Location:
Burlington, MA
Posted:
November 21, 2023

Contact this candidate

Resume:

ANJANEYA PRASAD RAVI

Email:ad0s7g@r.postjobfree.com Ph:669-***-****

LinkedIn: https://www.linkedin.com/in/anjaneya-ravi/

SUMMARY:

Skilled Cloud DevOps Engineer with 8+ years of IT experience in Azure, AWS, with expertise in areas of Continuous Integration, Continuous Deployment, Continuous Delivery (CI/CD), Software Configuration Management, Build and Release Management, Version Control, Troubleshooting, Automation, and Linux System Administration.

Experienced in all phases of the software development life - cycle (SDLC) with a specific focus on the build and release of the quality software. Experienced in Waterfall, Agile/Scrum and most recently Continuous Integration (CI) and Continuous Deployment (CD) DevSecOps practices.

Experienced in Azure administration: Deploy, Configure, Maintain, and compute on Azure cloud.

Experience in Build and Release Management, Software Configuration Management, Project Configuration and Change Management tools using JIRA, Workday, and Outlook.

Experience in Azure Development worked on Azure web application, App services, Azure storage, Azure SQL Database, Virtual machines, Fabric controller, Azure AD, Storage Accounts, and ARM templates.

Expertise in automating the builds and releases of Java, .NET, and AngularJS applications using VSTS/Azure DevOps Services.

Extensive experience in cloud infrastructure and designing custom build steps using PowerShell.

Experience in CI/CD tools like Azure DevOps, Jenkins, GIT, Maven & Gradle.

Designed AWS Cloud Formation templates to create custom sized VPC, Subnets, NAT to ensure successful deployment of Web applications and database templates and expertise in architecting, securing VPC Solutions in AWS with the help of Network ACLs, Security groups, public and private network configurations.

Implemented AWS Lambda functions to run scripts in response to events in Amazon Dynamo DB table, S3 bucket to HTTP requests using Amazon API Gateway.

Good Knowledge of Azure active directory and Azure Service Bus & created and managed Azure AD tenants and configured Application Integration with Azure AD.

Experience in Administration and Troubleshooting of Azure IAAS Components (VM, Storage, VNET, OMS, NSG, Site to Site VPN, RBAC, Load Balancers, Availability Sets)

Worked on Azure SQL database (Azure DB, Cosmos DB, Blob Storage)

Experience on Google cloud platform (GCP) services like compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring, and cloud deployment manager.

Experience in Infrastructure as Code (IAC) tools like Ansible, Terraform, Chef, and Puppet.

Strong experience with version control tools like Git, TFS and SVN for code review.

Experience in containerization orchestration services like Docker, Kubernetes, OpenShift with cloud (AWS, Azure & GCP).

Used Kubernetes to orchestrate the deployment, scaling, and management of Docker Containers.

Worked in deploying applications to Oracle WebLogic, WebSphere, JBOSS, TOMCAT Servers.

Troubleshooting, Load Balancing, Clustering, Deploying Applications, Performance Tuning and Maintenance in WebLogic and WebSphere Servers and other app severs as daily duties.

Good understanding of policies in CyberArk Central Policy Manager (CPM) and PAM.

Managing the source code control system. Developing automated, continuous, build process that reviews the source code, identifies build errors, and notifies appropriate parties to expedite/facilitate synchronization to the latest build.

Experienced in Clustering, Load Balancing techniques to ensure High Availability and Disaster Recovery.

Skilled in Monitoring Servers using Nagios, Splunk, Cloud Watch, and using ELK (Elasticsearch, Logstash & Kibana).

Experience in monitoring tools like Nagios, Splunk and Syslog.

Perform real-time system monitoring, traffic tracking and trend analysis using Network Management Tools (Splunk, SiteScope, Insight Manager, Open View)

Experience with Enterprise Vulnerability management using Qualys, Tenable security, and Checkmarx.

Experience with various bug tracking tools like Jira, Clear Quest, and Remedy.

Ability to use scripting languages such as Bash/Shell, Power Shell, JSON and Python for deployment, building scripts, and automate solutions.

Strong Experience as a Production support engineer providing 24/7 technical support for middleware application servers.

Experience in writing the infrastructure automation scripts in Python and Terraform.

Exposed to all aspects of software development life cycle (SDLC) such as Analysis, Designing, Planning, Developing, Testing, Implementing, Deployment and Support of distributed enterprise scalable, secure and transactional J2EE applications and post-production analysis of the projects.

Technical Skills:

SCM Tools

Subversion (SVN), GIT, ClearCase, Perforce

Build Tools

Ant, Maven, Gradle

CI Tools

Jenkins, Hudson, Build forge and Cruise control

Configuration Tools

Chef, Puppet, Ansible

Automation Tools

Docker, Kubernetes, OpenShift and Vagrant

Monitoring Tools

Splunk, Nagios

Tracking Tools

Jira, Remedy, ClearQuest

Cloud Platforms

AWS, Azure, OpenStack, GCP, PCF

AWS Services

EC-2, ELB, VPC, RDS, IAM, CloudFormation, S3, CloudWatch, CloudTrail, SNS, SQS, SWF, EBS, EMR, EKS, ELK, Dynamo DB, Redshift

AWS RDS

PostgreSQL, Aurora, MySQL

Database System

SQL Server 2000/2005, Mongo db. Oracle 9i/10g-PL/SQL

Scripting Languages

Python, Ruby, Perl, Shell Scripting, Power Shell

Languages

Java, C++, PHP

Web Technologies

Google Web Toolkit, HTML, CSS, XML, XSLT, JAVA SCRIPT

Servers

WebLogic, WebSphere, JBOSS, Apache Tomcat, Kafka, TFS, IIS, Nginix

Network Services

FTP-Vsftpd, SSH, TELNET, TCP/IP, HTTP, DHCP, SNMP, FTP, SMTP, NFS, WinSCP, SAN/NAS, MULTIPATHING, RAID LEVELS.

Platforms

UNIX, Linux, HP-UX, Solaris, Centos 6.5,7, Red Hat, Windows NT/2003/2008, Microsoft windows Vista/XP/2000

PROFESSIONAL EXPERIENCE:

Client: Aware Inc., June 2021 – Till date

Role: Cloud Architect

Responsibilities:

Worked as an active team member for both product development and the operations teams. As the DevSecOps Engineer, created and maintained Azure DevOps organizations, self-hosted Build agents and agent pools. Configured security policies and strategies for resources.

Worked on containerization to optimize the CI/continuous deployment (CD) workflow as group efforts.

Experience as Azure cloud engineer on configuring Ansible for continuous deployments, VM creations with ARM templates, Azure pipelines for continuous delivery blue, green deployments.

Worked on most recently Multistage YAML Pipelines for Build, and Release.

Experience in Azure Key Vaults to keep the pipeline secrets and passwords and integrated them with CyberArk.

Experience in Azure Big data with HDInsight's, Databricks with Spark.

Good understanding of policies in CyberArk Central Policy Manager (CPM) and PAM.

Experience in automating the securities and monitoring the Azure Key Vaults and Service Connections with CyberArk.

Experience on creating Ansible roles for deploying Azure services as part of continuous integration and delivery

Worked on application Infrastructure setup with Ansible & Terraform for deployments across regions in Azure.

As admin worked on Databricks services by managing user access, setup SSO, workspace storage.

Experience on Azure using Azure Data Platform services (Azure Data Lake, Data Factory, Data Lake Analytics, Stream Analytics, Azure SQL DW, HDInsight/Databricks, NoSQL DB)

Experience in Azure server less computing such as Logic apps, Functions and Event grid

Experience on Migration SQL database, ETL's and SSIS packages to Azure Databricks and Azure Data factory as in Azure cloud platform.

Experience in SSIS project-based deployments in Azure cloud

Designed SSIS Packages to transfer data from flat files, Excel SQL Server using Business Intelligence Development Studio.

Experience in migration for ETL's to Azure Data factory and Databricks for multiple teams

Experience on integration with Azure pipeline and Selenium automation for continuous testing.

Experience on Visual studio, Visual Studio code, Microsoft Azure SQL server, Azure Database, Azure Data factory.

Migrated projects using mechanism which shows Jenkins jobs that are automatically created upon creation of certain branches in Git and Bitbucket

Migrated Jenkins deployment jobs to Azure pipelines using CI/CD infrastructure using ansible way.

Migrated Ansible way deployments to Terraform declarative approach for faster deployments.

Experience in releasing management on managing product delivery to end-to-end and production environments.

Migrated CI deployments from Jenkins to Azure Pipelines framework using Ansible architecture

Integrating Azure Pipelines with Static Code Analysis tools like SonarQube and Checkmarx

Integrated .Net core and .Net Framework with Azure for continuous builds using ansible roles

Working on Python, PowerShell scripting creation and modification the CI/CD.

Created shell scripts and Cron jobs that monitored and reported security issues.

Working closely with Architecture, Automation, Security and Data Engineers teams.

Client: Verizon. Mar 2021-june 2021

Role: Sr. DevOps Engineer/Platform Engineer

●Designed roles and groups for users and resources using AWS Identity Access Management (IAM).

●Extensive experience with Agile Team Development and Test-Driven Development using JIRA

●Server side Microservices development in Spring 4.3

●Interaction with business and development teams to understand the requirements, evaluate feasibility plan and implement integration and Functional testing.

●Implemented Service layer using Spring IOC and annotation, controllers using Spring MVC

●Developing Docker images to support Development and Testing teams and their pipelines, distributed Jenkins and JMeter images,

●Used different plug-ins of Maven to clean, compile, build, install, deploy and more for jars and wars

●Transactions were managed by using Spring AOP and Spring Transaction management and Spring IoC container was used for dependency injection and Elastic search, Log stash

●Working closely with Architecture, Development, Test, Security, and IT service teams.

●Developing scripts for build, deployment, maintenance, and related tasks using Jenkins, Docker, Gradle, Bash

●Involved in development of REST web services using Spring framework to extract data from databases.

●Automated deployment of applications using Jenkins Pipeline and infrastructure using Puppet as configuration management tool and Bitbucket as source management tool

●Designed and implemented scalable, secure cloud architecture using EC2, S3, VPC and cloud formation

●Leveraged AWS cloud services such as EC2, auto scaling and VPC to build secure, highly scalable, and flexible system

●Configure Security groups for EC2 window and Linux instances, created users and groups in IAM

●Implemented core design pattern such as DAO, Singleton, facade, Factory and Observer

●Created SQL Queries, PL-SQL stored procedure functions for database layer by analyzing the required business objects and validating them with store procedures.

●Engaged in development and tasking of user and technical stories.

Client: Fresenius, MA. July 2019-feb 2021

Role Platform Engineer

●Designed roles and groups for users and resources using AWS Identity Access Management (IAM).

●Implemented AWS solutions using EC2, S3, RDS, EBS, Elastic Load Balancer, Auto scaling groups, optimized volumes and EC2 instances.

●Utilize Cloud Watch to monitor resources such as EC2, CPU memory, Amazon RDS DB services, Dynamo DB Tables, and EBS volumes.

●Used Apache DB Utils package to parse Result Set object returned from a SQL query to Java Bean object which results in improving code quality.

●Design CI/CD pipeline for Embedded IOT devices for health care systems at the clinics to gather data with AWS Greengrass.

●Utilized AWS Lambda platform to upload data into AWS S3 buckets and to trigger other Lambda functions.

●Created users and groups using IAM and assigned individual policies to each group.

●Created SNS notifications and assigned ARN to S3 for object loss notifications.

●Created S3 backups using versioning enable and moved objects to Amazon Glacier for archiving purpose.

●Used Java 1.8 features like Java Stream API, Lambda expressions with references, Collections, Concurrency API, etc. to enhance the code quality and readability.

●Written Indexing queries on MongoDB collections to expedite the fetch process based on some values of fields in the documents.

●Used Elasticsearch to fetch data while performing aggregations on data and also performing boosting on some data to sponsor them.

●Developed Restful web service as microservice using Spring to fetch data from MongoDB and perform domain and business validations on entities. publishes messages to multiple Kafka queues.

●Developing data Services by MySQL, Hive, PostgreSQL, Tableau and Teradata and map it to domain entities and make HTTP feign calls to process these entities.

●Used Jenkins and Docker container as Continuous Integration and Continuous Deployment tools.

Client: Zappos group of Companies, Nevada. Oct 2018-July 2019

Role: Platform Engineer (AWS)/ Sr. Software Engineer

Project: Inventory in The Cloud

Zappos.com, Inc. announced its plans to join the Amazon.com, Inc. family. With both companies sharing such a strong passion for customer service, we were very excited to begin growing together, by 2010, Zappos had grown so much that there was need to restructure the company so that we could continue to offer customers the very best service possible. For us to have the flexibility to possibly sell anything and everything one day, we needed to make this change. On May 1, 2010, Zappos was restructured into ten separate companies under the Zappos Family umbrella. Regardless of our structure, our goal is to position Zappos as the online service leader. If we can get customers to associate the Zappos brand with the absolute best service, then we can expand into other product categories beyond shoes. And, we're doing just that.

Inventory in the cloud this is project which is basically an auction management system to handle the Zappos group of Family and Amazon inventory and provide the service to organization to procure and manage the inventory from suppliers by all the members in the organization.

Responsibilities:

•Deployed infrastructure on AWS utilizing EC2 (Virtual Servers in the Cloud), RDS (Managed Relational Database Service), VPC and Managed Network and Security, Route 53, Direct Connect, IAM, CloudFormation, AWS OpsWorks (Automate operations), Elastic Beanstalk, AWS S3, Glacier (Storage in the cloud) and CloudWatch Monitoring Management.

•Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.

•Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.

•Developed Controller for request, response paradigm by Spring Controllers using Spring-Boot. Used JSON as response type in REST services.

•Implemented Elastic Load Balancers (ELB's) and Auto Scaling groups in AWS of Production EC2 Instances to build Fault-Tolerant and Highly Available applications.

•Created Python scripts to totally automate AWS services which includes web servers, ELB, Cloud Front distribution, database, EC2 and database security groups, S3 bucket and application configuration, this script creates stacks, single servers, or joins web servers to stacks.

•Built servers using AWS, importing volumes, launching EC2, RDS, creating security groups, auto-scaling, load balancers (ELBs) in the defined virtual private connection.

•Utilized Ansible and Jenkins to automate the provisioning of our identity management solution which is used to implement Single Sign On for AWS. EKS authentication integrated with SSO as well.

•Created Micro services applications with integrations to AWS services by using Amazon EKS, while providing access to the full suite of Kubernetes functionality.

•Used various services of AWS for this infrastructure. I used EC2 as virtual servers to host Git, Jenkins and configuration management tool like Ansible. Converted slow and manual procedures to dynamic API generated procedures.

•Worked on Cloud automation using AWS Cloud Formation templates and Implemented scheduled downtime for non-prod servers for optimizing AWS pricing.

•Responsible for monitoring servers using Cloud watch and using ELK Stack Elastic Search Logstash and Kibana, developed a RESTful API of Elasticsearch to analyze, search and visualize data.

•Created new infrastructure setup in AWS which involves creation of automation scripts using terraform and integrating with continuous integration channel using Jenkins.

•Experience with container-based deployments using Docker, working with Docker images, Docker HUB and Docker registries.

•Worked on creation of Docker containers and Docker consoles for managing the application life cycle

•Worked on Container Platform for Docker and Kubernetes. Used Kubernetes to manage containerized applications using its nodes, ConfigMaps, selector, Services and deployed application containers as Pods

•Executed Kubernetes locally with MiniKube, Created local clusters and deployable application containers.

•Used Kubernetes to provide a platform for automating deployment, scaling, and operations of application containers across clusters of hosts. Worked closely with development teams and performance test engineers for EC2 size optimization and Docker build containers

•Used Kubernetes to deploy, load balance, scale and manage Docker containers with multiple namespace versions.

•Imported and managed multiple corporate applications into the GitHub code management repo and Created user level access to GitHub project directories to the code changes.

•Managed central repositories: Implemented Atlassian Stash along with Git to host Git central repositories for source code across products, facilitate code reviews and login audits for Security Compliance.

•Implemented building tools such as Maven to automate and enhance the overall operational environment and Managed the Maven Repository using Nexus tool to automate the build process and used the same to share the snapshots and releases of internal projects.

•Set up CI (Continuous Integration) for major releases in Jenkins and TeamCity.

•Responsible for implementation of Ansible Tower as Configuration management tool, to automate repetitive tasks, and quick deployments of critical applications.

•Configuration Automation and Centralized Management with Ansible and Cobbler. Implemented Ansible to manage all existing servers and automate the build/configuration of new servers. All server's types where fully defined in Ansible, so that a newly built server could be up and ready for production within 30 minutes OS installation

•Implemented Ansible vaults to encrypt and decrypt the sensitive information files in order not to expose to outside world when we place it in Git hub

•Used chef for server provisioning and infrastructure automation in different environments.

•Integration of Automated Build with Deployment Pipeline. Currently installed Chef Server and clients to pick up the Build from Jenkins repository and deploy in target environments (Integration, QA, and Production).

•Implemented Chef Recipes for Deployment on build on internal Data Centre Servers. Also re-used and modified same Chef Recipes to create a Deployment directly into Amazon EC2 instances.

•Wrote Chef Cookbooks for various DB configurations to modularize and optimize end product configuration.

•Automated deployment of builds to different environments using Jenkins/Hudson CI tools.

•Designed and implemented the backup strategy for all the critical systems such as build machines, bug tracking tools, central repositories etc.,

•Experience developing Splunk queries and dashboards targeted at understanding application performance and capacity analysis.

•Worked with different Bug tracking tools like JIRA. Experience on SonarQube for continuous inspection of code quality.

•Excellent TCP/IP Networking skills and Basic knowledge of networking, firewall and load balancing concepts and their configuration

•Developed Linux based Java systems that interface with Enterprise Class Layer 2-3 SDN/NFV Network Devices, with a deep understanding of IP Networking, network concepts, TCP stack, routers/switches.

Environment: AWS (EC2, VPC, ELB, S3, RDS, Cloud Trail and Route 53), VDI, Linux, Ansible, Git version Control, GIT HUB, VPC, AWS EC2, S3, Route53, EBS, IAM, ELB, Cloud watch, Cloud Formation, AWS CLI, AWS Auto Scaling, Maven, Nagios, Subversion, Jenkins, Chef, Unix/Linux, Shell scripting, Terraform, Docker, Kubernetes.

Client: Verizon Communications Inc., Texas. Feb 2018 - Oct 2018

Role: Tech Lead/Sr. Software Engineer

Project: Optix Telecom Services

Verizon Communications Inc. The Company, through its subsidiaries, provides communications, information and entertainment products and services to consumers, businesses and governmental agencies. Its segments include Wireless and Wireline. The Wireless segment offers communications products and services, including wireless voice and data services and equipment sales, to consumer, business and government customers across the United States. The Wireline segment offers voice, data and video communications products and services, such as broadband video and data, corporate networking solutions, data center and cloud services, security and managed network services, and local and long-distance voice services. Verizon Wireless had 114.2 million retail connections. It also provides fourth-generation (4G) Long-Term Evolution (LTE) technology and third-generation (3G).

Optix Telecom Services is an CRM tool internally developed by Verizon for the Representatives at the call centers and over the Phone. Providing them the ease of use to both the costumer and the representatives. It involved in various features such as claims, deposits, Billing and may more.

Responsibilities:

Responsible for Deployment Automation using multiple tools Ansible, Chef, Jenkins, GIT, ANT Scripts.

Written Puppet and Chef Cookbooks and recipes in Ruby to Provision several pre-prod environments consisting of Cassandra DB installations, domain creations and several proprietary middleware installations.

Implementation and maintenance of Dockers and Kubernetes orchestration by Spinning the required images and mapping its ports for each module and integrating it by Docker File.

Implemented Jenkins for continuous integration (CI) and continuous deployment (CD) by writing the Ansible playbook for deployment in different environments.

Deployed a centralized log management system and integrated into Chef to be used by developers.

Applied the test-driven development methodology while developing which yielded cohesive, loosely coupled and tested code.

Implementing the modules using the RESTful web services and publishing it to the end client for consumption.

Implementation of Apigee (Google cloud Platform) for API publishing to client.

Implemented the development and integration on Amazon web services (AWS) cloud platform using Elastic Compute Cloud (EC2), load balancing using Elastic Load Balancing (ELB), Virtual private cloud (VPC), identity Access and Management (IAM) and storing content on Simple Storage Service(S3).

Implemented the Migration for project from Monolithic to Microservice architecture and also the existing .Net code to Java 8 with latest updates.

Train the team about continuous integration and automated testing practices and supported them throughout development.

Implemented Pivotal Cloud Foundry PCF Open stack development for the on prem and cloud migration.

Good understanding of OSI Model, TCP/IP protocol suite (IP, ARP, ICMP, TCP, UDP, SMTP, FTP, TFTP)

Hands-on experience in software version control using Git in version and project management.

Configured Jenkins to implement nightly builds on daily basis and generated change log that includes changes happened from last 24 hours.

Experience in designing and implementing continuous integration system using Jenkins by creating Python and Perl scripts.

Connected continuous integration system with Git version control repository and continually build as the check-in's come from the developer.

Responsible for design and maintenance of the Subversion/Git Repositories, views, and the access control strategies.

Designed and implemented Subversion and Git metadata including elements, labels, attributes, triggers and hyperlinks.

Responsible for nightly and weekly builds for different modules.

Operations - Custom Shell scripts, Ruby scripts, VM and Environment management.

Used Maven and ANT scripts to build the source code. Supported and helped to create Dynamic Views and Snapshot views for end users

Developed maven and Shell scripts to automatically compile, package, deploy and test J2EE applications to a variety of Web Logic platforms.

Manage configuration of Web App and Deploy to AWS cloud server through Chef.

Used Puppet as a configuration Management tool, Created Modules and Manifests in Puppet to automate various application.

Worked with different team members for automation of Release components.

Educated team about continuous integration and automated testing practices and supported them throughout development.

Configured Jenkins to implement nightly builds on daily basis and generated change log that includes changes happened from last 24 hours.

Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and automated the whole process.

Environment: Java/J2EE, Subversion, Ant, Maven, Jenkins, GIT, SVN, Chef, Puppet, Python, Shell Scripting, Ruby.

Client: Versa Networks, Texas. Jan 17 – Jan 18

Role: DevOps Engineer/ Sr.Software Engineer

Project: Asset Management and Security Services

Versa Networks. is unique among software-defined networking vendors, providing an end-to-end solution that both simplifies and secures the WAN/branch office network. Based completely on software, Versa’s Cloud IP Platform delivers a broad set of capabilities for building agile and secure enterprise networks, as well as highly efficient managed service offerings.

Assets and Security Management services is an internal enterprise asset management (EAM) software solution for enterprise application. As we provide the SDWN services we need to keep a track of the application, user status and network progress. Generate reports and providing the analysis by generation dynamic data virtualization. Developing various modules since we need to deploy on different cloud environments.

Responsibilities:

•Implemented Azure Sql Server for storing the data related to the recruitment and extensively worked on queries and stored procedures.

•Provided high availability for IaaS VMs and PaaS role instances for access from other services in the VNet with Azure Internal Load Balancer.

•Configured VM's availability sets using Azure portal to provide resiliency for IaaS based solution and scale sets using Azure Resource Manager to manage network traffic.

•Set-up a continuous build process in Visual Studio Team Services to automatically build on new check-in of code then deploy that new build to the Azure Web application.

•Wrote the Ansible playbooks which are the entry point for Ansible provisioning, where the automation is defined through tasks using YAML format to Run Ansible scripts to provision Dev servers.

•Good understanding of OSI Model, TCP/IP protocol suite (IP, ARP, ICMP, TCP, UDP, SMTP, FTP, TFTP)

•Used Ansible to manage Web applications, Environments configuration Files, Users, Mount points and Packages.

•Communication with team members for both Ansible Core and Ansible Tower teams to clarify requirements and overcome obstacles.

•Involved in various Java/J2EE technologies including Core Java 1.7/1.8, J2EE (Servlets, JSP), Spring 4.0 (IOC, MVC, AOP, DAO), Hibernate 5.3/JPA and Java Web Framework along with Web Services (RESTful).

•Having a good hand on experience in Spring IO, Spring MVC, Spring IOC, Spring Boot in order to implement Microservice architecture.

•Used Docker Container to package the applications and deploying them on to WebSphere Application server.

•Worked on Deployment Automation of all microservices to pull image from Private Docker registry and deploy to Kubernetes Cluster using Ansible.

•Administered Jenkins setting up master slave architecture and setting permission to users.

•Responsible for Plugin-Management in Jenkins according to requirement Upgrading and Degrading Plugin Versions.

•Created subversion/GIT repositories, imported projects into newly created subversion repositories as per the standard directory layout.

•Configured and maintained Jenkins to implement the CI process for major releases and integrated the tool with Ant and Maven to schedule the builds.

•Wrote Python Scripts to monitor variety of services & Perl Scripts with Hash/Arrays to Insert/Delete/Modify content on multiple servers.

•Used Splunk to monitoring/metric collection for applications in a cloud based environment.

•Integrate Splunk with Azure deployment using Ansible to collect data from all database server systems into Splunk.

•Configured application servers (JBOSS, Tomcat) to deploy the applications into various application servers.

•Creating customized Kickstart profiles for each server and making ISO to build the server.

•Implemented file sharing on network by configuring NFS on system to share essential resources.

Environment: Azure, Azure SQL, Azure AD Linux,



Contact this candidate