Murali Mohan
Senior DevOps Engineer
********************@*****.***
www.linkedin.com/in/murali-m-4a10bb211
Professional Summary:
** ***** ** ** ********** and 6 years as Dev/Ops cloud engineer in Configuration Management, Build Engineering and Release Management processes, including Building binaries, end-to-end code configuration, and deployments of artifacts for entire life cycle model in Enterprise Applications.
Experienced in DevOps, Build & Release and Configuration Management on Linux and Windows platforms.
Expert working on the principles and best practices of Software Configuration Management (SCM) in Agile, Scrum, and Waterfall methodologies.
Automated deployment operations using various tools in DevOps, Configuration Management, Cloud Infrastructure using Jenkins, Maven, Dockers, AWS, GIT, Linux etc.
Extensively worked with Version Control Systems SVN (Subversion), GIT, and TFS.
Worked on continuous integration setup with GitLab, Jenkins and AGROCD.
GitHub tree accessibility for CI/CD through Jenkins deployment
Virtualized the servers using the Docker for the test environment and environments needs and configuration automation using Docker containers.
Used Kubernetes to deploy scale, load balance scale and manage Docker containers with multiple names spaced versions.
Configured Application Life Cycle Management (ALM) tool like JIRA to track the progress of the project.
Experienced working on Version control tools like Git, GitHub, SVN, Bitbucket.
Experienced in Amazon AWS Cloud Administration which includes services like: EC2, S3.
Implemented Continuous Integration and Continuous deployment using CI Tools like Jenkins.
Integrated Jenkins with the version control tools to pull the latest pushed code.
Extensively used Docker for virtualization, run, ship, and deploy the application securely for fastens the build/release engineering.
Expert in creating Jenkins Environment and configuring end to end build pipelines.
Build servers using AWS including importing volumes, launching EC2, creating security groups, auto-scaling, load balancers in the defined virtual private connection.
Ensured data integrity and data security on AWS technology by implementing AWS best practices.
Worked on cloud migration from the physical data center towards Amazon Web services (AWS) and had good understanding with Public, Private and Hybrid Cloud Environments.
Expertise in using build tools like MAVEN for the building of deployable Artifacts such as war and ear from Source Code.
Used Docker and Open Shift to manage microservices for development and testing. Expert with Bug tracking tool like JIRA. Created and wrote shell scripts (Bash), Ruby, Python and PowerShell for automating tasks.
Educational Qualification:
MS in computers and Information Systems.
Technical Skills:
AWS Services
EC2, ELB, VPC, RDS, IAM, KMS, S3, Cloud Watch, Cloud trial, SNS, SQS, EBS, Amazon direct connect, EKS, ECS, Route 53, Load Balancer, Auto Scaling.
Platform
UNIX, Linux, Windows.
Build Tools
ANT, MAVEN, Jenkins
Scripting
Shell and Bash
Database
Pl/SQL
Config Management
Ansible
Containers & Orchestration
Docker, Kubernetes
Enterprise Servers & Middleware
Apache, Tomcat, Nginx.
Monitoring Tools
Control-M, Active Batch, Dollar U, Grafana and Nagios
Version Control Tools
GIT, GIT Hub,
DWH tools
Abinitio and Informatica
Other tools
Service Now, WIN SCP, Core FTP, V Sphere Client
Certifications:
AWS certified Solution Architect Associate.
ITILF - EXN5824855
Work Experience:
Client: AT&T, Bridgeton, MO
Sr DevOps Engineer Aug 2021 - Current
Responsibilities:
Combination of process automation, deployment, patching and troubleshooting Automating release and deployment process from development environment right up to production.
Deployed and managed Kubernetes clusters in AWS Cloud and on-premises environments
Build and deployments of java and NodeJS applications in pre-production and production environments.
Handled Jenkins installation, job creations, plug-in installation, and administration.
Wrote scripts in Bash and Python for server administration and automation
Containerized applications using Docker and managed them with Kubernetes.
Implemented continuous deployment system with Jenkins & AGROCD.
Configured a multi branch pipeline Jenkins job, which is triggered whenever the changes are made.
Configured network protocols and services such as TCP/IP, DNS, and DHCP
Managed backups, disaster recovery, and system upgrades
Implemented Continuous Integration using Jenkins and Stash.
Coordination with application development teams for issue troubleshooting.
Responsible for the Continuous Delivery pipeline given to all application teams as they on-board to Jenkins as a part of migration team.
Create AWS instances of windows and Linux using AMI.
Good experience in IAC with terraform in launching EC2 instances.
Setup/Managing Linux Servers & Windows Servers on Amazon (EC2, EBS, ELB, S3, Route53, RDS and IAM).
Worked on creating and Managing AMI’s/Snapshots and Volumes upgrade/Downgrade AWS Resources CPU, Memory, EBS.
Worked on Setup/Managing VPC, Subnets make connection between different zones.
Managing Linux servers on Amazon Cloud EC2, VPC, ELB, IAM, S3, AMI, Route 53, Security Groups, Cloud Watch.
Worked exclusively on making applications more scalable and highly available system in AWS (Load balancing) with full automation.
Used the AWS ECR (Elastic Container Repository) to store docker images with the version names and deploy to images on ECS Service.
Developed Cloud Formation scripts to build on demand EC2 instance formation.
Working with Jenkins to build and deploy java code from GitHub to the production servers using Continuous Integration and Continuous Delivery (CI/CD) pipeline. Sync the applications in AGROCD for each data center.
Environment: AWS Cloud, Ansible, Jenkins, Kubernetes, Docker, Terraform, Git, Stash, Shell, Python, Datadog, Splunk, Apache, Tomcat, Vault, Nginx, MySQL.
Client: Cetera Financial Group, El Segundo
AWS Cloud Engineer May 2018 to July 2021
Responsibilities:
Deployed Micro services to Elastic Kubernetes Service (EKS) cluster using helm charts and
Jenkins.
Designed and implemented AWS Cloud architectures using services such as EC2, S3, RDS, and Lambda
Virtualized the servers using the Docker for all the environments.
Implemented and worked on continuous deployment system with Jenkins
Automated the deployments using Ansible.
Worked closely with development teams to implement DevOps methodologies and practices
Conducted security assessments and implemented security measures such as SSL, SSH, and VPN
Used Maven as build tool on Java projects for the development of build artefacts on the source code.
Provided support for DEV, QA, UAT and PROD environments.
Implemented server monitoring and alerting using tools such as Nagios and Zabbix
Provided technical support to customers for web hosting and email services
Created AMI images of the critical ec2 instance as backup using AWS CLI and GUI.
Used GIT as SCM for branching, merging, and tagging.
Built end to end CI/CD Pipelines in Jenkins to retrieve code, compile applications, perform tests and push build artifacts to Nexus.
Experience with container-based deployments using Docker, working with Docker images,
Docker hub and Docker registries, installation and configuring Kubernetes and clustering them.
Experience in Docker, Kubernetes for the Container Security Engineer implementing monitoring/auditing security events on container and implement container network security detection.
Experience in monitoring System/Application Logs of the server using Splunk to detect Prod issues.
Involved in editing the existing MAVEN files in case of errors changes in the project requirements.
Environment: AWS Cloud, Ansible, Jenkins, Kubernetes, Docker, Shell, Python, Splunk, Apache, Tomcat, Maven, GitLab, GIT.
Client: Cetera Financial Group, El Segundo
Sr System Admin Aug 2015 to May 2018
Responsibilities:
Good knowledge in ActiveBatch Administration tasks like Agent management, application restart.
Ownership in creation and maintenance of jobs using Active Batch product.
Responsible for creating the job schedules and job templates for new requirements.
Providing L1/L2 operational support and maintenance of the platforms.
Analyzing and creating job workflows and dependencies.
Migration of jobs to test and production environments.
Make sure that the jobs run correctly, and possibly monitor the system via AB Reporting
Good knowledge in Scheduling concepts including Time Zones, New Day, Resource management, User Daily etc.
Maintaining schedules, updating documentation and error handling; source code and documentation change control.
Ensure that changes meet quality standards before they are promoted to Production by completing functional testing.
Proactively manage the batch schedules as per agreed procedures/SLAs.
Implement authorized changes as appropriate. Escalate exceptions appropriately.
Handling the outages and patch up gradation which happens for every quarter.
Standardize ActiveBatch process across all applications.
Client: American Express, Salt Lake City.
Production Support Analyst Jul 2010 to Jul 2015
Responsibilities:
Delivered mission critical projects and initiatives whilst maintaining and improving delivery of service proactively.
Actively Migrated jobs to test and production environments.
Monitored migration process via control-m configuration manager (CCM).
Experience with basic Control-M Administration tasks like Agent management, application restart.
Ownership of creation and maintenance of jobs using BMC Control-M.
Manage the batch schedules as per agreed procedures to meet SLA.
Review changes for operational and service impact.
Implement authorized changes as appropriate. Escalate exceptions appropriately.
Handling the outages and patch up gradation which happens for every quarter.
Standardize Control-M process across all applications.
Monitor existing jobs and Mentor other team members in this product.
Provide day-today support of Department applications that are related to Control-M.
To assist proactively and practically in the delivery of mission critical projects and initiatives whilst for commercial IT services.
Proactively manage the batch schedules as per agreed procedures/SLA’s