Post Job Free

Resume

Sign in

Big Data Devops Engineer

Location:
Aurangabad, Maharashtra, India
Salary:
24lpa
Posted:
February 14, 2024

Contact this candidate

Resume:

About Me

My Contact

Email:

ad3m1x@r.postjobfree.com

Contact No.: +917*********

Address: Room No.: 3 A wing Sr

No. 46 Mithanagar, Knodhwa

Pune 411048.

Technical Skills

• DevOps Tool: Jenkins Cl/CD,

GitHub, Ansible, Terraform,

Docker, Kubernetes.

• Cloud Platform: Amazon Web

Service (EC2, S3, VPC, IAM,

EKS, Cloud watch, ECS, ECR,

ELB, Route53, and Lambda.)

• Monitoring Tool: Prometheus,

Grafana and Alert manager.

• Big Data: Hadoop, Map

Reduce, Hive and Kafka.

• OS Platform: Linux and

Windows.

• Scripting: Bash and MySQL.

Soft Skills

• Soft skills: Observation

• Decision making

• Communication

• Multi-tasking

DevOps Engineer who aims to improve the efficiency of the organization by implementing automation best practices. Profile Summary

Acquired 6+ years of IT experience and relevant 3 years of AWS and DevOps experience knowledge of Version Control Systems, Configuration Management, DevOps/Build and Release Management, Containerization, Orchestration, Artifactory, and Monitoring Tools in a cloud environment.

Hands-on experience with DevOps tools such as Git, Jenkins, Docker, Maven, SonarQube, Ansible, Terraform, and Kubernetes.

Experience in maintaining GIT workflows for version control on Source Code Management.

Worked on Jenkins by installing, configuring, and maintaining for the purpose of continuous integration (CI) and for End-to-End automation for all build and deployments.

Responsible for Automating Infrastructure as code creation using Terraform and CI-CD pipeline deployment with Jenkins.

Hands-on experience with Docker files, image creation, container deployment, volume management, and registry.

Having Good experience in Amazon Web Services

environment and good knowledge of AWS services like Elastic Compute Cloud (EC2), ELB, IAM, S3, VPC, EKS, Cloud watch, ECS, ECR, ELB, RDS, Route53, Lambda.

Creating S3 buckets and managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and backup on AWS.

Education Background

Bachelor of Engineering Information Technology from P.E.S College of Engineering Completed in 2015. (Dr Babasaheb Ambedkar Marathwada University) Professional Experience

Tata Business Hub

DevOps Engineer (2021 – Present) (Permanent Employee) Roles Responsibilities:

Implemented a CI/CD pipeline involving GitHub, Jenkins, and Docker for complete automation from commit to deployment.

Automatically triggered Jenkins build when changes are pushed to GIT using the Web hook method.

Created Jenkins pipeline scripts (Jenkinsfile) to automate the entire software delivery process, including building, testing, and deploying applications.

Experienced in creating Ansible Playbooks to provision servers, Nginx, and other applications.

Implement Infrastructure automation using Terraform reducing deployment time and increasing efficiency or modify existing infrastructure.

Created Terraform Script to provision AWS resources including EC2instances, VPCs, and S3 bucket.

Implemented separate development, staging and production environment using Terraform workspace and variable files, ensuring consistency across the environment.

Integrated Terraform with CI/CD pipeline (Jenkins) to automatically deploy Infrastructure change based on code commit.

Creating Docker images, the Docker Registry, and handling multiple images.

Worked on Docker containers, attaching to a running container, removing images, managing directory structures, and managing containers.

Provisioned and managed Kubernetes Cluster on AWS installed, configure and maintained Kubernetes components such as Control Plane, Nodes and Deploying Applications in Kubernetes.

Created Kubernetes cluster and adding a worker node to the master node and created Name Space in Kubernetes Cluster.

Implemented EC2 Auto scaling groups for Web application to improved fault tolerance and cost Optimization by automatically adjusting capacity based on demand.

Configured and maintain Kubernetes network solution such as services and Ingres Controller and Network policies.

Created and configured EC2 instances and a VPC with elastic IPs using AWS services.

Created S3 for Data Storage and configured them with access control and security measures in AWS such as setting permission and using IAM roles effectively.

Maintained user accounts by using IAM and creating Public & private subnet in VPC.

Designed and Optimized VPC configuration to meet specific application, considering factors security, availability and scalability.

Developed IAM roles policies to enforce the Principle of least privileges, enhancing security postures and ensuring compliance with regulatory standards.

Engineered a high available architecture using ELB to distribute traffic, Configured Application Load Balancer (ALB) and Network Load Balancer (NLB) for distributing traffic across EC2 instances. Defacto Veritas Pvt. Ltd.

Big Data Admin (From 2015 to 2021) (Permanent Employee) Roles Responsibilities:

Created Branches and tags using Git Hub.

Worked in Git Hub to manage Source code repository.

Implemented Cl/CD Pipeline involving Git Hub, Jenkins, and Docker for complete automation. Administrated and implemented Cl tools Jenkins for automating builds and integrating unit tests and code quality analysis tools like SonarQube.

Containerizing application with Docker.

Taking periodic backups of Jenkins jobs and restoring in case there is any issues in Jenkins.

Manage and configure AWS services in global infrastructure to meet business requirements

(ELB, S3, Cloud Watch, IAM, VPC).

Experience in Planning, Installing, Configuring, Deploying & Securing the Hadoop clusters on AWS cloud.

Hands on experience in deployment, configuration, supporting and managing Apache Hadoop Clusters using Cloudera Enterprise, Hortonworks, and Apache Distributions.

Configuring and Managing High Availability (HA) for various services like HDFS HA, Resource Manager HA, in a Hadoop Cluster.

Monitoring and Configuring Hadoop Services using Cloudera Manager.

Installing, Configuring, Maintaining and Troubleshooting the Hadoop Ecosystem: HDFS, Yarn, Hive, Hue, Impala, Flume, Sqoop, Sentry, Kerberos, Kafka, StreamSets, Oozie and Zookeeper.

Troubleshooting, diagnosing, tuning, and solving Hadoop issues & Job issues. Experienced in Securing the complete access to the Cluster and its Components using Hadoop Security Authentication and Authorization Protocols. (Kerberos, ACLs, and Apache Sentry). Declaration

I hereby declare that all the above furnished details are true to the best of my knowledge and belief. Place: Pune 411048 (Syed Saleem)



Contact this candidate