Professional Summary:
**+ years of experience in IT industry comprising of Linux Administration, Build and Release, DevOps and AWS Cloud Services that includes principles with keeping up Continuous Integration, Continuous Delivery and Continuous Deployment in various environments like (DEV/TEST/STAGE and PROD).
Experience in AWS Cloud IaaS stage with components VPC, ELB, Auto-Scaling, EBS, AMI, EMR, Kinesis, Lambda, Cloud Formation template, Cloud Front, Cloud Trail, ELK Stack, Elastic Beanstalk, Cloud Watch and DynamoDB.
Perform analysis of large Data sets using components from the Hadoop ecosystem.
Experience in maintaining the user accounts (IAM), RDS, Route53, VPC, RDS, Dynamo DB and SNS services in AWS cloud.
Experienced with setup, configuration and maintain ELK stack (Elasticsearch, Logstash and Kibana) and OpenGrok source code (SCM)
Strong hands-on background in database technologies (Oracle, Mysql, MS SQL, PostgreSql, RDS, DynamoDB)
Experience in Micro services development using spring boot and deployment in Pivotal Cloud Foundry (PCF)
Work with the application team to design and develop an effective Hadoop solution. Be actively engaged and responsible in the development process.
Involved in designing and deploying multitude applications utilizing almost all the AWS stack (Including EC2, Route53, S3, RDS, Dynamo DB, Maria DB, SNS, SQS, IAM) focusing on high-availability, fault tolerance, and Auto scaling in AWS Cloud Formation.
Experience in security policies like Security Groups, IAM roles and Multi Factor Authentication.
Experienced in monitoring metrics on EC2, EBS, Elastic Load Balancer, RDS USING CloudWatch.
Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
Hands on experience in S3 buckets and managed policies for S3 buckets and utilized S3 Buckets and Glacier for storage, backup and archived in, PCF, AWS and in setting up and maintenance of Auto-Scaling AWS stacks.
Designed high availability environment for Application servers and database servers on EC2 by using ELB and Auto-scaling. Installed application on AWS EC2 instances and configured the storage on S3 buckets.
Extensively worked on Jenkins for continuous integration and for End to End automation for all build and deployments.
Experience in security policies like Security Groups, IAM roles and Multi Factor Authentication.
Hands on Experience in launching, configuring and maintaining AWS cloud resources like EC2, VPC, S3, Glacier, EBS, EFS, ELB, Auto Scaling, RDS, DynamoDB, Route53, Lambda, CloudWatch, IAM, Elastic Cache, ECS, Cost Explorer, OpsWorks, SOS, SNS, SWF, SageMaker, VPC, Secrets Manager, Stack Sets, AWS Organization, Athena, Data Sync, Elasticsearch Service.
Worked in DevOps group running Jenkins in a Docker container with EC2 slaves in Amazon AWS Cloud. configuration. Also, gain familiarity with surrounding technologies such as Mesos (Mesosphere) and Kubernetes.
Hands on in using Bamboo modules such as Build Complete Action, Build Plan Definition, and Administration configuration. Involved in Provisioning AWS Infrastructure using Terraform scripts from Jenkins.
Extensive hands on experience with recipes, cookbooks for applying across multiple nodes with chef with templates, roles, knife plugins, chef supermarket and deploying nodes in production environment .
Experience in Writing Chef Cookbooks and Recipes to automate our build/deployment process and do an overall process improvement to any manual processes.
Skilled in monitoring servers using Nagios, Cloud watch and using ELK Stack Elasticsearch Fluentd Kibana.
Experience with infrastructure automation tool – Terraform. Implement Infrastructure as Code utilizing Terraform for AWS resource provisioning and management.
Technical skills:
Operating Systems Linux (RedHat, Ubuntu, CentOS), Windows, MAC
Build/Automation Tools Jenkins, Maven, Ant, Bamboo, Team city, Build Forge, Gradle, TFS
Configuration Management Tools Ansible, Chef, Puppet, Salt Stack
Cloud Technologies AWS, Open stack, Azure, PCF
Scripting Languages Shell, Bash, Perl, Python, Groovy, .Net, PowerShell, Terraform
Database System MySQL, IBM DB2, Dynamo DB, Mongo DB, Cassandra, Hadoop.
Web/App Server Apache, IIS, HIS, Tomcat, WebSphere Application Server, WebLogic, Jboss
Version Control Tools GIT, Subversion, Bit Bucket, CVS.
Web Technologies Servlets, JDBC, JSP, XML, HTML, YAML, Swagger Tool.
Virtualization Tools VMWare, Power VM, Virtual box, V Centre, vSphere, WebLogic.
Monitoring Tools Nagios, Cloud Watch, Splunk, ELK, App Dynamics, Datadog
Professional Experience:
PGE(Remote) March 2023 to till date
AWS/DevOps Engineer
Design and manage private cloud infrastructure using AWS, which includes VPC, EC2, S3, cloud front, EBS, elastic file system, RDS, direct connect, route 53, Cloud watch, cloud Trial, Code Pipeline and IAM. Operations were automated using CloudFormation and Terraform.
Managed configuration of resources on environments such as DEV, TEST, QA, UAT and PROD for various releases using CloudFormation and Terraform.
Design and deploy dynamically scalable, available, fault-tolerant, and reliable applications on the AWS and OCI Cloud Infrastructure.
Set up CI/CD pipelines so that each commit a developer makes goes through the standard software development lifecycle and gets tested well enough before it can make it to production.
Develop automation scripting in Python to deploy and manage Java, applications across the Linux and windows servers.
Utilize AWS step-function for orchestrating and automating the pipeline.
Branch, tag and maintain the version across the environments using SCM tools like GIT, Subversion (SVN) and TFS.
Create Docker mage using a Dockerfile and work on Docker container snapshots, remove images, manager Docker volume as well as set up Docker host.
Migrated JIRA across environments and worked on JIRA database dumps.
Automated AWS components like Ec2 instances, Security groups, ELB, RDs, IAM through AWS Cloud Formation templates.
Experience in designing and deploying AWS Solutions using EC2, S3, and EBS, Elastic Load balancer (ELB), autoscaling groups and Ops Works
Developing environment, Confidential S3, EC2, Glue, Athena, AWS Data Pipeline, Kinesis streams, Firehose, Lambda, Redshift, RDS, and Dynamo DB integration. Created a React client web-app backed by serverless AWS Lambda functions to LINKS Interact with an AWS Sagemaker Endpoint.
Maintained user accounts IAM, RDS, Route53, SES, SNS and SQS services in AWS cloud.
implemented amazon s3 and amazon sagemaker to dpeloy and host the modelprovisioning AWS infrastructure using a JSON or YAML template
Used BOTO3 library to connect Sagemakerans python
worked on sagemaker to deploy the models in secure and scalable enviroment by launching from sagemaker studio and console
Managed GIT source code repository and improved practices of branching and code merging to custom needs of the applications teams.
Performed Branching, Tagging, Release Activities on Version Control Tool GIT.
Create/Configure & Maintain Weblogic Run time/Build SSI files for Middleware Installation.
Install, Maintain and Setup Web server Apache and App server like WebLogic and Tomcat on the cloud instances.
Built and Deployed Java.J2EE to web applications server in agile continuous integration environment and automated Labelling activities in TFS once deployment done.
Set up database in AWS using RDS storage and configured instance backups to S3 bucket.
Experience modeling data and selecting application paradigms using PostgreSql and Oracle.
Used sumo logic for cloud-source log management and analytic services.
Integrated AWS with Sumo logic for monitoring and creating dashboards for the appropriate log monitoring and creating the stacks using CloudFormation templates.
Completed automation deployments using AWS by creating IAM and used the codepipeline plugin to integrate Jenkins with AWS and also created Ec2 instances to provide virtual servers.
Integrates with various data sources, including databases and monitoring tools.
Worked on Vulnerabilities for Weaknesses in a system that could be exploited to compromise the security of the system.
Experience with declarative configuration using HashiCorp Configuration Language (HCL).
Worked with Terraform to create stacks in AWS from the scratch and updated the terraform as per organizations requirements.
Pipelined Application Logs from App Servers to Elastic search (ELK Stack) through Log Stash Built Dashboards for App Metrics using Kanban (ELK Stack).
Designing and implementing Terraform modules for reusable infrastructure components.
Creating and modifying AWS infrastructure such as EC2 instances, VPCs, S3 buckets, etc.
Designing and configuring CodePipeline stages for different environments (e.g., Dev, Test, Prod).
Used Brinqa and PRISMA for the security and Vulnerabilities Management.
Worked closely with developers and managers to resolve the issues that were risen during the deployments in different environments.
Integrating CodePipeline with other AWS services like CodeBuild, CodeDeploy, and Lambda.
Experience with Nagios/Cloud-Watch/DyanTrace/Grafana/Sumologic monitoring and alerting services for servers, switches, applications and services.
Environment:Java/J2EE, Ant, Maven, Nexus, Jenkins, GIT, SVN, UiPAth, Chef, Docker, Jira,, VM Ware, Cloud watch, AWS (EC2, VPC, ELB, S3, Glacier, RDS, IAM, Cloud Trial, Route 53), Python,, SQL, Shell Scripting, Terraform.
Cepheid(Remote) June 2021 to Feb 2023
AWS/DevOps Engineer
Design and deploy dynamically scalable, available, fault-tolerant, and reliable applications on the AWS and OCI Cloud Infrastructure.
Designed and deployed a large application utilizing Oracle Cloud and AWS resources focusing on high availability and auto-scaling.
Managed network security using a Load balancer, Auto-Scaling, Security groups on cloud instances.
Migrating Windows and Linux based applications from on-prem to AWS, OCI.
Extensively worked as a Production Support Engineer to support various applications
Setting-up the ApacheSolr cluster, creating indexes, Integrating with oracle databases and indexing data on regular intervals to improve text search against billions of records.
Worked on setting up ElasticSearch cluster & index creation and indexing data for data intensive application.
Setup New Relic Infra and APM monitoring to monitor the Server utilization and application specific Memory, CPU utilization, thread count and setup alert conditions to send an email to the app team when it reached the threshold.
Created Ansible playbooks to automatically install packages from a repository, to change the configuration of remotely configured machines and to deploy new builds.
Worked on setting up/Configuring RDD tool to collect network bandwidth consumption data to perform analysis.
Deployed JAR, WAR, EAR and J2EE applications on tomcat/weblogic server using Jenkins auto deployment.
Worked with Ansible (automation tool) to automate the process of deploying/testing the new build in each environment, setting up a new node and configuring machines/servers.
Performing F/W checks & Security checks accordingly,establishing secure communication between internal & external systems.
Write shell scripts to automate log file rotation/back-up management, discarding old feeds.
Worked on Administration, maintenance and support of Red Hat Enterprise Linux (RHEL) servers.
Managed configuration of resources on environments such as DEV, SIT, QA, UAT and PROD for various releases using ansible.
The code execution cannot proceed because VCRUNTIME140dl was not found. Reinstalling the program may fix this problem.
Build and deployed and scaled machine learning models with AWS SageMaker.
Developing environment, Confidential S3, EC2, Glue, Athena, AWS Data Pipeline, Kinesis streams, Firehose, Lambda, Redshift, RDS, and Dynamo DB integration. Created a React client web-app backed by serverless AWS Lambda functions to LINKS Interact with an AWS Sagemaker Endpoint.
implemented amazon s3 and amazon sagemaker to dpeloy and host the model
Used BOTO3 library to connect Sagemakerans python
worked on sagemaker to deploy the models in secure and scalable enviroment by launching from sagemaker studio and console
Managed GIT source code repository and improved practices of branching and code merging to custom needs of the applications teams.
Performed Branching, Tagging, Release Activities on Version Control Tool GIT.
Create/Configure & Maintain Weblogic Run time/Build SSI files for Middleware Installation.
Install, Maintain and Setup Web server Apache and App server like WebLogic and Tomcat on the cloud instances.
Install the SSOSiteMinder agent and Generate Certificates for the instances for secure authentication.
Configure Jenkins CI/CD pipeline jobs for end-to-end automation to build, test and deliver artifacts and troubleshoot the issues raised during the build process.
As a Jenkins Admin, We Maintain and Setup multiple Jenkins nodes and install the required software to build and scan the java applications and ensure data is stored securely and backed up regularly.
Environment:Java/J2EE, Ant, Maven, Nexus, Jenkins, GIT, SVN, UiPAth, Chef, Docker, Jira,, VM Ware, Cloud watch, AWS (EC2, VPC, ELB, S3, Glacier, RDS, IAM, Cloud Trial, Route 53), Python,, SQL, Shell Scripting, Terraform.
Lowe's Companies, Inc Mooresville, North Carolina January 2020 to May 2021
AWS/DevOps Engineer
Configured and maintained user accounts for Shared, R&D, Staging and Application team and created roles for EC2, RDS, S3, Cloud Watch, EBS resources to communicate with each other using IAM.
Responsible for provisioning AWS services like EC2, S3, Glacier, ELB (Load Balancers), Auto Scaling Groups, Optimized volumes, VPC, RDS, SNS, and EBS, IAM, Redshift, EMR, Glue, etc.
Complete Git workflow for the deployments on different environments and automated deployments though Jenkins pipeline.
Manage AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our QA and UAT environments as well as infrastructure servers for GIT and Puppet.
Created users and groups using IAM and assigned individual policies to each group.
Monitoring each service deployment, and validating the services across all environments.
Set up and maintained Logging and Monitoring subsystems using tools loke; Elasticsearch, Fluentd, Kibana, Prometheus, Grafana and Alertmanager.
Managed Amazon redshift clusters such as launching the cluster by specifying the nodes and performing the data analysis queries.
Implemented a Continuous Delivery pipeline with Docker, Jenkins and GitHub and AWS AMI’s.
Infrastructure design for the ELK Clusters.
Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
Integrate Splunk with AWS deployment using Puppet to collect data from all database server systems into Splunk.
Responsible for design and maintenance of the GIT, repositories, views, and the access control strategies.
Installing, configuring, and administering Jenkins Continuous Integration tool on Linux machines along with adding/updating plugins such as GIT, maven, and Ansible.
Developed Views and Templates with Django view controller and template language to create a user-friendly website interface.
Creating Lambda function to automate snapshot back up on AWS and set up the scheduled backup.
Written scripts in terraform when required and Parser files of complexity when required.
Extensive use of terraform as an infrastructure as code to automate the tools to create required services in AWS.
Used Ansible and other configuration management tools to deploy consistent infrastructure code across multiple environments.
Established infrastructure and service monitoring using Prometheus and Grafana.
Written scripts in Python to automate log rotation of multiple logs from web servers.
Used Maven to automate the build process. Configured and automated then Jenkins build jobs for continuous integration.
Used Python and Django for JSON processing, data exchange and business logic implementation.
Configured the RPM environments build for Centos OS for building and compiling the packages.
Involved in writing custom Shell scripts, VM and Environment management.
Implemented a Continuous Delivery framework using Jenkins, Puppet, Maven and Nexus in Linux environment.
Handled integration of Maven/Nexus, Jenkins, GIT, Confluence and Jira.
Setting up and configuring of Nagios, improved monitoring in Nagios and custom plugins.
Automated AWS components like EC2 instances, Security groups, ELB, RDS, IAM through AWS Cloud information templates.
Implemented Networking policies for Docker containers with open source development tools like Docker containers.
Worked on Jira for sprint tasks, issue tracking and for submitting access issue tickets.
Worked on configuring S3 versioning and lifecycle policies for backing up files and archive those baked up files in Glacier and created monitors, alarms and notifications for EC2 hosts using Cloud Watch.
Worked on Confluence for documenting tasks performed on a daily basis.
AWS Simple storage service (S3) provides to create the buckets with unlimited storage.
Environment:Java/J2EE, Ant, Maven, Nexus, Jenkins, GIT, SVN, UiPAth, Chef, Docker, Jira,, VM Ware, Cloud watch, AWS (EC2, VPC, ELB, S3, Glacier, RDS, IAM, Cloud Trial, Route 53), Python,, SQL, Shell Scripting, Terraform.
Capital One Financial Corporation Dallas, Texas April 2019 to November 2019
AWS/DevOps Engineer
Use AWS services EC2, VPC, IAM, S3, AWS resource access manager, ELB, ASG, Cloud Watch, CloudTrail, SNS, Elastic search, ECS etc.
Managed AWS EC2 instances utilizing Auto Scaling, Elastic Load Balancing and Glacier for our environments. Manage and maintained and deployed to Dev, QA and Prod environments.
Configured and managed AWS Glacier, to move old data to archives based on retention policy of databases/ applications (AWS Glacier Vaults).
Builds and mentors software delivery teams in Waterfall and Agile projects.
Configuring nginx for proxy RESTful API calls to micro-services in Docker containers.
Setup datadog monitoring across different servers and aws services.
Implemented Microservices in load balanced, highly available, fault tolerant Kubernetes infrastructure.
Worked on Bogie Pipeline which is the internal tool for CI/CD. Onboarding Various Microservices using Bogie. Doing Micro services deployments and troubleshooting any errors.
Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementation using Jenkins, Bogie along with Python and Shell scripts to automate routine jobs. Worked with the Architects on SDLC process being the part of post development environments.
Install, configure, and maintain ELK stack systems.
Developed an ETL process to pull dealer data from snowflake to Oracle for Drive Train Consumer needs
Deploying Elasticsearch to assist with environment logging requirements. Created different variations of Kibana dashboard running different instances of Elasticsearch, Logstash and Kibana and Kubernetes.
Responsible for planning index and shards and index TTL strategies in Elasticsearch. Troubleshooting Elasticsearch errors.
PostgreSQL widely using for open source RDBMS while snowflake is multi-structured data handler.
Involved in writing Java API for AWS Lambda to manage some of the AWS services and Kubernetes.
Have designed Lambda function to detect and trigger the application whenever there a document is uploaded to AWS s3 bucket.
Migration project from Oracle to Snowflake warehouse to meet the SLA of customer needs, Created ETL mapping document and ETL design templates for the development team.
Scheduled various AWS Lambda functions to trigger various AWS resources.
Created different elastic search queries and python scripts to analyze the data from different Microservices and run it through Logstash, pass it through Elasticsearch and visualized them in Kibana depending on the different kinds of logs.
Managed Kubernetes charts using Helm. Created reproducible builds of the Kubernetes applications, managed Kubernetes manifest files and managed releases of Helm packages.
Created SQL server Db using RDS and generated the schema for the existing tables present in S3 using AWS Glue. Data Extraction, aggregations and consolidation of Adobe data within AWS Glue using PySpark, installing of HAProxy.
Setup AWS infrastructure monitoring through Datadog and application performance monitoring through App Dynamics.
Configured Cloud Watch and Datadog to monitor real-time granular metrics of all the AWS Services and configured individual dashboards for each resource Agents.
Increased pre-production server visibility by producing Datadog metrics. Enabled Datadog APM, JVM metrics in different Microservices. Creating Datadog Dashboards to visualize different Microservices metrics.
Created index in ELK stack to send the application logs using fluent. Restricting the access to ELK using Security groups and IAM policies.
Creating Monitors for Datadog and CloudWatch using terraform. Integrating Datadog with Slack and PagerDuty.
Integrate Datadog in Jenkins pipeline and Automate the Dashboard and Alerts.
Created system alerts using various datadog tools and alerted application teams based on the escalation matrix.
Integrated Maven with GIT to manage and deploy project related tags. Worked on Maven to create artifacts from source code and deploy them in Nexus central repository for internal deployments. Branching and merging code lines in the GIT and resolved all the conflicts raised during the merges.
Worked on creating the queries for Kibana, Grafana and New Relic. Worked on Influx DB for creating the Data Source. Worked on Creating ECS clusters in AWS, using DevOps tools like Docker.
Written Cloud Formation Templates (CFT) in JSON and YAML format to build the AWS services with the paradigm of Infrastructure as a Code.
Experience with setting up Chef Infra, Bootstrapping nodes, creating and uploading recipes, node convergence in ChefSCM.
Used Informatica PowerCenter 7.1/6.0/5.1 to load extract data from Flat Files, Oracle, and Sybase databases and load to Sybase, Oracle, TeraData database and Flat Files.
Involved in AWS EC2/VPC/S3/SQS/SNS based automation through Docker, Terraform,Ansible, Python, Bash Scripts. Adopted new features as they were released by Amazon, including ELB&EBS.
DevOps role converting existing AWS infrastructure to Server-less architecture(AWS Lambda, Kinesis)deployed via terraform.
Environment:AWS, Jenkins, Bogie, Datadog, CloudWatch, Terraform, Kafka, nginx,ELK, EKS, EMR, Ec2, S3, IAM, VPC, Security Groups, PCF, Snowflake, Python, Maven, Linux, Kubernetes, JIRA, KANBAN, Elastic Search, Log stash, Splunk, AWS RedShift, ECS, Influx DB, Hadoop, Datadog, Docker, ETL.
Virtustream Inc McLean, Virginia October 2017 to March 2019
DevOps Engineer
Build AWS infrastructure using almost all the resources like VPC, EC2, S3, IAM, EBS, Security Group, Auto Scaling, EMR and RDS in Cloud Formation templates, Amazon ECR.
Implemented several Continuous Delivery Pipelines for different products using Jenkins, Go-CD and Bamboo. Set up build pipelines in Jenkins by using various plugins like Maven plugin, EC2 plugin, Docker, Terraform, JDK etc.
Built and Implemented collaborative development environment using GIT, GitHub and integrated it with Jenkins. Set up Jenkins master and added the necessary plugins and adding more slaves to support scalability and Agility. Experience with build tools Ant and Maven for writing build.xmls and pom.xmls respectively.
Created and maintained various DevOps related tools for the team such as provisioning scripts, deployment Tools and staged virtual environments using Docker and Vagrant.
Deployed and maintained chef role-based application servers, including Nginx,apache and tomcat.
Outlined ETL strategy in document to address the design in extracting, transforming and loading process to meet business requirements.
Used Informatica PowerCenter 7.1/6.0/5.1 to load extract data from Flat Files, Oracle, and Sybase databases and load to Sybase, Oracle, TeraData database and Flat Files.
Configured and worked on static code quality and coverage tools like SonarQube. Onboard numerous applications into SonarQube and help maintain the SonarQube installation.
Designed AWS Cloud Formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.
Have designed Lambda function to detect and trigger the application whenever there a document is uploaded to AWS s3 bucket.
Created custom monitors, alarms and notifications for EC2 hosts using Cloud Watch. Configured and administered GitHub Enterprise in AWS with High Availability (HA) enabled and handled. Maintained branches/forks in GitHub version control for the changes made in cookbooks as per release.
Hands-on Experience in the design, implementation, and support of automated containerized infrastructure (Docker), leveraging continuous integration and continuous delivery processes for service development, and Cluster/monitoring/ for infrastructure service deployment and administration. Evaluated Kubernetes for Docker container orchestration.
Used Maven dependency management system to deploy snapshot releases and release artifacts to nexus to share artifacts across the projects. Used Build tools like maven for building and deploying artifacts such as WAR from source code. Setup Custom Service, job Scheduler, and set repetitions options using Playbooks in Ansible.
Developed test scripts using groovy for data driven testing of SOAP and REST Web service using SOAP UI.
Perform Upgrades for Team Foundation Server and help migrate to Team Services Experience providing Continuous Integration/Delivery solutions (Jenkins, Maven and UDeploy).
Documented Rest API using Swagger Tool. Implemented, Developed and Deployed Java Micro services on aws cloud.
Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python, Ruby, and Docker on familiar servers such as Apache, and IIS.
Environment:AWS, Micro services, GitHub, Ansible, Jenkins, Tomcat, Apache, Python, Maven, Linux, Docker, Vagrant, SonarQube, Jboss, WebLogic, Team city, GitHub, JIRA, RPM, KANBAN, Elastic Search, Log stash, Splunk,Nginx, AWS RedShift, Oracle, MS BUILD, TFS, Concourse, Circle CI, Groovy, HAProxy.
PayPal Holdings, Inc Palo Alto, California. September 2016 to September 2017
DevOps Engineer
Experienced in configuration and maintenance of common Linux services such as Tomcat, Apache, MySQL, NFS, FTP, Postfix, LDAP, DHCP, DNS BIND, HTTP, HTTPS, SSH, IP tables and firewall etc.
Experience in Installing, configuring and maintaining the file sharing servers like Samba, NFS, FTP and also Web Sphere & Web Logic Application Servers, Nagios.
Primary responsibilities include Build and Deployment of the java applications into different environments like Dev, INT and QA.
Ability to handle load balancer implementations like bonding multiple interfaces into single bond in case of over load on LAN devices.
Well used and experience in deploying the code through web application servers like Apache Tomcat and NGINX.
Experience in Software Configuration Management (Daily Build, Release and Testing methodology) using tools like Microsoft Visual Source Safe (VSS), Subversion, Kubernetes, and GIT.
Administering local and remote servers on daily basis, troubleshooting and correcting errors.
Experienced with inter networking using TCP/IP and resolving network connectivity using tools like dig, nslookup, ping.
Monitoring of web servers and other services using Nagios monitoring tool.
Involving and partitioning formatting disks and in filesystem management as Software RAID, LVM and VxVM.
Installing and configuring various servers as Apache web server, HTTP server, Samba.
Experience in RHEL Provisioning, Upgrades, Patching, Configuration and Performance Tuning in Linux environment using satellite server.
Production support of Apache, Apache HTTPD, JBoss, Tomcat and Oracle Web logic 10.3 application servers including installation, configuration, and management and troubleshooting. Strong experience in VM environments like XEN, KVM, Oracle Virtual Box and VM Ware 5.0.
Environment:Solaris 9/10/11, Java/J2EE,.NET,ANT, MAVEN, GIT,RedHat Satellite Server, Apache Tomcat, KickStartBonding,Jenkins, WebSphere, SQL, Agile, WebLogic, Subversion, Samba, NFS, FTP, LVM, Tomcat, Apache,bash, Python.
Sagatianz IT Solutions Private Limited India December 2012 to July 2015
Linux Administrator
Sattva IT services Pvt ltd India May 2010 to November 2012
Linux Administrator
Education:
Master’s in computer science engineering from Northwest Polytechnical University, USA 2016.
Bachelorsin Electronics & Communications from JNTU, Hyderabad 2010.
LinkedIn: linkedin.com/in/kumar-k-832529175
Kiran Kumar
Carrollton, Texas
*************@*****.***