Post Job Free
Sign in

Cloud Engineer Devops

Location:
Bound Brook, NJ
Posted:
March 29, 2023

Contact this candidate

Resume:

Raavi Venkata Sairam

Sr. DevOps/ Cloud Engineer

Ph.no: +1-732-***-****

Email: adv7pt@r.postjobfree.com

Professional summary:

IT Professional with 10 years of experience in DevOps/Cloud and Secure Methodologies and focused on Automation of Build/Deployment/Environment Management areas by implementing CI / CD pipelines to help teams deliver a better, reliable and Production environments. With expertise in several areas of Software Configuration Management including development cycles, compilers, Test Automation, server configuration, scripting, continuous integration, builds, product packaging, installers, and managing Release candidates under a variety of Technologies.

Highly experienced in utilizing different cloud (AWS, Azure, GCP, and PCF) services to fulfill DevOps strategies like Build and release, infra-automation, configuration management and Observability of Microservices based applications.

Excellent working knowledge with GCP, PCF and Rackspace cloud services - Cloud Sites, Servers and Cloud Files.

Extensive experience on Terraform to map more complex dependencies and identify network issue and implement Terraform key features such as Infrastructure as code. Execution plans, Resource Graphs, Change Automation.

Used OpenShift architecture for OpenShift build automation, OpenShift CLI and CLI operations, OpenShift cluster and cluster maintenance.

Lead initiatives to define, design, and implement DevOps solutions: Roadmap, reference architectures, tools recommendations, practices, and processes, carrying out POC and tools consultation for target CI-CD framework.

Worked on several Docker components like Docker Engine, creating Docker images, Compose. Docker Registry and handling multiple images primarily for middleware installations and domain configuration

Used Kubernetes as open-source platform for automating deployment, scaling, and operations of application containers across clusters of hosts, providing container centric infrastructure.

Experience in managing Ansible Playbooks with Ansible roles, group variables and inventory files and copy and remove files on remote systems using file module.

Experience with AWS service OpsWorks for installing and configuring Chef Server and Chef Automate.

Involved in setting up Jenkins Master and multiple slaves for the entire team as a CI tool as part of Continuous development and deployment process

Extensive experience in building CI/CD pipelines using Hudson, Bamboo, Jenkins, and TeamCity for end - to-end automation for all builds and deployments

Experience in writing cookbooks which include recipes to perform Installation and Configuration tasks involving JDK, tomcat, Web Logic binary es installation and domain creations using Chef.

Experience migrating infrastructure and application from on premise to Azure and from Cloud to Cloud such as AWS to Microsoft Azure.

Replicated the build of an existing platform based on Puppet, CouchDB, PostGreSQL, MySQL and Apache/NGINX and wrap automation around the build to deploy into the DMZ and a future upgrade for most of the underlying components

Experience in Architecting and deploying multiple monitoring solutions using tools CA APM, Confidential, Dynatrace, Grafana, Prometheous, Splunk in production environments.

Experience in Shell scripting using sh, Powershell, Groovy, Yaml, Typescript, bash and PERL.

Experience with all phases of Software Development Life Cycle (SDLC), including Analysis. Design Development and Testing of Client-Server and Web based n-tier Architecture for web applications with exposure to diverse business domains.

Good command on working with Tracking Tools Bugzilla, JIRA and ServiceNow.

Expertise in JIRA Software, JIRA Service Desk, JIRA Core, Confluence, Stride, BitBucket and Crowd.

Technicalp skills:

Cloud

AWS, Azure, GCP and PCF

Automation tools

Chef, Ansible, Puppet, Terraform, cloud formation

SCM

GIT, Subversion SVN, TFS, Stash/Bit Bucket

Build Tools

Maven, Ant, Gradle, NPM, MS Build

Monitoring Tools

Splunk, Nagios, Grafana, Prometheous, ELK, Datadog, App Dynamics, Cloud watch, App Insights

Bug Tracking & Testing Tools

JIRA, Bugzilla,Cucumber, Junit,HP QualityCenter, IBM Clear Quest

Virtualization & Containerization

Docker, Docker Swarm, VMware ESXi, Vagrant, Kubernetes

Operating Systems

Unix, Windows, Redhat Linux (6.x, 5.x)

Programming Languages

Python, SQL, Java/J2EE, Ruby, .Net

Continuous Integration

Jenkins, Bamboo, GitLab, Azure Pipeline, AWS Pipeline, TeamCity

Web Technologies

HTML, XML, JSP

Scripting Languages

Bash, Perl, PowerShell, Shell, Groovy, Yaml

Databases

MySQL, MongoDB, Oracle, NoSQL, SQL

Web/App servers

WebLogic, WebSphere, Nginx, JBOSS, Apache, Tomcat, Glassfish

Professional Experience:

Client: Pfizer, Brooklyn, NY Sept 2021 – Till date

Role: Lead GCP Cloud Engineer

Designing and deploying a variety of applications using AWS CloudFormation, with a focus on high availability, fault tolerance, and auto-scaling, including EC2, Route53, S3, RDS, SNS, SQS, and IAM.

Configure Amazon EC2 instances, security groups, and databases in AWS using S3 buckets that are configured to keep machine backups in S3 buckets, and Glacier.

Created IAM (Identity and Access Management) accounts for a variety of users, including developers, system administrators, and network administrators.

Responsible for resolving daily problems through the utilization of a range of networking tools including SSH, Telnet, and ping. Additionally, I designed and kept up-to-date DNS records utilizing Route53.

Deployed Azure laaS virtual machines (VMS) and Cloud services (PaaS role instances) into secure VNets and subnets and Azure resource manager-based resources.

Expertise in GCP DevOps tools, such as Cloud Build, Cloud Source Repositories, and Deployment Manager, for continuous integration and continuous deployment (CI/CD) pipelines.

Utilized Ansible / Ansible Tower as Configuration management to automate repetitive operations, rapidly deploy key apps, and proactively manage change, and created Python Code utilizing the Ansible Python API to automate the cloud deployment process.

Knowledgeable in designing IAC solutions that are reusable, scalable, and modular to ensure ease of management and cost-effectiveness.

Utilized Kubernetes kops cluster and Docker for the runtime environment of the CI/CD system to build, test deploy.

Worked on Airflow 1.8(Python2) and Airflow 1.9(Python3) for orchestration and familiar with building custom Airflow operators and orchestration of workflows with dependencies involving multi-clouds.

Experience on Ansible which is used to manage web applications, environments, users, file systems, and packages. Created Ansible scripts to restart all production server services.

Set up a GCP Firewall rules in order to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.

Created a CI/CD pipeline in Jenkins and executed the build using ansible build and deploy scripts to integrate with the GitHub repository.

Experienced in creating, managing, and deploying infrastructure using IAC tools for public cloud platforms such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP).

Designed for a hybrid architecture based on Micro Services, AWS Elastic MapReduce (EMR), Spark Streaming and Event Driven architecture.

Skilled in building and deploying Docker Swarm and Kubernetes Container orchestration systems.

Skilled in creating Kubernetes clusters with AWS-Kops (EKS) and configuring and deploying the Kubernetes dashboard to provide web-based access to the cluster.

Led cross-functional teams in identifying and implementing process improvements that reduced costs while improving quality and customer satisfaction.

Developed PySpark and SparkSQL code to process the data in Apache Spark on Amazon EMR to perform the necessary transformations based on the STMs developed.

Build a program with Python and apache beam and execute it in cloud Dataflow to run Data validation between raw source file and Bigquery tables.

Excellent communication and collaboration skills, including the ability to work effectively with cross-functional teams and stakeholders on GCP projects.

Conducted a comprehensive review of company spending and identified areas where expenses could be reduced through consolidation or elimination of unnecessary expenditures.

Using Terraform and AWS Cloud Formation, AWS infrastructure was created, updated, and versioned.

Orchestration experience using Azure Data Factory, Airflow 1.8 and Airflow 1.10 on multiple cloud platforms and able to understand the process of leveraging the Airflow Operators.

Developed Ansible playbooks for delivering artifacts from Jenkins/S3 bucket to Tomcat server

Multiple high-performance MongoDB replica sets were implemented on EC2 with durable dependability.

Successfully led global deployment of WiFi infrastructure and continuous monitoring security solution in AWS datacenters with $500k in cost reduction and 30% reduction in schedule.

Configured Jenkins with the Nexus and JFrog plugins in order to pool binary artifacts into the Artifactory repository and troubleshoot the build problem during the Jenkins build process.

Familiar with the cloud-native approach and concepts such as Infrastructure as Code, Microservices, Containers, and Serverless computing.

Experienced in collaborating with cross-functional teams such as developers, operations, and security to ensure the alignment of IAC solutions with business requirements and industry standards.

Integrated JIRA with Git and Bamboo to automate end-to-end release cycle with Maven and Tomcat repository in order to set up continuous integration and formal builds as WAR's / JAR's via server console.

Designed PowerShell scripts for VM Machines and VM Host, as well as SQL Reporting.

Installed and configured Splunk to analyze application and server log files to monitor applications deployed on an application server.

Experience in MuleSoft AnypointAPI platform on designing and implementing Mule APIs by documenting and designing REST API's using RAML.

Skilled in designing, developing, and deploying Apache Beam pipelines on various data processing platforms like Google Cloud Dataflow, Apache Flink, and Apache Spark.

Log Stash was used to transport Application Logs from App Servers to Elastic Search (ELK Stack), and Built App Metrics Dashboards Using Kibana (ELK Stack).

Experience in migrating the Legacy application into GCP platform and managing the GCP services such as Compute Engine, cloud storage, Big Query, VPC, Stack Driver, Load Balancing and IAM.

Built Python programs to access and manage instance activities through the Amazon API.

Maintained environments for continuous build and integration in SCRUM and Agile projects.

Environment: AWS, Azure, CI/CD pipeline, Jenkins, Github, Kubernetes, Agile, Docker, EC2, WebSphere, Nginx, MySQL, Ansible, Tomcat, Maven, JFrog, Openstack, Chef, Puppet, Splunk, SCRUM.

Client: Gerdau Ameristeel US, Miami, Fl Jan 2020 – Aug 2021

Role: Sr. AWS/DevOps Engineer

Wrote Cloud Formation templates for the provision of Infrastructure resources in the AWS cloud environments following best security practices and standardize the process and configuration.

Designed AWS Cloud formation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of web applications and database templates.

Created AMI and created IAM policies for delegated administration within AWS.

Build and improve the reliability and performance of cloud applications and cloud infrastructure deployed on Amazon Web Services and configured VPC with public and private subnets.

Implemented multi-tier application provisioning in Openstack cloud and integrating it with Chef/Puppet

Experience in google cloud platform (GCP) cloud by provisioning compute engine, cloud load balancing, cloud storage, cloud SQL, stack driver monitoring components using the Terraform GCP Foundation modules

Designed and implemented automated provisioning of full stack application services (VPC, Security Groups, Instances, ELB) in AWS using Terraform.

Experience in creating RESTFUL Services and consuming Web API, Rest API which communicate data using JSON over HTTP protocol.

Worked with Terraform Template key features such as, Execution plans, Resource Graphs, Change Automation and extensively used Auto Scaling launch configuration templates for launching Amazon services while deploying microservices

Implemented new technologies or software to automate processes and reduce labor costs, resulting in significant cost savings.

Collaborated with finance and accounting teams to develop and implement cost reduction strategies that aligned with company goals and objectives.

Monitored and tracked expenses regularly to ensure that cost savings were sustained over time.

Converted existing Terraform modules that had version conflicts to utilize cloud formation during Terraform deployments to enable more control or missing capabilities.

Ability to use Dynatrace to generate insights and reports for key stakeholders, including developers, IT operations teams, and business executives.

Conducted a thorough analysis of company operations and identified areas where costs could be cut without sacrificing quality or efficiency.

Experience in integrating Dynatrace with other monitoring and analytics tools, such as Splunk, AppDynamics, and New Relic.

Experience in configuring the GCP Firewall rules in Terraform script to allow or deny traffic to and from the VM's instances based on specified configuration and used GCP cloud CDN (content delivery network) to deliver content from GCP cache locations drastically improving user experience and latency.

Worked on Docker registry, Hub and creating, attaching, networking of Docker containers, container orchestration using Kubernetes for clustering, load balancing, scaling and service discovery using selectors, nodes and pods.

Knowledgeable in integrating Apache Beam with other technologies like Apache Kafka, Apache Hadoop, and Google Cloud Pub/Sub for data ingestion and output.

Experience in job workflow scheduling and monitoring tools like Airflow and Autosys.

Specialized in provisioning EKS Kubernetes cluster on AWS, GKE Kubernetes cluster on GCP including masters, slave, RBAC, helm, kubectl, ingress controllers via Terraform foundation modules.

Worked on multiple things like setting up Kubernetes dashboards with AAF and using kubeconfig.

Deployed Kubernetes container application using Azure Kubernetes Service (AKS), ACS, Azure CLI, Azure Active Directory, Azure Virtual Network, Azure Storage, and Azure Database for MySQL.

Configuration of Jenkins, Identifying and installing required plug-ins to Jenkins, writing Groovy scripts to configure Build jobs and Build Pipelines.

Build data pipelines in airflow in GCP for ETL related jobs using different airflow operators. Worked parallelly in both GCP and AWS Clouds coherently.

Integrated GIT into Jenkins to automate the code check-out process, Pushing the code to the GitHub to automate the release. Responsible for design and maintenance of the GIT Repositories, views and the access control strategies.

Good understand with Web Services and expert in writing test cases and executing Web Services using SOAP and REST protocol on Postman and Ready API.

Installed and administrated GIT Source code tools and ensured the reliability of the application as well as designed the Branching strategies for GIT.

Experienced in GIT forks, tagging, handling merge requests and notifications. Setting up the GITrepos for Jenkins build jobs.

Configured Pipelines to build and deploy by setting up SonarQube for coverage reports, Maven, Jira integration, Nexus to build CI/CD pipeline which includes to trigger auto builds, code analysis, and deploy it nexus for various projects.

Knowledgeable in monitoring and maintaining Apache Beam pipelines to ensure data integrity and compliance with regulatory requirements.

Automated Weekly releases with Maven scripting for Compiling Java Code, Debugging and Placing builds into Maven Repository.

Expertise includes bash and python scripting with focus on DevOps tools and CI/CD Architecture

Performed automation testing for the new enhancement using selenium tool and Python scripting.

Worked on JQuery to make the frontend components interact with the Javascript functions to add dynamism to the web pages at the client side.

AJAX and JavaScript were used for validations and integrating business server-side components on the client side with in the browser.

Designed and implemented a configurable Application and Data security model to ensure controlled access to the data on Production, Test and Development environments.

Experience in designing and implementing monitoring strategies using Dynatrace for large-scale, distributed software applications and infrastructure.

Involved in installation, configuration and administration of Apache Web server, BEA Web Logic and IBM Web sphere and Samba Server in UNIX, Linux and Windows environment.

Monitored Kubernetes clusters using Splunk, Nagios and Grafana.

Experience on Data Analytics, Advanced Data Analytics, Visualization, Advanced Visualization, Dashboard Customization, and Advanced Dashboard Customization in Splunk.

Extensively worked in configuration of Nagios monitoring tool and troubleshooting the system level and issues related to SS7 signaling or related to call flow.

Environments: Azure, Terraform, AWS, Openstack, IaaS, Ansible, Docker, Kubernetes, MySQL, GIT, AJAX, JavaScript, OLTP, Linux.

Client: Cowan Systems, Baltimore, MD Feb 2017 – Dec 2019

Role: AWS/DevOps Engineer

Set up, configured, and managed a variety of AWS resources, including EC2 instances, S3 storage buckets, Elastic File Storage (EFS) systems, Elastic Load Balancers (ELB), high availability zones, Route53, IAM roles, AWS Lambda, and AWS Elastic Beanstalk, to quickly deploy and manage the applications.

Worked on integrated AWS Cloud Watch with EC2 instances for monitoring the log files and store in cloud watch logs and used lambda services through python scripts for taking regular EBS snapshot. Designed an AWS cloud formation template to create VPC

Performed Ansible and Ansible Tower as Configuration management tool to automate repetitive tasks, quickly deploys critical applications, and proactively manage change by writing Python code by using Ansible Python API to automate Cloud Deployment Process

Familiar with version control systems like Git and SVN, and experienced in using them to manage code repositories.

Proficient in troubleshooting and debugging PHP applications, and identifying and resolving performance issues.

Comfortable working in a team environment and collaborating with other developers, designers, and stakeholders.

Continuously learning and keeping up-to-date with the latest trends and best practices in PHP development.

Configured and managed environments DEV, QA, UAT and PROD on OpenStack for various releases and designed instance strategies.

Installed and configured the Jenkins, Jenkins plugins, job setup, pipeline and delivery pipeline views and build automation using Jenkins server

Worked on branching, tagging and maintaining the version across the environments using SCM tools like GITLAB, Subversion (SVN) on Linux and windows platforms

Experience in building Docker images using GitLab-ci build automation runner

Configuring JUnit coverage report and Integration Test cases as part of build in GitLab Runner

Setting GitLab repository and Runner for build automation.

Creation of gitlab-ci.yml file for kicking build process in stages that run in docker container.

Experience with GCP tools for monitoring, logging, and error reporting, such as Stackdriver and Cloud Logging and familiarity with GCP APIs and the ability to integrate with other Google Cloud services or third-party services.

Experience deploying applications on GCP or developing applications using Google Cloud services.

Performed daily system monitoring, verifying the integrity and availability of all hardware, server resources, systems and key processes, reviewing system and application logs, and verifying completion of scheduled jobs such as backups.

Provided support to Production, Staging, QA, Development environments for code deployments, changes and general support.

Used AWS Beanstalk for deploying and scaling web applications and services developed with Java, PHP, Node.js, Python and Docker on familiar servers such as Apache, and IIS

Understanding of cloud computing concepts such as IaaS, PaaS, and SaaS and how they relate to GCP

Worked on AppDynamics Monitoring tool, configuring dashboards for every application that's in the environment. Responsible for identifying, assessing and communicating process improvement recommendations around automation and service stability

Created and maintained system documentation including troubleshooting guides, support processes, and break-fix procedures based on lessons learned in resolving non-routine escalations

Experience working with GCP services such as Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Functions.

Involved in designing, integration, deployment and administration of Drupal, PHP, Tomcat across the LINUX platforms.

Worked on UNIX (Solaris) and Linux (RHEL) command-line environments and involved in writing Unix shell scripts to automate system tasks

Experience in supporting all phases of the system development life cycle including development, testing, QA and production

Experience with desired application instances: IIS, .Net, Liferay, Crystal Report Server, Siebel CRM, OBIEE, Oracle Forms and Reports, JIRA, IBM, InQuira, OnDemand, MicroStrategy, Docker, PCF, F5 and VMware Ops Center.

Environment: AWS, Terraform, EC2, Apache, S3, GCP, Hadoop, Ansible, Linux, Python, Openstack, Jenkins, GITLAB, SVN, JSP, Java, Docker, UNIX, JIRA.

Client: Inforlinx Solutions – Hyderabad, India May 2015 – Jan 2017

Role: Cloud Engineer

Used Puppet extensively to configure servers with Users, Keys and security configurations

Deploy puppet to completely provision and manage AWSEC2 instances, volumes, DNS, and S3.

Extensively Used Jenkins as Continuous Integration tools to deploy the Spring Boot Microservices to AWS Cloud and Pivotal Cloud Foundry (PCF) using build pack

Involved in setting up Jenkins Master and multiple slaves for the entire team as a Cl tool as part of Continuous development and deployment process.

Hands on experience in using source code control systems like GIT with various repository management services like Git Hub, Bit Bucket and Git Lab

Used GIT for version Controlling and source code sharing and used MSBuild.

Executing Test scenarios using ANT Tool which uses Ixia and Landslide and Real Confidential as traffic generator tools along with Confidential & Confidential, Cricket and other Reseller Subscribers.

Developed and maintained build script using Apache ANT and Maven for J2EE, ANT and MSBuild for .NET to perform builds efficiently.

Extensive experience architecting large scale performance monitoring solutions with Nagios 1x, 2x and 3x, up to 10k hosts and 100k services per monitoring instance.

Extensively worked in configuration of Nagios monitoring tool and troubleshooting the system level and issues related to SS7 signaling or related to call flow.

Wrote python scripts to parse XML documents and load the data in database and developed web-based applications using Python, CSS and HTML.

Created a Python/Django based web application using Python scripting for data processing, MySQL for the database, and HTML/CSS/JQuery and High Charts for data visualization of the served pages.

Experience in Creating and writing shell scripts (Bash), Ruby, Python and PowerShell for setting up baselines, branching, merging and automation processes across the environments using SCM tools like GIT, Subversion (SVN), Stash and TFS on Linux and windows platforms

Experienced in all the phases of software development lifecycle from Requirements Analysis, Design, Development, Testing and Deployment, UAT of software applications

Proficient in working with various technologies like Core Java, J2EE, Spring, Spring MVC, Spring Boot, JDBC, Hibernate, Ibatis, XML, REST Web Services and Design Patterns.

Skilled in Database Backup Restore, Attach and Detach, Recovery and Disaster Recovery Procedures.

Extensive experience in using all types of SQL Server Constraints, SQL Server Database design, Database maintenance, developing Transact- SQL queries, stored procedures, and triggers using SQL Servers.

Experience in designing, implementation and support of Linux infrastructure that utilizes both cloud based and physical servers.

Environment: Puppet, AWS, DNS, S3, Jenkins, GIT, ANT, XML, Python, .Net, SDLC, JDBC, Java, MySQL, Linux.

Client: Siri IT Solutions, Hyderabad, India March 2014 – April 2015

Role: Linux Engineer

Implementing 24/7 monitoring with Nagios, adding and removing hosts and services for ping monitoring.

Proficiency in Linux Bash scripting and following PEP guidelines in Python, as well as expertise in designing Unix shell scripts

Building servers with Kickstart for Linux, Jumpstart for Solaris, Ignite for HP-UX and knowledge of NIM for AIX installation

Installing, configuring, and maintaining application servers such as WebSphere and WebLogic, and web servers like Apache, HTTP and Tomcat on UNIX and Linux

Experience in designing, implementing, and supporting a Linux infrastructure that uses both cloud-based and physical servers

Building a Red Hat Network Satellite Server for automated Linux installation and creating a Linux system image with SystemImager.

Installing, upgrading, and configuring Red Hat servers using Kickstart and Solaris servers using Jumpstart, and customizing profiles and scripts for various server installations

Building RHEL (Linux) servers with kickstart and obtaining the kickstart file from the server

Maintaining a file server and print server, as well as cloning desktops on a Windows network

Hands-on experience in project management, deployment, installation, administration, maintenance, and troubleshooting of various Microsoft operating systems, applications, networks, and computers

Managing the deployment, maintenance, support, and upgrade of servers, hardware, software, operating systems, and printers in a Microsoft Windows environment.

Environment: Ansible, Nagios, Linux, Python, Kickstart, Tomcat, RedHat, Windows.

Client: Neyvel Lignite Corporation, India May 2013 – Feb 2014

Role: System Administrator

Experience in managing Linux/Unix servers and providing production support for various applications on Red Hat Enterprise Linux and Windows environments.

Worked on Installation, configuration, upgrade, patching, and performance tuning of Unix and system software and hardware.

Managing file sharing servers such as Samba, NFS, FTP, and WebSphere and WebLogic Application Servers, as well as Nagios.

Configuring and maintaining common Linux services such as Tomcat, Apache, MySQL, NFS, FTP, Postfix, LDAP, DHCP, DNS BIND, HTTP, HTTPS, SSH, iptables, and firewalls.

Handling load balancing implementations, such as bonding multiple interfaces into a single bond in the event of a LAN overload.

Implementing RAID 0/1/5 for creating logical volumes using VERITAS Volume Manager and Red Hat cluster servers in a SAN storage environment.

Strong proficiency in automating processes using shell scripting with bash and Python.

Building servers with jumpstart using Kickstart on a timely basis to meet corporate requirements.

Extensive knowledge of server administration, kernel upgrades, deployment of patches, and implementing firewall and security policies.

Daily administration of local and remote servers, including troubleshooting and error correction

Experience in RHEL provisioning, upgrades, patching, configuration, and performance tuning using a satellite server.

Knowledge of VDI operations and configuration, including VMWare ESX/ESXi.

Database administration of programs such as MySQL and Oracle.

Performing OS installations, upgrades, and server patching by configuring PXE and DHCP servers using Kickstart and Red Hat Satellite.

Experience with disk partitioning, Logical Volume Manager (LVM), and RAID.

Writing shell scripts in Bash, C-shell, and cron in Linux to automate tasks.

Environment: Linux (RHEL 5.x/6.x), Solaris 9/10/11, RedHat Satellite Server, Apache Tomcat, Jump Start, Kick Start, Bonding, RAID, WebSphere, SQL, Agile, WebLogic, Subversion, Samba, NFS, FTP, LVM, Tomcat, Apache, Bash, Python.

Education:

Bachelors EEE Nagarjuna University AP India,2012



Contact this candidate