Post Job Free
Sign in

Aws Engineer

Location:
Charlotte, NC
Posted:
April 03, 2020

Contact this candidate

Resume:

Name: Sandeep Degala

Email ID: adcl34@r.postjobfree.com

Contact: 251-***-****

Professional Summary:

Over 6 Years of experience in the IT industry comprising of AWS/Cloud Engineer, Linux Administration and Cloud Implementations.

Experience with Configuration Management tools like Chef and Ansible.

Extensive experience in the design and implementation of Continuous Integration, Continuous Delivery(CI/CD) and Continuous Deployment processes for agile projects.

Experience with programming languages and environments such as Python, JavaScript and Python.

Worked across both private (OpenStack) and public clouds (Amazon AWS).

Managed Linux and Windows virtual servers on AWS EC2 using Chef Server. Configured and monitored multi-platform servers using chef. Excellent at defining Chef Server and workstation to manage and configure nodes. Developed Chef Cookbooks to manage systems configuration.

Worked on container based technologies like Docker, OPENSHIFT and Kubernetes.

Created and wrote shell scripts (Bash), Ruby, Python and Power shell for automating tasks.

Experience in Implementing Organization DevOps strategy in various environment of Linux and windows servers along with adopting cloud strategies based on Amazon Web Services.

Experience in core AWS services (S3, EC2, ELB, EBS, Route53, VPC and Auto Scaling) and deployment services (Elastic Beanstalk, OpsWorks and Cloud Formation) and security practices (IAM, CloudWatch and CloudTrail).

Production experience in large environments using configuration management tools Chef, Ansible. Familiar with build server orchestration using mCollective.

Strong Experience in Amazon EC2 setting up Instances, VPCs, and Security Groups.

Experience in managing AWS VPC (Virtual Private Cloud) environment with around 2000+ Linux and Ubuntu instances.

Setting up databases in AWS using RDS, storage using S3 buckets and configuring instance backups to S3 bucket.

Ensured data integrity and data security on AWS technology by implementing AWS best practices.

Production experience in large environments using configuration management tools Chef, Ansible.

Extensive experience in working with Oracle WebLogic, Apache Tomcat application servers.

Reduce to zero downtime during database updates by implementing blue green deployment in OpenShift.

Skills:

Operating Systems : Windows, Linux, Solaris, RHEL, CentOS.

Cloud Services : AWS (EC2, S3, ELB, EBS, IAM, VPC, RDS, SNS, SQS, Glacier, Route53, Cloud

Watch, Cloud Formation, Cloud Front, Auto Scaling, Elastic Cache, EMR, Red Shift).

Virtualization Platform : Virtual Box, Docker, Vagrant, EC2 Container Service(ECS), MicroServices.

Deployment Tools : Ansible, Chef, Puppet.

Languages : Perl scripting, UNIX Shell-Bash scripting, JAVA/J2EE, Python.

Version Control Tools : CVS, SVN, TFS, GIT, GitHub, BitBucket, Nexus, Perforce.

Build Tools : ANT, Maven 2.0, Jenkins, Hudson, Bamboo.

Web Servers : WebLogic, WebSphere, Tomcat, Jboss, NGNIX, Apache.

Databases : MYSQL, MS SQL, Oracle, MongoDB, AWS RDS.

SDLC : Agile, Scrum, Sprint, Waterfall.

Educational Qualification:

Bachelor of Technology, GITAM University, Visakhapatnam, INDIA. April 2014

CERTIFICATION:

AWS Certified Solutions Architect

PROFESSIONAL EXPERIENCE:

Client: TIAA – Charlotte, NC April 2018 – Present

Role: AWS/Cloud Engineer

Responsibilities:

Manage all aspects of the end to end Build/Release/Deployment process for multi Project.

Used a micro service architecture, with Spring Boot-based services interacting through a combination of REST and MQ and leveraging AWS to build, test and deploy Identity microservices.

Was a part of Disaster Recovery exercise and actively involved in creating replica in another AWS AZ’s.

Creating alarms in Cloud watch service for monitoring the servers' performance, CPU Utilization, disk usage etc.

Use Amazon RDS MySQL to perform basic database administration. Set up DynamoDB for NoSQL data for other teams on lightweight Docker containers with elasticsearch and quick indexing.

Worked with ElasticSearch plugins shield for end-to-end SSL implementation.

Worked on Ansible modules to deploy Docker services on the Docker cluster for the entire Microservices stack.

Involved in developing custom scripts using Python, Perl & Shell to automate jobs.

Built Cassandra Cluster on both the physical machines and on Aws.

Used Ansible and automation tool to automate the Cassandra Tasks such as New installations configurations and Basic Server Level Checks.

Used Python programming in Linux platform to design the front-end portion of the plug-in.

Created and updated Chef recipes and cookbooks, profiles and roles using Ruby and JSON scripting.

Enhanced user experience by designing new web features using MVC Framework like Angular JS, accommodate these advanced and fast technologies.

Developed core product feature using NodeJs, Java and Scala.

Managing IAM accounts (with MFA) and IAM policies to meet security audit & compliance requirements.

Provisioned the highly available EC2 Instances using Terraform and cloud formation and wrote new plugins to support new functionality in Terraform.

Worked in an IAAS environment called Terraform, to manage application infrastructures such as storage and networking.

Developed Perl, and Bash scripts to do data verification between Hive and Teradata databases as part of database migration.

Monitored performance of the applications and analyzed log information using ELK (ElasticSearch, Logstash, Kibana)

Configured Angular routing module to configure routes in the application.

Manage and operate the Big Data environment and Web applications.

Implemented Hadoop clusters on processing Big Data pipelines using Amazon EMR and Cloudera whereas it depended on Apache Spark for fast processing and for the integration of APIs. At the end, we managed the above resources using Apache Mesos.

Configured Elastic Load Balancers with EC2 Auto scaling groups.

Work with my team to provide user stories and use cases of various modules of OpenStack data center deployments.

Mainly responsible for developing Restful API using spring framework, developed different controllers that return the response both in JSON and XML based on the request type.

Used Kafka Producer and Consumer API to push and read the messages from the topics.

Implementing a Continuous Delivery framework using Jenkins in Windows & Linux environment.

Developed build and deployment scripts using ANT/Maven as build tools and Jenkins as automation tool to move from one environment to other environments.

Built and Deployed Java/J2EE to a web application server in an Agile continuous integration environment and also automated the whole process.

Used Kubernetes for creating new Projects, Services for load balancing and adding them to Routes by accessing from outside, created Pods through new application and controlling, scaling and troubleshooting pods through SSH.

Responsible for Ensuring Systems and Network Security, Maintaining Performance and Setting up monitoring using Cloud Watch and Nagios & Zabbix.

Used Jenkins for continuous integration and Jenkins Master Slave Architecture to run Jenkins build in remote RHEL servers. Integrated Apache Kafka for data ingestion.

Experienced in implementing Chef and Docker.

Worked on chef for the deployment of servers.

Responsible for configuring the apps to OpenShift v3 and containerization of apps using Docker.

Worked in using Dockers Swarm and deployed spring boot applications.

Worked on Chef for IAAS configuration by writing cookbooks and recipes to automate the actions for virtual and remote resources and nodes.

Responsible for creating branches and resolving the conflicts while merging in GIT.

Experience in administering and maintaining Atlassian products like JIRA, bamboo, Confluence.

Implemented a Continuous delivery framework using Bamboo, Ansible, Maven and Oracle in Linux Environment.

Configured Elastic Load balancer (ELB) including high availability of ELB using various subnets in various availability zones, configured security settings and health check for application.

Implemented Atlassian Stash along with GIT to host GIT central repositories for source code across products, facilitate code reviews and login audits for Security Compliance. a

Configured Splunk add-on including the DB Connect, Active Directory LDAP for work with directory and SQL database.

Performed configurations from Jenkins to nexus

Performed configurations from Apache tomcat and web logic to Jenkins.

Environment: AWS (IAM, EC2, S3, EBS, Glacier, ELB, Cassandra, CloudFormation, CloudWatch, CloudTrail, SNS, SQS, Route53, RDS), OpenStack, Kafka, Git, Chef, Terraform, SDN, Splunk, Bash, Shell, Dynamo DB, RHEL 4/5/6, CentOS, Apache Tomcat.

Client: Smart Water Energy – Irvine, CA June 2017 to April 2018

Role: AWS Engineer

Responsibilities:

Design roles and groups using AWS identity and access management (IAM), and manage network using Security Groups, Network Access Control Lists with services provided by IAM.

Document system configurations, Instances, Operating Systems, and AMI build practices, backup procedures, troubleshooting guides, and safe keep infrastructure and architecture updated with appending.

Configuring and implemented an OpenStack SDN infrastructure to enable massive dynamic scaling of compute and storage resources.

Experience in working with an Agile / Scrum environment and daily standup meetings.

Developed and supported key pieces of the company's AWS cloud infrastructure. Built and managed a large deployment of Ubuntu Linux instances systems with OpsCode.

Debug existing automation code and test to confirm functionality within AWS/EC2.

Support application deployments, building new systems and upgrading and patching existing ones through DevOps methodologies.

Build servers in cloud based and physical infrastructure.

Since DynamoDB only stores data as key/value pairs, worked to strategize how to convert the MongoDB JSON document.

Created playbooks for OpenStack deployments and bug fixes with Ansible.

Used Bash and Python, included Boto3 to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMIs and scheduling Lambda functions for routine AWS tasks.

Used the AWS-CLI to suspend an AWS Lambda function processing an Amazon Kinesis stream, then to resume it again.

Configured alerts for relevant mongo metrics.

Provide oversight and guidance for the architecture; develop best practices for application hosting, and infrastructure deployment for each application hosted with AWS and Docker containers.

Utilize AWS CLI to automate backups of ephemeral data-stores to S3 buckets, EBS and create nightly AMIs for mission critical production servers as backups.

Experience with CloudTrail, CloudFront, and Glacier services.

Design EC2 instance architecture to meet high availability application architecture and deploying, configuring, and managing servers in AWS.

Support, troubleshooting and problem resolution for the developed Cloud Formation scripts to build on demand EC2 instance formation.

Kafka messaging systems has been used for large scale message processing applications. Kafka with Apache Storm used to handle data pipeline for high speed filtering and pattern matching.

Continuous deployment, continuous integration, and promoting Enterprise Solution deployment assets to target environments.

Utilized Ansible for configuration management of hosted Instances within AWS.

Integrated existing systems into AWS/EC2 cloud infrastructure. Built/maintain ansible playbooks and used that to push out bi-weekly application updates.

Elastic Load Balancer, and DNS services with amazon Route 53.

Manage AWS and install web certificates (SSL, Client Authentication Certificates).

Performed all necessary day-to-day Subversion/GIT support for different projects and Responsible for design and maintenance of the Subversion/GIT Repositories, views, and the access control strategies.

Migrating a production infrastructure into an Amazon Web Services VPC utilizing AWS CloudFormation, EC2, S3, Chef/OpsWorks, CloudWatch, CloudTrail, EBS, Route 53, IAM etc. This included migrating a number of production MySQL databases into RDS/ ElastiCache, rewriting a large set of monolithic recipe-based cookbooks as Provider and Attribute-driven wrapper cookbooks.

Setup up and maintenance of automated environment using Chef recipes and cookbooks within AWS environment.

Designed, deployed and integrated Splunk Enterprise with the existing system infrastructure and setup configuration parameters for Logging, Monitoring and Alerting.

Support various web services including Apache Tomcat.

Created web services, WSDL and web methods with Annotation in hibernate, used the spring container for data source and to load the Hibernate specific classes.

Experienced in Writing Chef Recipes to automate our build/deployment process and do an overall process improvement to any manual processes.

Established Chef best practice, approaches to systems deployment with tools such as vagrant, test-kitchen and the treatment of each Chef cookbook as a unit of software deployment, independently version controlled.

Security conscious in all the system administration, development and tools configuration management.

Coordinate release activities with Project Management, QA, Release Management and Web Development teams to ensure a smooth and trouble-free roll out of releases.

Analyzing the tools and application architecture and implement it on different environments, making it more user-friendly for the application team.

Environment: AWS (IAM, EC2, S3, EBS, Glacier, ELB, CloudFormation, CloudWatch, CloudTrail, SNS, SQS, Route53, RDS), OpenStack, Git, Chef, Splunk, Terraform, Bash, Shell, DynamoDB, RHEL 4/5/6, CentOS, Open SUSE, Apache Tomcat.

Client: DST Systems Inc – Kansas City, MO August 2015 to June 2017

Role : Cloud Engineer

Responsibilities:

Supported 250+ AWS Cloud instance’s and used AWS command line Interface to manage and configure various AWS products. Wrote automation scripts in Ruby and bash.

Designed highly available, cost effective and fault tolerant systems using multiple EC2 instances, Auto Scaling, Elastic Load Balance and AMIs.

Written HBASE Client program in Java and web services.

Model, serialize, and manipulate data in multiple forms (xml).

Utilize EBS to store persistent data and mitigate failure by using snapshots.

Designed AWS CloudFormation templates to create custom sized VPC, subnets, NAT to ensure successful deployment of Web applications and database templates.

Designed roles and groups for users and resources using AWS Identity Access Management (IAM) and also managed network security using Security Groups, and IAM.

Optimized volumes and EC2 instances & Created multi AZ VPC instance.

Installation configuration and administration of MongoDB databases.

Used IAM to create new accounts, roles and groups

Automation of Redshift, EC2, RDS, ElasticCache.

Designed and developed features for J2EE-based business activity monitoring and operational dashboard engine, including the rules and alert engine, web app components, recoverability, intelligent workflow features, internationalization, and upgradability.

Provide highly durable and available data by using S3 data store, versioning, lifecycle policies, and create AMIs for mission critical production servers for backup.

Used Chef to configure and manage infrastructure. Written cookbooks to automate the configuration setups.

Installation of Oracle on Linux and Solaris, creating database, creating Oracle users etc.

Worked on User administration setup, maintaining account, Monitor system performance using Nagios and Tivoli.

Monitoring day-to-day administration and maintenance operations of the company network and systems working on Linux and Solaris Systems.

Configured Struts, Hibernate framework with Spring MVC.

Maintained the business standards and the requirements in EJB and deployed them on to Web Logic Application Server.

Implement and manage Monitoring services with SQS, SNS, CloudWatch, and CloudFormation.

Configured S3 versioning and lifecycle policies to and backup files and archive files in Glacier.

Configured S3 to host static web content.

Deployed cloud stack using AWS OpsWorks.

Configured Elastic Load Balancers with EC2 Auto scaling groups.

Configured and monitored distributed and multi-platform servers using chef. Defined Chef Server and workstation to manage and configure nodes.

Development of Chef Cookbooks to manage systems configuration.

Supported 2000+ AWS Cloud instance’s and also familiar with Amazon Cloud command line management.

Result driven consultant with good experience in the area of UNIX/Linux System Administration.

Utilize DevOps methodologies and best practices to create infrastructure automation and continuous delivery.

Ensure communication between Operations and all Engineering Teams, Product Owners, and Scrum Masters.

Environment: SCM, RHEL, Unix, Windows, AWS Services (EC2, VPC, IAM, S3, RDS, ElasticCache, SQS, SNS, CloudWatch, CloudFormation, OpsWorks), GIT, Subversion, Web Server, WebLogic, Java/J2EE, JBoss, TFS, Chef, Nagios, Ant, Maven, Jenkins.

Client: Enrich IT Technologies – Hyderabad, India April 2014 to Aug 2015

Role : System Engineer

RESPONSIBILITIES:

Responsible for remote Linux support for more than 150 Servers.

Installation, configuration, patching, administration, troubleshooting, security, backup, recovery and upgrades of Red Hat Enterprise Linux (RHEL) 5/6/7, CentOS, Fedora, Solaris 8/9/10.

Performance tuning of the operating system for better application performance and network performance.

Configuring and troubleshooting of network services like NFS, FTP, SAMBA, NTP, Telnet, SSH.

Used putty and SSH certificate tools to login and use secure access points.

Managing users like creating, deleting and granting proper privileges and managing system security.

Developed Shell/Perl scripts to automate the deployment process.

Worked with Development and QA teams to continuously integrate software development using GIT, Maven, Jenkins.

Coordinated with the Network Team and Oracle database Administrators to resolve issues.

Setup and configured network TCP/IP on Red hat Linux including RPC connectivity for NFS and Created mount points for Server directories, and mounted these directories on Red hat Linux Servers.

Worked on Ticket based problem management.

Excellent working knowledge in implementing LDAP security models using Netscape LDAP and IBM secure way (LTPA).

Creation of VMs for Red hat Linux on VMWare ESX 3.5 and administering them with VI Client.

Monitored System Performance of Virtual memory, Managed Swap Space, Disk utilization and CPU utilization.

Troubleshooting and resolving of problems related to hardware, operating systems, third party applications and scripts.

ENVIRONMENT: Red Hat Enterprise Linux, Fedora, CentOS, Solaris, Windows 2008 server, VMware vSphere, Shell/Perl Script, TCP/IP, LDAP, NFS.



Contact this candidate