Post Job Free
Sign in

Security Agent Devops Engineer

Location:
Overland Park, KS
Posted:
June 08, 2023

Contact this candidate

Resume:

Sateesh Kantamani

Sr. AWS DevOps Engineer

LinkedIn URL: https://www.linkedin.com/in/kantamani-sateesh/

804-***-**** adxlg2@r.postjobfree.com

PROFESSIONAL SUMMARY:

Over 9+ years of experience as a Cloud/DevOps Engineer and around 15 years of experience in IT industry in planning, developing, & delivering enterprise/cloud-based software solutions.

Hands-on experience with AWS, Azure cloud solutions, and designing serverless architectures for multi-cloud environments.

Used Terraform, AWS SDK, CFT, Azure RM templates to create the infrstructure in AWS and Azure cloud platforms.

Building Terraform modules as reusable code for creating AWS services EC2, Auto Scaling Groups, Load balancers, S3, Kinesis Streams, API Gateways, Code Pipelines, Code Build, Lambda Functions, Step Functions, AWS Glue, ETL jobs, VPC.

Built CI/CD multibranch pipelines using Jenkins, Github, gitlab, Maven, Ansible to automate & deploy.

Developed Kubernetes and Rancher to deploy, scale, load balance and manage container orchestration with multiple namespace versions and life cycle management of Kubernetes Clusters.

Developed AWS Code Pipelines to build and deploy the micro-services in ECS and also EKS clusters using multi-stage environments and AWS services ECR, ECS, Code Build, SNS, SQS.

Architecting and administration of self-healing & secure cloud-based services and applications for PaaS, IaaS & SaaS environments using AWS services such as IAM, VPC, EC2, ECS, EBS, RDS, EKS, S3, Lambda, ELB, Auto Scaling, Route 53, Cloud Front, Cloud Watch, SQS, SNS.

Configured and installed Datadog in AWS/Azure cloud environments. Developed monitoring systems in the Datadog account and also integrated with the Pager Duty to trigger alerts on the basis of dynamic work loads of applications.

Integrated Datadog with micro services applications in the AWS environments deploying into AWS ECS and EKS clusters.

Built ETL scripts in different languages like PLSQL, Informatica PS/PE, Hive, Pig and Spark and expertise in Creating, Debugging, Scheduling and Monitoring jobs using Airflow and Oozie.

Experience in building enterprise and distributed applications by using technologies REST Web Services, APIGEE, SOAP, Micro Services, Camel, JMS, Active MQ, EJB, Spark, Kafka, MongoDB and Cassandra.

Experience in domain driven design, pub-sub architecture, single responsibility principal patterns.

Experience in developing applications with databases RDS, Oracle, MySQL, PostgreSQL, SQL Server.

SKILLS/TOOLS:

Cloud Platforms

AWS, Microsoft Azure

Cloud Environments

AWS Elastic Beanstalk, EBS, ELB, EKS, DynamoDB, Redshift, IAM, Vault, S3, Elastic Cache, Lambda, Glue, CFT, Code Build/Deploy, Cloud Front, Azure Data Lake, GCP

Scripting

Python, Node, Groovy, Shell, Perl

Config. Mgmt.

Ansible, Chef, Jenkins, GitHub, GitLab, Maven, Bitbucket

Technologies

APIGEE, Apache Spark, Kafka, Tomcat, Nginx, WebLogic, WebSphere, snapLogic, JBOSS

Databases

RDS, DynamoDB, Mongo DB, NoSQL, Microsoft SQL Server, Oracle, DB2, MySQL

Containerization

ECS, Kubernetes, Docker Swarm

Monitoring

DataDog, Nagios, Splunk, Dynatrace, Elasticsearch, Logstash, Kibana, RUM

DevSecOps

SonarQube, Snyk, Nessus Tenable, GitLab, Veracode, Twistlock, Blackduck, Fortify

Testing Tools

Jasmine, JUnit, Selenium, Postman, SoapUI

Work Experience:

Client: Blue Yonder Scottsdale, AZ

Role: Sr AWS DevOps Engineer

Duration: March 2022 to till date

Responsibilities:

Built AWS Cloud Formation templates to create EC2, VPC, NAT, Subnets, API gateways, RDS, S3, ALB to deploy into multiple cloud environments using Jenkins pipelines.

Developed data streaming pipelines using serverless services in AWS using Lambda functions, step functions, kinesis streams and also AWS Glue for down streaming the data to different destinations.

Developed terraform modules for different cloud services in AWS and also developed custom modules in terraform for application specific configurations.

Developed ECS clusters for migrating on-premises applications into micro-services in ECS clusters. Designed the pipelines for these deployments into AWS using Jenkins also designed monitoring set up from end-to-end using Datadog.

Designed and created the serverless framework comprising of AWS Step Functions, API Gateways, AWS Lambda (Python), Kinesis stream and Kinesis Data Firehose, AWS ELK, AWS ECS Fargate, S3 for onboarding different projects to store data into S3. Ingestion of data events into S3 is fully automated using event bridges.

Developed Groovy libraries for automatically creating Jenkins pipelines with multi-stages in the deployments and also integrated with ServiceNow to create change request before the Prod environment deployment. This is fully automated and can be used by any team to create Jenkins pipelines.

Designed and developed helm charts to deploy and manage applications in AWS EKS clusters. Developed Jenkins multi branch pipelines for this architecture to deploy the helm charts into multiple aws environments.

Developed AWS cloud insight queries and dashboards using AWS observability and imported custom query results into datadog to setup monitors.

Created Terraform Modules to spin up infrastructure in the AWS environments through Jenkins CI/CD pipelines. Services used in this infrastructure set up are Elastic Kubernetes Service(EKS), AWS Code Build, VPC, Subnets, Internet and Nat Gateways, Lambda Functions, API gatways, S3, IAM roles.

Implemented Amazon CloudWatch Real User Monitoring (RUM) for monitoring and analyzing the performance of web applications using AWS observability.

Created pipeline cloud formation templates and Infrastructure as code templates for multiple applications as part of the migration to AWS.

Worked on migrating on-premises applications to ECS, reducing infrastructure costs and improving application scalability.

Created Terraform Modules to spin up infrastructure in the AWS environments through Jenkins CI/CD pipelines. Services used in this infrastructure set up are Elastic Kubernetes Service(EKS), AWS Code Build, VPC, Subnets, Internet and Nat Gateways, Lambda Functions, API gatways, S3, IAM roles.

Worked with Docker, Kubernetes, Swarm and clustering frameworks and managing docker containers in a large environments and also implemented better deployment strategies and testing plans.

Infrastructure management, infrastructure & application custom metrics and performance monitoring using CloudWatch, DataDog, Logstash, Splunk.

Designed and implemented containerized solutions on Amazon EKS, leveraging Kubernetes to achieve high availability, scalability, and ease of management.

Integrated SNS with AWS Step Functions using lambda end points to send updates to the project teams about the progress of infrastructure deployment.

Developed Windows High-Availability solution for the event of DR failover using AWS Lambda, AWS Step Functions, CloudFormation and AWS Code Pipeline, AWS SNS.

Created AWS API gateways with integration in the backend to Python lambda functions to run the application for the logical Tier of the application and integrated with AWS Aurora database for the Data Tier in the application architecture.

Built Amazon’s CloudFront distribution to sit in front of the application web server and cache the content stored on the web server to reduce the amount servers needed to assist the traffic coming in.

Expertise in WebLogic's architecture and components, including managed servers, administration servers, data sources, connection pools, and messaging resources.

Worked on pipeline as code, infrastructure as code and configuration as code and enforced best practices for free flow of code pipeline usage.

Developed a plan to optimize Elasticsearch cluster performance by tuning settings, monitoring cluster health, and optimizing data storage and retrieval.

Built bash scripts to backup and purge the older logs of the application servers and automated the job scheduling via corn command. Automating the daily workflow jobs like location sync and cache store sync via Cron jobs.

Laying out technical requirements around the functioning of application and built an exceptionally reliable security architecture to resolve all the security concerns. Achieved this with involving AWS services like AWS Firewall Manager, WAF, Bot control, IP sets, Secrets Manager, EC2, Cloud formation, EMR Cluster, IAM, KMS, AWS S3, VPC, Route53 and AWS Analytics.

Performed AWS S3 buckets creation applied policies, permissions with the Least Privilege Access to the IAM roles and encrypted with KMS key by customizing the CloudFormation template.

Automated the process of restoring a snapshot and Implementing DB sync using Aurora snapshot tool.

As part of enhancing the security, we set up proxy servers for users to RDP into and make API calls to the application.

Implemented centralized logging and integration with Datadog using CloudTrail and S3 for major AWS resource changes.

Worked on the installation and configuration of the monitoring tool DataDog and implemented custom dashboards for monitoring Infrastructure resources and applications.

Created Datadog monitors and integrated with Pagerduty for on-call support for the business-critical applications.

Developed Groovy libraries for automatically creating Jenkins pipelines with multi-stages in the deployments and also integrated with ServiceNow to create change request before the Prod environment deployment. This is fully automated and can be used by any team to create Jenkins pipelines.

Client: Assurean Austin,TX

Role: Sr Cloud SRE Engineer

Duration: Oct 2019 to Feb 2022

Description: Handling Day to day build and deployments for more than 20 major applications, coordinating with developers to deliver software applications. Understand learn latest build and deployment tools as per the bank’s requirements and Onboard current application on to new tools and train development teams accordingly.

Responsibilities:

Created AWS API gateways with integration in the backend to Python lambda functions to run the application for the logical Tier of the application and integrated with AWS Aurora database for the Data Tier in the application architecture.

Created terraform modules to set up AWS CloudFront for the API Gateways to expose the services to the different project teams.

Developed entire CI/CD pipeline by integration with Ansible, Docker and Jenkins. For the continuous Integration, the required Jenkins environment is launched using Ansible and Vagrant by using bootstrapping scripts and Ansible installs the Docker. For CD, packaged the Jenkins itself with the Docker containers and deployed with Ansible.

Worked on the Smoke testing lambda (Python) which tests the endpoint for each of the new project stack created as part of the AWS Step Functions.

Worked on migrating on-premises applications to ECS, reducing infrastructure costs and improving application scalability.

Created Cloud Watch custom metrics for the Applications deployed in the AWS infrastructure and set up Monitoring Alarms to triggers AWS SNS notifications to the associated teams when there is an increase in usage of applications above threshold frequency.

Configured App Dynamics agent for application performance monitoring (APM) for micro services in a multi-cloud environments. Also configured stack tracing and monitoring for the application’s specific custom logs.

Worked on migrating on-premises legacy applications to cloud infrastructure of around 800 EC2 instances by deploying cluster of web servers & load balancers using Terraform modules and Ansible and working with other AWS services like ECS, deploying docker images, containerizing, orchestrating using docker swarm and Kubernetes.

Implemented monitoring and observability solutions for EKS clusters using tools such as Prometheus, Grafana, and AWS CloudWatch, ensuring proactive identification and resolution of issues.

Worked on creation of custom Docker images, container images, tagging and pushing the images. Used Docker containers and Docker consoles for managing the application life cycle.

Developed Bash and Python, to supplement automation provided by Ansible and Terraform for tasks such as encrypting EBS volumes backing AMI’s and scheduling Lambda functions for routine AWS tasks.

Implementing new projects builds framework using Jenkins & maven as build framework tools and integrated Docker build as a part of Continuous Integration process and deployed local registry server.

Developed Dev/Test/Prod environments of different applications on AWS by provisioning Kubernetes clusters on EC2 instances using Docker, Bash, Chef.

Created Azure ARM templates to spin up the infrastructure in the Azure Cloud for services VNETs, Virtual servers, Load Balancers, Azure Resource groups.

Created Azure DevOps Pipelines by Integrating Chef to install TrendMicro security agent and update patches to create Golden images and publish it in the Azure Shared Image Gallery.

Experienced in utilizing Datadog's anomaly detection capabilities to identify and alert on unusual patterns or deviations from normal performance metrics.

Deployed and configured Elastic search, Log stash and Kibana (ELK) for log analytics, full text search, application monitoring in integration with AWS Lambda and Cloud Watch.

Used Ansible to deploy ELK for automating continuous deployment (CD) and configured Slave Nodes and deployment failure reporting.

Also worked on the Lambda Functions for getting data from csv files stored in S3 bucket and publishing them to the RDS database for the KPI dashboard development for the visualization.

Configured Datadog for monitoring AWS lambda functions to monitor all the metrics emitted by Lambda, as well as function logs and performance data, to get a complete picture of your serverless applications. Also enabled tracing on the lambda functions.

Client: Forescout San Jose, CA

Role: DevOps/Build and Release Engineer

Duration: April 2015 to Sep 2019

Responsibilities:

Creating and providing new release branches for source code in SVN, TFS, bitbucket to developers

Collaborate with DEV team to build maven builds using Jenkins pipeline. Setting up lower-level environments in build forge to build and test code.

Providing day-to-day support for all Tomcat / WebSphere and Middleware applications and components, ensuring timely resolution of problems. Modify build scripts as per requirement for various build project request.

Remote login to Virtual Machines to troubleshoot, monitor and deploy applications.

Implemented continuous deployment pipeline with Jenkins and Jenkins workflow on Kubernetes

Developing scripts for build, development, maintenance and related tasks using Jenkins, docker, Maven, Groovy and shell. Working on Visual Studio Team Confidential to manage a backlog for the daily activities.

Use Jira ticketing to create new projects and integrate new components into the application.

Configuring, Automating and Deploying Chef, and Ansible for configuration management to existing Infrastructure.

Creating the Automation scripts using Python for testing various applications as well as the integration of these applications (API’s &UI’s)based on REST calls.

implemented, managed, administered, and troubleshot middleware components on RHEL virtual machines.

Deploy code to various environments on DEV team request using branching and merging techniques.

Automate build and deploy application code to various environments from repository (SVN, Bitbucket, TFS).

Experienced in working with various Python Integrated Development Environments like IDE, PyCharm and Sublime Text.

Involved in development of test environment on Docker containers and configuring the Docker containers using Kubernetes. Responsible in setting up Infrastructure Design and Automation of JBOSS EAP.

Experience in using Nexus and Artifactory Repository Managers for Maven and Ant builds.

Used Ansible and Ansible Tower as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

Perform sonar test and OAD scan to review and inspect code quality, security and to detect bugs.

Extensively used build automation tools like MAVEN and ANT for the building of deployable artifacts such as war from source code.

Good experience with JBoss mod clustering Managed Docker orchestration and Docker containerization using Kubernetes.

Manage Jenkins pipeline to auto trigger build process and publish artifact to antifactory.

Responsible for configuring deployment process to target application server, troubleshoot deployment and support to deploy source code to PROD.

Used Kubernetes to orchestrate the deployment, scaling and management of Docker Containers.

Continuously improving the build and release process, using ansible tower to automate CD task and actively manages changes.

Virtualized the servers using the Docker for the test environments and dev-environment needs. Building/Maintaining Docker container clusters managed by Kubernetes Linux, GIT, Docker

Create and Build Jenkins jobs and store the build artifacts in Nexus and deploy by using preconfigured scripts.

Participate in monthly release, by deploying code changes to production server, provide support to maintenance of application, maintain site collections, templates, Access permissions and forms.

Client: ETCC Richardson TX

Role: Senior Quality Assurance Analyst

Duration: December 2010 - April 2015.

Description: Electronic Transaction Consultants Corporation (ETCCorporation) is a leading toll collection solutions and services provider that delivers empowering electronic toll collection solutions through systems integration, consulting, maintenance, and operations services. Major clients are NTTA, Houston Toll way (HCTRA), Georgia and Seattle.

Responsibilities:

Worked on HCTRA BOS, HCTRA Legacy projects and NTTA Core

Reviewed and executed test plans and test Cases for transaction processing, account creation and payment modules.

Developed test procedures that is used for testing and delivered to the client.

Implemented efficient and effective testing strategy, test conditions and test scripts.

Executed SQL Queries to verify the dataflow from the backend and validated the customer account information.

Generated request and response in Soap UI to validate data.

Worked on file processing for fleet accounts.

Worked on oracle and crystal reports.

Generated reports to review the performance of the application. It included reconciliation of reports.

Wrote complex SQL Joins, Unions and Nested Queries on various data tables for Data validation.

Generated different types of reports using Quality Center

Executed Keyword driven testing framework using Selenium WebDriver

Verified events log in the case of failed test cases.

Tested GUI and functional testing different modules of the application.

Created data for testing purposes using automated test scripts.

To gather, analyze, negotiate, define and document requirements for approval based on project scope and schedule.

Attended meetings with clients to discuss needs and potential solutions.

Based upon the discussion created requirements which specified the scope of projects.

Coordinated with all stakeholders to identify scope, evaluate solution options, estimate cost, and assisted project manager to prepare a proposal.

Managed time to ensure sign off of deliverables (Requirements and Functional Design document) to project schedule.

Assisted and/ or guided development team in understanding requirements.

Supported testing team during creation of test cases and testing stage.

Proactively communicated and collaborated with business leads to analyze information needs and functional requirements and deliver artifacts such as Functional Requirement document, Use Cases, Mock–up Screens, and Interface designs

Participated in various meetings and discussed enhancement and modification requests.

Interacted with developers to report and track defects.

Client : Dataquest Entertainment Pvt. Ltd Hyderabad, India

Role: Business Analyst & Quality Analyst

Duration: July 2008 - November 2010.

Responsibilities:

Obtained a detailed knowledge of current business processes involved in the project environment.

Collaborated extensively with the users and with different levels of management to identify requirements, business events and to develop functional specifications for the proposed system.

Conducted analysis of business processes and requirements and identified technology enabled improvements. Used agile software development methodology.

Analysis of key data fields from the existing systems and how they relate to the required conversion data fields on the wmA System.

Translated business requirements into functional specifications for communication to technical team throughout various releases of the project.

Interviewed SMEs (Subject Matter Experts), asking detailed questions and carefully recording the requirements in a format that could be reviewed and understood by both businesspeople and technical people.

Analyzed User Requirement Document, Business Requirement Document (BRD), Technical Requirement Specification and Functional Requirement Specification (FRS).

Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.

Served as a resource for analytical services utilizing SQL Server

Worked on UAT testing and took signoff from the business users.

Document software defects, using a bug tracking system, and report defects to software developers Involved in Planning, analyzing bugs, Execution and Bug Tracking.

Involved in creating Process Flow diagrams, Use Case Diagrams, Class Diagrams, and Interaction Diagrams by the team.

Documented functional specifications and test plan.

Involved in creating data mapping documents to map the data fields, which assist the middleware team to route and parse interface messages between different applications.

EDUCATIONAL DETAILS:

B.com computer science Andhra University.

MBA from SV University, Tirupati, India



Contact this candidate