Post Job Free

Resume

Sign in

Ci Cd Devsecops Engineer

Location:
Allen, TX
Posted:
December 19, 2023

Contact this candidate

Resume:

Spoorthy Sankuru

DevSecOps Engineer

GITHUB - https://github.com/orgs/Spoorthy-Projects/repositories A cloud enthusiast with 4+ years of IT industry experience in Software Configuration Management (SCM), Infrastructure Management, IaC, IaaS, PaaS, Packaging, Building and Deploying, Operational Analysis, Release Management, CI/CD, Debugging, Maintaining Storage and Backup, Monitoring, Automation, Control & Environment Management activities in Web

& Enterprise Application following Agile Methodology, Data and DevOps Operations. CLOUDS: AWS, Microsoft Azure, Google Cloud Platform CONTAINERIZATION: Docker, Kubernetes, Apache Airflow, EKS, Docker Private Registry, AKS, ECR, GKE, Artifact Registry, OpenShift, Nginx

SCRIPTING LANGUAGES: Bash, Shell, PySpark, Python, YAML, Groovy WEB TECHNOLOGIES: HTML, CSS, Bootstrap, JavaScript, JSON, Java JEE, jQuery and AJAX CI/CD:

Jenkins Cloudbees, Ansible Tower, GitLab, XLR, AWS Code Pipeline, RHEL, Azure DevOps, Helm Charts, ArgoCD, GitHub Actions CONFIGURATION & ORCHESTRATION: Ansible, Ansible Tower, Terraform, AWS CFT, AWS CDK, Chef BUILD & ARTIFACT: HashiCorp Vault, Jfrog, Apache Maven, SonarQube, Nexus, Apache Tomcat, AWS Artifact, Azure Artifacts, Gradle, Veracode VERSION CONTROL & BUG TRACKING: GitHub, GitLab, Bit Bucket, SVN, Azure Repos, JIRA, Azure Boards MONITORING: Cloud Watch, Splunk, Amazon SQS, SNS, Prometheus, Grafana, New Relic, AppDynamics, DataDog, Dynatrace, Nagios, Amazon Connect OPRETING SYSTEMS, DATABASES Ubuntu, Unix, Linux, Athena, AWS Glue Studio, Oracle SQL, PLSQL, Informatica Power Center, MySQL, PostgreSQL, RDS, Redshift, Aurora MASTER'S IN COMPUTER SCIENCE 12/2022

Auburn University at Montgomery (AUM), Montgomery, AL, USA BACHELOR'S IN COMPUTER SCIENCE AND ENGINEERING 05/2017 CMR Engineering College (JNTUH), Hyderabad, TS, India Bank of America, Addison, Texas (Hybrid)

Role: DevSecOps Engineer 08/2023 – Till Date

Responsibilities:

• Worked in Horizon’s GIS (Global Information Security), Application Management Continuous Delivery (AMCD) team with Security first mindset.

• Worked in onboarding new components under many AIT’s where I built the complete Continuous Delivery pipeline using Ansible Tower to automate repetitive tasks and for quickly depoy critical applications, writing Jenkinsfile as a descriptive way for building pipeline, and deploying it in DEV, SIT, UAT and PROD envs using Horizon XLR Release templates.

Location: Texas Mobile: 1-903- 361- 6120 Email: ad13bi@r.postjobfree.com CAREER OBJECTIVE

SKILLS (On the scale of highest experience to lowest) EDUCATION

PROFESSIONAL EXPERIENCE

• Onboarded unprotected accounts from KeePass to Hashicorp Vault by following strategic protection and rotation solutions interactive and non-interactive service accounts.

• Created namespaces and vaulting into LDAP, ORACLE and MS SQL Engine for all envs using Insomnia.

• Worked on Linux hosts, Windows services and Container applications which are created using Openshift as the orchestration tool.

• On the basis of the CI build runs by developers (input parameters), created automated pipelines using Ansible Playbooks and deployed them using Ansible Tower via regular, adhoc templates and Jenkins Cloudbees.

• Created manifest files for maintaining Openshift containers, Ansible playbooks to add tasks for the automated pipeline, configured Tower inventories, projects, templates and created XLR Release templates to deploy the components to UAT and PROD.

• Actively involved in the On-Call support for every Friday’s production releases and supported in fixing PROD issues. Alephnet LLC, Lubbock, Texas (Remote)

Role: Data & DevOps Engineer 05/2023 – 06/2023

Responsibilities:

• Experience in Startup company environment by architecting, building, and launching new data models that provide intuitive analytics to our Banking customers.

• Hands on experience in setting up workflows from scratch with DAG’s using Apache Airflow for automated scripts to ensure daily execution in production.

• Designed, built, and launched extremely efficient & reliable data pipelines using AWS Glue Studio and PySpark to move data to large Data Warehouses.

• Developed Python automations for migrating full loads and cdc loads from external sources like S3, Parquet, Text files and performed transformations on them using PySpark in our data science platforms.

• Installed and configured HashiCorp Vault as Docker container and used HashiCorp Vault to store all our secrets.

• Built custom Docker Images, Docker files and Docker containers from scratch and maintained them.

• Created infrastructure from scratch using Terraform and automated their configurations using Ansible Playbooks.

• Prototyped CI/CD system with GitLab on AWS utilizing Docker and EC2 instances for runtime environment to build, test and deploy our Data Science applications.

• Used Azure DevOps services such as Azure Repos, Azure Boards, Azure Test Plans, Azure Pipelines to plan, work on code development, build and deploy for one of our clients. Capgemini India Pvt Ltd, India (Onsite)

Role: Software Engineer - DevOps 03/2018 – 08/2021 Responsibilities:

• Creating various POC's of CI/CD pipelines by integrating source control tools GitHub, Build tools like Maven, CI tool like Jenkins, Deployment automation tool like Ansible, Terraform, Docker and EKS clusters.

• Presented a Demo to clients on CI/CD pipelines, containerization and K8’s clusters to start DevOps services.

• Worked under Agile methodology involving in Kanban, Scrum meetings, Code reviews, Incident and Change Management, Disaster Recovery, Confluence, have knowledge of GitOps, DevSecOps approaches.

• Worked on setting up AWS Services with the use of Infrastructure as Code (IaC) using Cloud formation templates and Terraform scripts for Dev, Test and Prod environments.

• Handling the AWS Services and updating the versions, instance sizes, tags update, infrastructure changes or any updates using CloudFormation (CFT) and Terraform templates which are in JSON format.

• Reduced AWS cost and right size services in preparation for containerization using Docker containers by initiating strategies for cost cutting and maintaining them in Docker containers while storing the images from ECR.

• Implemented Jenkins pipelines, Docker-based micro-service deployments involving CI/CD into K8’s clusters, EKS.

• Involved in the On-Call support bi-weekly to monitor the database, AWS Cloud services and PROD issues.

• CKA: Certified Kubernetes Administrator

• AWS Certified Solutions Architect – Associate

• HashiCorp Certified: Terraform Associate (002)

• “Rising Star, Peer-to-peer” (award given for exceptional performance throughout BU) in the FY 2019-20 Q3. CERTIFICATIONS

AWARDS



Contact this candidate