Post Job Free
Sign in

Customer Care Devops Engineer

Location:
Frisco, TX, 75035
Posted:
February 28, 2023

Contact this candidate

Resume:

Radhika L

425-***-****

advmsa@r.postjobfree.com

Professional Summary

Devops Engineer with Overall 12+ years in IT comprising of integrating, automating, configuring, deploying on cloud environment and On Premises. Working experience in areas of DevOps, CI/CD pipeline, build and release management,Agile methodologies.

Experience in DevOps Methodologies in Kubernetes, Docker, Pivotal Cloud Foundry (PCF),AWS, Ansible, CI/CD Pipelines, Jenkins, Git,GITLAB,Bitbucket,Splunk etc.

Integrated Build Process through Jenkins to GitHub. Built CI/CD pipelines, automating, building, deploying, and Monitoring using Pipeline as a Code using Jenkins.

Knowledge in installing, configuring and managed monitoring tools like Splunk, Nagios for continuous monitoring.

Experience in creating Docker files, Docker images & containers, Docker hub, installation and configuring Kubernetes and clustering them.

Experience deploying to and orchestrating containers (Docker, Kubernetes).

Used Kubernetes to deploy scale, load balance scale and manage Docker containers with multiple name spaced versions.

Experience with installation and configuration of Kubernetes, clustering them and managed local deployments in Kubernetes.

Implemented Micro-services using Pivotal Cloud Foundry (PCF) platform build upon Spring Boot Services.

Hands on Pivotal Cloud Foundry deployments and configurations.

Experience in deploying the micro services applications in Pivotal Cloud Foundry (Paas) cloud platform and CF command line interface.

Experience with AWS code pipeline and creating cloud formation JSON templates & YAML templates.

Monitored the servers using tools like Nagios, Splunk, grafana, Splunk and providing 24x7 on call support rotation basis. Exposed to all aspects of Software Development Life Cycle (SDLC) such as Analysis, Design, Implementation and Testing.

Rest API is used to create the Web services which provides interoperability between computer systems on the Internet. RESTful Web services allow the requesting systems to access and manipulate textual representations of Web resources by using a uniform and predefined set of stateless operations.

We must use POST and GET methods in order to post/get the information to/from the REST API from the web applications via postman tool.

Comprehensive knowledge of Software Development Life Cycle (SDLC), having thorough understanding of various phases including Project Initiation, Planning, Executing, Controlling, closure.

Experience in working with methodologies such as SDLC, SCRUM, AGILE, KANBAN, Sprint planning and Waterfall.

Good knowledge in using bug tracking and ticketing tools like JIRA, IBM ALM, Remedy and ServiceNow.

Higher Education

Bachelors of Technology in Computer Science & Engineering at KL University, Vijayawada, India.

Technical Abilities

Operating Systems

Redhat Linux (RHEL7), Ubuntu, Windows

Web Technologies

Apache, Jboss and Tomcat, IIS

Cloud Virtualization

AWS (Amazon Web Services), Pivotal Cloud Foundry (PCF), Kubernetes, Docker

Continuous Integration Tools

Jenkins and Hudson

Code Quality Tools

SonarQube

Build Tools

Maven and Ant

Version Control Systems

SVN,Bit bucket and GIT, GitLab

Artifactory Management

Artifactory, Nexus

Container Technologies

Docker, ECS

Database Systems

Oracle and MySQL

Scripting Knowledge

Unix Shell Scripting

Monitoring Products

Nagios, Splunk, Kibana,Grafana

Security Products

SSL and tcpdump

Network Services

SSH, DNS, FTP

Project Management

Jira and Rally

Configuration Management Tools

Ansible, Chef

Professional Experience

Client: Verizon Mar 2022 – Dec 2022.

Location: Irving, TX

Role: Production Support

Verizon is an American multinational telecommunications corporation and largest provider of mobile & fixed telephony services. Doc Monitoring is the team where continuously monitors the Production on Grafana dashboards and checking the Kibana logs to alert the system if the application is corssing the threshold values. Micros services which covers the area of various APIs like credit check, payments, billing,customer care, Order and Subscription Mgmt., Product and offer Mgmt. We are responsible for creating the dashboards and reporting the issues, spikes etc when ever the customer is trying to access the Verizon application.

Health check will be done on code deployment night.

Supports the deployment of microservices to production.

Provided 24/7 on call Prod support for the existing applications.

Entered Bugs along with required priority and severity using the ool JIRA and assigned the same to Development Team.

Project related documents are maintained in SharePoint for version control.

Creating the Grafana dashaboards based on the requirement and monitoring them.

Checking the logs via Kibana.

Git branches being created/modifed based on the requirements of creating the dashboards.

Day to day support of environment.

Monitor the Applications using Kibana/Grafana monitoring tool.

IF the respective threshold is crossed procatively will create a CMD and work with the respective teams for resolving the issues.

Monitor the glassbox for the checking the Customer experience.

Headspin and Catchpoint alerts are monitored and reported respectively.

Worked in hectic schedules, tight deadlines, fast paced environment, changing needs.

Developed SQL queries for accessing Oracle databases for data.

Good understanding of different types of search and reporting commands in Splunk.

Created Regular Expressions for Field Extractions and Field Transformations in Splunk.

Used Remedy as ticketing system for raising troubleshooting tickets to Environment Support teams.

Used Jenkins for Continuous Integration and Maven for continuos deployment into Tomcat Application Server.

Worked closely with software developers and DevOps to debug software and system problems.

Provided 24/7 on call QC support for the existing applications running on Apache Server.

Environment: Java, Jira, Oracle,Tomcat,SQL, Unix, Splunk, Git, TOMCAT,JIRA,Kibana, Grafana.

Wells Fargo

Duration: April 2020– Mar 2022

Role: Kafka Support Engineer.

Platform Engineering is the senior infrastructure engineering team for Company's Digital Technology organization..Provide engineering & application support to our partners across several technologies running on Red Hat Enterprise Linux; including but not limited to Apache HTTPD Webserver, Nginx, Confluent Kafka, Oracle Java, Oracle Weblogic, Apache Tomcat, F5 iRules, Kubernetes, Docker and AppDynamics. Platform Engineering’s mission is to deliver world class engineering and support for the digital channels and products that serve our customers’ online & mobile financial needs. Our subject matter experts identify & mitigate technical problems, motivate & empower our partners, and provide technical solutions that our business relies upon and our customers expect.

●Installing, configuration and supporting Confluent or Apache Kafka at a large scale Environments.

●Having good Knowledge of Jira, Git, Artifactory & Jenkins, and scripting experience using Unix Shell Scripting

●Extensive experience configuring and deploying java based web applications, automating processes & procedures, tuning & troubleshooting java applications running on Linux.

●Attend daily scrum meetings, collaborate with team members, business partners & clients, provide the strategy, technical guidance & consultation to developers and pre-production operations teams.

●As an engineer for Platform Engineering supporting the DSPT (Kafka) platform, responsibilities will include establishing technology road maps, implementing standards & best practices, providing gold images of binaries, responding to security findings, performing upgrades, automating processes & procedures, identifying & mitigating technical problems, developing documentation & departmental technical procedures, and providing 4th level support to production and pre-production operation teams.

●Involved in Building Continuous Integration and Continuous Deployment / Delivery Pipelines for Enterprise Application Build & Deployments on On-Prem applications.

●Developed CI/CD system with Jenkins on Kubernetes container environment.Created Docker images using a Docker file.

●Kubernetes to deploy Docker containers into Pod Clusters on multiple Nodes in Test, Staging and Production Environments.

●Create Pods with Kubernetes through YML scripts and Deploy to Docker containers in the environments.

Environment:Jira,Github,Jenkins,Maven,Artifactory,Kafka,kubernetes,Docker,Java,Splunk.

Client: South west airlines Duration: Nov 2018 – April 2020

Location: Dallas, TX

DevOps Engineer

South west airlines being in the travel domain, it's important that they need to be dynamic to the changes happening in the industry very quickly. DevOps enables south west airlines to make any code changes quickly and easily is an advantage for any company that needs to deliver new features quickly to customers and responds to the market’s behaviour. DevOps provides resiliency to recover quickly during service outage which could be a disaster for travel industry. Any issues related to outage, delivery can be easily identified using DevOps framework. The nature of the project is Customization of DevOps Frame work across the complete SDLC cycle, hence delivering the solution and implementation of the same. Process optimization and Automation will be done by providing proven frameworks, incorporating automated testing within the CI/CD pipelines, and optimizing performance testing within the release lifecycles. Identifying the Infrastructure required for the specific PaaS.

●Worked on Building Continuous Integration and Continuous Deployment / Delivery Pipelines for Enterprise Application Build & Deployments on On-Prem applications.

●Developed CI/CD system with Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the CI/CD system to build, test and deploy.

●Continuous Integration and Continuous Delivery process implementation using Jenkins along with Groovy and Shell scripts to automate routine jobs.

●Created Docker images using a Docker file.

●Automated deployment of applications using Jenkins pipelines and used groovy scripts as pipeline as a code.

●Worked on Kubernetes platform to orchestrate the docker containers using deployments, scheduling, load balancing, services for different applications.

●Used Kubernetes to deploy Docker containers into Pod Clusters on multiple Nodes in Test, Staging and Production Environments.

●Made use of Kubernetes to orchestrate the deployment, scaling and management of Docker Containers. Operating Systems.

●Create Pods with Kubernetes through YML scripts and Deploy to Docker containers in Various nodes in the environments.

●Used Ansible as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

Environment:Jira,BitBucket,Jenkins,Maven,Artifactory,ServiceNow,Groovyscripting,,kubernetes,Docker,Java,Splunk.

Client: AT&T Duration: Jun 2014 – Oct 2018

Location: Richardson, TX

DevOps Engineer

Project: CSTEM

Project Description: Project involves Continous Integration and Continuos Delivery Model on Java Microservices using DevOps to configure, manage and deploy applications to the Middleware infrastructure.

Responsibilities:

●Involved in building Continuous Integration and Continuous Deployment / Delivery Pipelines for Enterprise Application Build & Deployments on On-Prem applications.

●Developed CI/CD system with Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the CI/CD system to build, test and deploy.

●Continuous Integration and Continuous Delivery process implementation using Jenkins along with Groovy and Shell scripts to automate routine jobs.

●Created Docker images using a Docker file.

●Automated deployment of applications using Jenkins pipelines and used groovy scripts as pipeline as a code.

●Worked on Kubernetes platform to orchestrate the docker containers using deployments, scheduling, load balancing, services for different applications.

●Used Kubernetes to deploy Docker containers into Pod Clusters on multiple Nodes in Test, Staging and Production Environments.

●Made use of Kubernetes to orchestrate the deployment, scaling and management of Docker Containers. Operating Systems.

●Create Pods with Kubernetes through YML scripts and Deploy to Docker containers in Various nodes in the environments.

●Used Ansible as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

●Deployed Micro-services applications on to Pivotal Cloud Foundry (PCF) platform using CI/CD pipelines.

●I have created ORGs & Spaces as per respective environments & from PCF Market place accessed pool of services & configured appdynamics, syslog, ConfigServer etc as per Project needs.

●Developed CI/CD system with Jenkins on Pivotal Cloud Foundry (PCF) for the CI/CD system to build, test and deploy microservices application using Blue/Green Deployments.

Used Splunk as a monitoring tool to identify and resolve infrastructure problems before they affect critical processes and also track of CICD Pipeline build status.

Environment:Jira,BitBucket,Jenkins,Maven,Artifactory,ServiceNow,Groovyscripting,PivotalCloudFoundry (PCF),kubernetes,Docker,Java,Splunk.

Client: AT&T Feb 2011 – March 2014

Location: Bothell, WA

Role: Application Support

AT&T is an American multinational telecommunications corporation and second largest provider of mobile & fixed telephony services. CSI (Common Service Interface) is AT&T’s enterprise middleware platform for enabling Business to Business (B2B) partners and internal IT applications project which covers the areas customer care, Order and Subscription Mgmt., Product and offer Mgmt. We are responsible for testing the middleware in conjunction with its Users and client’s process.

Reviewed the Business Requirement document with the Business & Development team to understand the architecture and functionality of the application.

Installed, configured and administered CSI environments in Development, Test and Performance environments and provided support to End Users.

Provided 24/7 on call QC support for the existing applications running on Apache Server.

Entered Bugs along with required priority and severity in QC and assigned the same to Development Team.

Project related documents are maintained in SharePoint for version control.

Configured and deployed applications in various work environments like Development, Test, Performance.

Domain/Environment creation on Weblogic 10.

Project in Development/Test /Pre Production systems.

Installed and configured enterprise applications on BEA Weblogic Application Server 10.3/9.2/8.1 in Solaris 10 environment.

Deployed Various J2EE applications on to the clusters and Application servers and supported accordingly.

Configuring and tuning Site Scope for monitoring of Staging Environment.

Configured JMS Connection Pools, QUEUE, Topics, JDBC resources, data sources and bounded to the J2EE applications, configured the connection pools for the various data sources.

Taking Thread dumps, pstack, prstat and analyzing to find the problems in application..

Developed shell scripts to automate the maintenance process of the Web Logic and recover the backed up Web Logic 9.2 configurations.

Created Users, Groups and Roles and assigned users to Groups and Roles.

Managing and Monitoring the JVM performance by Weblogic Heap Size, garbage collection, JDBC Pools.

We used POST and GET methods in order to post/get the information to/from the REST API from the web applications via postman tool.

DELETE, GET,POST are the actions which we can perform using the REST, we have to pass the action on the URI

POST or PUT method is used th write data to the API

GET is only for reading the data.

Cluster Configuration – Horizontal cluster, Vertical clustering and Deployment over

clusters

Creating JDBC DataSources and JMS queues and connectors for all the new build and

branches being created.

Creating LDAP users and groups in weblogic environments.

Day to day support of environments and WLST issues.

Low disk space clearance

Data Restoration

Knowledge and good experience on Model to Execution (M2E), AT&T Frameworks and Tools Software Manager (AFTSWM), DATAGATE, Local Resource Manager (LRM), Global Resource Manager (GRM), AT&T Consolidated Framework Service (ACFS), Consolidate Service Manager (CSM), Direct Messaging Engine (DME), CAET, GRID, CASSANDRA.

Worked with developers finding Memory leaks.

Responsible for collecting all thread dumps/heap dumps for troubleshooting issues.

Responsible for collecting and Sending JVM Crash data to Development team.

Monitor the Applications and Servers using Wily Introscope monitoring tool.

Periodically monitored logs for optimal performance.

Worked in hectic schedules, tight deadlines, fast paced environment, changing needs.

Developed SQL queries for accessing Oracle databases for data.

Extensively used Quality Center to Report bugs to developers.

Good understanding of different types of search and reporting commands in Splunk.

Setup Splunk Forwarders for new application levels brought into environment.

Created Regular Expressions for Field Extractions and Field Transformations in Splunk.

Used performance-monitoring tools like Wily Introscope.

Reviewed Web Server, Application Server Performance Monitoring data using both Wily Introscope.

Used ARS Remedy as ticketing system for raising troubleshooting tickets to Environment Support teams.

Installed, configured and administered CSI environments in Development, Test and Performance environments and provided support to End Users.

Involved in build and deployment processes for Pre-production environments using Jenkins.

Used Subversion SVN as source code repositories and config changes maintenance.

Used Jenkins for Continuous Integration and Maven for continuos deployment into Tomcat Application Server.

Worked closely with software developers and DevOps to debug software and system problems.

Provided 24/7 on call QC support for the existing applications running on Apache Server.

Environment: Java, J2ee technologies, Quality Center, Oracle,WebLogic,TOAD, XML, Unix, TIBCO, SOA, Splunk, SVN,SHELL SCRIPTS, SCM,JENKINS,Maven,TOMCAT,JIRA,Weblogic,Wily Introscope,RESTAPI,Postman.

Client: AT&T Feb 2008 – Jan 2011

Location: Bothell, WA

Role: Programmer Analyst

AT&T is an American multinational telecommunications corporation and second largest provider of mobile & fixed telephony services. CSI (Common Service Interface) is AT&T’s enterprise middleware platform for enabling Business to Business (B2B) partners and internal IT applications project which covers the areas customer care, Order and Subscription Mgmt., Product and offer Mgmt. We are responsible for testing the middleware in conjunction with its Users and client’s process.

Responsibilities:

• Identified the Business Requirements of the project.

• Involved in preparing the Detailed Design document for the project.

• Developed UI using JSP, Tiles, Java Script and CSS.

• Created tile definitions, struts-config files, and validation files for the application using Struts framework.

• Created XML formatted output files.

• Did requirement analysis, design, coding, implementation and Maintenance of this application follow the complete SDLC lifecycle along with Team lead.

• Did core Java coding useJDK 1.3, Eclipse Integrated Development Environment (IDE).

• Did coding of following modules

• Implemented Action Classes and Action Forms using Struts framework.

• Used JDBC to connect to the database.

• Involved in Unit testing, System Testing and writing test cases.

• Design Database tables.

• Wrote SQL queries and stored procedures

SOAP is used inorder to test the secnarios by using the wsdl and namspaces

• MySQL Server administration.

• Used IBM ClearCase as version control and workspace management.

• Used ANT as the build tool

• Apache Tomcat server administration

Environment: CoreJava,JSP,Struts1.1,Eclipse,JDBC, J2EE, Apache Tomcat 5, HTML, JavaScript,MySQLClear Case, Ant,SOAP



Contact this candidate