Post Job Free

Resume

Sign in

Etl Developer Devops Engineer

Location:
Columbus, OH
Salary:
90000
Posted:
June 14, 2023

Contact this candidate

Resume:

Sushma Koppara

Ph : 309-***-****

Email: adxpr2@r.postjobfree.com

Professional Summary:

Having 8 plus years of experience in all phases of Software configuration, Devops infrastructure build, Software Development Life Cycle (SDLC), Bug life Cycle and Methods like Waterfall and Agile.

Implemented Single Sign-On (SSO) through WebSEAL for web applications.

Worked on configure certificates and troubleshooting the SSL issues.

Experienced in writing playbooks and roles and deployed those into the server using ansible and Jenkins pipelines.

Experienced in AWS cloud server and implementing, architecting infrastructure using various Devops tools.

Experienced in spinning Aws services like EC2, EB, load balancers, autoscaling groups, updating and creating IAM components, creating RDS, S3, Elastic search, dynamo DB etc using terraform and cloud formation templates.

Creating lambda’s, ECS, EKS and fargate using terraform modules and also timely upgrades.

Extensive experience in creating AMI’S using PAKER and chef/ansible to create custom AMI’S for the infrastructure. Upgrading kernel and venerability fixies and application versions also with AMI’s.

Developed an infrastructure as a code using Terragrunt with Terraform while applying a fully modularized approach. Built networking infrastructure, EKS (Kubernetes) clusters, and relational databases in AWS as a code.

Building docker images and storing them into AWS ECR and JFROGG Artifactory registry.

Experience implementing Azure services such as Azure Active Directory (AD), Azure storage, Azure cloud services, IIS, Azure Resource Manager (ARM), Azure Blob Storage, Azure VM, SQL Database, Azure Functions, Azure Service Fabric, and Azure Service Bus.

Developed and designed Azure DevOps pipeline to manage the resources across multiple subscription in Azure.

Worked with Version Control, Build & Release Management and Deployments of the Solutions to the DEV, QA & PROD Environments leveraging Azure DevOps/VSTS principles/process (CI/CD) and toolsets of Visual Studio, AKS (Azure Kubernetes Service), Application Insights, Log Analytics.

Strong programming/scripting and troubleshooting skills, background with extensive knowledge of Unix/Linux, and virtualization/cloud concepts.

Actively involved in the DevOps streamlining process through Jenkins CI.

Very good experience using monitoring tools, configuration and custom matrix creation by using new relic, Splunk, cloud watch etc.

Skilled in using Jenkins as a Continuous Integration server to configure with GitHub and Maven and as well as in testing.

Deployed application which is containerized using Docker onto a Kubernetes cluster which is managed by Amazon Elastic Container Service for Kubernetes (EKS). Deployed application which is containerized using Docker onto a Kubernetes cluster which is managed by Amazon Elastic Container Service for Kubernetes (EKS).

Design and implement data pipelines using GCP services such as Dataflow, Dataproc.

Experience in Implementing data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center.

Participated in code reviews and contribute to the development of best practices for data engineering on GCP.

Experienced in working with version control systems like GIT and used Source code management client tools like GitBash, GitHub, Git Lab.

Experienced in Python scripting, Shell Scripting (Bash, ZSH, KSH, etc.), SQL Server, UNIX, and Linux.

Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts.

Prepared the required application design documents based on functionality required Designed the ETL processes to load data from Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Oracle Data Warehouse database.

Strong Experience on writing SQL Queries, Stored Procedures in Oracle Databases.

Experienced in Waterfall (Software Development Life Cycle -SDLC), TDD, BDD and Agile Project Environment.

Excellent interpersonal skills, proven team player with an analytical bent to problem solving and delivering under high stress environment.

TECHNICAL SKILLS:

Programming

Python, Shell Scripting, Loki and Jager, yml.

Back-End

Elasticsearch, MySQL, Postgres SQL, MongoDB, Aurora DB

Source Control/Versioning

SVN, GIT, code cloud

Configuration Management

Chef and Ansible

Project Management / Monitoring Tools

Jenkins, Docker, Crontab, SonarQube, JIRA, confluence, version one

Applications Servers

Web logic, Tomcat, Apache, JBOSS, NodeJS, Java

Cloud Technologies

AWS, Azure, GCP

Operating System

Linux, Debian, Ubuntu, Cent OS 6.x, 7.x RedHat 6.x, 7.x, Mac OS X, Windows 10/8/7

Monitoring tools

Splunk, AWS cloud watch, new relic, Dynatrace

Professional Experience:

Wipro Limited Oct 2022 - present

Client: Apple Austin, TX

Responsibilities:

Developing Ansible playbooks and modules, Administer Ansible infrastructure, perform maintenance and configuration, troubleshoot, and resolve issues in the dev, test, and production environments, update system and process documentation and user guides, diagrams and SOPs as needed.

Interacting with the client, users, and relevant stakeholders, gathering business needs, and converting them into technical specifications.

Using Ansible as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

Use Git and Bitbucket for version control of the code repositories.

Written the Ansible YAML scripts to configure the remote servers and Written Ansible Playbooks to setup Continuous Delivery Pipeline.

Automated configuration management and deployments using Ansible playbooks and Yaml for resource declaration. And creating roles and updating Playbooks to provision servers by using Ansible.

Monitoring the logs of transactions using Splunk and analyze them to find errors.

Working with Development team to deploy application and working with DBA team to configure data source for the application and get the issues resolved.

Participating in the testing and evaluation of database-related software products.

Responsible for database backup, recovery, monitoring and tuning.

Working with Rio pipeline for continuous Integration. The Deployed Artifacts are stored in Artifactory.

Working with Spinnaker UI, and created the Deploy Pipeline, to deploy the Application on to server for Dev, IT and Prod.

Environment: PyCharm, Ansible, Oracle PL/SQL, Git, Linux, Rio, Spinnaker UI, GCP, Artifactory and Splunk.

Client: AT&T Corporation, Saint Louis, MO. Oct 2021 – sep 2022

Role: Dev-Ops Engineer

Responsibilities:

Worked in Agile environment during the complete project life cycle and take part in daily SCRUM meetings, technical elaborations, weekly sprint meetings, and sprint retrospective meetings.

Interacted with the client, users, and relevant stakeholders, gathering business needs and converting them into technical specifications.

Translated business requirements for SSO and SAML Federation into technical design, development, and integration. Advance troubleshooting for SSO web applications and SAML integrations Digital Key Management and Federated Identity partner integration.

Expertise in designing and implementing Enterprise Single-Sign-On SSO, Identity and Access Management solutions including Federated SSO using SAML.

Experienced in designing the overall Virtual Private Cloud VPC environment, including server instances, storage instances, subnets, network access controls, security groups, availability zones, ECR, ECS, EKS, etc.

Used Ansible as Configuration management tool, to automate repetitive tasks, quickly deploys critical applications, and proactively manages change.

Wrote Python Code using Ansible Python API to Automate Cloud Deployment Process.

Used Python based GUI components for the front-end functionality such as selection criteria, created test harness to enable comprehensive testing.

Worked on IBM ISAM for updating Junctions, WebSEAL config, definitions, clients, and all the configuration according to client requirement and deploying them into the ISAM server using ansible and Jenkins.

Worked with Jenkins server and slaves configurations and creating Jenkins Jobs for generating reports and deploying plugins and configurations.

Environment: ISAM 9.0, PyCharm, SAML, Ansible, Git, python, Putty, Python, EKS, WinSCP, Jenkins.

Client: Altice USA - Bethpage, NY May 2019 - Oct 2021

Role: AWS Devops Engineer

Responsibilities:

Work in Agile environment during the complete project life cycle and take part in daily SCRUM meetings, technical elaborations, weekly sprint meetings, and sprint retrospective meetings.

Interact with Product Managers to fine tune user stories and with testing team to approve functional specification and test cases.

Working with Clients and Development team to Implement on going changes in environment.

Use Maven as build tool for application build and deployment.

Creating jobs in Jenkins to deploy applications.

Use Git and Bitbucket for version control of the code repositories.

Monitoring the logs of transactions using Splunk and analyze them to find errors.

Working on Continuous Integration and Continuous Deployment using Jenkins and sonar.

Creating requests to promote applications to higher environments after successful testing in dev/stage environment the project architecture, project documentation.

Developed an infrastructure as a code using Terragrunt with Terraform while applying a fully modularized approach. Built networking infrastructure, EKS (Kubernetes) clusters, and relational databases in AWS as a code.

Managed and maintained existing GCP Data Management implementations for multiple clients.

Implement data security and access controls using GCP's Identity and Access Management (IAM) and Cloud Security Command Center.

Created AWS cloud formation templates to create custom-sized VP, EC2 instances, Subnets and worked on tagging standard for proper identification and ownership of EC2 instances also other AWS services like S3, Cloud trial, Cloud Watch, Cloud Front, RDS, SNS and SQS.

Involved in complete cycle on migrating physical Linux/ Windows machines to cloud (AWS) and configured Apache Webserver in the Linux AWS Cloud environment using Chef automation.

Planned and configured network infrastructure within the VPC with public and private subnets and configured routing tables and internet gateway, Security Groups in AWS.

Administered security and configured user access and limits using AWS Identity and Access Management (IAM) by creating new profiles and policies for user management in JSON.

Created detailed AWS Security groups which behaves as virtual firewalls that controlled the traffic allowed to reach one or more AWS EC2 instances.

Created Buckets in AWS and stored files. Enabled Versioning and security for files stored and implemented, maintained monitoring & alerting of production & corporate servers/costs using AWS Cloud Watch.

Mentored developers in Kubernetes design and custom application implementation and created.

Designing and Developing Dockerfile and Kubernetes deployment YAML files to run the microservice-based application.

Designed and managed API system deployment using fast http server and Amazon AWS architecture.

Developed tools using python, Shell scripting, XML to automate some of the menial tasks.

Developed professional web-based UI using HTML5, CSS3, Bootstrap, JavaScript, Jquery and AJAX.

Worked on Element Tree XML API in python to parse XML documents and load the data in database. Implemented Docker containers to create images of the applications and dynamically provision slaves to Jenkins CI/CD pipelines.

Created additional Docker Slave Nodes for Jenkins using custom Docker Images and worked on all major components of Docker like Docker Daemon, Hub, Images, Registry.

Virtualized the servers using the Docker for the test and dev-environments needs and configured automation using Docker container.

Environment: AWS EC2 instances, Cloud Watch, Linux, Splunk, Python, Chef Automation, EKS, GCP, Apache Webserver, Access Management (IAM), JSON, AWS Security, Kubernetes, Docker, Jenkins, CI/CD pipelines.

Client: NIKE INC, PORTLAND, OR May 2015 - June 2017

Job Title: Azure DevOps Engineer

Responsibilities:

Experience in using ARM templates (JSON) to create Azure services, while ensuring no changes were made to the existing infrastructure.

Experience implementing Azure services such as Azure Active Directory (AD), Azure storage, Azure cloud services, IIS, Azure Resource Manager (ARM), Azure Blob Storage, Azure VM, SQL Database, Azure Functions, Azure Service Fabric, and Azure Service Bus.

Working knowledge in deploying CI/CD system using Azure DevOps on Kubernetes container environment, and for the runtime environment of CI/CD system to build, test and Deployment we have utilized Kubernetes and Docker.

Designed and automated AZURE Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), SaaS capabilities which include virtual machine, container services, virtual network, and cloud services.

Working experience with Azure Resource Manager (ARM) to deploy, update, and delete multiple Azure resources, as well as migrating on-premises resources to Azure with Azure site recovery (ASR), and Azure backups.

Used Azure DevOps services such as Azure Repos, Azure Boards, and Azure Test Plans to plan work and collaborate on code development, built and deployed application. Developed, maintained, and provided the team with Various Azure DevOps-related tools like deployment tools, staged virtual environments, and provisioning scripts.

Used Shared Image Gallery to store the created images and built Azure pipelines in Azure DevOps to implement all these services in Azure.

Structured cluster AutoScaler for Azure Kubernetes Service (AKS) by using Terraform and worked with scheduling, deploying, and managing pods and replicas in AKS.

Terraform was used along with Packer to create custom machine images, and Ansible was used to install the software dependencies once the infrastructure was provisioned.

Developed and maintained Continuous Integration (CI) using tools in Azure DevOps (VSTS) spanning multiple environments, enabling teams to safely deploy code in Azure Kubernetes Services (AKS) using YAML scripts.

Managing the Azure Kubernetes Services (AKS) policies, providing access to different Azure resources and developing and improving the workflows that govern access.

Experience with version control tools such as GIT and Bitbucket. Comprehensive knowledge of source controller concepts including branches, tags, and merges.

Developed build and deploy scripts using MAVEN and activated them using Jenkins to migrate from one environment to another.

Worked with Azure Monitoring tools such as Azure Log Analytics, Azure Network Watcher, and Azure Service Health to diagnose and minimize service degradation.

Experience in using the monitoring tools like Azure monitoring, Dynatrace to setup the desired alerts to avoid disruption.

Created ARM Template for deploying the resources into Azure using the PowerShell and continuous integration by VSTS.

Deployed Azure IaaS virtual machines (VMs) and Cloud services (PaaS role instances) into secure VNets and subnets using PowerShell.

Environment: Azure, AD, ARM, CI/CD, AKS, Azure Databricks, Kubernetes, Docker, GIT, Bitbucket, Maven, PowerShell, PaaS, IaaS, shell scripting, Python, Jenkins.

Client: MaterialSoft, Hyd, India Aug 2011 to Nov 2013 Job Title: ETL Developer

Responsibilities:

Involved in understanding the business requirements and translate them to technical solutions.

Worked for preparing design documents and interacted with the data modelers to understand the data model and design.

Created new mappings and updating old mappings according to changes in Business logic.

Based on the requirements created Functional design documents and Technical design specification documents for ETL Process.

Developed the PL/SQL Procedure for performing the ETL operation Interacted with business users, source system owners and Designed, implemented, documented ETL processes and projects completely based on best data warehousing practices and standards.

Extensively involved in Informatica Power Center upgrade from version 7.1.3 to version 8.5.1.

Designed, developed Informatica mappings, enabling the extract, transform and loading of the data into target tables on version 7.1.3 and 8.5.1.

Involved in performance tuning and fixed bottle necks for the processes which are already running in production.

Designed and developed a process to handle high volumes of data and high volumes of data loading in a given load window or load intervals.

Created Workflow, Worklets, and Tasks to schedule the loads at required frequency using Workflow Manager.

Used SQL override to perform certain tasks essential for the business.

Environment: ETL, Microsoft Business Suite, Informatica, SQL Server Management Studio.

EDUCATION:

Bachelor’s in Electrical and Electronics Engineering 2010 from JNTU, India.



Contact this candidate