Abrar Ahmed
Work: Ph: 331-***-****, Email: ***************@*****.***
LinkedIn: https://www.linkedin.com/in/abrar-ansari-36841a245/
PROFESSIONAL EXPERIENCE:
With more than 8 years of diverse IT experience, I specialize in roles such as Kubernetes Platform Engineer, AWS Cloud Automation Engineer, Cloud Migration Engineer, and DevOps Engineer. I have a proven track record in leading Build & Release Projects, overseeing on-premises data centers, and implementing serverless automation with Python Boto3. My expertise also includes designing and scaling NoSQL data structures and implementing advanced Kubernetes strategies. I have a deep understanding of platform virtualization, continuous integration and deployment, and configuration management, all while effectively managing cloud security protocols. My combination of technical proficiency and leadership abilities allows me to deliver cutting-edge solutions that optimize operational efficiency and contribute to business growth.
PROFESSIONAL SUMMARY:
Cognitive about designing, deploying, and operating highly available, scalable and fault tolerant systems using Amazon Web Services (AWS).
Adept working on BareMetal, VMware esxi, OpenShift, AWS, and Azure platforms.
Well-versed in commissioning, Installation, and provisioning UNIX systems in Green Field Data Centers.
Experienced in Performance Monitoring, Security, Troubleshooting, Backup, Disaster recovery, Maintenance and Support of UNIX systems.
Specialized in implementing Organization DevOps strategy in various operating environments of Linux and servers along with cloud strategies of Amazon Web Services.
Proficient in writing Cloud Formation Templates (CFT) & Terraform code to build the AWS Services with the paradigm of Infrastructure as a Code.
Proven experience in migration of IBM Mainframe based legacy applications into high available and reliable CI/CD pipelines.
Administered build servers like Team City, Jenkins, Team Foundation Server, Bamboo.
Expertise in integrating CPEH-FS, CEPH object storage on OpenStack and OpenShift environments.
Worked on OpenStack infrastructure upgrades, expansion, scaling, troubleshooting, and debugging for clients in the most challenging, complex environments.
Automated OpenStack and AWS deployment using Cloud Formation, Ansible, Chef and Terraform.
Expert in troubleshooting Linux, DNS, DHCP, NAS, SAN, and volume managers.
Hands-on experience in deploying VNFs & CNFs using ETSI-MANO architecture and guidelines.
Experienced with event-driven and scheduled AWS Lambda functions to trigger various Services.
Acquired practical exposure with Continuous Integration/Continuous Delivery tools like Jenkins, Code Build to merge development with testing through pipelines.
In Depth/ Working knowledge in 5G architecture CU/ DU/ EMS & experience in deploying the 5G Teleco workloads on AWS, VMware Platforms.
Skilled in support activities, analysis, technical design, testing, trouble shooting for both batch & online application programs of mainframe applications.
Created CI/CD pipeline using Azure DevOps, Team city/Octopus, and Jenkins/Octopus .
Carried out the automation processes utilizing Configuration Management tools such as Ansible.
Conducted a proof of concept (POC) using Docker and Podman container platforms to showcase the feasibility and benefits of containerization, including encapsulating code within a file system and enhancing abstraction and automation in our development workflow.
Accomplished in migrating systems from Physical to Virtual environments and evolving virtual services into Microservices architectures.
Expertise on re-architect and re-factor the legacy on-prem application to Public cloud.
Experienced on DB migration and schema migration to Public cloud (AWS,Azure) mysql servers.
Experienced in writing complex NoSQL queries to do CRUD Operations on NoSQL Databases (DynamoDB).
Experienced in version control, Revision Control, and source code management tools like GIT, CodeCommit.
TECHNICAL SKILLS:
Cloud
GCP, AWS, Azure, APIM,
Containerization Tools
Docker, Kubernetes, OpenShift, Helm
Configuration Management Tools
Puppet, Ansible
Automation & Deployments
Digital.AI, Build Pack Images, PowerShell
CI/CD Tools
Jenkins, Bamboo, Harness, GitHub actions, Team City
Technology
IBM Mainframes - COBOL, JCL, VSAM, DB2, CICS and IMS, IBM; Cognos Powerhouse - QTP, QUIZ, QUICK and COM Procedures
Version Contro
Git, GitHub, Bitbucket, SVN
Monitoring Tools
SignalFx, CloudWatch, Grafana, Kibana, Nagios, Dynatrace
Operating Systems
Windows, Unix/Linux, CentOS, Ubuntu, Mac
Language and Scripting
Python, Go, Oracle SQL/PLSQL, Groovy, Shell Scripting, JavaScript, HTML, CSS
Web Service
Oracle HTTP Server, Apache Tomcat, Nginx
Ticketing Tool
JIRA, Rally, ServiceNow
Application Servers
Tomcat, WebSphere Application Server (WAS), WebSphere Liberty
Database
Oracle 9i/10g/11g/12c, MySQL, PostgreSQL, MongoDB, Cassandra, Elasticsearch
Networking Protocols
TCP/IP, VPC, Subnets, VPN
Methodologies
SDLC, RUP, Agile (Scrum/Extreme Programming).
Message Broker
IBM WebSphere MQ, ActiveMQ
WORK EXPERIENCE:
DevOps Engineer
TCS / Amex – Remote April 2023 – Current
DevOPs engineer Roles & Responsibilities:
Design and implement cloud-based solutions, ensuring high availability, security, and scalability for enterprise applications and products.
Breaking down the HPOO flows and tracking down which are able to provision in Terraform.
Manage production, QA, and development environments, providing continuous optimisation and cost control.
Responsible for core platform engineering of critical applications and databases.
Responsible for the design, installation and maintenance of CEPH clusters and integrating with the OpenShift environment.
Analyzing application portfolios, identifying dependencies & common infrastructure platform components of Mainframe system, and assessing migration feasibility.
Responsible for configuring ESXI hypervisor on baremetal servers and adding the server to vCenter.
Responsible for creating VM’s on vCenter and adding the cluster to Teleco cloud Automation orc.
Integrated RGW with OpenShift to provide object storage for applications.
Installed and configured OSD nodes and document the OSD configuration. Created the monitoring rules for OSD nodes and troubleshoot the OSD performance alerts.
Configured Continuous Integration and Continuous deployment of over 200 applications using Team City and Octopus deploy to AWS cloud.
Built Terra grunt project to manage Terraform configuration file DRY while working wif multiple terraforms modules and worked wif Terraform Templates to automate the Azure IaaS virtual machines using terraform modules and deployed virtual machine scale sets in production environment.
Responsible for delivering the software package to customer and deploy the application on Tanzu Kubernetes Platform (VMware), ensure the application is up on running in customer environment.
Configured the Longhorn storage to EKS cluster for setting up high availability (PVC’s)
Created CephfFS and integrated with OpenShift for applications that require shared filesystems.
Designed and developed security policies and IAM rules to enforce HIPAA-compliant protection of sensitive data.
Create and Maintain CICD pipelines for Java and Mainframe applications using Gitlab, Jfrog, Jenkins, Junit, Ansible, Urban Code Deploy (UCD), Sonarqube.
Manage and Conduct SOX and PCI compliance, adhering to financial and data security standards.
Experience conducting risk assessments, remediation planning and work with teams to achieve and maintain compliance.
Collaborate with IBM Support to implement latest versions of Integrated Development Frameworks for Mainframe users and ensure 100% availability of the product.
Implemented Dynatrace monitoring for end-to-end monitoring of applications and infrastructure.
Developed Ansible playbooks for installation of ‘One Agent’ across all the servers.
Performed cost analysis and optimisation, identifying cost-saving opportunities and implementing strategies to reduce expenditure on AWS & Azure, while maintaining service quality.
Participate in incident response activities, troubleshoot, and resolve issues to minimise service disruptions.
Design and execute migration of three-tier web applications and databases from on-premises infrastructure to AWS.
Led the migration from Virtual Network Functions (VNFs) to Cloud-Native Network Functions (CNFs), enhancing network agility and scalability by leveraging containerization and microservices architecture.
Executed migration of SVN repositories from Team Forge to GitLab.
Played significant role in successful Physical-to-Virtual and virtual-to-virtual migrations, ensuring seamless transitions.
Develop proof of concept for migrating Build/Testing/Deployment process of IBM Mainframe infrastructure into fully automated pipeline using wide variety of Devops tools.
Deployed the teleco application on AWS EKS.
Installed Openshift on baremetal & configured service proxy for kubenites (F5-SPK), F5 load balancers.
Configured Astra Trident (Netapp) a storage orchestrator for containers & Kubernetes distributions to openshift (on-premise).
Technical Scope: Dell R740, ESXI, Openshift, AWS, Teleco cloud, 5G (DU,CU), Tanzu Kubenites, AWS EC2, Cost Optimization, Cloudwatch, Grafana, F5 loadBlancer, NetApp (Asta Trident), vCenter, ECR, Harbor, EKS, Open Stack, Long horn, S3, Python, Shell, Ansible.
Cloud DevOps Engineer
PayPal - San Jose, CA July 2020 - April 2023
DevOps Engineer/ CloudAutomation (Serverless) AWS : Roles & Responsibilities:
Developed end to end GitOps pipelines using Cloud formation and deployed the services on ECS and EKS for ISV providers.
Used terraform to write Infrastructure as code and created Terraform scripts for EC2 instances, Elastic Load balancers and S3 buckets
Developed the CDK script to provision AWS Infra and integared the CDK in the pipeline.
Created Pipeline failure alerts using SNS CDK for ETL( data pipelines) for ISV providers.
Written platform automation scripts using python Boto3 module to pull the keys from secrets manager and parameter store for authenticating the cross account services.
Deployed the pipelines in cross accounts using STS.
Build tools like Ant, Maven for the building of deployable artifacts (war & ear) from source code and Continuous Integration with Jenkins/Hudson bamboo, and Team city.
Responsible for build and deployment automation using AWS, ECS, Kubernetes containers.
Automated the continuous integration and deployments using Codebuild, Codepipeline, Ansible and AWS Cloud Templates
Developed the fine grain IAM policies to grant the requied permission for AWS Services at the time of developing the Git-Ops pipelines.
AWS CostOptimization:
Increase visibility on AWS cloud spending and identify opportunities for optimization and cost reduction. I have a strong understanding of public cloud (AWS, Azure), their product portfolio and commercial models.
Developed the automation scripts to identiy the orphan services which inoccuring the costs and termited them by handleing the dependencies.
Cost Analysis: Educated internal team on monthly costs, spend analysis and build custom reports using tools like Cloud Health
Identify and review cost saving opportunities and secure budget or influence the change in product groups
Knowledge of cloud cost reporting tools (AWS CUR, Cloud health, etc)
Good understanding of billing on cloud platforms (incl. impact of usage patterns on TCO)
Experience with capacity forecasting, cloud cost optimization techniques, financial reporting, and workflow management in a large service operations environment.
Familiarity with monitoring and reporting tools/APIs and other usage anomalies and be able to investigate into the source of increases
Developed and maintained cloud infrastructure capacity and demand forecasts – incl. day to day cost monitoring and tracking of cloud capacity & resource mgmt. for all services
Recommend savings opportunities related to (Reserved Instances, Savings Plans, Spot, Dedicated, Hybrid) and track utilization of these commitments
Technical Scope:CodeBuild, Code Commit, Parameter Store, SSE-KMS, S3 &LifeCycle, Kubernetes(EKS),CloudFront, Lambda Edge, Code Deploy,SNS, VPC, VPN, API Gateway, Cloud Formation,
DynamoDB, Secret Manager, CloudFormation, NatGateway, AWS Cognito, ECS, EFS, Transit Gateway, CloudWtach.
AWS / Cloud DevOps Engineer
Gerber Life Insurance - Fermont, MI January 2019 - June 2020
AWS/ Cloud Engineer: Roles and Responsibilities:
Connecting to the remote embadedhardware devices to install Rasberry Pi OS on devices and to write the python scripts on micro controller to Publish the data to AWS using notification topics.
Established connections between AWS IoT Core and Raspberry Pi devices using the MQTT protocol.
Creating and registering the ‘Things’in AWS IOT to gather the telemetric data from Devices.
Written IOT Rules and Actions to send the telemetric data to other AWS Services like Kinesis FireHouse, Elastic Search, Lambda, DynamoDB and SNS to Store, monitor and predict the incidents.
Python boto3 module is used to write the business logic in lambda to query and update the data.
Kinesis firehouse is used to structure the RAW inbound data to firehouse and stored in the s3 bucket.
Created Rest API’sin API Gateway service to establish the secure connection between the React UI and AWS Lambda’s
Configure CloudWatch Events and Cron Jobs to trigger the lambda at regular intervels of time.
Deployed the application on Lambda using Codeploy and the AWS Codebuild, Pipeline services.
Designed and Developed Pipelines in Jenkins to publish container images to cloud container registries and deploy application on Kubernetes cluster managed by EKS.
Responsible for Continuous Integration (CI) and Continuous Delivery (CD) process implementationusing Jenkins along with Python and Shell scripts to automate routine jobs.
Created custom helm charts for deploying microservices monitoring stack ( i.e Prometheus, Grafana) on kubernetes clusters(EKS).
Configured Cloudfront with LambdaEdge location using Sigv4 Certs to access the static website from nearest edge location.
Hosted the ‘SmartSense’ IOT application as static website on S3 bucket.
Provisioned the DynamoDB tables and cloud infrastructure using Cloud Formation.
Configured PING Idenetity provider with SAML Authentication for the cognito users to login to the application.
Established VPN connection between AMAT Corp network and AWS VPC.
Written IAM Finegrain policies to AWS services to restrict the unauthorized users to access the production environment.
Configured SNS Topics with AWS IoT Core and RasPidevicesto alert the business lab team to get the status of the hardware components.
Encrypting the static website and telemetry data in S3 bucket using SSE-KMS
Configured Cloudfront with lambda Edge using SIGV4 Certificates to deliver the application with zero latency.
Created cognito user pools and configured the rules to every pool that are accessing the application.
Technical Scope:
Python Boto3, Rasverry Pi OS, CodeBuild, Code Commit, Secret Manager, SSE-KMS, Code Commit, S3,Kubernetes(EKS),CloudFront, Lambda Edge, Code Deploy, BitBucket, SNS, IOT CORE, IOT Rules & Actions, Kinesis FireHouse, Elastic Search, VPC, VPN, API Gateway, Cloud Formation,DynamoDB, Secret Manager, CloudFormation, NatGateway, AWS Cognito, React JS.
DevOps Engineer
Wells Fargo – San Francisco, CA June 2016 – December 2018
Devops Engineer and Responsibilities:
Analyze and gathered business requirements from the client like existing on-premises client infrastructure details, middleware, third-party integration components and their existing framework compatibility with cloud.
Provisioned the High Availability cloud infrastructure using cloud formation.
Upgraded the software version to latest and Installed Message Queue (IBM MQ Enterprise), MySQL Enterprises edition, dotnet framework, Oracle enterprise edition, IIS Server and its dependent packages in App, Web and DB Servers.
Configured Highly Available active- active and active-passive clusters for Web. App and DB servers using server machine keys.
Configured Amazon FSX for data synchronous replication between active-active DB cluster.
Provisioned CI CD pipeline for deploying applications on servers using code build, Code commit and Code pipeline services.
Migrated on-premises data in DB servers to AWS RDS servers via Schema Conversion tool and DMS.
Migrated on premises Cots, Java and Dotnet applications to AWS cloud. Created linked DB servers to connect:
Created domain controller(LDAP) server in AWS cloud and allowed the app to connect with users SAML(Tivoli Identity Federation).
Performed High-availability load test on migrated application by stopping one server and vice-versa to other.
Configured AWS Batch service to trigger jobs at regular intervals.
Migrated tasks from on-premises windows server to cloud windows server and configured task scheduler to trigger the jobs in regular interval.
Configured cross cloud SSO, where user login to AWS cloud can also login to Azure without any authentication.
MENDIX CLOUD: Developed applications on Mendix platform by integrating external relational databases of type PostgreSQL as DB.
Integrated AWS with Mendix to access the objects stored in S3 bucket. I.e. Applications on Mendix platform should access the S3 bucket to store their object data.
Involved in supporting cloud instances running Linux and Windows on AWS, experience with Elastic IP, Security Groups and Virtual Private Cloud in AWS
Extensive experience on configuring Amazon EC2, Amazon S3, Amazon Elastic Load Balancing, IAM and Security Groups in Public and PrivateSubnets in VPC and other services in the AWS
Managed network security using Load balancer, Auto-scaling, Security groups and NACL
Create, assess, update and maintain documentation pertaining to PCF platforms
Extensively worked on Jenkins CI/CD pipeline jobs for end-to-end automation to build, test and deliver artifacts and troubleshoot the build issue during the Jenkins build process
Technical Scope:
Kubernetes, docker, Linux, Quality Center, Java/J2EE, DB2, Web Sphere, Windows, Load Runner, Oracle, SQL, PL/SQL, MS Excel, MS Office, EC2, AWS, ELB, Terraform, CloudFormation, VPC, VPN, S3, EFS, FSX, DMS, SMS, SKT, TGW, SSO, Terraform, Red Hat Linux, Git,Jenkins,Code Commit.