Umair Hussain
DEVOPS ARCHITECT
Details
Fairfield, CA, United States
***********@*****.***
Links
Skills
MySQL
PHP
HTML & CSS
Docker
SQL
C++
JavaScript
Git
Java
Python
experience
(RHEL,
Linux
Proficiency
Linux and windows system administration
AWS
Certifications:
Certified AWS Solutions Architect (Associate)
Certified AWS Solutions Architect (Professional)
Education
BS Computer Sciences, Allama Iqbal Open University, Islamabad
SEPTEMBER 2001 — JUNE 2003
MBA, Comsats University, Islamabad
September 2003 — July 2005
Employment History
Solutions Architect/DevOps Engineer at nClouds, San Francisco
JANUARY 2018 — MAY 2024
●Design and deploy scalable infrastructure on AWS and Azure and GCP.
●Architected secure cloud environments adhering to HITRUST CSF standards, ensuring protection of sensitive data.
●Developed and enforced security policies and procedures to meet HITRUST certification requirements, enhancing data security and Compliance.
●End to End Design and implementation of CI/CD Pipelines using Jenkins, Circle CI, AWS Code Pipeline, Git lab and Bamboo.
●Designed and developed scalable data processing application using PySpark.
●Implemented and Optimized PySpark scripts to handle complex data transformations.
●Designed and implemented Azure-based solutions for continuous integration and deployment (CI/CD), leveraging services such as Azure DevOps (formerly VSTS) pipelines, Azure Repos, and Azure Artifacts to automate software delivery processes.
●Migrated on-premises infrastructure to Google Cloud Platform (GCP), implementing Infrastructure as Code (IaC) with Terraform to automate provisioning of compute instances, networking components, and storage resources
●Developed and maintained typescript applications.
●Used typescript to build and deploy cloud native application.
●Created reusable TypeScript libraries and modules for shared functionality across multiple projects, standardizing development practices and reducing duplication of effort.
●Automated workflows using Active Batch for increasing efficiency.
●Migrated on-premise VMware workloads to AWS utilizing VMware Cloud on AWS and its services including EC2, S3 and RDS.
●Developed and maintained AWS Glue ETL workflows to extract, transform and load data from S3, Amazon RDS and AWS dynamo DB.
●Implemented AWS Glue job scheduler to automate data processing workflows.
●Configured and Managed VMware Vcenter, ESXi hosts and VMs including resource allocation, networking and storage configuration.
●Integrated VMware NSX with AWS networking services for establishing efficient communication between on-premise and cloud environments.
●Designed and Implemented scalable GraphQL APIs for multiple microservices based systems.
●Developed and maintained GraphQL schemas, resolvers and data models to support complex queries.
●Monitored and Optimized GraphQL performance using Prometheus, Grafana and New Relic.
●Designed and implemented Azure networking solutions, including virtual networks (VNets), Azure ExpressRoute, and Azure VPN Gateway, to establish secure and scalable connectivity between on-premises and cloud environments.
●Designed and implemented highly available and scalable Elasticsearch clusters for log aggregation and search capabilities, ensuring minimal downtime and optimal performance
●Optimized Elasticsearch configurations and indexing strategies to improve search query performance by 50%
●Developed and maintained custom ETL pipelines to ingest and transform data from various sources into Elasticsearch, ensuring data accuracy and completeness.
●Installed, configured and managed RHEL system, ensuring compliance with security standards and optimizing system performance.
●Configured and Managed RHEL clustering and high availability solutions.
●Configured and Managed AWS Landing Zone core services, including AWS Organization, AWS IAM, AWS S3 and AWS VPC.
●Implemented CI/CD pipelines on GCP using Cloud Build, Cloud Source Repositories, and Container Registry to automate build, test, and deployment processes for microservices-based applications hosted on Kubernetes Engine.
●Implemented robust security measures for Elasticsearch, including TLS encryption, role-based access control (RBAC), and integration with AWS IAM.
●Implemented robust security measures for Elasticsearch, including TLS encryption, role-based access control (RBAC), and integration with AWS IAM.
●Installed and deployed Kafka clusters on AWS using Terraform and Ansible.
●Configured and Managed Kafka topics, partitions and replication factors for high availability,
●Integrated Kafka with other data systems including Apache Hadoop.
●Troubleshot and resolved Kafka issues using Kafka console Kafka GUI and log analysis.
●Implemented Infrastructure as Code (IaC) solutions using tools such as Terraform and AWS CloudFormation for AWS and Azure Resource Manager (ARM) templates for Azure, automating the provisioning and management of cloud infrastructure to improve scalability, reliability, and consistency across multi-cloud environments.
●Implemented and managed Argo Workflows for orchestrating complex container-native workflows on Kubernetes, enhancing automation and efficiency.
●Designed and Implemented Infrastructure automation using tools such as Chef, Puppet and Ansible.
●Setup datadog monitoring across different servers and AWS services.
●Created customized Sentinel policies to address specific security and compliance requirements for cloud infrastructure.
●Developed and Maintained SOC- complaint cloud infrastructure, ensuring alignment with SOC 1, SOC 2 and SOC 3 requirements.
●Implemented and managed security controls, including access management, encryption and logging to meet SOC compliance.
●Created policies and procedures to align SOC framework ensuring consistent application of security practices.
●Integrated cloudcheckr, Datadog, Splunk dashboard with AWS accounts.
●Work on automating regular tasks using scripting such as Bash, Perl, and Python.
●Worked on AWS Secrets Manager to securely store and retrieve secrets, ensuring of the data privacy.
●Integrated AWS Secrets Manager with other AWS services including AWS lambda functions and EC2 instances.
●Deploy containerized apps on ECS, EKS, Azure Container Instances & AKS.
●Worked on AWS CDK to design, build and deploy scalable, secure and efficient cloud infrastructure.
●Developed custom CDK constructs and modules to extend the functionality of other AWS services.
●Migrated existing CloudFormation templates to AWS CDK, simplifying infrastructure management and improving maintainability
●Used AWS CDK to build and deploy serverless architectures, including AWS Lambda functions, API Gateway, and DynamoDB tables.
●Design and implement automated patch management solutions using AWS Systems Manager and Azure Update management center.
●Design and implement Disaster Recovery solutions of varied complexity as per RTO/RPO requirements of the workload.
●Developed custom scripts and integrations to extend the functionality of Nexus, Tosca, and SonarQube, integrating them into existing toolchains and workflows to streamline automation processes and improve overall efficiency.
●Developed and maintained custom Linux images and packages using YUM and RPM for consistency and security across enterprise environments.
●Orchestrated complex data workflows using Apache Airflow, ensuring reliable and timely execution of ETL processes.
●Developed and deployed data pipelines with Dagster, leveraging its unique abstractions for improved testability and maintainability.
●Designed and implemented Grafana dashboards to provide real-time insights into system performance, infrastructure health, and application metrics, enabling proactive monitoring and troubleshooting.
●Leveraged Grafana's rich visualization capabilities to create intuitive and customizable dashboards, allowing stakeholders to quickly identify trends, anomalies, and areas for optimization within complex environments.
●Work on Server-less applications using platforms like AWS Lambda, Step functions, Azure functions and logic apps.
●Worked on the provisioning and management of Azure infrastructure using tools like Azure Resource Manager (ARM) templates and Azure CLI, ensuring infrastructure as code (IaC) principles for consistency, scalability, and repeatability
●Dabbled into Database migration using AWS DMS and Percona Xtrabackup
●Setup and administer different database management systems such as MySQL, PostgreSQL, MSSQL, Amazon Aurora, MongoDB, DynamoDB, Redis etc.
●Implemented Azure Monitor and Application Insights to gain visibility into application performance, monitor system health, and proactively identify and troubleshoot issues, ensuring high availability and reliability of Azure-based services.
●Implement Infrastructure and Application monitoring using Datadog, New Relic, Nagios/Icinga, AWS Cloud-watch, Azure monitor, Zabbix etc.
●Implemented Azure Automation for automating routine tasks and workflows, leveraging PowerShell scripts and Azure Automation Runbooks to streamline operational processes and reduce manual effort.
●Implement logs aggregation, visualization and analytics using Logz.io, Datadog, Graylog, ELK etc.
●Wrote Lambda functions in python for AWS lambda and invoked python scripts for data transformation and analytics on large data sets in ERM clusters and AWS data streams.
●Automated various infrastructure activities like Continuous Deployments, Application server setup, stack monitoring using Ansible playbooks and has integrated Ansible with run deck and managed and developed different aspects of Jenkins to act as an administrator as well as wrote Groovy scripts in order to make the configuration management system more effective.
●Utilized Elastic Load balancing mechanism with Auto scaling features to scale the capacity of EC2 Instances across multiple availability zones in a region to distribute incoming high traffic for the application.
●Setting up database in AWS using RDS, storage using S3 buckets and configuring instance backups to S3 bucket.
●Created Apache Directory Server for local networks and integrating RHEL instance with active directory in AWS VPC
●Used docker to containerize custom web application.
●Maintenance and Managing of LDAP servers. Processing of JavaScript to all nodes using Chef.
●Developed Grafana plugins and integrations to extend functionality and integrate with existing monitoring and alerting systems, enhancing observability and streamlining workflow automation processes.
●Developed Puppet manifests and custom modules to securely transfer retrieved secrets from AWS Secrets Manager to Password Manager Pro.
●Utilized Puppet's built-in file transfer capabilities to ensure encrypted and secure transmission of sensitive data to Password Manager Pro
●Created Log collection ELK Elastic Search, Logstash, Kibana, Grafana installed File beat on all nodes in cluster to send log data to Logstash
Sr DevOps Engineer at Harosec, San Francisco
OCTOBER 2014 — DECEMBER 2017
●Design and deploy scalable infrastructure on AWS, Azure, Rackspace.
●End to End Design and implementation of CI/CD Pipelines using Jenkins, Circle CI, AWS Code Pipeline, Git-lab and Bamboo.
●Designed and Implemented highly available and scalable Linux based infrastructure for cloud native apps.
●Led Migration of VMware workloads to AWS using AWS Server Migration Services SMS and Migration Hub.
●Responsible for complex provisioning, Advanced maintenance, data replication, disaster recovery, data migration and documentation for open systems storage and backup environments.
●Managing a team of 17 Sr and Jr DevOps Engineers.
●Integrated Opsgenie with Jenkins.
●Integrated Sentry with incident management system.
●Worked on IDP Development environment setup using Spring tool suite, using Docker and kubernetes technologies for creating docker containers and deployed into kubernetes clusters.
●Used Sysdig secure and sysdig monitor to continuously monitor performance, availability and compliance of multiple EKS and kubernetes clusters.
●Used Sentinel polices to automate security and compliance checks, reducing manual effort and improving efficiency of cloud operations.
●Integrated sysdig Image Vision image scanning in Jenkins based CI/CD pipeline to automate local scanning of images pre-deployment
●Infrastructure automation using automation tools such as Chef, Puppet and Ansible.
●Setting up infrastructure provisioning using Packer, streamlining deployment processes and reducing deployment time. Implemented modular configurations, leveraging Packer's templates to create reproducible and consistent machine images across development, testing, and production environments.
●Used Terraform, Ansible, Chef and cloud formation extensively for configuration management and infrastructure as Code (IaC)
●Diagnosed and troubleshoot UNIX and windows processing problems and applied solutions to increase company efficiency
●Deploy containerized apps on ECS, EKS, Azure Container Instances & AKS.
●Work on server less using AWS Lambda, Step functions, Azure functions and logic apps.
●Deployed applications on Azure App Services and on VMs.
●Track issues using Jira and work on documentation using Confluence
●Manage applications running on Django, PHP, Java and Tomcat.
●Implement Infrastructure and Application monitoring using Datadog, New-relic, Nagios/Icinga, AWS Cloud watch, Azure monitor etc.
●Implement logs aggregation, visualization and analytics using Logz.io, Datadog, ELK etc.
●Responsible for the design, development and administration of transactional and analytical data constructs/structures and business reports from legacy to AWS and Snowflake.
DevOps Engineer at Signet Media, San Francisco
OCTOBER 2013 — SEPTEMBER 2014
●Implement security features for multi-tenant environments built on virtualization technology.
●Manage a small highly-skilled technical team that assists in handling the day to day activities and assist in executing the vision provided.
●Communicate with staff or clients to understand specific system requirements.
●Provide advice on project costs, design concepts, or design changes.
●Implemented AWS code pipeline and created cloud formation JSON templates in terraform for infrastructure as code
●Checking code compatibility, Standard, and functionality over windows, UNIX/Linux Environment.
●In order to support automated builds and deployments, wrote Windows CMD shell scripts for automating builds and transfer scripts for promoting binaries to target systems using scripted WinSCP processes.
●Document design specifications, installation instructions, and other system-related information.
●Verify stability, interoperability, portability, security, and scalability of system architecture.
●Collaborate with engineers or software developers to select appropriate design solutions or ensure the compatibility of system components.
●Evaluate current or emerging technologies to consider factors such as cost, portability, compatibility, or usability.
●Provide technical guidance or support for the development or troubleshooting of systems.
●Identify system data, hardware, or software components required to meet user needs.
●Deploy containerized apps using Docker.
●Infrastructure configuration automation with Chef and Ansible
●Implement infrastructure monitoring with Nagios.
●Build automated machine images using Packer
DevOps Engineer at Design Reactor, San Francisco
APRIL 2010 — SEPTEMBER 2013
SA Engineering and Provisioning
●Install new / rebuild existing servers and configure hardware, peripherals, services, settings, directories, storage, etc. in accordance with standards and project/operational requirements.
●Install and configure systems such as virtualization infrastructure applications or Network Management applications.
●Developing and maintaining installation and configuration procedures.
Operations and Support
●Performing regular security monitoring to identify any possible intrusions.
●Performing daily backup operations, ensuring all required file systems and system data are successfully backed up to the appropriate media, recovery tapes or disks are created, and media is recycled and sent off site as necessary.
●Performing regular file archival and purge as necessary.
Maintenance
●Maintaining operational, configuration, or other procedures.
●Performing periodic performance reporting to support capacity planning.
●Performing ongoing performance tuning, hardware upgrades, and resource optimization as required. Configure CPU, memory, and disk partitions as required.
●Maintaining data center environmental and monitoring equipment.
I.T Manager at Info span, Islamabad, Pakistan
MARCH 2006 — MARCH 2010
●Manage and Supervise IT teams located in multiple geographical locations with multiple teams in each geographical location, performing multiple IT functions.
●Manage IT operations human resource objectives by recruiting, selecting, orienting, training, assigning, scheduling, coaching, counseling, and disciplining employees; communicating job expectations; planning, monitoring, appraising, and reviewing job contributions; planning and reviewing compensation actions; enforcing policies and procedures.
●IT operations operational objectives by maintaining current system; evaluating, recommending, testing and installing new technology; contributing information and recommendations to strategic plans and reviews; preparing and completing action plans; implementing production, productivity, quality, and customer-service standards; resolving problems; completing audits; identifying trends; determining system improvements; implementing change.
●Met IT operations financial objectives by forecasting requirements; preparing an annual budget; scheduling expenditures; analyzing variances; initiating corrective actions.
●Determined IT operations service requirements by analyzing needs of users/departments; prioritizing modifications to core system applications;
●Providing information by collecting, analyzing, and summarizing data and trends;
●Design, Development and implementation of Linux based Call Center CRM and Predictive Dialer solution utilizing both VoIP and PSTN based networks.
●Oversee design and implementation of Cisco based Core, Distribution and Access layer network infrastructure with fully redundant Core and Distribution layer implementation.