Che Ndengue
https://www.linkedin.com/in/che-ndengue-7b8885257/
612-***-****/ ********@*****.***
PROFESSIONAL SUMMARY
Senior DevOps and SRE Solutions Architect with extensive experience designing, deploying, and operating AWS infrastructure for containerized and data-driven applications. Skilled in Infrastructure-as-Code using Terraform and CloudFormation, and proficient in Kubernetes orchestration on EKS. Proven track record building secure, scalable environments across S3, EC2, and data pipelines, and automating CI/CD workflows for consistent, efficient deployments. Hands-on experience administering and supporting enterprise analytics platforms including Databricks, Tableau Server, and IBM Cognos Analytics across Windows and Linux environments. Strong background in managing infrastructure on Windows Server and Red Hat Enterprise Linux, implementing High Availability architectures, leading Cognos upgrades, tuning repositories, optimizing Spark workloads, and enforcing security through Active Directory, SSO, and RBAC integrations. Edge Security & CDN Architect with hands-on experience migrating Akamai environments to Cloudflare, implementing WAF, Bot Management, Page Shield, and Magic Transit–aligned DDoS protection strategies. Hands-on experience with observability and reliability engineering using Prometheus, Grafana, centralized logging, and SRE best practices to ensure high availability, incident response readiness, and performance optimization. Implemented cost governance and resource optimization using Cloud ability and IBM Turbonomic, aligning financial accountability with operational excellence. TECHNICAL SKILLS
● Cloud Orchestration: AWS Systems Manager, Terraform, AWS CloudFormation, AWS Lambda, Ansible.
● AWS Platform: VPN, VPC, Route53, Route 53 Resolver, EC2, ELB, AWS CloudFormation, AWS Lambda, AWS Systems Manager, S3, RDS, SNS, SQS, SES, Trusted Advisor, CloudFront, AWS Auto Scaling, CloudWatch.
● Identity & Access Management: AWS Organization, AWS IAM, Active Directory, OKTA, AWS Secrets Manager, Vault.
● Container Orchestration: AKS, Kubernetes, Kubernetes Administrator, EKS, ECS, Docker GCP Deployment Manager.
● Streaming & Monitoring: Kinesis Firehose, Splunk Cloud, PyTest, Nagios, Splunk, Zabbix, New Relic, PagerDuty.
● Image & Patch Management: AWS SSM Patch Manager, AWS Golden AMI Pipeline, Docker.
● Governance & Compliance: AWS Config Rules, AWS Organization, AWS Control Tower, AWS Trusted Advisor, AWS Well Architected Tool, AWS Budgets.
● Security: AWS Guard Duty, AWS Shield, AWS Inspector, AWS SSM Parameter Store, WAF
● Application Delivery: GitHub, GitLab, Bitbucket, Jenkins, GitHub Actions, AWS Code Deploy, AWS CodePipeline, AWS Code Commit.
● Data Processing: PySpark.
● Micro Services: Docker, Kubernetes, EKS, ECS.
● Data Protection: AWS Certificate Manager, AWS KMS, Snapshot Lifecycle Manager, AWS CloudHSM.
● Migration: Database Migration Service, Server Migration Service, Cloud Endure, CART.
● Database: DynamoDB, RDS, MongoDB, MySQL, Postgres, Amazon Aurora.
● Programming: Python, Python Boto3 Modules, Automation.
● Communication: Confluence, Slack, Box.
● Kinesis Firehose: Managed service for delivering real-time streaming data to destinations such as Amazon S3 Splunk.
● Cloud: For collecting, indexing, searching, and analysing machine-generated data in real-time.
● PyTest: For automated testing.
● optimize node configurations for performance, reliability, and resource utilization.
● Containerization: AWS ECS, AWS EKS, Docker, Kubernetes etc.
● Graph Databases: Aerospike, Neo4j, Timbr.
● Image & Patch: AWS SSM Patch Manager, AWS Golden AMI Pipeline, Docker.
● Governance & Compliance: AWS Config Rules, AWS Organization, AWS Control Tower, AWS Trusted Advisor, AWS
● Well Architected Tool, AWS Budgets, etc. Integrated AKS with CI/CD pipelines using tools like Azure DevOps, Jenkins, or GitHub Actions to automate the deployment of containerized applications.
● Security: AWS Guard Duty, AWS Shield, AWS Inspector, AWS SSM Parameter Store, WAF etc. Monitoring & Event.
● Management: AWS CloudWatch (Events & Logs), AWS SNS, AWS S3, Splunk, New Relic, PagerDuty, Prometheus.
● Application Delivery: GitHub, GitLab, Bitbucket, Jenkins, GitHub Actions, AWS Code Deploy, AWS CodePipeline, AWS Code Commit.
● PySpark: Perform Data transformation Optimized AKS clusters for cost efficiency by right-sizing nodes, utilizing spot instances for non-production workloads, and implementing cluster autoscaling based on workload demand.
● Designed AKS clusters for scalability and high availability, leveraging features such as node pools, horizontal pod autoscaling (HPA), and availability zones.
● Micro Services: Using Docker and Kubernetes (EKS and ECS).
● Data Protection: AWS Certificate Manager, AWS KMS, Snapshot Lifecycle Manager, AWS CloudHSM.
● Migration: Database Migration Service, Server Migration Service, Cloud Endure, CART Database: DynamoDB, RDS, MongoDB MySQL, Postgres, Amazon Aurora.
● Python: Python Boto3 Modules, Automation.
● Communication: Confluence and Slack, Box.
EDUCATION
● Bachelor of Science in Information Technology 2013. PROFESSIONAL EXPERIENCE
NTTDATA Texas Sep 2025 – Present Sr. Cloud/DevSecOps/Platform/ Solutions Architect
● Implemented cloud cost governance and financial visibility using Cloud ability, enabling business-unit chargeback, spend forecasting, anomaly detection, and cost optimization across AWS and Azure environments.
● Leveraged IBM Turbonomic for real-time application resource optimization, rightsizing compute instances, optimizing Kubernetes workloads and reducing over-provisioned CPU/memory usage.
● Setup test Integration with Go on pipeline.
● Built and maintained Validating and Mutating Admission Webhooks in Kubernetes using Go, enforcing security policies, resource standards, and automated sidecar injection with proper TLS configuration and failure policy management.
● Designed and implemented enterprise-grade container platforms using Red Hat OpenShift and Azure Red Hat OpenShift (ARO) to support public and hybrid cloud digital transformation initiatives.
● Installed and configured Datadog agents on Linux and Windows servers, EC2 instances, and Kubernetes nodes.
● Incorporated AI into pipelines to determine if a deployment is performing poorly and automatically initiate a rollback based on metrics like error rates or response times.
● Deployed AI-driven security tools like CrowdStrike or Azure Security Centre to identify and mitigate security threats in real time.
● Deployed Mule applications to Cloud Hub and containerized environments (Docker/Kubernetes) for scalable cloud-native integration.
● Built and automated infrastructure provisioning using Terraform and Bicep, enabling repeatable deployments of OpenShift clusters across dev, test, and production environments.
● Designed and developed API-led connectivity solutions using MuleSoft Any point Platform to enable seamless integration between cloud, on-prem, and third-party systems.
● Designed, developed, and maintained complex workflows using Apache Airflow, creating scalable DAGs to
● orchestrate ETL/ELT pipelines across cloud and on-prem environments.
● Developed and maintained REST/SOAP API integrations, message queues, and middleware solutions to connect internal systems, third-party platforms, and cloud services.
● Monitored system integrations using centralized logging and observability tools (Splunk, Prometheus, Grafana),
● troubleshooting API failures, latency issues, and data inconsistencies.
● Designed and documented integration architecture diagrams using Visio and Lucid chart; maintained technical documentation and runbooks in Confluence.
● Integrated MuleSoft applications with AWS, Azure, databases (MySQL/PostgreSQL), and enterprise systems (SAP, Salesforce, etc.).
● Ensured secure integration practices by implementing IAM controls, API authentication (OAuth, JWT), TLS encryption, firewall policies, and vulnerability scanning compliance. Integrated GenAI capabilities into DevOps workflows to automate log analysis, incident summarization, and root cause identification, reducing MTTR for production issues.
● Integrated service mesh (Istio/OpenShift Service Mesh) to enable secure service-to-service communication, traffic management, and observability across microservices.
● Designed and implemented end-to-end system integrations between enterprise applications, cloud platforms, and on-prem infrastructure to ensure seamless data flow and interoperability.
● Performed Tableau Server upgrades and patching using TSM CLI, SSL certificate management tools, and backup/restore utilities, ensuring minimal downtime during migrations Applied GenAI for security insights, summarizing SAST/SCA findings and suggesting remediation actions for vulnerabilities in code and infrastructure.
● Built dynamic and parameterized DAGs with task dependencies, branching logic, retries, SLAs, and error handling to ensure reliable data processing in production.
● Incorporated AI into pipelines to determine if a deployment is performing poorly and automatically initiate a rollback based on metrics like error rates or response times. Implemented AI-assisted monitoring using Azure OpenAI Service to correlate logs, metrics, and alerts for anomaly detection and predictive insights.
Designed and deployed containerized microservices (Docker, Kubernetes) integrated with AI-driven pipelines using Hugging Face Transformers for scalable data processing.
● Provided L2/L3 production support for Databricks platform, troubleshooting cluster failures, job errors, and performance bottlenecks in multi-workspace environments. AvidXchange North Carolina Sep 2024 – Sep 2025
Sr. Cloud/Devops/SRE Solutions Architect
● Used Hashi Corp tools like Vault and Consul to manage secrets, configuration files, and service discovery
● Leveraged Dynatrace to monitor k8 cluster to capture log data, events and metric data.
● Bootstrapped deployment of Dynatrace with Argo CD. Applied Snyk with OpenAI Codex to analyze vulnerabilities and recommend remediation for SAST/SCA findings.
● Setup test Integration with Go on pipeline.
Designed and deployed containerized microservices (Docker, Kubernetes) integrated with AI-driven data processing pipelines for scalable workloads.
● Built and maintained Validating and Mutating Admission Webhooks in Kubernetes using Go, enforcing security policies, resource standards, and automated sidecar injection with proper TLS configuration and failure policy management.
● Containerized applications using Docker and deployed workloads to OpenShift using DeploymentConfigs, Operators, Helm charts, and Kubernetes manifests.
● Installed and configured Datadog agents on Linux and Windows servers, EC2 instances, and Kubernetes nodes.
● Incorporated AI into pipelines to determine if a deployment is performing poorly and automatically initiate a rollback based on metrics like error rates or response times.
● Deployed AI-driven security tools like CrowdStrike or Azure Security Centre to identify and mitigate security threats in real time.
● Managed UDeploy agents, environment templates, and deployment approvals to support enterprise release governance.
● Supported hybrid cloud deployments, integrating on-premises systems with Azure and OpenShift environments for seamless workload portability.
● Ensured consistent environments by writing idempotent tasks. Applied Ansible linting and role testing with Molecule to maintain quality.
● Implemented and supported service mesh architectures using Istio, configuring traffic routing rules, Virtual Services,
● Destination Rules, mTLS policies, and observability integrations.
● Managed project dependencies, plugins, and multi-module builds using Maven. Automated builds and packaging for CI/CD pipelines.
● Configured Maven profiles for different environments (dev, test, prod).
● Developed lightweight REST APIs with Flask to expose ML models and backend data services. Integrated PySpark jobs into Airflow DAGs and CI/CD pipelines.
● Wrote comprehensive unit and integration tests using JUnit 5 and Mockito. Applied TDD practices for new feature
● development.
● Built RESTful APIs and integrations using Mule 4, Data Weave transformations, and RAML specifications following API-first design principles.
● Led deployment of One Agent, Active Gate, and Dynatrace Extensions across diverse infrastructure stacks.
● Used Terraform to backup and store Entra policies into state file and store in blob storage.
● Architected spread sheet for resources and tag name and different environments.
● Leveraged python to setup automation deletion for logs in Dynatrace.
● Setup Security monitoring with New Relic Troubleshot control-plane components including API server interactions, controller manager reconciliation loops, webhook latency issues, and sidecar injection failures in production environments.
● Cleanup Drift is Repos.
● Implemented GitOps-based deployment strategies (ArgoCD-style workflows) to manage application lifecycle and ensure consistency across clusters.
● Integrated Datadog with AWS, Azure, and GCP to auto-discover resources and collect metrics for services like Lambda, RDS, S3, AKS/EKS, and API Gateway.
● Tuned Elasticsearch clusters for performance and index management.
● I have architected end-to-end Dynatrace solutions for enterprise-scale applications across cloud (AWS, Azure), on-prem, and containerized (Kubernetes, OpenShift) environments.
● I have designed and implemented enterprise-scale Azure landing zones using ARM templates, Bicep, and Terraform to ensure compliance with security, networking, identity, and governance requirements.
● Applied AI for recommendations on best practices, such as optimizing deployment strategies or enhancing system reliability.
● Configured and managed OpenShift networking components, including routes, ingress controllers, and service exposure for internal and external applications.
● Integrated UDeploy with Jenkins CI pipelines and Artifactory for automated artifact promotion workflows.
● Worked with Azure Active Directory and Okta for user lifecycle management, group-based access control, MFA
● enforcement, and single sign-on (SSO).
● Used Ansible to bootstrap and configure instances in AWS and Azure. Integrated with CI/CD pipelines (GitLab CI, Jenkins) to automate deployments of infrastructure and applications.
● Developed and maintained microservices using Spring Boot to deliver scalable and resilient backend applications.
● Developed distributed data pipelines using PySpark on AWS EMR and Azure Databricks.
● Applied predictive analytics to forecast resource usage and ensure optimal scaling in cloud environments.
● Used tools like Dynatrace, Datadog, or New Relic, which integrate AI to detect anomalies, predict potential failures, and automatically alert teams before issues impact users.
● Designed and implemented IAM policies to control access to key resources like S3, EC2, Lambda, and RDS using least privilege principles.
● Leveraged AI-driven log aggregation and analysis platforms (e.g., Splunk, ELK Stack with Machine Learning) to identify patterns in logs and proactively resolve errors.
● Set up centralized logging using Filebeat/Logstash to ingest logs from VMs, containers, and cloud services.
● Configured and managed secrets and credentials using HashiCorp Vault.
● Designed component processes and application processes in UDeploy to support versioned artifact deployments and rollback strategies.
● Built and maintained data orchestration workflows using Apache Airflow, designing scalable DAGs for ETL pipelines,
● infrastructure automation triggers, and cloud cost reporting workflows
● Tuned and debugged Envoy sidecar configurations for latency optimization, circuit breaking, rate limiting, and header-based routing.
● Integrated firewall logs and security telemetry into Splunk (SIEM) for centralized monitoring, log correlation, alerting, and incident investigation across infrastructure and Kubernetes workloads.
● Integrated Cloudability with enterprise tagging standards and Azure DevOps pipelines to enforce cost attribution models and improve FinOps maturity across multi-cloud environments
● Maintained detailed technical documentation, runbooks, and architecture decision records in Confluence, improving knowledge sharing and audit readiness.
● Designed secure network and application architecture diagrams using Microsoft Visio and Lucidchart, documenting cloud topology, Kubernetes cluster design, ingress/egress flows, and security zones.
● Leveraged IBM Turbonomic for automated resource optimization, rightsizing EC2/VM workloads, scaling Kubernetes
● clusters, and eliminating over-provisioned resources to improve performance while reducing cloud spend.
● Designed and published APIs for internal and external consumers. Implemented policies for rate limiting, authentication, and transformation.
● Managed lifecycle and versioning of APIs, integrated with Azure AD for OAuth2.
● Integrated pytest into CI/CD pipelines (Azure DevOps, GitHub Actions) with coverage reports and alerts.
● Used AI-driven tools like Snyk, Veracode, and SonarQube to analyse codebases for vulnerabilities, misconfigurations, or insecure dependencies during CI/CD workflows. Integrated GitHub Copilot into development workflows for AI-powered code suggestions and validation, reducing manual errors in CI/CD pipelines.
Fidelity Investment Westlake, TX Sep 2023 - Sep 2024 Sr. Cloud/Devops/Database Solution Architect
● Used Hashi Corp tools like Vault and Consul to manage secrets, configuration files, and service discovery.
● Setup CI/CD pipeline in Azure for different customer operations when needed by customers e.g. adding/deleting/increasing.
Built automation workflows using LangChain to generate runbooks, documentation, and troubleshooting steps, improving knowledge sharing across teams.
● Collaborated with cross-functional teams (AI/ML engineers, developers, security teams) to enable scalable deployment of AI/ML workloads on OpenShift platforms
● IOPS/change instance size or type etc. Embedded Datadog in CI/CD pipelines for release monitoring.
● Developed scripts and automation tools using Python, Bash, and PowerShell.
● Configured and managed secrets and credentials using Hashi Corp Vault.
● Provisioned, managed, and monitored EC2 instances on AWS.
● Written and audited custom IAM policies using JSON to enforce least privilege principles for roles, users, and services Used Dynatrace Query Language (DQL) to build focused metrics views and business KPIs. Implemented custom tags and management zones for granular visibility.
● Implemented distributed tracing and observability within service mesh environments using Prometheus, Grafana, Jaeger, and Kiali for traffic flow visualization and performance monitoring
● Applied GitOps workflows (ArgoCD/Azure DevOps) to manage Kubernetes manifests, CRDs, webhook configurations, and service mesh policies with version-controlled, automated deployments.
● Implemented security features in Spring Boot applications using Spring Security and OAuth2 for authentication and
● authorization.
● Administered and configured IBM UrbanCode Deploy (UDeploy) for enterprise-scale automated deployments across Dev, QA, and Production environments.
● Used Azure Functions to implement serverless computing, allowing for the execution of code in response to events.
● Integrated IAM roles with services like EKS and ECS to ensure secure service-to-service communication.
● Enabled auto-monitoring of Kubernetes clusters (AKS, EKS, GKE), pods, and workloads using Dynatrace Operator and cloud integrations.
● Developed cloud-native tooling and automation services using Go (Golang), leveraging Go modules, struct-based data modeling, concurrency patterns (goroutines), and REST API integrations to support Kubernetes platform operations.
● Enabled observability and monitoring by integrating OpenShift monitoring stack with Azure Monitor, Log Analytics, Prometheus, and Grafana for real-time insights into cluster and application performance.
● Enforced system hardening policies and managed secrets using Ansible Vault. Automated user management and SSH key distribution.
● Deployed microservices and backend workloads in AKS using Helm and YAML manifests. Configured Ingress controllers, autoscaling, secrets injection via Key Vault, and integrated with Azure Monitor for logging and metrics
● Collaborated with compliance teams on periodic access reviews and role audits to meet regulatory standards like SOC and ISO 27001.
● Built event-driven architectures to handle storage changes, resource group events, and custom application events.
● Integrated with Function Apps and Logic Apps for downstream processing.
● Built scalable test environments on Azure using containerized test agents and Azure Container Instances, ensuring isolated, repeatable, and parallel test executions.
● I have engineered and maintained fine-grained RBAC policies in Azure AD and Azure subscriptions, applying least privilege principles to roles across development, testing, and production environments.
● Designed and implemented custom Kubernetes controllers using Kubernetes controller-runtime and client-go libraries to manage Custom Resource Definitions (CRDs) and automate reconciliation workflows.
● Leveraged Back base modular payment framework that allows banks and financial institutions to offer seamless domestic and international payments, P2P transfers, bill payments, and Open
● Enabled Azure Front Door WAF to protect against common web vulnerabilities such as SQL injection, cross-site scripting (XSS), and other OWASP Top 10 threats.
● Banking payments.
● Developed integration diagrams to illustrate how different systems and applications are integrated within a solution showing the communication protocols, data formats, and message flows between the integrated systems.
● Used PySpark for big data processing, pandas for data analysis, Flask for building lightweight APIs, and pytest for test automation.
● Deployed Back base payment microservices in a Kubernetes cluster (AWS EKS, Azure AKS, GKE, or OpenShift).
● Used Back base for business impact reduces fraud risk and ensures PSD2, AML compliance.
● Configured Azure Front Door to distribute traffic across multiple regions, for high availability.
● Configured SSL/TLS termination at the edge using Azure Front Door, providing secure connections for user
● Created detailed infrastructure diagrams using tools like Microsoft Visio and Lucid chart to illustrate the architecture of cloud-based solutions.
● Integrated Spring Boot applications with relational databases using Spring Data JPA and optimized query performance.
● Authored reusable and modular Ansible playbooks and roles to automate provisioning, patching, software installation, and system configuration.
● Expertise in using Checkmarx, SonarQube, and Coverity to improve the security and quality of code.
● Designed application architecture diagrams to visualize the structure of complex software applications.
● CDN Implementation: Deploy and configure CDN solutions to optimize content delivery, reduce latency, and improve website performance.
● Engineered and maintained fine-grained RBAC policies in Azure AD and Azure subscriptions, applying least privilege
● principles to roles across development, testing, and production environments.
● Utilized Kinesis Firehose to ingest large volumes of streaming data from various sources, including IoT devices, application logs, clickstreams, and social media feeds.
● Architected and developed custom operators using Kubernetes controller-runtime framework and client-go, implementing reconciliation loops to manage stateful and stateless workloads through CRDs.
● Deployed and configured Splunk Cloud instances for various organizations, ranging from small-scale deployments to large enterprise environments.
● Leveraged fixtures, mocking, and parametrized tests to ensure broad coverage and maintainability.
● Utilized pyTest to automate the testing of Python applications, libraries, and APIs.
● Infrastructure Design: Design CDN architectures that align with business requirements, considering factors such as geographic distribution, redundancy, and scalability.
● Performance Optimization: Continuously monitor and optimize CDN performance to ensure fast and reliable content delivery globally.
● Built scalable controllers in Go, leveraging informers, shared caches, work queues, and event handlers to optimize reconciliation efficiency and reduce API server load.
● Implemented resource management strategies to ensure efficient allocation and utilization of CPU, memory, and storage resources across the cluster as k8 Admin.
● Configured and maintained relational databases using AWS RDS.
● Deployed and managed Kubernetes clusters on AKS for various projects, ranging from small-scale applications to
● large-scale microservices architectures.
● Leveraged Kinesis Firehose's data transformation capabilities to preprocess and transform streaming data before delivering it to the destination.
● Integrated Kinesis Firehose with other AWS services to build end-to-end data processing pipelines.
● Implemented data access patterns and strategies for efficient and scalable DynamoDB operations.
● Configured and managed security groups and access control rules for network security.
● Implemented the use of SonarQube for code checks and bugs ensuring code quality.
● Setup a blue/green environment for faster releases and updates in K8 Clusters with terraform.
● Integrate the K8s cluster with CI/CD pipelines to enable continuous integration and delivery of applications, ensuring fast and reliable deployments
● Used PySpark extensively to perform data transformation, manipulation, and analysis on large datasets.
● Setup Rehydration process for Neo4j for both AWS and Azure with the CI/CD pipeline.
● Setup Rehydration process for Aerospike in both AWS and Azure with the CICD pipeline.
● Setup a CI/CD pipeline for new releases of Timbr and update.
● Deployment of Timbr graph Database application with Docker/Docker Compose.
● Test cases using pytest’ s conventions, including test functions, fixtures, parameterization, and test discovery mechanisms.
● Used Azure Active Directory for managing identities, implementing role-based access control (RBAC).
● Wrote shell scripts using Bash to automate repetitive tasks.
● Deployed, configured, and managed AWS RDS instances.
● Leveraged EC2 Lifecycle Manager to create snapshots of EBS Volumes on scheduled intervals for backup and define a
● retention period as a cost-saving measure used Azure Logic Apps and Azure Service Bus for Run Pytest tests, and
● generate test reports, enabling early detection of issues and ensuring code quality throughout the development lifecycle.
● building scalable and robust integration solutions.
● Performance Optimization: Continuously monitor and optimize CDN performance to ensure fast and reliable content delivery globally.
● Implemented data ingestion pipelines to ingest data from diverse sources into Splunk Cloud for analysis.
● Used Pytest's fixture mechanism allows for setup and teardown of test environments, data preparation, and resource allocation.
● Integrated PySpark seamlessly with machine learning libraries such as MLlib and scikit-learn, enabling the creation of end-to-end machine learning pipelines.
● Used AKS clusters, Azure Portal, Azure CLI, or Infrastructure as Code (IaC) tools like Terraform and ARM templates.
● Host Backbase services in AWS Kubernetes (EKS) and on-prem K8s.
● Integrated Pytest into Azure DevOps pipelines to automate the execution of tests as part of the Continuous Integration (CI) and Continuous
● K8 Admin handles configuration management for the entire cluster, including applying updates, managing secrets, and configuring networking policies.
● Security Implementation: Implement and maintain security measures within the CDN, such as DDoS protection, web application firewalls (WAF), and secure socket layer (SSL) configurations.
● Used AWS Kinesis Firehose to encompass data ingestion, transformation, integration with AWS services, monitoring, scalability, cost optimization, security, and compliance. Bank of America Brooklyn Center, MN Feb 2021 - Aug 2023 Sr. Cloud/Devops Solutions Architect
● Design for high availability and business continuity using self-healing-based architectures, fail-over routing policies, multi-AZ deployment of EC2 instances, ELB health checks, Auto Scaling, and other disaster recovery models.
● Automated deployment of Backbase payment modules with zero downtime.
● Used diagramming to convey complex technical concepts and architectures to stakeholders in a clear and concise manner facilitating collaboration, decision-making, and alignment among project teams and enabling stakeholders to have a comprehensive understanding of the proposed solutions.
● Consumed and authenticated against third-party APIs using OAuth2 tokens and service principals within pipeline automation.
● integrated UDeploy with Jenkins CI pipelines and JFrog Artifactory for automated artifact promotion and release
● orchestration.
● Integrated Open Policy Agent (OPA) Gatekeeper alongside custom webhooks for layered policy enforcement and
● compliance automation.
● Designed zero-trust service communication models using Istio, enabling strict mTLS, authorization policies, and
● namespace-level isolation.
● Designed reusable declarative and scripted pipelines for build, test, artifact publishing, and Kubernetes deployments.
● Designed security architecture diagrams to illustrate the security measures and controls implemented within