Post Job Free
Sign in

Real-Time Google Cloud

Location:
Manhattan, NY, 10003
Posted:
October 15, 2024

Contact this candidate

Resume:

Alex

Senior Elastic Engineer

*********@*****.***

PROFESSIONAL SUMMARY

Experienced Senior Elastic Engineer with over 6+ years of expertise in architecting and deploying scalable data solutions across AWS and Google Cloud platforms. Proven ability to enhance real-time monitoring, optimize data processing, and implement robust security measures through the ELK Stack (Elasticsearch, Logstash, Kibana) and other cloud technologies. At Walmart, led the development of Elastic Cloud solutions that improved query performance by 20% and integrated real-time data processing using AWS Lambda and Amazon S3 for efficient log storage.

• Proficient in architecting and deploying Elastic Cloud solutions, leveraging services such as AWS, Google Cloud Storage, and AWS Lambda for enhanced data management and analytics.

• Expertise in developing and maintaining Kibana dashboards for real-time insights into user behavior infrastructure performance, and operational metrics, fostering data-driven decision-making.

• Skilled in implementing security protocols, including IAM roles and HashiCorp Vault, ensuring compliance with industry standards and reducing security incident response times by 25%.

• Experienced in utilizing advanced data processing tools, including Apache Kafka and Flink, to support market surveillance and anomaly detection, resulting in a 15% reduction in system downtime.

• Proven track record in automating CI/CD pipelines using Jenkins and Azure DevOps, improving development efficiency by 30% and ensuring reliable deployment processes across environments.

• Hands-on experience with machine learning frameworks like TensorFlow, scikit-learn, and PyTorch to extract insights and drive predictive capabilities from complex datasets.

• Strong background in containerization and orchestration using Docker and Kubernetes, achieving 99.9% uptime of mission- critical services through optimized cluster management.

• Collaborative team player, working closely with cross-functional teams to align on project goals, enhance workflows, and ensure successful integration of data solutions.

• Proficient in implementing monitoring solutions with tools like Splunk, Prometheus, and Grafana to track system performance and ensure optimal infrastructure operations.

PROFESSIONAL EXPERIENCE

Senior Elastic Engineer Walmart Remote Mar2022 – Present

• Architected and deployed Elastic Cloud solutions on AWS, enhancing real-time monitoring and search capabilities. Integrated Elastic Cloud with Amazon S3 for efficient log storage and leveraged AWS Lambda for real-time data processing and indexing. Improved query performance by 20% through optimized Elasticsearch cluster design, ensuring optimal query speed and reliability. Utilized Amazon VPC and Private Link for secure, private network communication between components.

• Developed and maintained 12 Kibana dashboards, delivering real-time insights into user behavior, infrastructure performance, and operational metrics, leading to better business decisions and enhanced customer experience

• Implemented IAM roles for granular access control, ensuring compliance with security standards and policies across the infrastructure. Utilized Elastic Machine Learning for anomaly detection, enabling proactive identification of unusual patterns in search queries and system load, which reduced downtime by 15% and enhanced proactive management of system health.

• Implemented Cross-Cluster Search to scale search functionality across multiple AWS regions and clusters, allowing fast, accurate search results globally. Developed and maintained Kibana dashboards to provide actionable insights into key performance indicators.

• Integrated Logstash for efficient data ingestion and transformation, supporting seamless log processing and enriching data for further analysis within the Elastic Stack. Collaborated with cross-functional teams to ensure the successful integration of Elastic solutions, maintaining close communication with stakeholders to align on project objectives and achieve optimal performance.

• Utilized Apache Kafka for real-time data streaming and processing, ensuring smooth data flow across systems. Employed Apache NiFi to automate data ingestion and integration of diverse sources, and implemented Apache Flink for advanced stream processing and real-time analytics to support market surveillance and anomaly detection.

• Automated CI/CD pipelines using Jenkins and Docker, improving development cycle times by 30%. Orchestrated applications using Kubernetes, ensuring 99.9% uptime of mission-critical services.

• Strengthened data security by integrating HashiCorp Vault for sensitive information management and applied network security monitoring with Zeek, reducing security incident response times by 25%. Utilized Zeek for network security monitoring and integrated OpenSearch and MongoDB to handle diverse data types and support comprehensive data exploration and advanced analytics.

• Implemented Splunk for log management and operational intelligence, configuring data connectors to ensure seamless integration across systems, enhancing overall data pipeline efficiency and analytics capabilities.

• Developed and deployed machine learning models to extract insights and generate predictions from complex datasets. Utilized frameworks like scikit-learn and TensorFlow for building and training models and employed PyTorch for advanced deep learning tasks. Implemented AI solutions to enhance data analysis capabilities and support decision-making processes.

• Coordinated with cross-functional teams, including data scientists, software engineers, and business analysts, to align on project objectives and requirements. Facilitated effective communication and information sharing across teams to ensure seamless integration of data solutions. Participated in regular meetings and collaborative sessions to discuss progress, address challenges, and refine project strategies.

Elastic Engineer Cisco Remote Jan 2019 – Feb 2022

• Architected and deployed Elastic Cloud solutions on Google Cloud to deliver highly scalable and AI-driven search experiences. Integrated Elasticsearch with Google Cloud Storage for efficient, distributed data management, and implemented Python- based ETL workflows to automate data ingestion, enabling real-time processing and analytics.

• Designed and maintained 7 Kibana dashboards to provide real-time visibility into system performance, user behaviour, and operational metrics. Leveraged Elastic Machine Learning for proactive anomaly detection, allowing for early identification of infrastructure issues and improved system reliability.

• Implemented cross-cluster replication and horizontal scaling, ensuring high availability and resilience across Elasticsearch clusters. Utilized Index Lifecycle Management to streamline data storage and archival, reducing costs while maintaining data durability with snapshot-based recovery mechanisms.

• Containerized key applications using Docker and deployed them via Elastic Cloud on Kubernetes, ensuring scalable, fault- tolerant infrastructure. Developed interactive and responsive front-end interfaces with React.js for Kibana, allowing teams to visualize data in real-time and make informed decisions.

• Automated monitoring and alerting processes using Python scripts and REST APIs, integrating Elasticsearch DSL for custom querying and automation. This improved response times and facilitated personalized, dynamic search experiences, enhancing system performance and user satisfaction.

• Strengthened security through robust role-based access control (RBAC), encrypted communication protocols, and comprehensive audit logging. Ensured secure cluster communication with Google VPC and IAM, enforcing strict access controls while meeting corporate and regulatory security standards.

• Optimized global search performance using Cross-Cluster Search, enabling seamless access to distributed data across multiple regions. Leveraged Elasticsearch DSL to implement advanced querying, and automated search and data extraction processes with Python, driving improved operational efficiency and enhanced search capabilities.

• Designed and deployed machine learning models to extract actionable insights and drive predictive capabilities from complex datasets. Leveraged frameworks such as scikit-learn and TensorFlow for model development, and utilized PyTorch for advanced deep learning applications to enhance data analysis and support AI-driven decision-making.

• Collaborated with cross-functional teams, including data scientists, software engineers, and business analysts, to ensure alignment on project goals and requirements. Facilitated effective communication and knowledge sharing across teams, ensuring smooth integration of data solutions and optimization of workflows. Actively participated in project meetings to track progress, troubleshoot issues, and refine strategies for continuous improvement. DevOps Engineers BNY Mellon Onsite Jun 2018 – Dec 2019

• Designed and implemented CI/CD pipelines using Azure DevOps, automating code integration, testing, and deployment processes for multiple applications. Streamlined the deployment cycle, reducing downtime and increasing development efficiency by enabling frequent and reliable releases.

• Managed containerized applications using Docker and orchestrated them with Azure Kubernetes Service (AKS). Ensured high availability, scalability, and fault tolerance of cloud-based applications by optimizing Kubernetes cluster configurations and deploying Helm charts for package management.

• Utilized Terraform for infrastructure as code (IaC) to automate the provisioning and management of Azure resources. Ensured consistent and repeatable deployments across environments, allowing rapid scaling of infrastructure while maintaining compliance with security and operational policies.

• Automated monitoring and alerting using Azure Monitor and Application Insights. Configured custom metrics and alerts for real-time monitoring of system performance and application health, enabling proactive incident detection and resolution.

• Implemented Azure Key Vault for secure management of secrets, certificates, and encryption keys. Integrated it with automated workflows to ensure sensitive information remained protected throughout the CI/CD pipeline and across distributed systems.

• Leveraged Azure Blob Storage for efficient storage of logs and artifacts, enabling secure and scalable data retention. Integrated Azure Data Factory for ETL processes, ensuring smooth data ingestion and transformation between various cloud services, supporting operational analytics.

• Integrated Prometheus and Grafana for real-time system and application monitoring. Developed custom dashboards and set up automated alerts to track resource usage, application performance, and system health, ensuring optimal infrastructure operations.

• Implemented Azure Policy and Security Center to ensure compliance with financial industry standards and secure cloud operations. Applied role-based access control (RBAC) and identity management using Azure Active Directory (AAD) to enforce strict security policies and protect sensitive data.

• Orchestrated serverless functions with Azure Functions to automate repetitive tasks and event-driven processes. Deployed event-based workflows that handled data transformations, triggered alerts, and improved operational efficiency across different services.

• Optimized cost and resource management by automating scaling policies for compute resources in Azure. Configured auto- scaling for AKS and virtual machines, ensuring resource efficiency without compromising performance during peak demand periods.

• Employed Git for version control and collaboration, ensuring smooth communication and coordination between development, operations, and security teams. Regularly conducted code reviews and integrated testing to maintain high-quality code and reduce errors during production deployments.

• Enhanced infrastructure security by integrating Azure Sentinel for threat detection and incident management. Configured Sentinel to monitor activity logs, detect anomalies, and automate responses to potential security threats, ensuring compliance and maintaining robust security postures.



Contact this candidate