Full Name: Vishnu Mannam
Email: ***********@*****.***
Phone: +1-904-***-****
LinkedIn: https://www.linkedin.com/in/vishnumannam/
Title: Senior Data Engineer
Professional Summary
Senior Data Engineer with 8+ years of experience in building large-scale data engineering solutions, specializing in cloud-based architectures, streaming data platforms, GraphQL APIs, and distributed systems. Skilled in Python, Node.js, AWS, Snowflake, MongoDB, Kafka, and CI/CD practices with a strong understanding of domain-driven design and federated service architecture. Proven success in high-transaction environments with strict data governance requirements.
IT Skills
Languages: Python, JavaScript, Node.js, Scala, PL/SQL, SQL
Big Data & ETL: Spark, Hadoop, Hive, Pig, Oozie, Kafka, Flume, Sqoop
Cloud Services: AWS (EC2, S3, EMR, Lambda, Aurora), Azure
Databases: Oracle, Aurora MySQL, Snowflake, MongoDB, DynamoDB, Redis, HBase, Cassandra
Tools: Jenkins, GitHub, Docker, Kubernetes, Tableau, Looker
Frameworks & APIs: GraphQL (Apollo Federation), REST, Flask, Django
CI/CD & DevOps: GitOps, Jenkins, Nexus, Terraform, CloudFormation
Monitoring & Observability: AWS CloudWatch, Zookeeper, Apache NiFi
Project Experience
Client: American Airlines
Role: Senior Data Engineer
Duration: April 2023 – Present
Project: Developed a cloud-native data processing pipeline using AWS, Kafka, and Snowflake. Designed and optimized federated GraphQL subgraphs for real-time sports analytics.
Roles and Responsibilities:
Designed and built data APIs using GraphQL, integrated across multiple services.
Created CI/CD pipelines using Git, Jenkins, and containerized applications with Docker and Kubernetes.
Developed streaming ingestion solutions using Kafka and processing with Spark.
Created real-time reports in Looker via Snowflake.
Built orchestration flows using Apache Airflow and Oozie.
Maintained infrastructure using AWS CloudFormation and IAM for secure access.
Managed large-scale data ingestion using Python and AWS-native tools.
Supported monitoring using AWS CloudWatch and custom observability dashboards.
Client: State of TX (DSHS)
Role: Big Data Engineer
Duration: June 2021 – March 2023
Project: Built ETL pipelines to process healthcare datasets with compliance under HIPAA, leveraging Spark, Kafka, and Azure services.
Roles and Responsibilities:
Developed real-time analytics dashboards with Tableau and Kafka.
Integrated unstructured healthcare data into HDFS and NoSQL stores.
Designed ETL pipelines using Scala, Spark, and Hive.
Automated monitoring using Zookeeper, Airflow, and Jenkins.
Worked in agile teams, managing sprints and releases in JIRA.
Designed serverless apps with Azure Functions for scalable ETL.
Implemented data governance policies aligned with HIPAA standards.
Conducted schema evolution and data lake optimizations on Azure Blob Storage.
Client: Wells Fargo
Role: Big Data Engineer
Duration: Sep 2019 – May 2021
Project: Led the architecture for cross-platform data replication across Redshift, Hive, S3, and ElasticSearch using Python and Spark.
Roles and Responsibilities:
Wrote MapReduce jobs in Java and transitioned to Spark jobs in Python.
Built custom data connectors for Redshift, Hive, and ElasticSearch.
Developed microservices using Python Flask and containerized with Docker.
Integrated Kinesis for event stream collection.
Utilized AWS Glue and EMR for serverless transformations.
Implemented CI/CD automation using Jenkins and Git.
Built monitoring solutions using Grafana and Logstash (if used – optional).
Enabled real-time metrics with Elasticsearch and Kibana.
Client: TCS (India)
Role: Hadoop Developer
Duration: May 2018 – Aug 2019
Project: Built ETL and analytics pipelines for media and telecom clients on Hadoop/Cassandra stack.
Roles and Responsibilities:
Implemented batch processing with MapReduce, Hive, and Pig.
Automated ETL validations and deployments using Python and SSIS.
Deployed and configured Sqoop, Flume, and Oozie across clusters.
Designed scalable data schemas in Hive and Cassandra.
Created Avro/parquet formats for optimized computation.
Ensured data compliance per SOX standards in log management.
Created shell scripts for data load orchestration.
Worked on interactive SQL analysis and reporting for content platforms.