Post Job Free
Sign in

Data Analytics Engineer

Location:
Raleigh, NC
Posted:
February 21, 2025

Contact this candidate

Resume:

Darshan Ramegowda linkedin.com/in/darshanramegowda

Raleigh-Durham-Chapel Hill Area, NC, USA *********@*****.*** +1-614-***-**** Summary

● 20+ years of total IT experience, with 9+ years of Solution Architect and 10+ years in Data Engineering and 3+ years in Leadership roles

● 9+ years of experience with Cloud Products & Services like (AWS, GCP, AI/ML, GenAI, RAG, Databricks, Snowflake, Oracle)

● 8+ years of experience with Distributed systems in the Big Data arena, 10+ years of experience in Data Warehouse/ Database using ETL, Informatica, SSIS, SSRS, SQL, OracleDB and Polyglot Programming(Python, Shell/Bash, .Net)

● Adept in technology fields like Cloud (AWS & GCP), Databricks, Snowflake, Data Analytics, AI/ML, GenAI, Operations & Observability, Migration & Modernization, ETL/ELT, Data Warehouse;

● Strong agility and openness to learn, perform and excel in current industrial trends and technologies.

● Built Operations and Observability platform driving $4-6 million savings in operation costs, Conceptualized Data Solutions & Well Architect-ed principles across Strategic customers

● Actively seeking assignments in the field of Cloud Computing, Data Analytics and AI/ML.

Skills

Cloud Services BedRock • Sagemaker • RAG • LLM • Redshift • Athena • Glue, Kinesis • Quicksight • DataProc • Monitoring/Alerting • Cloud KMS • Composer • Terraform • Grafana • Airflow

• Load Balancer • Cloud SQL • BigQuery • DataFlow • Pub/Sub • BigTable • Cloud ML • Cloud Spanner • EC2 • S3 • RDS • EMR • Databricks • Snowflake • QlikSense Big Data Stack Apache Hadoop • Spark • Scala • Hive • Ranger • Solr • Zookeeper • Sqoop • HDFS • Flume • Apache Nifi • Apache Atlas • UNIX • Linux Data Tools Databricks • Snowflake • Informatica • ETL • Informatica B2B • SSIS • SSRS • Oracle Exadata

Programming Shell/Bash scripting • Python • Scala • Spark SQL • Spark Streaming • HiveQL • Oracle DB •SQL • PL/SQL • Java • C# • .Net • ASP • COM • JavaScript • VBScript Management/SC Asana • Atlassian • JIRA • Confluence • Quality Center • Service Now • Github • GitLab S/W Methodologies Agile/SCRUM • Kanban • Waterfall • Test Driven Development Experiences

Lead Solutions Architect(Cloud/AI/ML)

Travelers Insurance Dec 2023 - Till Date

Experience in working with RAG technologies, LLM models and Agent frameworks using AWS/Sagemaker/Bedrock for implementing Q&A based systems for TravAI and HR Chatbots

Key responsibilities are Technical leadership, Customer engagement, Solution Architecture, Cloud Migration & Modernization, Security/Compliance, Resiliency, Price/Performance and Operations.

Designed comprehensive, end-to-end solutions on AWS, MultiCloud and Hybrid that align with business requirements ensuring scalability, reliability, security, resilience and performance.

Collaborated closely with clients to understand their business objectives, technical needs, and constraints and provided expert guidance on Cloud capabilities and best practices to help them achieve their goals

Design Architecture patterns and templates that can be used across organizations LOB units. Senior Solutions Architect

Amazon Web Services (AWS) Jun 2022 - Dec 2023 (1.5 years)

Primarily focus on data analytics platform, data lake, enterprise solutions, data processing, data warehousing, Oracle DB and ETL across business verticals

Design and Architect Cloud Operations and Observability solutions for Monitoring, Alerting, Visualizing and Logging using Cloud metrics, Grafana Labs, and Slack/Xmatters as communication channels

Provide support across Pre Sales, Solutions Design, Customer collaboration and Technical expertise

Make sound technical and strategic decisions across multi cloud and sustainable well architected principles

Actively involved in Modernization and Migration solutions Google Cloud Consultant - GCP

Deloitte Consulting Jul 2018 - Jun 2022 (4 years)

Design and Architect Cloud Security solutions for Authentication and Authorization using Apache Ranger, Solr, Zookeeper, Kerberos, Centrify, KMS, Cloud IAM

Create data movement tools between projects and storage classes for GCS Cloud Storage Objects and Metadata

Implement CI/CD pipeline for data processing using Cloud Composer(Airflow), GCS and BigQuery

Deploy Cloud Infrastructure using Terraform & Build applications and solutions using Cloud IaC

Design Resilient & High Availability(HA) solutions using Cloud Load Balancer for Data Analytics and Operation products

Implement Token Broker by bridging the gap between Kerberos and Cloud IAM to allow users to log in with Kerberos and access GCP resources.

Moving Big Data workloads and security controls from On-premises to Dataproc Lead Data Engineer

Infosys Nov 2006 - Jul 2018 (11.9 years)

Create Data Ingestion process of importing data from Cloud (File and Web services) into Hadoop file system via Apache Flume, Nifi, Informatica, Oracle DB and SSIS packages.

Involved in creating Data Lakes/Hubs by transforming the ingested data into application specific data models

Used Apache Spark with Scala (Spark SQL and Spark Streaming) for transforming and storing data into Hadoop file system

Involved in discussions with Business partners, Business analyst in helping them define requirement through elicitation process

Involved in discussions with Technical architects and Managers in defining and improving system architectures and design

Evaluate and transform BRD, FRS into robust and scalable Designs by documenting High level and Detailed level steps. Operate in a highly Agile development environment. Certifications

AWS Certified Solutions Architect Professional - Amazon Web Services (AWS) Google Cloud Certified Professional Cloud Architect - GCP Education

Quantic School of Business and Technology

Master of Business Administration & Technology 2020-21 Visvesvaraya Technological University

Bachelor of Engineering(BE) in Computer Science 2002-06



Contact this candidate