Post Job Free
Sign in

Data Engineer Machine Learning

Location:
Bellevue, WA
Posted:
August 08, 2025

Contact this candidate

Resume:

VENKATA RAMI REDDY MYLA

Contact: +1-425-***-**** Email: **********@*****.***

LinkedIn: MylaVenkataRamiReddyPROFESSIONAL SUMMARY

Lead Azure Data Architect Data Engineer Data Scientist Analyst Big Data & DWH Specialist

Versatile and results-driven data professional with over 16 years of experience in the IT industry, specializing in cloud-based data architecture, engineering, analytics, and AI. Extensive expertise in Microsoft Azure and Google Cloud Platform (GCP), with a strong background in big data solutions, data science, artificial intelligence, and machine learning. Proven track record in designing scalable data platforms, developing intelligent solutions, and leading cross-functional teams to drive innovation and deliver measurable business value across diverse domains.

TECHNICAL SKILLS

Cloud Platforms:

●Azure: Azure Data Fabric, Real-Time Intelligence, ADLS, OneLake, Azure SQL Database, Cosmos DB, Data Factory, Synapse, Databricks, Data Fabric, Logic Apps, Azure Functions, Batch Account, PowerBI, Power Apps, Power Automate, DevOps, Event House, Event Stream, Activator, KQL.

●Google Cloud (GCP): Cloud Storage, CloudSQL, DataProc, Dataflow, BigQuery, BigTable, Composer, Airflow.

●AWS: EC2, S3, RDS, Glue, Redshift.

Big Data Technologies: Hadoop (HDFS), Hive, HBase, Scala, Spark, PySpark, Oozie, Jenkins.

Data Science, Machine Learning & AI: Python, R, NumPy, Pandas, MatPlotLib, Plotly, NLP, sklearn, TensorFlow, Keras, RASA, Flask APIs.

DevOps: GitHub, Jenkins, Docker, Kubernetes.

ETL Tools: Informatica, Data Stage, ODI (11G/12C), DAC.

BI & Visualization Tools: Power BI, Power Platform, Tableau, OBIEE, Cognos, Business Objects, Google Looker Studio.

Automation Tools & AI: Power Apps, Power Automate.

Tools &Agile: Remedy, Service Now, HP QC, Rally, CA Agile, Azure DevOps Board

Databases:

●SQL: Oracle, SQL Server, MySQL, Teradata, SQLite, PostgreSQL

●NoSQL: MongoDB, CosmosDB, HBase.

●Graph Databases: Tiger Graph.

Data Warehouses:

●Azure Synapse, Snowflake, Redshift, Google Big Query.

Programming Languages: Python, Scala, R, Java, C++, JavaScript, HTML, CSS.

Operating Systems: Windows, Linux, Mac.

PROFESSIONAL EXPERIENCE

Microsoft Location: Redmond, Washington, USA Data Architect Lead Data Engineer & Analyst 27-Feb-2025 – Present

●Global Demand Center(GDC) is marketing platform and I designed end to end architecture of data operations

●Used technologies cosmos db, ADF, Synapse, Fabric, KQL and ICM portal, Data warehouse cluster of ICM to manage the data

●Pareto Principle, also known as the 80/20 rule followed to manage the problems

●Provided strategic oversight and leadership in managing critical incidents, service disruptions, and escalations using the Microsoft Incident and Case Management (ICM) portal to ensure minimal business impact and rapid resolution.

●Led and mentored cross-functional teams across engineering, operations, and support to enforce ITSM best practices in incident, problem, and change management processes.

● Oversaw the governance of Major Incident Management (MIM) workflows, ensuring timely escalations, RCA (Root Cause Analysis), and stakeholder communications aligned with SLAs and compliance standards.

●Drove the implementation and optimization of ICM processes, improving incident response times and reducing incident volume through proactive trend analysis and automation.

●Partnered with Microsoft product groups, global support teams, and engineering stakeholders to enhance incident lifecycle management and influence platform-level reliability improvements.

●Championed Continual Service Improvement (CSI) by analyzing incident data, conducting post-mortems, and driving corrective/preventive action plans that fed into strategic risk reduction initiatives.

●Established executive dashboards and reporting mechanisms within ICM to monitor operational KPIs, SLA compliance, and service health across mission-critical platforms.

●Ensured regulatory and audit readiness by maintaining standardized documentation, playbooks, and knowledge repositories for all high-impact incidents and service events. Led operational reviews and incident drills to validate readiness and improve resilience for high-availability services in cloud-based environments.

Timex Group Location: Connecticut, USA Data Scientist Lead Data Engineer Analyst 05-Sep-2022 – 13-Dec-2024

●Designed and implemented enterprise-wide data architecture strategies using Azure Data Fabric

●Developed cloud-based data integration and analytics solutions using Azure Data Factory and Azure Data Fabric.

●Established data governance policies and optimized pipelines for performance and scalability.

●Developing analytic cube, reporting, data modeling and data pipeline using Azure Data Fabric

●Used Azure data fabric real time reporting using even hose, event stream, activators and K-SQL.

●Written PySpark in Azure Data fabric notebooks for EDA and data warehousing purpose.

●Delivered business intelligence solutions with Power BI, enhancing decision-making.

●Mentored teams on best practices in data engineering and analytics.

●Streamlined processes through automation using Power Apps and Power Automate.

●Unified and integrated systems seamlessly within a single platform using Power Automate

●Written code to create forecasting Models using Azure data fabric MLFlow.

●Tightly integrated systems under one roof using Power Automate.

●Developed tabular models to handle complex DAX formulas

●Bridged the gap between business processes and technology by implementing innovative solutions.

●Implemented Microsoft Fabric for unified data management and analytics across on-premises and cloud setups, increasing data accessibility and consistency

●I can develop functional dashboards with relevant KPIs in Power BI

●I can write complex, efficient SQL queries/scripts to extract data

●Develop visual reports and dashboards that have filters

●I can perform EDA and automate data extraction/analysis using Python

●Design, develop, and implement data pipelines using azure data factory (adf)/Azure Data Fabric to move data between various sources and data warehouses.

●Write and optimize sql queries to extract, manipulate, and analyze data from multiple sources.

●Develop and maintain python scripts/notebooks for data processing and automation tasks.

●Collaborate with cross functional teams to understand data requirements and ensure data solutions meet business needs.

●Monitor data pipeline performance and troubleshoot issues to ensure data accuracy and reliability.

●Stay updated on industry trends and best practices related to Azure cloud, AWS cloud, GCP cloud, data factory, data fabric, Databricks …etc.

UnitedHealth Group (Optum) Location: Minnetonka, USA

Senior Big Data Architect Data Scientist 05-March-2018 – 29-June-2022

●Implemented scalable big data solutions using MapR Hadoop, Spark, Hive, and Azure services.

●Developed data processing workflows using Azure Data Factory pipelines, optimizing data movement with activities like Copy for data transfers, Filter for data subset extraction, and For Each for automating iterative tasks, enhancing workflow efficiency and automation.

●Design and implement dedicated SQL pools for structured data storage.

●Manage partitioning, indexing, and table distribution strategies.

●Develop Synapse Pipelines (similar to ADF) for ELT processing.

●Use Spark Pools for big data processing and AI/ML integration.

●Optimize SQL Queries, Data Skew, and Workload Management.

●Use Materialized Views, Indexing, and Caching for faster queries.

●Employed Azure Data Factory to orchestrate complex end-to-end data workflows, integrating on-premises and cloud-based data services like Azure Databricks and Azure Synapse Analytics, supporting enterprise data modernization

●Developed predictive models for fraud detection and customer behavior analysis.

●Analyzed customer trauma data to develop insights and predictive models, enabling proactive interventions and improved support strategies

●Automated workflows using Airflow and deployed NLP-based Chabot solutions.

●Designed and integrated real-time analytics pipelines with Spark and PySpark.

●Delivered visualization dashboards using Tableau and Power BI.

●Applied data masking and anonymization strategies with Azure Data Factory and Azure Databricks to safeguard privacy while maintaining data utility for analysis

●Designed and developed innovative products, Clinical 360 and Provider 360, to deliver comprehensive insights and enhance decision-making

●Forecasted sales behavior by analyzing trends, seasonality, and noise to deliver accurate and actionable predictions

●Executed data engineering tasks utilizing a robust skill set, including Scala, Big Data, Jenkins, GitHub, and SQL, to build scalable and efficient data pipeline

R R ITEC Location: Telangana, India.

Solution Architect 09-Feb-2015 – 02-Mar- 2018

●Designing and developing data pipelines using Azure Data Factory, Azure Databricks, Azure Synapse Analytics, Azure Datalake Gen 2, Azure Key vault and other relevant tools, ensuring efficient data ingestion and transformation.

●Designed and implemented scalable data models in Azure Synapse, Azure SQL, and Delta Lake, optimizing data storage, retrieval, and processing for efficient ETL workflows in Azure Data Factory and Databricks

●Conduct hands-on practical sessions to enhance learners' technical skills.

●Assess students' progress through assignments, quizzes, and project evaluations

●Participate in faculty meetings and contribute to the development of the training center's goals and strategies.

●Deliver seminars, workshops, and webinars in Azure Cloud to attract potential students.

●Utilized Azure Synapse Analytics platform to perform advanced data analytics, including complex data transformations, aggregations, and statistical analysis enabling data-driven decision-making and business insights

●Assist in marketing efforts by showcasing the center's capabilities and success stories

●Collaborate with companies to understand their Azure Consulting and training needs and adapt the curriculum accordingly.

Cognizant Location: Teaneck, NJ, USA

Senior Associate 24-Mar-2008 – 06-Feb-2015

●Worked with Customers like Amgen, Pfizer, Fox, Abbott and health net

●Worked with various tools such as OBIEE, OBIA, ODI, Cognos, SAP Business Objects, Informatica, and more.

●Studying the business requirement document, interacting with the client and preparing the BRD HLD LLD

●Designing and developing data models that support business requirements, ensuring efficient storage and retrieval of data.

●Ensuring seamless integration of data from various sources, and implementing ETL (Extract, Transform, Load) processes

●Enforcing data standards, ensuring data quality, and maintaining data consistency across systems.

●Utilizing tools like Erwin, IBM InfoSphere, or SAP PowerDesigner to create, manage, and update data models

●Identifying and resolving data-related issues, including data integrity and performance problems.

EDUCATION

●Master of Science, Osmania University, 2004–2006

●Bachelor of Computer Science, Osmania University, 2001–2004

CERTIFICATIONS

●Microsoft Certified: Azure Data Engineer Associate (DP-203)

●Microsoft Certified: Power BI Data Analyst Associate (PL-300)

●Fabric Analytics Engineer Associate (DP-600)

ADDITIONAL INFORMATION

●Proven ability to manage cross-functional teams and deliver high-impact solutions.

●Published technical blogs and conducted workshops on data engineering and Data Analysis best practices.

●Active participant in industry forums and conferences on emerging technologies.



Contact this candidate