Post Job Free

Resume

Sign in

Machine Learning Data Bricks

Location:
Lewisville, TX
Posted:
August 01, 2023

Contact this candidate

Resume:

MAHESH RACHAMALLA

732-***-****

adyntp@r.postjobfree.com

Data engineering in 6+ years of Experience

(Overall 12+ Years of IT Experience)

PROFILE SUMMARY

Data Engineer with Hands-on experience on the state-of-the art automation technologies in Data Management and integration including ETL, Cloud, Warehousing and Big Data. Strong background in Development Methodologies, Solution Delivery, Deep technical expertise and Wide business exposure to understand and solve business problems with technology.

Data Ingestion & Processing: Proven background of Designing, Developing and Deploying End to End Data Pipelines both on-premises and in cloud environments. Experienced in handling high volume and high velocity loads using parallel processing tools & methodologies. Expertise in designing BI Solutions using diverse ETL tools and databases

Reporting and Analytics: Experienced in enabling down-stream reporting and analytics applications both in batch and real-time scenarios. Designed Data models such that the proper load balancing is ensured between the Data and BI-layers.

Solution Design and Delivery: Experience and capability to deal with complex and large environments across various business verticals. Assisted teams in designing highly scalable, highly concurrent, and low latency data applications. Ability to adapt cutting edge technologies to enterprise requirements including high availability and disaster recovery. Experience working with and evaluating open-source technologies and demonstrated ability to make objective choice

Machine Learning/Data Scientist: Expertise in various machine learning algorithms and techniques, including supervised and unsupervised learning, regression, classification, clustering, ensemble methods, natural language process and Text Analytics

SKILL SUMMARY

Highly skilled professional with experience in Data Engineering, Data Architecture, Design & Implementation in HealthCare, Oil&Gas, Telecom and Retail domains

Responsible for deploying machine learning models into different environments (MLOps), worked on packaging models, creating containers, and configuring the necessary infrastructure to ensure models can be deployed and scaled effectively.

Provided the Architectural design plans for various applications part of modernization) to migrate existing applications from on-prem to Microsoft Azure Cloud

Created the re-usability framework, best practices, and reduced cost saving

Experience in developing Spark applications using SparkSQL in Data bricks for data extraction, transformation, and aggregation from multiple file formats for analyzing and transforming the data to uncover insights into the customer usage patterns.

Performed Exploratory Data Analysis (EDA) to extract key insights from the data

Experience in Azure MLOps/DevOps and CICD Implementation

Responsible for deploying machine learning models into production environments. They work on packaging models, creating containers, and configuring the necessary infrastructure to ensure models can be deployed and scaled effectively.

Worked on Natural Language Process the Intent classification model to classify Service Request or Incidents using the email and Entity Recognition Model to extract the required entities from the email request

Created Machine Learning Models using Azure Machine Learning Framework, Integrated the models using Web services

Good Interpersonal skills, commitment, result oriented, with a quest and zeal to learn new

Technologies and Highly Adaptable and flexible in working with different geographical team

Good Exposure to AWS,kafka streaming and Snowflake Database

TECHNICAL SKILLS

Big data & Cloud Computing: Azure Data Bricks, Data Bricks SQL, Unity Catalog, Autoloader, Spark, PySpark, Spark SQL, Delta Tables, Azure Machine Learning, ML-Flow, Azure Logic Apps, Azure Synapse, Python, SQL, Azure Data Factory, Delta Lake, Services, Azure Key Vaults, Azure Storage (BLOB & ADLS Gen2), Azure Purview, Streaming, Snowflake, kafka,AWS, PowerBI and Tableau

Machine Learning: Azure Machine Learning Services, Data Science, Azure Python SDK,Predictive Modeling, MLOps, mlflow, Scikit-learn, Numpy, Pandas, statistics, Python, Matplotlib, Seaborn, Deep Learning, RNN, CNN, Dockers, TensorFlow, Keras and PyTorch, NLP, Text Analytics and Artificial Intelligent, Azure Cognitive services, search index, chatbot,LUIS

BI: SSIS, SQL Server, Data Modeling and Data warehousing

Environment & Software Engineering: Jupyter Notebook, anaconda, SDLC, Agile, Scrum, Azure Devops

EXPERIENCE

Client: Georgia Department Education (GADoE) – USA Feb 2022 – Present

Role: Lead Senior Technology Consultant – Ernst & Young

Leading the DE team for work assignment and co-ordinate with DE team for requirement

Understanding and getting them delivered according to SLA and make sure giving quality of work delivered by team and closely working with stake holders on requirements

Manage business stakeholders to secure strong engagement for the solution and ensure

Implemented the content-based recommender system to recommend the related deals to the user using Text Analytics

Worked on AutoML for model selection and hyper parameter tuning

Created Models using Azure ML Framework, Azure Python SDK, Registered, Integrated and deployed the models using Web services

Worked data bricks autoloader to ingest batch and streaming data

To train fellow colleagues on data bricks and other application for the data needs

Actively participated the Requirement Gatherings, Analysis, OKRs and Workshops

Experience in building architecture for Data Lakes, Data Warehouses and Data Marts

that the delivery of the project aligns to longer-term strategic roadmaps

Provided the Cloud based Architectural Technical Design Implementation

Created reusable ADF Pipelines for data-driven workflows in cloud for orchestrating and automating data movement and data transformation.

Created Azure Data bricks notebooks to load data from various Source to Azure Data Lake Delta Lake and SQL database

Created reusable Pyspark notebooks to transform and load data into Azure Data Bricks Delta tables

Created python UDF’s for reusability, config based generic Pyspark notebooks and logging mechanism

Created widgets in data bricks to process dynamic files and dynamic tables.

Heavily used Spark SQL for transformation and data analysis

Worked on the Unity Catalog for the Data Governance Implementation

Created dynamic Views for Row-level-security (RLS) and Entity Level restriction

Created Data Policies for Job role-based data access for Data Governance

Client: Clark County - Las Vegas – USA Sep 2021 – Feb 2022

Role: Senior Technology Consultant– Ernst & Young

Established the platform, technology, best practices, tools, and standards

Focus on transitioning the data of legacy systems for the scope of the airside, land side and finance Domain

Extend the platform with Datawarehouse, Data Analytics, Visualization and Self-service

Created Azure Logic App resource to post the request using web activity in ADF to send automated emails to business/project resources to acknowledge Pipeline Success/Failures

Build Azure Databricks notebooks in PySpark, Scala and SparkSQL to create Data Warehouse dimensional models in ADLS.

reporting, Automated CI/CD pipeline to control development/deployments

Implemented end to end governance using Microsoft Azure Purview, Data Bricks Unity Catalog

Suggested and worked on the Value Creation ideas to optimize the customer tools resulted more than $350K saving every year for the customer

Created widgets in data bricks to process dynamic files and dynamic tables.

Implemented code reusability technique's by creating separate notebooks for variable initialization and Pyspark function, calling them in each dimension and fact table notebooks execution.

Client: Telstra – Australia July 2019 – Sep 2021

Role: Application Development Sr Analyst – Accenture

Created Natural Language Process the Intent classification model to classify Service Request or Incidents using the email and Entity Recognition Model to extract the required entities from the email request

Developed machine learning models to solve business problems using various algorithms such as Naïve Bayes, K-Means,Decision Tree, Random Forest,BERT and XGBoost and Co-sine similarity

Conducted data analysis, feature engineering, data wrangling, TF-IDF vectorization, Word2Vec, Count Vectorization and model training using Python and popular machine learning libraries.

Collaborated with cross-functional teams to understand business requirements and translate them into machine learning solutions.

Optimizing machine learning models, including hyper parameter tuning, model selection, and performance evaluation.

Conducted rigorous testing and performance monitoring to ensure models delivered accurate and reliable results

Created Models using Azure ML Framework, Integrated the models using Web services

Responsible for deploying machine learning models into different environments (MLOps), worked on packaging models, creating containers, and configuring the necessary infrastructure to ensure models can be deployed and scaled effectively.

MLOps, monitoring and logging systems to track the performance, behavior, and health of deployed machine learning models, set up alerts, dashboards, and metrics to detect anomalies, monitor resource utilization, and ensure the models are functioning as expected. Troubleshooting issues and optimize models based on monitoring data

Designed and developed machine learning and deep learning systems to improve company products

Ran machine learning tests and experiments to optimize system performance

Deployed the Machine Learning Models on to production using ACI and AKS

Client: Chevron Oil & Gas Apr 2017 – July 2019

Role: Application Development Sr Analyst – Accenture

Actively participated in Requirement analysis, Designing Mapping Toolkit to build dimension and fact tables for data modelling.

Implemented Pyspark data bricks notebooks to ingest and transform the data from source(raw) to target(output) container by applying business logics.

Created external hive tables with partitions on target Parquet and Delta file format folders.

Each dimension and fact table data ingestion handled using two notebooks, one is to read and transform the input data and another one is to implement

Client: Mars-Chocolate Mar 2016 – Apr 2017

Role: Application Development Sr Analyst – Accenture

Performed activity using try and except blocks while running the notebook.

Migrated the SQL Stored Procedure to Spark Code in Databricks Environment

Extensively worked on different files formats like .csv, parquet, and delta file formats.

Designed and implemented data vault models to extract, transform and load the product data with history of changes on the products attributes

designed and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries, and packages

Client: Blue Cross Blue Shield Association Aug 2015 – Mar 2016

Role: Application Development Sr Analyst – Accenture

Analyze feature requirements and lead the technical activities for the design and

development in addition to individual contribution

Worked on Development and unit testing

Analyze customer issues and provide resolution/workaround. & Analysis of the feature,

design, and development activities.

Client: United Health Group Dec 2010- July 2015

Role: Software Engineer – United Health Group

Implemented the content-based recommender system to recommend the related deals to the user using Text Analytics

Implemented BI solution framework for end-to-end business intelligence projects

Worked on various tasks of SSIS like Transform Data Task, Execute SQL Task, Bulk Insert, for each loop, Sequence Container, and file System Task.

Monitoring day to day data loads of various SSIS Package

Worked with various Control Flow items - For Loop, For Each loop, Sequence, Execute Sql task, File System task) and Data Flow items (Flat File and OLE DB source and Destinations

Designing and developing SSIS packages using different transformations like Conditional Split, Multicast, Union-All, Merge, Merge Join and Derived Column, Different types of Data Loading - Direct, Incremental and SCDType2

Transaction Process Executive – INTEQ BPO Pvt Ltd Services Feb 2009 – Dec 2012

Transaction Process Executive – Infinite BPO Pvt Ltd Services Feb 2007 -Feb 2009

PERSONAL SKILLS

Provide updates to top management and co-ordinate with the team

Confident, articulate, and professional speaking abilities (and experience),

Innovative professional, delivered & created multiple processes in team reducing manual work

Leadership - Managed teams of multiple resources

Playing a dual role of delivery lead as well as an individual contributor



Contact this candidate