.
.
NIMESH KUINKEL
Lead Data Engineer
San Francisco Bay Area, CA, 94547
******.********@*****.***
linkedIn: https://www.linkedin.com/in/nimesh-
kuinkel-78a622120/
Results-driven data engineering expert with over 12 years of hands-on experience in data engineering, ETL development, and machine learning workflows. Skilled in using a variety of cloud platform technologies such as AWS, Azure, and BI stack tools to deliver large-scale data projects on time and within budget. Passionate about data analytics and committed to delivering excellent results. Strong track record of leading cross-functional teams, driving business value through data-driven insights and decision making.
Skills
Tech Stack:
Python, SQL, Snowflake, Postgres, PySpark, MySQL, JIRA, lucid chart, Apache Airflow, Docker, Hadoop, Kafka, Kubernetes, KubeFlow, AWS Services(EC2,Glue,lambda,kinesis, MSK, MWAA, DynamoDB, Redshift, flink, EMR, Athena, S3, IAM, Aurora), SQL Server, SSIS,SSRS, Tableau, Power BI, Hightouch Reverse ETL, Segment, fivetran, dbt, Git, Github Actions.
Work History
CAIS(Capital Integration Systems)- Senior Lead Data Engineer 2021-12– 2023-04
San Francisco, CA
• Successfully led a team of data engineers, BI developers, data scientists, BA and QA to deliver data warehousing and ETL projects for marketing, sales and other business units
• Responsible for building and modernizing data architecture, modeling, and pipelines for data analysis, reporting and ML.
• Converted on-premise data pipeline architectures to cloud based architecture with mostly AWS services.
• Delivered large-scale data projects on time and within budget, resulting in improved data-driven decision making across the organization and increased revenue by 20%.
.
.
• Developed and deployed Python scripts in AWS lambda to extract API data, clean, and analyze large datasets.
• Conducted exploratory data analysis on large datasets using PySpark and Python data science libraries such as NumPy, Pandas, and Matplotlib.
• Modernized the data infrastructure with microservices architecture, fivetran, dbt, snowflake, segment, tableau, hightouch, AWS services, Airflow, Docker and Kubernetes.
• Implemented data modeling, data governance and security protocols to ensure data quality and compliance with industry standards.
Autodesk- Technical Lead Data Engineer
2020-11– 2021-11
San Francisco, CA
• Worked on the migration project from on-premise resources to AWS Infrastructure.
• Analyzed On-prem resources and proposed the cost saving solution on AWS and successfully completed the migration project on tight time schedule and under client’s budget.
• Developed ETL pipelines using PySpark to extract data from relational databases, process it using Spark SQL and DataFrame APIs, and load it into a data warehouse for analysis.
• Converted legacy cURL scripts to Python scripts to extract API data and load to snowflake database.
• Organized and Participated in project planning sessions with project managers, business analysts and team members to analyze business requirements and outline the proposed solution.
• Automated the data flow from data acquisition, data transformation, data loads and data visualization.
Bay Alarm Company- Lead BI/Data Engineer
2019-06– 2020-09
Concord, CA
• Analyzed the business pain points and Converted the business requirements into technical specifications.
• Leverage clustering and logistic regression techniques to build a predictive model pipeline on Marketing Data. The pipeline includes data-preprocessing, feature selection, hyper-parameter tuning, model training, and model validation. Accuracy improved by 80% compared to the ruled-based model.
• Implemented and Moved Snowflake Database. Performed data ingestion to and from AWS S3 to Snowflake.
• Wrote custom IAM policies using JSON scripting
• Automated ML and DL production models with Kubeflow and Docker. Trinchero Family Estates- Sr. BI/ Data Engineer
2018-03 - 2019-06
.
.
Napa, CA
• Support team projects with technical expertise with all aspects of the solution design and implementation, including requirements definition, data acquisition processes, data modeling, process automation, escalation procedures, construction, and deployment.
• Worked on Wine Sales Data and promotion materials data to build a model based on the historical figures to predict correct promotional scheme using Python libraries like Sickit-learn, pandas and numpy.
• Manages data and data requests to improve the accuracy of data and decisions made from data analysis.
• Communicates to team members, leadership and stakeholders on findings to ensure models are well understood and incorporated into business processes.
• Used Apache Airflow to programmatically schedule and monitor workflows. UCOP Data Engineer
2017-07 - 2018-03
University of California: Office of the President, Oakland, CA
• Explore and investigate different model types and techniques to improve machine learning performance
• Participate in feature engineering and defining data requirements for different Machine Learning models
• Improved the accessibility of multi-sourced data across the team by designing and building a high- performance data warehouse on AWS to ingest data from various data sources.
• Worked with Data Scientists to extract data from AWS redshift and build the prediction models based upon the requirements using Python libraries.
• Wrote Custom Script to consume the Web API to do various tasks, like updating the database table based on Service response and process the data.
• Created side by side bars, scatter plots, stacked bars, heat maps, filled maps and symbol maps according to deliverable specification
Sr. BI/Data Engineer
2016-06 - 2017-05
E&J Gallo Winery, Modesto, CA
• Involved in finalizing database design, producing logical and physical models using ER/Studio, and implementation of database objects as per the data model.
• Involved in implementation of database objects such as Tables, Indexes, Functions, Stored Procedures, Triggers, Constraints and Views to support the Matter Management application.
• Involved in analysis and redesign of existing procedures to improve performance and reduce the load run time.
Adaptive Dream Solutions - SQL/ Data Engineer
San Francisco, CA
2010-10 - 2016-06 Contractor
.
.
• Worked on various projects from different domains as a contractor for Adaptive Dream Solutions. While at Adaptive Dream Solutions, I worked for companies like Navient, Otto Engineering and Cognizant.
• Responsible for creating packages to validate, extract, transform and load data to a centralized SQL server using OLEDB providers form the existing diversified data sources.
• Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data warehouse.
• Implemented multidimensional cubes with MDX calculations to support analysis needs and fast data retrieval.
• Created high quality reports, dashboards and analytics to meet the needs of various business clients such as Operations, Sales, Customer Service, Finance, and Executives. Education
Bachelor: Computer Science
2008 - 2009
University of South Alabama - AL
Information Technology
2009 - 2010
Bishop State Community College - AL
Bachelor of Science: Information Technology
2015 - 2017
Colorado State University Global – Online College
Certifications
• Administering Microsoft SQL Server 2012/2014 Databases
• Implementing a Data Warehouse with Microsoft SQL Server 2012/2014
• Information Systems Management and Architecture
• Network and Information security Analysis
• Database Management Completion Certificate
• Network Administering and Configuration
• Johns Hopkins University Online courses - Coursera Certificate
• Getting and Cleaning Data with Python -Coursera certificate
• R programming certificate- Issued by Coursera
• The Data Scientist's Toolbox-Issued by Coursera
• Self Learn Online course completed
• Machine Learning with Python course on udemy
• AWS solution architect course on udemy
• Data Science with R course on updegree
• MCSA certified: SQL Server Solution Associate Querying Microsoft SQL Server 2012/2014