Post Job Free
Sign in

Senior Data Engineer with 8+ Years of BI/Data Warehousing Experience

Location:
Winter Garden, FL
Salary:
150000
Posted:
January 25, 2026

Contact this candidate

Resume:

ISLOMBEK KOBILJONOV

689-***-**** - ******************@*****.*** - Orlando, FL

Senior Data Engineer with over 8+ years of experience in managing and processing large- scale data systems to deliver Business Intelligence and Data Engineering solutions. Adept at utilizing various technologies such as Spark, Python, and SQL to optimize and enhance processes, improve business performance, and provide a seamless user experience. As Senior Data Engineer, I have extensive knowledge of managing and processing various types of data systems like OLTP, OLAP, ODS, Data Lakes, and Data Warehousing. My goal is to provide effective Business Intelligence and Data Engineering solutions that meet business requirements, utilizing tools such as Azure Data Factory, Databricks, Azure Synapse, Power BI, Python, and SQL.

TECHNICAL SKILLS

Methodologies

Project Life Cycle (PLC), Systems Development Life Cycle (SDLC), Agile Project Management (SCRUM), Waterfall, Iterative

Data Base Programming MySQL, Mongo DB, SQL Server, PL/SQL, PostgreSQL, AWS Redshift Data modeling Pivot tables, Snowflake schema, Star schema, VLOOKUP Technologies/ Tools

JIRA, GIT, Apache Spark, Apache Kafka, PowerBI, Tableau, SSIS, SSRS, SQL, SSAS, Data Modeling, Data Warehousing, Business Intelligence (BI) Operating Systems Windows, Linux

SDLC Agile, Scrum, Waterfall

MS Office Word, Excel, Outlook, PowerPoint, Visual Studio Cloud Platform Microsoft Azure, Snowflake, Databricks, Azure ADF, Snowflake, AWS Glue, s3 Management/Collaboration

Tools

Azure DevOps, SharePoint, Microsoft Teams meeting, Webex Products, Zoom Programming Languages SQL, PL/SQL, T-SQL, Python, PowerShell, C# WORK EXPERIENCE

SENIOR DATA ENGINEER ERNST & YOUNG, ORLANDO, FL 07/2021- PRESENT

• Worked with clients, stakeholders, and business users to implement various data systems like Data Warehouse, Data Mart, ODS, OLTP, and OLAP. Contributed to project scope, requirements gathering, data modeling, effort estimation, ETL design, development, system testing, implementation, and production support.

• Designed, developed, tested, and implemented Business Intelligence solutions using Data Warehouse, Data Mart, ETL, OLAP, and Client/Server applications.

• Created SQL views, stored procedures, functions, and indexes, and executed dynamic SQL queries across different servers based on inputs. Recommended new code development or reuse of existing code.

• Enhanced T-SQL stored procedure performance by adding clustered and non-clustered indexes, table partitioning, and rewriting queries with CTE, window functions, and joins.

• Extensively used Informatica designer to create a robust end-to-end ETL process with complex transformations like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, and 2

Transactional Control for efficient data extraction, transformation, and loading to the staging area and then to the Data Mart (Data Warehouse), while verifying complex logics for computing facts and populating dimension tables.

• Implemented SCD Type1 and Type2 mapping to load and maintain Data Warehouse using MD5 and Surrogate keys.

• Worked on multiple data projects involving Data Factory, Logic Apps, Azure Functions Apps, Databricks, and Synapse Data Warehouse. Researched and implemented new technologies and design patterns for data ingestion, data warehousing, and data science.

• Developed ETL Pipelines in Azure Data Factory (ADF) using Linked Services, Datasets, and Pipelines to extract, transform, and load data from various sources such as Blob Storage and Cosmos DB into Azure SQL for further processing.

• Conducted structured stream processing to ingest flat data from JSON, generating data frames into the bronze table for further transformations in Azure Databricks using PySpark.

• Created numerous pipelines in Azure Data Factory v2 to retrieve data from various source systems, utilizing different Azure Activities like Copy Data, Filter, For Each, and Get Metadata.

• Conducted data flow transformation using the data flow activity, and automated jobs using various triggers like Events, Schedules, and Tumbling in ADF.

• Built pipelines to transform the bronze table into the silver table using PySpark in Azure Databricks. Created various partitioned tables based on business requirements and stored them in Delta Lake for easy querying and consumption.

• Led multiple Azure big data and data transformation implementations in industries like banking and financial services, high tech, and utilities.

• Implemented large Lambda architectures using Azure data platform capabilities, including Azure Data Lake, Azure Data Factory, Azure Data Catalog, HDInsight, Azure SQL Server, Azure ML, and Power BI.

• Developed Spark applications using Spark-SQL to extract, transform, and aggregate data from multiple file formats to analyze and uncover insights into customer usage patterns.

• Built Delta Lakes for time travel, enabling data versioning for rollbacks, full historical audit trails, and reproducible machine learning experiments, using Azure Databricks as a collaborative spark-based platform on Azure.

• Imported data from Excel and SQL Server DB to Power BI to generate reports, created DAX queries to generate computed columns in Power BI, and published Power BI reports for organizational use, making dashboards available on web clients and mobile apps.

• Used Power BI Gateways to keep dashboards and reports up to date, created row-level security in Power BI, and integrated Power BI with the Power BI service portal.

• Developed using emerging Microsoft-related technologies like PowerApps, Flow, and Power BI, and worked on various transformations available in the Power BI query editor.

• Scheduled automatic refresh and scheduling in Power BI service, used M Query and DAX for creating custom measures, columns, and tables in Power BI, and developed dashboards, analysis reports, and visualizations using DAX functions like table functions, aggregation functions, and iteration functions for data manipulation.

• Designed and implemented data warehousing solutions using Snowflake, enabling scalable and efficient data storage and retrieval.

• Designed and implemented data warehousing solutions using Snowflake, enabling scalable and efficient data storage and retrieval.

3

ETL BI DEVELOPER PANDA EXPRESS, LA, CA 05/2018- 07/2021

• Participated in requirements gathering, developing logical and physical data models, building fact and dimension tables, configuring ETL workflows, and creating cubes and reports for analysis.

• Developed multiple T-SQL stored procedures, triggers, and views, and made modifications to tables for data loading, transformation, and extraction.

• Composed complex T-SQL queries to validate data and ensure test results aligned with business requirements.

• Employed tools such as Execution Plan, SQL Profiler, and Database Engine Tuning Advisor to optimize queries and enhance database performance.

• Optimized stored procedures, T-SQL, views, and functions for peak performance.

• Designed ADF pipelines utilizing Linked Services, Datasets, and Pipeline components to extract, transform, and load data from various sources such as Azure SQL, Blob Storage, Azure SQL Data Warehouse, and write-back tools.

• Migrated on-premises databases to Azure Data Lake stores using Azure Data Factory.

• Utilized Azure Key Vault to securely manage secrets and incorporated them into Azure Data Factory and Databricks notebooks.

• Included Copy activity, Get Metadata activity, Lookup activity, and custom Azure Data Factory pipeline activities.

• Imported large and diverse datasets from different source systems into Azure Data Lake Gen2 using Azure Data Factory V2.

• Created ETL packages that handled data from sources like SQL Server, flat files, Excel files, and Oracle, transforming and loading it into target tables using SSIS.

• Managed metadata associated with the ETL processes populating the Data Warehouse.

• Utilized SSIS data transformations such as Slowly Changing Dimension (SCD), Aggregate, Sort, Multicasting, Conditional Split, and Derived Column.

• Incorporated additional SSIS data flow transformations, including Data Conversion, Export Column, Merge Join, Sort, Union All, and Conditional Split, into new and existing packages.

• Scheduled SSIS packages to run AM and PM feeds from multiple servers and resources to development servers.

• Created intricate SQL queries, optimized databases, and tuned long-running SQL queries using SQL Server Profiler and SQL Tuning Advisor.

• Developed and automated SSIS packages for ETL processes, extracting data from sources like flat files, MS Excel, CSV, and MS Access to SQL Server.

• Constructed complex SQL database objects, including stored procedures and User-Defined Functions (UDFs).

• Developed relational and dimensional database models by creating tables, views, indexes, and triggers in MS SQL Server. Wrote SQL scripts to manipulate data and ensure data integrity. Maintained a data warehouse.

• Designed and produced reports in MS SQL environment using SSRS, including parameterized, drill-down, and aggregation reports for daily, weekly, and monthly monitoring by managers. Used sub-reports for complex calculations and managed report scheduling, execution, and delivery using SSRS.

• Designed dynamic row-level security for different user levels with complex DAX functions and created custom measures, columns, and tables using DAX and R in Power BI. Used M Query to create custom measures, columns, and tables, and scheduled dataset refreshes with 4

notifications and alerts for published reports in the Power BI Service. Managed table interactions and relationships and implemented drill-down and drill-through for Power BI reports.

• Produced parameterized and drill-down reports using SSRS and wrote queries for report layouts and drill-down reports.

SSIS DEVELOPER TUPPERWARE, ORLANDO, FL 05/2017- 05/2018

• Participated in building a Data Warehouse using Star Schema and Data Marts.

• Cleaned and validated data in the staging area, standardizing formats, handling missing values, and transforming data by splitting or combining columns as specified.

• Utilized T-SQL to create database objects such as stored procedures, user-defined functions, and views in SSMS based on business specifications.

• Developed SSIS ETL packages to import data weekly from the OLTP system to the staging area, subsequently migrating it to the Data Warehouse and Data Marts.

• Executed incremental loading to migrate data from various sources to Data Warehouses and Data Marts, implementing slowly changing dimensions to manage historical data per requirements.

• Created multiple control flows in SSIS, including data flow tasks for migration or conversion, Script Tasks for variable management, Foreach Loop Containers for looping functions, and File Tasks for creating weekly records.

• Applied various transformations in SSIS, such as Data Conversion, Lookup, Conditional Split, Derived Columns, and Merge.

• Implemented end-to-end ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into Snowflake.

• Built scalable and optimized Snowflake schemas, tables, and views to support complex analytics and reporting needs.

• Used Azure Blob Storage for efficient data file storage and retrieval, optimizing costs and security with compression and encryption.

• Integrated Azure Data Factory with Azure Logic Apps for orchestrating complex data workflows and event-based triggers.

• Implemented data replication and synchronization strategies between Snowflake and other platforms using Azure Data Factory and Change Data Capture.

• Developed and deployed Azure Functions for data preprocessing, enrichment, and validation in data pipelines.

• Designed and implemented data ingestion and storage solutions using AWS S3, Redshift, and Glue. Developed ETL workflows with AWS Glue to extract, transform, and load data into Redshift from various sources.

• Developed data sets, reports, and dashboards focused on end-user objectives and business processes using TABLEAU, ThoughtSpot, Power BI, and other reporting tools.

• Implemented incremental load and incremental refresh processes from OLTP systems to OLAP data sources for reporting purposes.



Contact this candidate