Post Job Free
Sign in

Data Warehouse Bricks

Location:
North Beach, FL, 33004
Posted:
May 13, 2024

Contact this candidate

Resume:

SUMMARY

Highly skilled, result oriented professional with over fifteen years in system analysis in DW development and Data Architecture in enterprise-class technologies specializing in sophisticated Data Warehouse and BI solutions with complex logical and physical design in cloud environments.

EDUCATION

MOSCOW INSTITUTE OF ELECTRONICS AND MATHEMATICS.

oMaster’s degree in computer science

TORONTO UNIVERSITY

oMaster’s degree equivalency

oVarious vendor courses and seminars including IBM, Oracle and Microsoft.

oCompleted with the exam Pega Architect training for level 1 and 2.

TECHNICAL SKILLS:

Cloud Computing and ML: expert knowledge in AWS, Azure, Python, Data Bricks

Development and networks tools:

AWS S3, Copy utility, Data Bricks

Private and public set up of Azure services

ADF Pipelines and data flows, Logic and function apps

Azure network: VNETS and DSNs,

Data Bricks notebooks with Python and Scala.

Process integration and Message Streaming:

Confluent Cloud, Spark Data bricks - Kafka Consumers and Producers APIs, Confluent and Spark Connectors. Message Streaming Topic performance optimization

ETL/ELT Tools: Pythone, Azure ADF Pipe lines and data flows, Matillion, SSIS.

Data Bases:

Spark Delta Tables, Snowflake, Azure MS Synapse and Hyper scale SQL Servers.

On perm sql servers 2008R2-2019:

Advanced T-SQL, Expert knowledge in design of SQL queries, indexing, partitioning and performance optimization

Snowflake : Designed and optimize database objects. Data Warehouse Architecture specific to Snowflake – Data Vault, Dimensional Models, Data Lakes.

Oracle 11-12, PL/SQL, SQL tune-up, Advance database design, Data base optimization and conversion

Performance Measurement, Capacity planning, Dynamic SQL, Tables/indexes partitioning and Materialized views, Asynchronous Queues,

APIs: Web services: SOAP/REST, Logic and function apps,

BI Reporting and Analytical Tools:

MS Power BI/Power Query, Tableau, SSRS, SSAS, Cognos7/10: Framework Manager, Transformer Cubes, Analysis Studio.

Busness Process Modeling: Pega BPM, Pega PRPC 5.5-7.2

Data Archetcure: Data Vaults, Dimension Modeling - Star Schema, Inman DW Models, Mixed models CIFs. Modeling tools: ER Studio, Ervin.

Project Management Methodologies: MS DevOps project set up and maintenance, JIRA

PROFESSIONAL EXPERIENCE

Dominion Computer

Consultant/Owner:

Elior North America June 2023 – April 2024

Bra new design, development and successful implementation of a data warehouse featuring the load of more than a thousand tables and files from different sources to provide data to be used by over 70-100 BI dashboards.

ETLs: framework of parameterized ADF and Logic Apps Pipelines driven by a control table in multiple jobs running in parallel mode.

Load Sources: Tables from SQL servers and Oracle, files from Smart sheets, Share Point.

Design and develop Logic App workflows to access API based data load.

Back End Snowflake: Architected the load process with staging merge process in full, incremental and time scaled load modes.

Design and develop Snowflake SQL stored procedures as part of the load process and Data Vault load transformations to be orchestrated from ADF pipelines

Conducted multiple POC and recorded training sessions for support personal.

Design and implemented Azure DevOps CI/CD YAML Stage/Jobs/Steps pipelines to promote ADF pipelines and to run SQL scripts to UIT and Prod environments from GitHub depository.

Dominion Computer/TekPartners Partnership: Major Projects:

Royal Caribbean June 2022 – March 2023

Senior Consultant

Solo design and development Kafka Streaming framework for Siebel Salesforse design

Complex ADF pipe lines design and testing and promotion

Created Spark Kafka Consumer and Producers to get XML Messages routed through Tibco hub

Connected Confluent Kafka cluster with Azure Databricks cluster utilizing Sprak sreaming API

Created Python classes for XML parsing logic and access Spark streaming APIs

Trained RCCL developer to use Kafka Streaming and

Apply Pre-landing, Bronze, Silver and Golden layers concepts to Kafka streaming process

Architect Delta tables to store parsed XML and JSON data with various Customer information

Presented Data Vault architectural Concepts to Lead developers

Created Python/Databricks/Spark producer classes to ineteract Kafka topics queues.

Configure Confluent HTTPS connectors to link Consumer topics to CDP Treasure Data

Bayview Mortgage August 2021 – March 2022

Technical Project Lead

Conversion of on-prem SQL Data Warehouse to the state of art Azure Cloud base platform

Synapse DW structure design, performance tune-up with optimal Indexes Cluster, Column Stores.

Azure Hyperscale SQL Servers set up and performance tune-up, and partition exchange

Design and implementation of major ADF pipelines and Data Flows

Design and Develop complex Store Procedures used in ETLs and APIs

Optimized performance of existing Store Procedures

Design data retrievals APIs utilizing Azure Logic App and Web App

Troubleshot Azure services and the VNET network connections

Established the entire project in DevOps and led daily stand ups

Create complex connection of Logic Apps to DB and Blob Data storages

Design ETL Scheduling system bases on MS schedule and Blob Storage triggers calling ADF Pipelines

Conducted multiple POC and training session for Logic App and ADF pipe lines and Databricks notebooks with Data Bricks with Python and Scala.

MEDNAX

Lead BI Developer/Architect Sep 2020 – August 2021

Design and develop new enterprise class multilevel Data Warehouse utilizing Azure Synapse, ADF and Databricks consisted of Source(Staging/Lending), Integration and Reporting levels with the optimum data movement mechanisms between the layers

Introduced and implement a concept of dynamic master pipeline and data bricks to load around 500 on-prem tables and data lakes files from different production system sources to Synapse DW.

Architecture and design DB control tables to drive dynamically configurable pipelines and databricks notebooks for multiple objects data move.

Introduced and implement a concept of progressive mapping of on_prem to source (staging) to integration DB objects of different levels of DW. This approach turns to be extremely productive to synchronize work of the group of developers.

Developed templates for stored procedures and DW tables to optimize work of the developer group

Optimized Distribution, Clustering and Indexing DDL permeameters to improve DW performance

Develop wit h Data Bricks with notebooks with Python and Scala.

THERAPEUTICSMD

Lead BI Developer/Architect June 2019 - June 2020

Design and develop new enterprise class Data Warehouse utilizing Snowflake Cloud Data Warehouse on the AWS platform.

Introduced the Data Vault design concept for a new Enterprise Data Warehouse to handle complexity and data volatility in the client’s analytical needs and business data. Responsible for the overall data architecture of the data marts and supporting structures.

Designed and developed data flows using Matillion as the ELT tool, with Snowflake and Oracle.

Provided training and guidance to team members on Snowflake data loading best practices using the ELT approach as well as tool-specific training for developers new to the tool.

Developed reports utilizing Tableau to visualize the data for end user insights

Full Time Employee:

OFFICE DEPOT

Lead Developer Apr 2015 - May 2019

Made improvements and troubleshoot SSIS ETL packages, transforming and providing product data to 3d parties customers. Using C# Build APIs to invoke internal and external process and pass parameters. Designed and maintained queries and TSQL packages for business reporting and analysis of internal and vendor provided data points and attributes.

Architectural design for data integration of multiple data sources for e-commerce content management including SQL server, Oracle, and DB2 mainframe. Design data structures for operational data stores utilizing snowflake and star schemas.

Designed and architect objects in AWS/Snowflake to be loaded from 3rd party vendors to production systems. Responsible for analysis, design and development SQL Server and Oracle based high performance web content management ETLs. Successfully completed number of projects resulting in improve performance and flexibility and making product download 3-4 times faster.

Working with the business in establishing MDM structures and data cleansing, knowledge repository and rules.

Administer and improve performance for Oracle EDQ software resulting in improvements of cleansing cycle from 2-3 days to 3-4 hours.

Amazon and Salesforce Cloud Computing: created ETLs to load data from Salesforces to Amazon AWS S3 storage and to Oracle Content management systems.

AMERICAN EXPRESS

Lead Data Architect Aug 2010 - Apr 2015

Led a team of on-shore and off-shore data base professionals to develop data base structures and Data Marts for reporting\analytical solution for AMEX World Services portfolios. Working in close partnership with WSDW application team we triple the data volume of the WSDW as well as double areas of applications for three years of my leadership and architectural design. Have a leading role and expertise in Pega PRPC Data Classes and their interface to DW ETL processes. Was a major architect in changing World Services DW frame work from two level models to Corporate Information Factory Architecture creating WSDW development standards and guide lines. In partnership with physical DBAs, introduced DPF key optimization and MDC Multidimensional Clustering optimization use of MQT Materialized Views, resulting in significant improvements of the performance for the World Services Data Warehouse and Data Marts reporting and ETL processes.

On Operational site of AMEX World Services I lead architectural efforts on the major portfolios World Services Portal, Claim and Disputes Portfolios, Global Communication Services and New Accounts Service Portal. I was leading the architectural part of projects to create and enhance logical and physical data assets of these portfolios to meet new business requirements for AMEX major Blue Print initiatives. I introduced and lead efforts to improve data architecture for Pega PRPC data structures to provide development team and management with a better view of the data assets in the Pega development platform to reduce data redundancy and improve data quality.

Other Portfolios I was responsible for: Exchange Media: Projects Universal, Pega Front End, Global Communication Manager, Risk and Credit Bureaus.

NCL - NORWEAGIAN CRUISE LINE

Manager of Data Warehouse and Business intelligence. Mar 2006 - May 2010

Managed multiple projects to develop and maintain NCL’s data warehouse utilizing project management best practice processes and techniques. Estimated and planed resources, cost, effort and duration of product deliverables. Provided leadership, technical guidance, training and mentoring to BI developers.

Responsible for architectural design, data base management and development of NCL BI/Data Warehouse applications:

Actively involved in day-today development activities, data base administration, Data base performance optimization, architectural design, project reviews, troubleshooting and problem solving.

Defined, developed and maintained Data Warehouse structures on logical and physical levels utilizing SQL Server and Oracle Data Bases.

Responsible for architectural design and development of data models for BI data delivery utilizing MicroStrategy. Framework Manager and Transformer.

Responsible for supporting interfaces for product/pricing availability data for NCL ecommerce applications.

Developed and maintained multiple ETL processes utilizing PL/SQL, Asynchronous Queues, SQL Loader and Informatica. Specialized in design and implementation for multithread and queued ETLs.

Developed ETL procedures utilizing ASP.net, C# and SSIS components to maintain complex support/control data tables.

Design and build multidimensional cubes for finance, revenue and sales analysis utilizing: Framework Manager, Transformer. Optimized, deployed and scheduled Cognos cubes and established user interface utilizing Cognos Report Studio and MicroStrategy.

Defined data warehouse reporting requirements and project scope with internal customers Conducted JAD sessions with technical and business stakeholders to determine business information requirements. Owns data acquisition process and improvements needed to meet evolving customer needs.

Responsible for small, medium and long term projects. Established outsourcing plan for off-shore programming vendors.

Coordinate and responsible for off-hours data warehouse production support team. 24x7 production support.

Responsible for development and maintenance of the CRM (Siebel, People Soft) and Data warehouse interfaces.

Responsible for systems of record change impact analysis and implementation

Managed Scorecard/Dashboard project development of Ship to Shore data collection system using SQL server to Oracle server communication utilizing SSIS, Cognos Metric Studio/ Metric Designer and Event Studio for Scorecard/Dashboard projects.

Established standards and procedures for QA control of development projects. Coordinate data standards, common data definitions and standard data management practices. Review and recommend new technology for data warehouse ETL and data delivery components.

Design and develop change managements procedures. Responsible for SOX regulation compliance

ASSURANT GROUP, Miami

Senior System Analyst/Data Warehouse 2004 - 2006

One of the largest international insurance and risk Management Company.

Responsible for building insurance DW architecture and BI data delivery reports and cubes for over 1000 of users with over 30 TB of data.

Building interface to claim processing system utilizing J2EE technology: Java, Web Sphere

Working with actuaries to collect and analyze system requirements for data modeling

Develop Data Warehouse ETL procedures utilizing PL/SQL, Asynchronous Queues, Informatica and MicroStrategy, PLSQL and Unix scripts

Data Warehouse analytical tables design, and development, utilizing PL/SQL and UNIX scripts.

Set up Data Warehouse reporting structures utilizing Crystal Enterprise, Hyperion Essbase with multidimensional cubes, Hyperion Analyzer and Cognos tools for data delivery.

Converted Hyperion Essbase Cubes to Cognos Cubes utilizing PowerPlay Transformer.

Work with Hyperion software engineers to set up a direct load process between Oracle and Essbase databases

Design and optimization of aggregate storage for multidimensional OLAP cube

Design and analysis of XML/XSD Messaging Hub utilizing IPEDO software.

Oracle database installation administration, analysis and tune-up.

Troubleshooting Call Center Operation based on Oracle/Crystal Enterprise Reporting system

Personal: US and Canadian citizen.



Contact this candidate