Post Job Free

Resume

Sign in

Data Engineer Sql Server

Location:
Miami Beach, FL
Salary:
120000
Posted:
September 13, 2023

Contact this candidate

Resume:

Summary

ROBIOLA JORGE ANDRES

Data engineer – data architect – cloud data DevOps -project manager professional

Personal Info:

Address: **** ******* *** ****, *****, FL, 33140

Mobile: +1-786-***-****

Email: adzonv@r.postjobfree.com;

About Myself:

I am an independent professional with a wide experience developed in the following IT areas: Business Intelligence Solutions, Cloud Architecture Data Ingestion, Data Engineering, Data Ware Housing Projects, DevOps Automation, ETL Development, Hadoop Big Data Ingestion, Visualization Design and Development.

As a leader and manager professional, I worked on several DR projects, obtained a certification as a Project Manager Professional in 2010, an ITIL Foundation V3 certification in 2011, plus an IT Scrum Master course in 2014.

Table of Contents

Summary 1

Skills Set by Year 2

Courses and Certifications 3

Work Experience (last 7 years) 5

Company (EY): 5

Company (Paypal): 6

Company (Enve): 7

Company (HTA): 9

Company (Prokarma): 10

Company (Zenfolio): 10

Company (Lighthouse): 11

Company (Kestra): 11

Education 12

Languages 12

Skills Set by Year

Environment

Owner

Knowledge Area

App/Service Name

Years

Cloud

Azure

Database

SQL

+4

Cloud

Azure

ETL/ELT

Data Factory

+4

Cloud

Azure

Database

Synapse DW

+4

Cloud

Azure

Storage

Blob Storage / Data Lake

+4

Cloud

Azure

Serverless

Functions

+1

Cloud

Azure

Containerization

AKS

+1

Cloud

Azure

Orchestrator

Logic Apps

+2

Cloud

Azure

ETL/ELT

Synapse Analytics Notebook

+3

Cloud

Azure

Visualization

Power BI

+2

Cloud

Azure

Computing

Virtual Desktop

+2

Cloud

Azure

Database

Cosmos DB

+1

Cloud

Azure

Front-End

PowerApps

+1

Cloud

Amazon

Computing

EC2/EMR

+4

Cloud

Amazon

Storage

S3 Bucket

+4

Cloud

Amazon

Serverless

Lambda

+2

Cloud

Amazon

Messaging

SPS, SNS

+2

Cloud

Amazon

Streaming

Kinesis

+1

Cloud

Amazon

Orchestrator

Glue

+1

Cloud

Amazon

Database

RDS PostgreSQL, Aurora DB

+4

Cloud

Amazon

Database

Redshift

+2

Cloud

Amazon

Database

DynamoDB

+2

Cloud

Amazon

ETL/ELT

Step Functions

+1

Cloud

Amazon

Infrastructure

VPC, IAM, API Gateway

+4

Cloud

Amazon

Containerization

ECS, Fargate, KMS

+1

Cloud

Google

Computing

Virtual Machine

+2

Cloud

Google

Storage

Cloud Storage

+2

Cloud

Google

Database

BigQuery, BigTable

+4

Cloud

Google

Serverless

Cloud Functions

+2

Cloud

Snowflake

DW / MPP

Snowflake

+4

Cloud

Databricks

DW / MPP

Databricks

+3

Cloud

Confluence

Streaming

Kafka

+1

Cloud

Orchestrator

ETL/ELT

Fivetran

+1

Cloud

Google

ETL/ELT

Firebase

+1

N/A

N/A

Programming

Python

+5

N/A

N/A

Programming

Spark Scala

+3

N/A

N/A

Programming

C#

+10

N/A

N/A

Programming

JavaScript

+4

N/A

N/A

Programming

PowerShell

+5

N/A

N/A

Programming

Linux Bash

+2

N/A

N/A

Programming

SQL

+20

Cloud

Infrastructure

Provisioning

Terraform Hashicorp

+3

Cloud

Infrastructure

Provisioning

Vagrant Hashicorp

+3

Cloud

Infrastructure

Provisioning

Packer Hashicorp

+3

Hybrid

Infrastructure

Provisioning

Ansible

+2

Hybrid

Infrastructure

Containerization

Docker Swarm, Kubernetes

+3

Hybrid

Apache Hadoop

Big Data

HDFS, Zookeeper, Hive, Pig, Spark Scala, Kafka, Impala, HBase

+3

Hybrid

Integration

ETL/ELT, Orchestration

Jenkins, Airflow

+2

Hybrid

Integration

ETL/ELT, Orchestration

Active Batch

+3

Hybrid

Integration

ETL/ELT, Orchestration

Informatica PowerCenter

+2

Hybrid

Tableau

Visualization

Tableau

+4

Hybrid

Microsoft

Visualization

Power BI

+2

Hybrid

Microsoft

Database

SQL-Server

+20

Hybrid

Qlik

Visualization

Qlik Sense

+2

Cloud

Google

Visualization

Looker

+1

Cloud

N/A

No-SQL Database

MongoDB

+2

Cloud

N/A

No-SQL Database

Neo4J

+1

Hybrid

N/A

Database

PostgreSQL

+4

Courses and Certifications

Period

Course

Institute

2022

Neo4J Environments

Udemy

2022

Streaming Kafka Development

Cognitive Class

2021

Data Engineer for GCP Platform

Udemy

2021

Python for Databricks

Udemy

2020

Docker Swarm & Kubernetes

Cognitive Class

2020

Azure AZ900 Fundamentals

Microsoft Academy

2020

AWS Cloud Practitioner

Amazon

2018

Hadoop Ecosystem Developer

SimpliLearn

2018

Introduction to Python Data Science

Cognitive Class

2017

Introduction to Statistic Analysis using R

Cognitive Class

2016

Spark Fundamentals I

Cognitive Class

2015

Microsoft SQL-Server 2012 Administering 70-462

Microsoft Academy

2014

Microsoft SQL-Server 2012 Programming 70-461

Microsoft Academy

2014

Scrum Master

UTN Argentina

2013

Introduction to Hadoop Big Data and Hive

SimpliLearn

2012

QlikView Development and Design

Qlik Community

2011

ITIL Foundation V3 Best Practices

IBM Corp

2010

Project Management Professional

PMI

2007

Microsoft Certified Database Administrator

Microsoft Academy

Work Experience (last 7 years)

Company (EY):

EY Inc.

Position and modality:

Data Engineer Associate Director (Contractor)

Period Start:

2021-12

Period End:

Current

Description:

Working for the DW Assurance, EY Data Fabric, and Billing Hub projects delivering support as Data Architect and Technical Data Engineer as well, maintaining an existing DW mounted in Azure, and re-designing some Web Applications supported by Databricks and No-SQL databases

Skills:

Billing Hub Project:

Azure Synapse DW

Databricks as cloud based DW.

Databricks Delta Lake for streaming data consumption

MongoDB as No-SQL Database

PySpark (Python + Spark) in Databricks Notebook as programming language for ELT

Web App Solutions:

GitHub Repository

Docker and Kubernetes POD used to automate data ingestion processes.

Airflow Orchestrator

Azure Synapse DW

Azure Synapse Analytics Serverless

PySpark (Python + Spark Scala) in Spark Pool

Neo4J as No-SQL Database.

DW Assurance Project:

Microsoft Azure as Cloud Infrastructure

Azure Synapse DW

Azure Synapse Analytics Serverless

Azure Data Factory (ADF)

AZ SQL Database

SQL-Server 2019

Azure Devops

Azure Blob Storage and Data Lake Gen2

Azure Logic Apps

Python 3.x as programming language for ELT

Company (Paypal):

Paypal Inc.

Position and modality:

Data Engineer, Data Analyst, Tableau and Amplitude Developer (Contractor)

Period Start:

2021-04

Period End:

2022-12 working in parallel with EY and Enve Labs since there was no conflict of interests

Description:

Working for the PP Analytics Payment Checkout department to identify, analyze, generate a data pipeline to automate the data analysis, and finally develop a visualization dashboard for incidents occurred during the payroll process in both mobile, web apps, and 3rd party vendor applications

Skills:

Analytics Payment Checkout Process:

Hadoop Kafka streaming pipelines to ingest data into GCP external BigQuery datasets.

Hadoop Big Data Hive as No-SQL Database for SSOT External tables generated in GCP BigQuery

PySpark (Python + Spark Scala) in GCP

Teradata MPP Database

Airflow Orchestrator

GCP Cloud Storage

GCP BigQuery under Jupyter Notebook as cloud based DW.

Tableau Server 2021.3

Amplitude Product Analytics

GCP Looker Reporting platform

Company (Enve):

Enve Labs LLC.

Position and modality:

Data Engineer, Data Architect, Data Cloud DevOps (Freelancer, not FTE)

Period Start:

2021-01

Period End:

Current, working in parallel with Paypal and EY, since the POC projects do not require to be completed during the daily timeframe

Description:

Working on different projects that require technical skills on both Data governance plus DevOps infrastructure knowledge. The job is performed as freelancer, attending only weekly meetings with the customer, making presentations for the deliverables

Skills:

WS Services used for a Financial Project (15 months) (Mariano Buglione as manager):

Lambda functions to collect API requests, ETL Glue catalogs and crawlers, and Active Batch Cloud as job orchestrator.

Streaming pipelines using Kafka Confluent plus Python to pull down real-time intraday data every 5 minutes.

EC2 and EMR to run Python codes and Docker Swarm microservices.

CloudWatch to monitor and send notifications.

S3 buckets for a Data Lake Model (RAW, Silver, and Golden Areas)

PySpark (Python 3.X and Spark Scala) to execute Data Ingestion pipelines.

Snow Pipe and Snowflake Cloud-base databases for Data Warehousing models

RDS (PostgreSQL, Aurora DB, SQL-server) for relational database support

SQL Athena to read large volume of files from the S3 Bucket Data Lake model

REDIS database and DynamoDB for large volume and low-latency data Warehousing Model

AWS Redshift as cloud based DW.

Python Data-Build Tool (DBT) and DBT-expectations plus Montecarlo Observability Software

Tableau and Power BI as the visualization model

Analysis, ERP NetSuite SCMS (Supply Chain Management Software) as Data Source

for further visualization and data analysis, plus Rapid Miner as a comparative analytics dashboard system

Hygiene and Health project for Reckitt (6 months) (Venkata Sandeep Pabbaraju as manager):

Re-design a data pipeline in Azure Databricks linked to different DW system mounted in Power BI

Databricks Delta Lake for streaming data

Databricks as cloud based DW.

PySpark (Python + Spark) in Databricks Notebook as programming language for ELT

Azure Logic Apps Orchestrator

Azure Blob Storage and Data Lake Gen2

Azure Cloud Functions

Azure Data Factory (ADF)

Azure Power BI as dashboard visualizer

e-Commerce Scrap project for Publicis Groupe (6 months) (Kristina Kaganer as manager):

Collect information from the best-known Buying Web portals in US, UK, and EA to compare prices and reviews made by the users over there.

PySpark (Python + Spark) as programming language for scraping.

GCP Virtual Machine

GCP Cloud Storage

GCP Cloud Functions

Airflow Orchestrator

GCP BigQuery as cloud based DW.

Company (HTA):

High Tower Advisors / Tradehelm Inc.

Position and modality:

Data Engineer, Data Architect, ETL Developer (Contractor)

Period Start:

2018-09

Period End:

2021-02

Description:

Working for HTA, a financial company located in Chicago IL to deliver support as a Data Engineer, Data Architect, and ETL Developer to support a DW system migrated from on-premises host servers to a Cloud environment mounted in AWS

Skills:

HTA Data Model Team / Cloud systems

Snowflake as cloud based DW.

AWS EC2 / EMR

AWS IAM / VPC

Snowpipe / AWS SNS / AWS S3

AWS Glue ETL Orchestrator

AWS Lambda / SNS / S3

HTA Data Model Team / On-premises

MS SQL-Server 2019

MS SQL-Server SSIS / SSRS

Active Batch (ABAT) as ETL Orchestrator

MS PowerShell as low-level code for FTP connections

C# as programming language for SSIS

VB as programming language for SSRS

Python 3.x as programming language for AWS Lambda and AWS ETL processes

PySpark (Python + Spark Scala) for packages running over AWS EMR

JavaScript as programming language for Snowflake Stored Procedure and Functions

Company (Prokarma):

Prokarma Inc.

Position and modality:

BI Solutions Architect, Tableau Developer (Freelancer)

Period Start:

2018-07

Period End:

2019-06

Description:

For a health-care company, we must replace the existing underlying database and ELT environment from PostgreSQL and IBM DataStage DB and ETL systems, to Cloudera Distribution for Hadoop as Backend Environment, pointing out the new solution to the existing Tableau Server model without changing the original dashboards and keeping the same information

Skills:

Backend and Visualization Systems

AWS RDS PostgreSQL database

IBM DataStage ETL orchestrator

Cloudera Hadoop Hive for External QL Tables

Cloudera Hadoop Pig for ELT process

Cloudera Impala as MPP database

Hadoop Kafka using java packages to subscribe consumers pipelines ingested into Hive tables.

Spark Scala as programming language to design External tables.

Tableau Server as visualization tool

Company (Zenfolio):

Zenfolio Inc.

Position and modality:

DW Solutions Architect, Sisense Developer (Contractor)

Period Start:

2017-06

Period End:

2018-05 same periods working in parallel with Lighthouse Investment Partners since there was no conflict of interests

Description:

For a professional photography marketplace, I created a common repository used the extract and transform the information coming from different applications and loading the final DW model into a visualization tool.

Skills:

On-premises servers:

SQL-Server 2016 database

SQL-Server SSIS ETL tool

Oracle Financials ERP System

Cloud Environment:

AWS VPC + IAM

AWS Lambda + S3 + AWS Redshift serverless

Sisense Elasticube for the Visualization platform

Company (Lighthouse):

Lighthouse Investment Partners Inc.

Position and modality:

ETL Team Leader, Data Analyst (Freelancer)

Period Start:

2016-11

Period End:

2018-05 same periods working in parallel with Kestra Financials since there was no conflict of interests

Description:

Build an ETL platform flexible to integrate information coming from different Risk Management applications (Axioma, Thomson Reuters, Beta, Barra, SGD, among others) and show the principal statistics into an existing Reporting Platform. Working with a Machine Learning model built-in R Analytics, then replaced by Python ML and load it as an automation process

Skills:

On-premises servers:

SQL-Server 2016 database

SQL-Server SSIS ETL tool

Bloomberg API system model

Axioma Web Service

Thompson Routers Database

R Analytics Tool for Machine Learning with Stock Exchange

Python 3.X programming language for ML model

Cloud Environment:

Azure SQL Database as mock

Azure Data Factory as mock

Company (Kestra):

Kestra Financial Inc.

Position and modality:

DW Architect (Contractor)

Period Start:

2016-11

Period End:

2017-08

Description:

create an ETL process and Database design used to integrate different CRM and Financial Management Applications such as eMoney, ERP NetSuite, Bonaire, CRM Salesforce) into an existing data warehouse model, updating the current Reporting platform and studying the migration to Power BI Tools mounted on a Cloud Environment

Skills:

On-premises servers:

Microsoft SQL-Server 2008

MS SSIS 2008/2012 ETL Tool

MS SSRS 2008/2012

C# as a programming language for SSIS

VB as programming language for SSRS Platform

Power BI Tool as the new Reporting Platform

Education

Period

Study Subject

Place

Status

2008-2010

Project Management Professional

IBM Inc. / Project Management Institute

Certified

2006-2008

IT System Engineer

Universidad Abierta Interamericana

It remains 4 subjects to degree

2000-2002

IT System Analyst

Universidad Abierta Interamericana

Diploma

Languages

Language

Level

English

Advanced C2 fluent written and verbal

Portuguese

Basic B1 level written and verbal

Spanish

Native



Contact this candidate