Post Job Free

Resume

Sign in

Data Engineer Real Time

Location:
Atlanta, GA
Posted:
August 08, 2023

Contact this candidate

Resume:

Bernard Bawak

Data Engineer

adyr5q@r.postjobfree.com

PROFILE I am a multi-talented software developer applicant with knowledge in several programming languages, data warehouses and cloud technologies. I am True team player with strengths in adaptability and accuracy.

EMPLOYMENT HISTORY

February 2022- Present Data Engineer, Trace3

Developed end-to-end data pipelines using Snowflake, integrating various data sources, and ensuring smooth data flow into the data warehouse from AWS.

Configured Kafka source connectors for migration of data from Amazon RDS Postgres into Kafka topics.

Cloned production databases for testing.

Created and maintained documentation for data pipelines, ETL processes, and data models, enabling efficient knowledge sharing and onboarding of new team members using DBT docs.

Implemented data security measures, including role-based access control and data encryption, ensuring compliance with data privacy regulations.

Led a team in the transformation of data using DBT in snowflake to improve profit market analysis.

Partner with CTO and Director of data engineer to design an end-to-end data pipeline for clients.

Designed and developed data pipelines in snowflake using streams and tasks so that stakeholders will make accurate and real time decisions.

Designed end-to-end data pipelines in Azure Data Factory to extract, transform, and load data from various sources into Azure Data Lake Storage and Azure SQL Data Warehouse.

Developed and maintained ETL processes using Azure Synapse Data Flows, performing data transformations and aggregations on large datasets.

Collaborated with data architects, analysts, and business stakeholders to understand data requirements and translate them into efficient ETL workflows.

Implemented data transformation logic using Azure Data Factory activities, data flows, and transformations, resulting in streamlined data processing and reduced ETL execution time.

Worked with Azure DevOps for version control and CI/CD pipelines to ensure smooth deployment and maintenance of data pipelines.

Designed and developed end-to-end data pipelines in Azure Databricks, extracting data from various sources, transforming it using Spark transformations, and loading it into Azure SQL Data Warehouse for analytics.

Collaborated with data scientists and analysts to understand data requirements and implemented scalable solutions for real-time and batch data processing.

Jun 2019 — Dec 2021 Software Engineer, AYACOV LLC, Atlanta

Transforming Json semi-structured data to a relational view

Loading Parquet files into snowflake

Replicated databases in different regions and environments for disaster and recovery in snowflake.

Tuned and optimized Snowflake warehouse performance by implementing best practices, resulting in a 25% reduction in query execution time.

Designed and developed an algorithm for extraction and transformation of unstructured data in 28000 text files into one structured tabulated csv file for ingestion and analysis using python and pandas.

Optimized snowflake objects for better storage and execution performance.

Securing data sharing in snowflake for a stable data access and governance across organization

Designed and developed pipelines in snowflake by combining streams and tasks to process changed data on a schedule at real time.

Provisioning and creating azure storage and containers.

Provisioning and connecting to an azure PostgreSQL database using Azure CLI

Configuring the azure databricks environment for data ingestions.

I saved the company 3% of their cost by developing a scripting algorithm for image processing.

Led the development for web scrapping project to find influencers for the marketing department.

I developed Rest API with lambda and API Gateway

Imported custom packages on AWS lambda.

Built data pipelines in GCP by designing roadmap to move data from application into Big Query using pub/sub

Implemented cloud dataflow and dataproc pipeline in GCP for stream and batch processing.

I created and maintained test plans, cases, scenarios/scripts using Selenium web Driver, in compliance with defined Test automation standards and methodologies. Send emails using SES, Lambda and Python

I automated file handling with python & AWS S3

I wrote lambda functions in python to interact with S3 and DynamoDB

Imported CSV files from S3 into Redshift with AWS Glue

Integrated JavaScript, j Query and CSS in their website.

I build automation tasks using selenium and python.

Create a database crude application for employees using Node.JS, Express and MongoDB

Multi-tasked across multiple functions and roles to generate project results and organizational expectations.

I participated in building a React.JS chat application with Sockets, CSS, and Rest API. This aided real time communication between employees and departments.

Feb 2018 — Feb 2019 Web developer, Patriots of Blessing Humanitarian Foundation, Houston

Provided front-end website development using SquareSpace.

Integrated HTML, CSS and JavaScript languages into Squarespace Implemented Google-based SEO and ad campaigns to meet budget specifications.

Multi-tasked across multiple functions and roles to generate project results and meet deadlines and organizational expectations.

Worked with the founder to synchronize web presence with brand identity and logo.

EDUCATION

Jan 2019— Dec 2023 Bachelor of Science in Computer Science, University of West

Georgia

Dec 2020 — Dec 2021 Software engineering, Launch School -Coding Bootcamp Online Full Stack Development

Aug 2017 — Dec 2019 Associate of applied science in Computer Programming,

Chattahoochee Technical College Marietta

SKILLS

Python Confluent Kafka

Snowflake SQL

Azure DBT core

AWS ETL

CERTIFICATION

Java

Snowflake



Contact this candidate