Post Job Free
Sign in

Senior Data Engineer and BI Specialist

Location:
Milton, MA
Posted:
November 23, 2025

Contact this candidate

Resume:

David Moses

Phone: 781-***-****

Email: *************@*****.***

Technical Skills

Databases:SQL;SQL Server;Hadoop;Redshift;Vertica;Oracle;

PostgreSQL;SAS;Snowflake

Dashboard and Reporting Tools: Tableau; SSRS; Power BI; MicroStrategy

Programming Languages: Python; Ruby; C#; Java; VBA

Amazon Web Services: EC2; S3; EMR; RDS

Education

University of Massachusetts at Lowell

oPh.D. Student, Computer Science

oI was a part-time Ph.D. student at UMass Lowell during 2012 and 2013

oFinished all coursework. Left as all but dissertation (ABD).

University of Massachusetts at Boston

oMaster of Business Administration (MBA)

oMaster of Science in Information Technology (2010)

Boston University

oMaster of Science in Computer Science (2009)

oGraduate Certificate in Database Management and Business Intelligence (2015)

oGraduate Certificate in Web Application Development

Brandeis University (Waltham, Massachusetts)

oBA, Computer Science and Economics

oBusiness Minor

UC Berkeley

oI completed the Python Programming course as well as the Java Programming course in December of 2018 with a GPA of 4.0

oEnrolled in the UC Berkeley AI and Machine Learning certificate program with a graduation date in May of 2026.

Certifications

oMicrosoft Technology Associate (2018)-Database Fundamentals

Professional Experience

Aristotlesoft, Boston, MA June 2023 to Present

Consultant

Employed as a consultant on data engineering projects.

America’s Test Kitchen, Boston, MA December 2019–June 2023

Senior Data Engineer

Upgraded legacy Tableau dashboards to add quality assurance metrics. Added a CRM report with Tableau using Snowflake view as the data source.

Lead quality assurance control to maintain daily data updates. Fixed data issues in the E-commerce, Payups, Cancellations, and Emails and Orders Tableau dashboards using Snowflake views as the data source.

Worked with Alteryx for list pulls for the marketing team. Surpassed expectations, completing all tasks with complete accuracy and ahead of schedule. Completed twice the number of Jira tickets as the second most person.

Wrote Python ETL scripts to transfer data from the Oracle transactional database to the Snowflake data warehouse hosted in AWS.

Liberty Mutual Insurance, Boston, MA November 2017– December 2019

Senior Business Analyst/Data Analyst

Awarded a spotlight award for the work I did creating the Production Forecast Tool (PFT Tool). And, I was awarded a second spotlight award for the Excel VBA automation work I did for the Huddle Board project.

Wrote Excel VBA macros to automate the creation of huddle board Excel files created from Excel templates; to allow users to toggle a range in Excel between an Excel formula and a user entered value; to allow users to pull in the latest SharePoint data into the huddle board; and to drill down into the task level records by double clicking on a metric in the Huddle Board Excel sheet.

Created Excel based templates for each business insurance segment for the Huddle Board project. Formulas were written for all metrics for 7 Excel templates. The huddle boards my team created were used by the business insurance underwriting group for all team huddle meetings, serving about 1,200 employees.

Created the System Support and also the Underwriter Quality Assessment Tableau dashboards. The ability to drill down into the records underlying the summarized metrics was implemented in both Tableau dashboards.

Rewrote the SAS code for the PFT Tool to include datasets for the average policy size, the average account size, and a rolling 12 month hit ratio for underwriters. Automated the PFT SAS project to run daily on a schedule.

Wrote Excel VBA macros to allow users to pull into the Excel PFT Tool the latest records that were put onto SharePoint daily.

Worked closely with the stakeholders of the PFT Tool to define the metrics and specifications for the tool.

XPO Logistics, Boston, MA October 2014 – November 2017

Business Intelligence Engineer

Wrote or revised over twenty business intelligence, accounting, and billing reports using SQL Server. I coded stored procedures using T-SQL that made use of complex joins, CTE’s, windowing functions, and subqueries. SSRS report development included verifying reports under development by comparing the actual data from a spot check with the data being reported, as well as query tuning stored procedures used in the reports. Also, I met with stakeholders to gather their report requirements and for user acceptance testing of the reports.

Created Tableau dashboards for executive management.

Created the Customer Aging and Invoice Aging reports for accounting, which were key reports that XPO’s collectors used daily.

Created KPI reports for the freight brokerage group showing key metrics for sales reps and carrier reps.

Closed the most Jira tickets on the team. I closed twice as many Jira tickets as the second place closer.

Microsoft, Burlington, MA April 2014 – September 2014

Analytics Developer

As an analytics developer within the data analytics team, I wrote business intelligence reports, as well as doing ETL for a Redshift data warehouse hosted in AWS.

Wrote the Windows Phone, Nokia X, and S40 App Star Rating reports in Redshift. The reports I developed were published in business intelligence dashboards. I performed data quality assurance on the reports before they were published.

Rewrote Windows Phone reports at Nokia using HQL in Hadoop.

Closed the most Jira tickets on the team.

Nokia, Burlington, MA January 2013– April 2014

Analytics Developer

One of three data engineers on the AWS Redshift data warehouse project. In about 5 months we successfully created daily automated incremental ETL scripts to copy Gzip CSV files into the Redshift data warehouse from our source databases. While also loading about 10 Terabytes of data consisting of our full data from our various other data sources.

Wrote ETL scripts to send GNU zipped (GZip) CSV files to S3 using the S3Cmd tool, after first extracting the structured data into GZip CSV files from a variety of data sources (Hadoop, Salesforce, Oracle, and Postgres). I used the Pentaho Kettle ETL tool to create ETL transformations and jobs for the project.

Wrote and then deployed Ruby ETL scripts to an EC2 instance, which loaded GZip CSV files from our S3 bucket into Redshift. A Cron job was created on the EC2 instance to run the ETL scripts on a defined schedule.

Coursera Courses Completed-

oIntroduction to Swift Programming

oJava for Android

oPython Data Structures

oExcel VBA for Creative Problem Solving

oFundamentals of Visualization with Tableau

oProgramming in Python

oThe Data Scientists Toolbox

oUsing Databases with Python



Contact this candidate