Post Job Free
Sign in

Data Analyst - ETL - Tableau & Python Proficiency

Location:
Washington, DC
Posted:
February 16, 2026

Contact this candidate

Resume:

Years of Relevant Experience: * Years

Security Clearance: level (Status) – N/A

Current Title: Research Analyst

PMA: DEOMI

Professional Summary

Dedicated and results-oriented Data Analyst with over 4 years of industrial experience in transforming complex datasets into actionable insights. Proficient in programming languages such as Python, R, and SQL, with expertise in leveraging advanced libraries like Pandas, NumPy, Matplotlib, and Seaborn for efficient data manipulation and analysis. Skilled in creating compelling visualizations and dashboards using Tableau, Power BI, and Excel to support data-driven decision-making. Experienced in managing databases like MySQL and SQL Server, and adept at utilizing cloud technologies such as AWS and Azure to design scalable data solutions.

Education, Awards, Training and Certificates

BS Degree, Computer Science, Bahir Dar University, Ethiopia, 2011.

Diploma, Languages, St. Mary’s University College, 2009.

Certificate, Data Analytics, THINKFUL a Chegg’s service, 2021.

Professional Experience

09/24 – Present: DEOMI – OY2 TO7 Research Analyst, JHT Inc/Precise Systems, Remote

Analyzing and interpreting large data set from EEO, EOAC, and EOARCC courses to extract actionable insights and identify trends in students’ performance, demographics, and satisfaction.

Developing interactive dashboards in Tableau to visualize average rates, counts, and demographic distributions for leadership and stakeholders.

Designed and implemented ETL pipelines using IBM DataStage to extract, transform, and load structured data from multiple sources into centralized data warehouses.

Integrated SAS datasets into enterprise ETL workflows using IBM DataStage, ensuring seamless data preparation for reporting and modeling.

Developed and maintained Java-based ETL components for automating data ingestion, validation, and transformation processes.

Cleaning, transforming and mapping raw survey data, ensuring accuracy and consistency across multiple datasets to support reliable reporting.

Conducting statistical analyses using tools such as SPSS and Excel to generate reports on participant satisfaction, relevant lessons, and open-ended feedback.

Coordinating with team members to validate data models and streamline reporting processes in alignment with project timelines.

Supporting curriculum evaluation efforts by analyzing quantitative and qualitative data using Python libraries such as Pandas, NumPy, and SciPy.

Conducting inferential statistical tests including ANOVA and T-tests to assess program effectiveness and detect significant differences across cohorts or demographic groups.

Performing trend analysis and time series evaluations to identify performance patterns and long-term impact.

Writing and optimizing SQL queries to extract, filter, and aggregate survey and performance data from internal databases.

Collaborating with team leads to interpret statistical results and translate findings into actionable insights for curriculum improvement.

09/22 – 09/24: Data Analyst, SilverSpace Technology, Remote

BNY, Financial Data Analysis

Developed custom Python scripts to extract data from various sources, including PostgreSQL databases and API endpoints, consolidating disparate data sources into a unified data model.

Crafted custom R scripts and functions to automate repetitive data processing tasks, improving the efficiency and consistency of data workflows.

Collaborated with cross-functional teams to gather data requirements and translate them into data mapping documents and transformation rules.

Created and optimized job sequences in DataStage, improving runtime efficiency and reducing data processing latency by 25%.

Created dynamic Tableau Dashboards with drill-down, filtering, and slicer functionalities, allowing users to interactively explore and analyze data at different levels of granularity and dimensions.

Automated repetitive data processing tasks in Excel using macros and VBA (Visual Basic for Applications) programming, improving the efficiency and reliability of reporting workflows.

Wrote complex SQL queries to extract, filter, and aggregate data from the MySQL databases, leveraging advanced SQL features such as stored procedures, triggers, and user-defined functions.

Applied Agile techniques, including user stories, acceptance criteria, and story points, to clearly define, prioritize, and estimate data-related tasks, enabling the team to respond quickly to changing business needs.

03/22 – 08/22: Data Analyst, Tekwissen, Remote

Amazon, AMZL Power Project

Developed Smartsheet Dashboards and annual plans to measure KPIs, Work breakdown structures, RASCI Charts, Various process flows, and databases, establishing a program foundation and tracking health in an aggressive, schedule-focused program lacking existing measurement means.

Utilized a last-mile business intelligent data system to construct analytical models informing the safety program strategy and generating custom deep-dive analyses beyond regular reports.

Supported efforts to rectify data quality issues by identifying root causes and presenting potential solutions.

Presented daily and weekly reports by analyzing milestones on project phases' improvement.

Analyzed data, identified business needs, and defined requirements for the largest nationwide distributed power and infrastructure expansion for AMZL, all for the Climate Pledge. Created dashboards, processes, metrics, and operating plans to drive an annual $100M+ capital program. Directed strategy for annual planning and change management across multiple organizations

Built and published customized interactive reports and dashboards, and scheduled reports using Power BI.

02/19 – 02/22: Data Analyst, Silverspace, Remote

Pay Chex, Financial Data Analysis

Leveraged Python's robust data manipulation and analysis capabilities to automate and streamline various data-related tasks, including data extraction, cleaning, transformation, and feature engineering.

Implemented NumPy's powerful array manipulation and mathematical functions to perform advanced data processing and statistical analysis on large, complex data sets.

Orchestrated Pandas' data structures (Series and Data-Frames) and data wrangling capabilities to efficiently clean, transform, and combine data from multiple sources, improving data quality and reliability.

Employed Matplotlib and Seaborn for creating highly customizable and informative data visualizations, enabling stakeholders to better understand trends, patterns, and outliers in the data.

Optimized Tableau's advanced charting and visualization capabilities, including custom calculations, parameters, and geographic mapping, to create impactful and insightful data narratives.

Integrated advanced Excel features, such as pivot tables, Vlookup, and custom formulas, to efficiently clean, organize, and summarize complex data sets.

Designed and implemented custom PostgreSQL database schemas to support the company's data infrastructure, ensuring data integrity, scalability, and performance.



Contact this candidate