Qualifications Summary
With over ** years of IT experience in the field of data analysis,
Strong experience in business and data analysis, data profiling, data migration, data integration, and metadata management services
Assisted the association in performing the data compiling, mining, and psychoanalysis required to assess the given data's ability to collaborate with peers in both business and technical areas to deliver optimal business process solutions in line with corporate priorities.
Develop Power Apps applications with interface forms to allow users to submit data to databases or spreadsheets, and use Power Automate to control the workflow with some automation scheduling.
Experience in understanding stored procedures and stored functions using PL/SQL
Good data warehousing concepts include meta data and data marts from an Active Directory source.
Connected data tables to TOAD for fixing data anomalies and discrepancies, data comparison, source to target, and target to source.
I wrote SAP scripts to speed up redundant steps for transferring data into the SAP Online System.
Performed calculations with aggregate functions, dynamic charts and graphics, data sets, connected joins, and combined data sources using Tableau and Tableau.
Gathering business and technical requirements that would best suit the needs of the technical architectural development process.
Proficient in Advanced Excel and Excel Visual Basic for Applications to Automate Tasks, especially for multi-task assignments.
Software and Software Operating System Skills
Backend: MSSQL Server, SAP HANA EE
Frontend: Power Apps, SAP
Programming: DOS, JavaScript, MS Access, MS Excel VBA, Python (Pandas, Numpy, Matplotlib),
VB Script
RDBMS: MS Access, MSSQL Server, Oracle 11g, SharePoint
Data Modeling: Snowflake, Star Schema,
ETL Tools: SSIS, Toad Data Point, Power Automate, Azure Data Factory, Azure Databricks, Data Lake, Informatica
Operating System: Windows 11
Visualization: Power BI, Google Data Studio, Looker, Tableau
Some Knowledge: C++, C#, MS Access VBA, HTML, JavaScript, Cascading Style Sheets, Office Scripts,
SAS Base 9, Accounting, QuickBooks
Degrees, Certification & Other Training
Solano College, Fairfield, California: Associate degree in Computer Science (January 2008 - May 2014), GPA: 3.6
Prometric Testing: SAS 9 Programming (Sep 2010); Location: San Francisco, California
Navel Technologies: SAS Training; Location: Online
TechPoint Solutions: Data Warehousing Training (Oracle, SQL/PLSQL, ETL, Informatica, Crystal Reports) . Tutorials Point, Tableau Training Course; Location: Milpitas, California
Accomplishments
SATTT, Detroit, Michigan: July 8, 2024 – December 31, 2025
Data Analyst - Remote:
Performed high-volume data entry and validation across multiple systems, ensuring 99% accuracy in financial, healthcare, and operational datasets.
Designed and maintained ETL pipelines using SQL Server, Informatica, and Python, automating extraction, transformation, and loading of data from diverse sources into centralized repositories.
Conducted data quality audits and implemented cleansing routines that reduced reporting errors by 30%.
Built parameterized, metadata driven pipelines using dynamic content, lookup activities, and reusable templates to reduce maintenance and accelerate deployment..
Partnered with cross-functional teams to translate business requirements into technical workflows, improving efficiency of reporting cycles by 25%.
Optimized SQL queries and stored procedures for large datasets, cutting query runtime by 40% and enhancing system performance.
Implemented ADF triggers (schedule, event based) to automate ingestion workflows and support near–real time data availability for reporting teams. Supported data migration projects by mapping legacy data structures to modern cloud-based platforms (Azure Data Factory & Azure Databricks).
Developed PySpark and SQL notebooks in Azure Databricks to transform, cleanse, and enrich large scale datasets for analytics and machine learning workloads. Collaborated with stakeholders to deliver ad-hoc analysis on workforce, financial, and operational trends, directly influencing strategic planning initiatives.
Sempra, San Diego, California: July 13, 2022 – May 10, 2024
Data Analyst - Remote:
Migration and Automation. Developed window functions, user-defined tables, and translated business questions into SQL scripts. Store procedures, views, and complex queries using SAP HANA EE.
Translated scripts from other languages, such as Excel, VBA, and SAS, into SAP HANA EE PL/SQL Scripts.
Used Toad Data Point to automate many of the department tasks, such as SQL scripts, uploading and downloading data from MS SQL Server, SAP HANA EE, Microsoft Excel, etc.
Collaborated with other data analysts and research analysts to integrate PL/SQL code within software applications, ensuring seamless database interactions.
I created easy step-by-step documentation for performing certain tasks in Microsoft Word and PowerPoint. Develop Power Apps applications with interface forms to allow Marketing Department users and external customers to submit data to databases or spreadsheets.
I created complex workflows that involved automation scheduling, calculations, updating data, validations arrays, conditions, loops, HTML tables, and cascading style sheets in Power Automate to be used for Power Apps.
Migrated reports from Google Analytics into Tableau, cleansed and formatted data, created reports, Power BI DAX functions for more complex reports, and key performance indicators for better visuals. Create visuals using Tableau to extract data from various sources such as Google Analytics, Sharepoint List, MSSQL Server, SAP HANA, etc.
Designed and implemented ETL pipelines using Azure Data Factory to extract, transform, and load data from diverse sources.
Developed and optimized data workflows, improving performance and efficiency in cloud-based data integration. Schedule and automate reports to be submitted in a particular date and time depending on criteria with Toad Data Point and Power Automate
Integrated data from multiple sources, including SQL databases, Excel spreadsheets, and cloud services, into Power BI for comprehensive analysis. Implemented robust data cleaning routines using Python’s Pandas module, effectively handling missing values, duplicates, and outliers in large datasets.
Retrieve archived maps for projects that need to be updated per client requests.
SWITO, Boca Raton, Florida: August 04, 2016 – June 23, 2022
Data Analyst - Remote:
I created tables and imported all personnel information into an Access database.
Develop reports from Tableau; use some Excel and Access VBA to quickly complete some tasks with efficiency, and develop complex SQL scripts to retrieve data tables and queries.
Created detailed documentation for various tasks using step-by-step methods.
I self-trained myself to use new software and then performed live training with over 30 personnel online on how to use new software.
Use PostgreSQL to connect to other sources and create financial summary data, tables, and views using complex SQL queries.
Created and maintained complex SQL queries and stored procedures for data extraction, transformation, and loading (ETL) processes.
Discover zero-cost or low-cost solutions to complete big tasks, allowing the organization to use those funds for other projects.
Effectively addressing missing values, duplicates, and incorrect entries in Excel workbooks using Python’s OpenPyxl module
I also attended refresher and new courses on SSRS, SSIS, Access and Excel, VBA, PowerShell, and C# during my spare time. Send out mass emails to various news media for the different departments within the organization in a timely manner.
Created Tableau reports from datasets which many involved using Power BI DAX Functions. Many of the reports required cleaning data source using M Code in the Advance Editor before proceeding to upload the data
Automated Excel tasks, including data extraction, manipulation, and report generation, using OpenPyXL to enhance productivity and reduce manual effort.
Utilized Pandas for data cleaning, transformation, and analysis, handling large datasets with complex structures
Provided ongoing support and troubleshooting for Power BI users, resolving any issues related to data accuracy and report functionality.
I extracted data from various online sources to compile contact lists for different groups to send out solicitations.
Work with the upper management and designated special projects personnel to implement data strategies, tables, keys, and measures, build data flows, and develop conceptual data models.
Pacific Gas & Electric, San Francisco, California: March 30, 2015 – August 4, 2016
Data Analyst:
Migrated the Planned Outage Dashboard Process from Microsoft Access Macros into SQL Server 2012 Store Procedures.
I used Microsoft Visual Studio to create SSIS packages for faster import and export of tables to and from different database formats. Process various reports and upload them to the performance management site for multiple divisions to review.
Enhance Microsoft Excel spreadsheets using Excel Visual Basic Code to enhance worksheet performance and cut down on errors and downtime.
Developed procedures for multiple tasks using step-by-step instructions with friend illustrations.
Audited and corrected data received from different sources before positing the results to the’ division's performance dashboard.
Forecasting data using Microsoft Access/Excel and PL/SQL to help prevent overspending and the MOD SQL function to calculate the total amount from the value of extra items to be set aside for the next year budget helped save the department from overspending.
Used SAP accounting software to extract reports from the business warehouse and production and enter data when necessary. Assisted the department’s data operations analyst using Oracle SQL Developer to pull meta data information from different tables.
Developed and maintained SAS programs to extract, transform, and load data from various sources into SAS datasets.
Created and maintained SAS macros and other programming tools, resulting in a 15% increase in efficiency.
Generated reports, summaries, and other data outputs using SAS, improving data accuracy by 20%.
Collaborated with cross-functional teams to develop and maintain data security and validation processes, reducing data errors by 25%.
Implemented data quality checks and data warehousing processes, leading to a 30% increase in efficiency.
Conducted data analysis, data mining, and data manipulation utilizing SAS and related tools.
Creation of metrics, attributes, filters, reports, and dashboards created advanced chart types, visualizations, and complex calculations to manipulate the data using Tableau software.
I volunteered to be the main contact source for converting questions into SQL queries and providing detailed information back to the coworkers.
The Linux/Unix matching command allowed me to extract, filter, and export out the necessary data using a nice-readable format instead of manually separating the data into separate columns. Proficiency in Linux commands, scripting (bash, Python), and system management.
Cleaning data using different functions in MS Excel and pattern matching criteria using SQL commands I was able to assist coworkers with their questions; this allows other personnel in my group to spend more time completing their work, keep the department workflow moving at a good pace, and help minimize backlogs.
Chevron, San Ramon, California: September 29, 2014 — December 23, 2014
Data Analyst:
STRATA Site Benchmarking Project. Performed data analysis, data entry, and cleanup on over 1,000 records from different business units, checking and fixing anomalies, formatting issues, and inconsistencies.
Create complex charts and graphs with interactive filters and drilldowns that will allow managers from various divisions to quickly locate outliers and correct any anomalies.
I gathered requirements from remotely based business users and defined and elaborated on the requirements by holding meetings with the users (who are also fifth-third employees).
Analyzed historical documentation, supporting documentation, screen prints, e-mail conversations, presented business, and wrote the business requirements document.
Prepare an auditing analysis to identify trends, patterns, and/or system issues that may contribute to coding and documentation deficiencies and risk areas. I made recommendations on data metrics to upper management and co-workers before submitting the final metrics to the vice president of the company.
Experience in developing stored procedures, functions, views, triggers, and complex SQL queries using SQL Server, TSQL, and Oracle PL/SQL Proficient in the integration of various data sources with multiple relational databases like Oracle 11g/Oracle 10g/9i, MS SQL Server, Relational and Flat Files into the staging area, ODS, Data Warehouse, and Data Mart.
Involved in performance tuning at source, target, mappings, sessions, and system levels.
I converted the technical documentation into plain language so even a non-technical person could understand the analysis.
ACIS Software Solutions Inc, Park Ridge, Illinois January 8, 2014 – July 31, 2014
SQL/ETL Developer:
Data Warehouse/ETL Developer Advanced Training. Responsibilities: Informatica 9 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools – Informatica Server, Repository Server manager.
Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g/9i, MS SQL Server; Relational and Flat Files into the staging area, ODS, Data Warehouse and Data Mart. Involved in Performance tuning at source, target, mappings, sessions, and system levels. Concepts using Slow Changing Dimension Type: 1, 2, and 3).
Documented Informatica mappings in Excel spread sheet. Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
Worked on complex Source Qualifier queries, Pre and Post SQL queries in the Target.
Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
Implemented performance tuning logic on Targets, Sources, Mappings and Sessions to provide maximum efficiency and performance.
Pacific Gas & Electric, Concord, California: February 18, 2011 – November 22, 2013
Data Analyst:
Safety Environment Management System & Maximum Allowable Operating Pressure Project. Exported data from the department’s database and then used MS SQL Server 2012 SSRS to populate for Chart, Summary Reports, Tablix, Lookups.
Proficient in Advance Excel and Excel Visual Basic for Applications to automate tasks especially for multi-tasking.
Advise SAP developers and project managers of much needed enhancements to database to prevent use errors during development and accurately entered and maintained data in SAP systems, ensuring data integrity and consistency.
Created system architecture diagrams and design documents to outline technical solutions. Designed and executed unit, integration, and system tests to ensure software quality.
Utilized automated testing tools to streamline the testing process and improve test coverage.
Participated in design reviews and provided feedback to ensure high-quality, scalable designs.
Analyzed workforce diversity data to assess the effectiveness of diversity and inclusion initiatives. Created dashboards to track key metrics such as gender and ethnic diversity, and provided recommendations to enhance diversity within the organization. Analyzed historical documentation, supporting documentation, screen prints, e-mail conversations, presented business and wrote the business requirements document.
Proficient in Advance Excel and Excel Visual Basic for Applications to automate tasks especially for multi-tasking.
Evaluated the effectiveness of training and development programs by analyzing participant feedback and performance data. Provided insights to HR leadership on program improvements and ROI.
Maintained an account of field studies and collected geographical data.
Collected Data on job applications, hiring rates, and time-to-fill positions.