Post Job Free
Sign in

Data Engineer Solutions

Location:
San Diego, CA, 92101
Salary:
90,000
Posted:
July 31, 2024

Contact this candidate

Resume:

Page * of *

Rodolfo Mendivil

Data Engineer

With +8 years of experience in building and optimizing

data pipelines and architectures. Skilled in designing and implementing scalable solutions for data storage,

integration, and management. Committed to driving data- first initiatives and optimizing data delivery for cross- functional teams. Seeking an opportunity to contribute to you organization mission of transforming operations into a data-driven organization.

****************@*****.***

+526*********

linkedin.com/in/RMendivil-DE-BI-

DataScience

WORK EXPERIENCE

Data Engineer

KODEFREE - Data Consulting Firm

08/2023 - Present, Ciudad de Mexico

Projects: Coca-Cola BI,INTER Insurance Data Migration Accomplishments: PySpark ETL Development: Spearheaded the design and implementation of PySpark ETL pipelines to extract, transform, and load large volumes of raw data into a structured format suitable for analysis.Optimized ETL processes to enhance performance and reduce processing time. Azure Synapse Development:Played a key role in the migration and development of data solutions within the Azure Synapse Analytics platform, ensuring seamless integration with existing systems. Implemented best practices for data warehousing and performance optimization, resulting in a scalable and efficient data processing environment. Cross- functional Collaboration: Collaborated with cross-functional teams, including data scientists, business analysts, and other stakeholders, to understand business requirements and translate them into actionable data solutions.Acted as a bridge between technical and non-technical teams, ensuring a clear understanding of project objectives and alignment with business goals. Responsibilities: ETL Architecture and Implementation:Architected and implemented ETL processes using PySpark, ensuring the efficient extraction, transformation, and loading of data from diverse sources into the data warehouse. Tasks: Data Integration Projects: Using Apache Kafka for real- time data streaming and integration, this distributed architecture and fault-tolerance features in projects with large volumes of data with low latency. I have implemented Kafka producers and consumers in Python using libraries like confluent-kafka for seamless integration with existing systems. Using Apache Spark DataFrame API and Spark SQL for data transformation, manipulation, and aggregation tasks. Additionally, I have developed Spark jobs in Python using PySpark, harnessing its scalability and performance benefits for data integration workflows. Using Apache Airflow in orchestration complex data pipelines. This technology was used for task scheduling and monitoring features, enable me to automate data integration processes effectively. I have developed custom Airflow DAGs (Directed Acyclic Graphs) using Python to orchestrate data ingestion, transformation, and loading tasks across various data sources and destinations. Python Data Libraries (Pandas, NumPy) have been used in several of my data integration projects. I have used Pandas for data manipulation tasks, including data cleaning, merging, and reshaping operations. NumPy has been invaluable for numerical computing tasks, especially when dealing with multidimensional arrays and mathematical functions .Database Systems (SQL, SQLAlchemy): Data integration frequently involves interfacing with relational databases. I am proficient in writing SQL queries to extract, transform, and load data from databases into target systems. Additionally, I have utilized SQLAlchemy, a Python SQL toolkit, for database abstraction and ORM (Object-Relational Mapping) tasks, simplifying the integration process and enhancing code maintainability. With projects using Databrikcs tasks as Integration with Data Lakes and Data Warehouses : Integrations between Data Lakes and Data warehouses, was possible to leverage existing data infrastructure investments and perform analytics across disparate data sources without data movement or duplication. In the other hand this technology was used in other projects in data ingestion, transformation, and preparation. With its distributed computing framework based on Apache Spark, Databricks can handle large-scale data processing efficiently, making it ideal for building data pipelines and ETL (Extract, Transform, Load) workflows. Finally in few cases I had the experience to ManagedSpark Infrastructure for eliminating the need for organizations to provision, configure, and manage their own Spark clusters. Contact : Ivan Fernandez

SKILLS

SQL DAX PYTHON

PySpark VBA Apache Spark

Apache AirFlow Apache Cassandra

Apache Kakfka Snowflake CORE

NoSQL / Postgres, DynamoDB, Redshift

Ms Azure / Synapse Analytics, Spark, and

Data Factory

Data normalization, Indexes,

Constraints.

AWS / EC2, EMR, RDS, Redshift,Glue,

Kinesis and Lambda

Agile Environment / Jira

Data Governance, Data Quality, ACID

3NF, Star,Schema desgin

Power Bi / Tableau / LookerMl

CERTIFICATIONS

Microsoft Data Enginner Associate -

Certification (06/2023 - 06/2023)

IBM: SQL for Data Science

(04/2020 - 04/2024)

Business Intelligence - SDSU

(04/2023 - 04/2024)

Microsoft Azure - Certification

(07/2021 - 07/2024)

Big Data - SDSU (04/2023 - 04/2024)

Advanced SQL - SDSU

(04/2023 - 04/2024)

Applied AI - SDSU (04/2023 - 04/2024)

SnowPRO CORE - Snowflake

Certification (01/2024 - Present)

In Process

ThriveX: SQL Professional

(04/2023 - 04/2024)

Achievements/Tasks

Page 2 of 4

WORK EXPERIENCE

Lead D&A Data Engineer

Thomson Reuters - Elite

09/2021 - 10/2023, Mexico City

Organization in business management through technology helping businesses to face the challenges presented by the evolution of information technology. Activities in Snowflake: Data Modeling: Design and develop data models, schemas, and structures within Snowflake, understanding best practices for data organization and storage.ETL Development: Develop efficient ETL (Extract, Transform, Load) processes to integrate data from various sources into Snowflake, ensuring data accuracy and consistency.Query Optimization: Optimize SQL queries and workloads for performance, making use of Snowflake's features such as clustering, indexing, and partitioning.Data Integration: Integrate Snowflake with various data sources and tools, ensuring seamless data flow and compatibility between systems.Security Implementation: Implement robust security measures within Snowflake, including access control, encryption, and authentication, to protect sensitive data.Data Transformation: Perform data transformations and aggregations within Snowflake using SQL and other transformation techniques to meet business requirements.Data Quality Management: Implement data quality checks and validation processes to ensure data accuracy, completeness, and consistency within Snowflake. Others like Backup and Recovery, Performance Monitoring,Documentation, Collaboration, Automated Data Pipelines, Capacity Planning, Data Governance, Training and Knowledge Sharing, Performance Tuning, Vendor Management, Data Migration.

Contact : Jose Alejandro Gutierrez / LaVonne Maliga Business Intelligence and Data Modeler

Evolve Bank & Trust - B9 Fintech

08/2018 - 09/2021, San Francisco, CA.

Risk Loan BI Analysis, Markekting Leads Eficiency, Customer Services KPI's, SLA's, CSAT, NPS, and CES. Data Platform Management: Oversee the development and maintenance of data platforms, identifying trends and patterns in complex datasets to enhance data-driven decision-making.

Data Analysis and Algorithm Development: Carry out statistical research, prototype new systems, and develop algorithms to meet specific business needs, leveraging your expertise in data analytics and exploratory data analysis (EDA). Service Improvement and Innovation: Drive continual service improvement and innovation in productivity, software quality, and reliability, with a focus on meeting or exceeding service-level agreements (SLAs).

IT Standards Implementation: Collaborate with different partners and teams to implement IT standards, including operational and compliance standards, ensuring adherence to industry best practices.

Service Monitoring and Improvement: Monitor, support, and improve services according to incident, change, and problem management processes, aligning with IT/ TR and industry standards.

Report and Dashboard Development: BI Developers build reports and dashboards using BI tools like Tableau, Power BI, Looker, or others. They connect these tools to Snowflake to visualize data and present it in a user-friendly format. Contact : Mikail Matveev

LANGUAGES

English

Full Professional Proficiency

Spanish

Native or Bilingual Proficiency

3E ELITE - Mangamement Application for Law Firms in USA. Achievements/Tasks

Page 3 of 4

WORK EXPERIENCE

BI Developer

BBP & IT Services

02/2017 - 08/2018, Mexico City

TupperWare BI & DE Projects

BI Solution Design: Collaborate with stakeholders to design business intelligence solutions that address specific data and reporting needs while aligning with service management goals.

Data Visualization: Create interactive and informative dashboards and reports using tools like Tableau, providing actionable insights to support decision-making. User Training and Support: Provide training and support to end- users to ensure they can effectively utilize BI tools and reports in their day-to-day operations. Data Integration: BI Developers use Snowflake to integrate data from various sources, such as databases, data warehouses, and external systems. They design ETL processes to extract, transform, and load data into Snowflake for reporting and analysis. Data Modeling: BI Developers create and maintain data models within Snowflake. They design and optimize data structures, including tables, views, and schemas, to support efficient querying and reporting.

SQL Query Development: BI Developers write SQL queries in Snowflake to extract and transform data for business intelligence and reporting purposes. They create complex queries to generate meaningful insights from the data. Performance Optimization: BI Developers optimize query performance by fine-tuning SQL queries, creating materialized views, and implementing caching strategies. They work to ensure that BI solutions provide responsive and efficient access to data stored in Snowflake.

Contact : Julio Santillan

Operations and Statistics Director

Sagarpa - Federal Government

02/2013 - 02/2017,

Data collection: Gather relevant agricultural data from various sources such as farmers, agricultural organizations, reserach institution, and government agencies. Data Analysis: Use statistical techniques and software tools to analyze the data collected and identify trends, patterns and correlations, and derive actionable insights. Reporting: Prepare compenhensive reports and presentations summarizing the analyzed data, including key findings, trends, and recommendations for policymakers and stakeholders.

Database Management: Maintain and update agricultural databases, ensuring data integrity, accuracy and security.

Contact : Victor Hugo Celaya

TOP 3 PROJECTS

Aeromexico BI Conections and Groups

Optimized Fleet Utilization : By analyzing connections data, we were able to optimize fleet utilization by strategically scheduling flights to maximize the number of connections at key hubs. Identified Profitable Routes : Through connections analysis, we identified profitable routes with high demand for connecting flights.

Reduced Operational Costs : Connections analysis enabled us to identify opportunities to reduce operational costs by optimizing flight schedules and routing to minimize fuel consumption and other operating expenses.

Increased Ancillary Revenue : Leveraging connections data, we developed targeted strategies to increase ancillary revenue by offering relevant services and amenities to connecting passengers. Data Integration Tools used: Azure SQL Server Integration Service (SSIS) Coca Cola - Distribution Agreement

Optimized Distribution Network : Through BI analysis of distribution agreements, we identified inefficiencies and bottlenecks in the distribution network. Improved Inventory Management : Utilizing BI tools, we gained insights into inventory levels, demand forecasts, and stock replenishment patterns. Enhanced Sales Performance : BI analysis of distribution agreements allowed us to identify sales trends, customer preferences, and market opportunities. Strengthened Partner Relationships : By analyzing data from distribution agreements, we gained a deeper understanding of partner performance, compliance with contractual terms, and customer service levels.

Data Integration Tools Used: Informatica

Achievements/Tasks

Achievements/Tasks

Page 4 of 4

TOP 3 PROJECTS

TupperWare - BI Orders Fulfillment

Optimized Inventory Levels : Through BI analysis of order fulfillment data, we gained insights into product demand patterns, lead times, and inventory turnover rates. Reduced Order Processing Time : Utilizing BI tools, we identified inefficiencies in the order fulfillment process, such as bottlenecks in order processing, picking, and shipping. Enhanced Customer Service Levels : BI analysis allowed us to track order fulfillment metrics, such as order cycle time, on-time delivery performance, and order accuracy rates. Optimized Distribution Network : Leveraging BI insights, we evaluated the performance of distribution channels, transportation routes, and warehouse operations. Data-Driven Decision Making : BI analysis empowered Tupperware to make data-driven decisions at every stage of the order fulfillment process.

Data Integration Tool Used: IBM

EDUCATION

Data Science and Analytics - PostGrade

San Diego State University

06/2022 - 04/2023, San Diego California. USA

Real State Predictor Machine Learning

Module. - Thesis

Master in Business Administrtion

University of Arizona

06/2007 - 08/2008, Phoenix, Az

How to take the Data Insights for the

Decision Makers.

Bachelor Degree in Finance and Accounting

University of Sonora

08/1992 - 12/1996, Hermosillo, Son.

Certification as Professional Ranked on

the top 100 of the country.

Courses

Courses

Courses



Contact this candidate