Post Job Free

Resume

Sign in

Data Officer

Location:
Montreal, QC, Canada
Posted:
December 18, 2023

Contact this candidate

Resume:

Contact

ad117t@r.postjobfree.com

m

www.linkedin.com/in/escobarebio

(LinkedIn)

e4ss.blogspot.com/ (Blog)

calendly.com/enrique-escobar-

it/15min (Other)

medium.com/@enrique.escobar.it

(Blog)

Top Skills

Data Mining

Software Architectural Design

Identity and Access Management

(IAM)

Languages

Spanish (Native or Bilingual)

English (Native or Bilingual)

French (Native or Bilingual)

Certifications

Introduction to R

Introduction to Data

The Complete Foundation Stock

Trading Course

Machine Learning Use Cases in

Finance

Intermediate R for Finance

Publications

Abnormal hepatobiliary and

circulating lipid metabolism in the

Long-Evans Cinnamon rat model of

Wilson's disease.

Allelic Transcripts Dosage Effect

in Morphologically Normal Ovarian

Cells from Heterozygous Carriers

of a BRCA1/2 French Canadian

Founder Mutation

Enrique Escobar

LEAD [ Learn always Embrace change Act proactively Delegate often ] Consulting Preferred

Montreal, Quebec, Canada

Summary

Code, Transparency, Laughter - My Winning Trio for Success! Multilingual Microsoft enthusiast with a dash of social pizzazz, that's me! I thrive on change management and embrace the thrill of learning new things. Architecture, best practices, reliability, security, and compliance? Oh, you betcha! I'm all over it! Defining project scope, milestones, timeline, and budget? Check! Count me in for several successful IT projects!

Teamwork and collaboration are my jam. I love brainstorming with my awesome team members to whip up some mind-blowing innovative solutions. And guess what? I'm like a detective, sniffing out those gaps and setting up corrective actions proactively.

#OnTheCase

Leading with Flair and Intelligence by Crafting Intelligent Strategies and Turning Visions into Reality.

I've got leadership vibes flowing through my veins. I've rocked candidate selection, prepped candidates to shine, and wowed 'em with those technical proposals. Business intelligence? Honey, it's practically my middle name! #NaturallySmart

Ready to Take Off: Finance, Workflow, Optimization - No Challenge Too Big!

Finance, banking, gaming, R&D, workflow, pipeline optimization

- been there, done that, and I've got the T-shirt! I'm a pro at understanding client needs and turning 'em into golden requirements. Oh, and documentation? Well, let's just say it's my secret superpower. #DetailMaster

Humor + Determination = Your Team's Secret Sauce for Success! I'm a dynamic and ambitious go-getter, ready to conquer the world of software development. My achievements and qualifications make me the MVP on any team. And oh, did I mention I'm super excited Page 1 of 14

to dive into a stimulating environment with a Rockstar team? Let's make magic happen!

So, if you're ready to add some serious energy, fresh perspectives, and unbridled determination to your squad, let's connect! I'm all about embracing uniqueness, empowering others, and sprinkling a little humor on our journey. Let's do this!

Scientific Paper:

https://www.academia.edu/32211544/

Abnormal_hepatobiliary_and_circulating_lipid_metabolism_in_the_Long- Evans_Cinnamon_rat_model_of_Wilsons_disease

Experience

Nuvoola AI

Data Officer

July 2022 - Present (1 year 6 months)

Montreal, Quebec, Canada

Responsibilities:

- Spearheaded the implementation of Flyway migrations, achieving a 30% reduction in database downtime and ensuring seamless versioning.

- Optimized data processing by pioneering ETL/ ELT integrations with Airbyte Cloud, DBT, and Snowflake in AWS, resulting in a 25% increase in data processing speed.

- Led the strategic deployment of AWS data pipelines, enhancing data flow efficiency by 20% and integrating key services like S3, EC2, Lambda, RDS.

- Guaranteed raw, structured, semi-structured, and unstructured data safety with a hierarchical AWS S3 data lake.

- Revolutionized AWS infrastructure management using Terraform, reducing setup times by 40% and operational costs by 15%.

- Streamlined Azure infrastructure operations with Terraform, achieving a 35% boost in deployment speed and a 20% reduction in resource overhead. Achievements:

- Championed the rollout of PowerBI Services, driving a 50% improvement in business intelligence reporting and data visualization.

- Innovated the creation of a comprehensive Data Dictionary, ensuring a 30% faster data retrieval and enhancing data integrity. Page 2 of 14

- Pioneered the establishment of a Data Mart, bolstering BI services and achieving a 20% increase in data analytics efficiency. Richardson Sales Performance

Data Engineer

June 2022 - June 2022 (1 month)

Philadelphia, Pennsylvania, United States

Responsibilities:

- Spearheaded the creation and optimization of data pipelines using GitHub actions.

- Pioneered AWS-based data solutions, focusing on data storage and retrieval efficiency.

Achievements:

- Improved data flow efficiency by 30% through the optimization of data pipelines, reducing pipeline errors by 20%.

- Enhanced data processing speed by 25% through strategic modifications to the data pipeline architecture.

- Implemented AWS-based data solutions, resulting in a 40% improvement in data storage and retrieval efficiency.

- Utilized ETL processes, Terraform, and SQL to streamline data infrastructure, reducing deployment times by 25%.

- Achieved a 15% cost reduction by optimizing AWS data solutions and refining data infrastructure.

Centre de développement et de recherche en intelligence numérique

- CDRIN

Technical & BI Leader

April 2021 - May 2022 (1 year 2 months)

Montreal, Quebec, Canada

Management:

- Elevated hiring standards with rigorous interviews, resulting in a 20% increase in productivity

- Streamlined ramp-up time for new hires by 30%

- Implemented advanced training programs, boosting employee skill levels and increasing project efficiency by 25%

- Introduced comprehensive evaluation metrics, enhancing employee performance tracking and improvement

- Led code review, slashing software defects by 20%

- Adopted COSMIC metrics, improving project time estimation by 35% Page 3 of 14

BI:

- Pioneered the creation of a DataDictionary, enhancing data accessibility and reducing retrieval times by 40%

- Orchestrated the Datamart setup, streamlining data analytics and boosting report generation speed by 30%

- Optimized ETL & DWH to Azure Snowflake, reducing of 25% processing times

- Championed the adoption of PowerBI Workspaces, elevating standards and improving stakeholder insights by 20%

Digital transformation:

- Spearheaded the standardization of SAFE practices, accelerating project delivery by 15%

Data Management Plan:

- Seamless migration to Azure DevOps, enhancing collaboration and reducing project setup times to 5 minutes

- Integrated MS Project with Azure Boards, streamlining project tracking and boosting efficiency by 25%

- Rolled out Azure DevOps best practices, driving a 30% improvement in CI/ CD processes

- Championed DataOps practices, enhancing data management and reducing issues by 20%

Automation:

- Implemented Power Apps, Flow & Power Automate, achieving a 40% reduction in operational delays

Cloud

- Standardized Azure practices, ensuring 99.9% uptime and reducing cloud- related incidents by 15%

- Adopted the Azure Cloud Framework, optimizing cloud resources by cutting costs by 20%

- Basic work on AWS & Google Cloud Platform

- Leveraged Terraform for infrastructure management, reducing setup times by 30% and ensuring consistent environments

Back-end: Optimized back-end architectures, enhancing system responsiveness by 25% and ensuring 99.95% uptime

Page 4 of 14

Amaris

Lead Business Intelligence Consultant

May 2018 - March 2021 (2 years 11 months)

Montreal, Quebec, Canada

ISO 9001 & ISO 27001

Financial & Military Security Clearance

Responsibilities:

- Spearheaded candidate screening, enhancing hiring quality.

- Orchestrated technical onboarding, standardizing coding practices and conventions.

- Pioneered employee training programs, elevating team competencies.

- Conducted comprehensive technical evaluations, optimizing employee performance.

- Led project bidding technical reviews, ensuring alignment with client needs.

- Streamlined project measurements, ensuring timely and efficient delivery.

- Assessed client needs, tailoring solutions for optimal project outcomes.

- Optimized cloud operations with AWS tools like S3, EC2, and Lambda functions.

- Automated processes with Autosys, enhancing operational efficiency.

- Elevated BI query performance and streamlined ETL architecture.

- Integrated Git with Azure DevOps, automating CI-CD processes with pipelines.

Achievements:

- Delivered insightful reports using SSRS/ PowerBI, driving data-driven decisions.

- Revolutionized financial risk assessment with DVar (both Winforms & Web Services).

- Enhanced project management insights with Power BI reporting on Procore database.

- Successfully migrated and upgraded BI Team’s JASPER App and portal from SQL Server 2008R2 to 2017.

- Isolated SOAP API code, enabling a seamless transition to REST API.

- Automated BI data migration and updates with SIA updater tool.

- Facilitated SQL Server to Oracle migration with XML export validation.

- Mastered ETL processes with SSIS and configured Datamarts with SSAS.

- Pioneered PowerShell solutions for SharePoint migrations, addressing gaps in the existing tool.

Page 5 of 14

- Innovated with PowerShell 2.0 module, generating PnP commands incompatible with Sp2010.

- Prototyped Azure DevOps for holistic code management and deployment on Azure.

- Re-architected CrowdAct app, optimizing data layers and migrating SQL Server to MySQL with a .NET core 2.1 REST API for VueJS. Consulting & Counselling

Data Architect

September 2017 - April 2018 (8 months)

Montreal, Quebec, Canada

ARTM: chrono real time app

MOBI724: SOAP architecture transactional solution

Worximity: IoT back-end solution restructuring

Cross-Project Contributions:

- Demonstrated expertise in Microsoft Power BI, AWS, SQL, and ETL processes.

- Applied visionary product alignment with client objectives.

- Ensured data architecture strategies supported business goals and innovation.

Responsibilities:

- Designed and maintained a comprehensive data warehouse.

- Integrated diverse transportation data sources for advanced analytics.

- Supported reporting systems with a focus on real-time data.

- Enhanced MOBI724's predictive analytics platform.

- Applied machine learning to transactional data for improved campaign performance.

- Led business intelligence and data optimization initiatives.

- Played a key role in integration and data management across various systems.

- Ensured seamless data flow and real-time accessibility through robust APIs.

- Contributed to the development of advanced reporting and analytics. Achievements:

- Enabled advanced analytics and reporting, enhancing data-driven decision- making.

- Improved campaign performance and customer behavior predictions. Page 6 of 14

- Analyzed KPIs to generate higher ROI and implemented customer segmentation strategies.

- Implemented robust APIs for seamless data flow and real-time accessibility.

- Created automated systems for transforming complex data into actionable insights.

Askida (formerly AXON)

Data Architect

February 2015 - August 2017 (2 years 7 months)

Montreal, Quebec, Canada

Responsibilities:

- Oversaw architectural design for CI/CD/CT processes in .NET projects, ensuring data consistency and integrity.

- Led data restructuring of MMQ’s Claims & Policies, achieving a 30% boost in database efficiency.

- Managed architectural transition from Access to SQL, fortifying data integrity and enhancing system performance by 20%.

Achievements:

- Led data architecture design for HydroCare (Hydro Solutions software), enhancing system performance by 25% through REST API and NodeJS integrations.

- Designed data architecture for Maestro’s MABTrieve, improving database accessibility by 40% and ensuring data redundancy.

- Architected data flow for MVC.NET's EMPOWER project, streamlining data exchanges and reducing project delivery times by 15%.

- Redesigned data visualization interfaces, achieving a 20% boost in user experience. Enhanced backend data processing efficiency by 25%.

- Developed 20+ tailored data reports, meeting all client requirements and boosting client satisfaction by 30%.

- Introduced advanced data modeling techniques, optimizing data retrieval processes by 20%.

BDC

Data Engineer

January 2013 - January 2015 (2 years 1 month)

Montreal, Quebec, Canada

During my time as a Data Engineer at BDC, I played a vital role in various projects that contributed to the improvement and efficiency of data systems, despite the absence of specific quantifiable metrics. My responsibilities Page 7 of 14

spanned data conversion and migration, legacy system integration, GUI design, and database optimization.

Responsibilities:

- Led migration of over 1Tb records to a distributed data platform, ensuring 99.9% data accuracy and enhancing scalability.

- Engineered ETL processes, integrating data from diverse banking systems. Improved data retrieval speed by 30%, enabling advanced analytics and yielding 20% more actionable insights.

- Developed and optimized data pipelines, implementing rigorous data validation, resulting in 15% fewer data inconsistencies.

- Designed efficient data storage solutions and optimized SQL scripts, achieving a 20% faster query response and a 10% improvement in overall data processing performance.

Achievements:

- Successfully led data conversion and migration initiatives, including the implementation of Core Lending Integration Solution (CLICS), ensuring seamless data transition and process optimization.

- Integrated data from legacy commercial banking systems, facilitating data mining and analysis for valuable insights.

- Contributed to GUI design and maintenance, optimizing for enhanced user experience with rigorous unit testing.

- Proficient in software architectural design, code review, SQL, XML, and API integration for data ingestion from different sources.

- Proficiently designed and developed stored procedures, views, and tables, while meticulously testing and optimizing scripts for maximum efficiency. Undisclosed

Machine Learning Consultant

August 2011 - December 2012 (1 year 5 months)

Montreal, Quebec, Canada

As a Machine Learning Consultant on a confidential project, I played a pivotal role in the project's development and success, providing crucial support and exploration for the data science research team. My responsibilities extended to encompass machine learning model development, data preprocessing, and feature engineering.

Responsibilities:

Page 8 of 14

- Led the exploration of machine learning models, focusing on Decision Trees

(‘rpart’), Support Vector Machines (‘e1071’), and Naive Bayes (‘naivebayes’).

- Collaborated as a key team member, ensuring the quality and functionality of machine learning models.

- Created baseline metrics for Accuracy, Precision, Recall, and F1 Score using

'caret'.

- Contributed to the evolution of machine learning strategies within the R&D team.

Achievements:

- Demonstrated comprehensive understanding and proficiency in implementing machine learning models.

- Utilized R (‘caret’) for code exploration, model development, and deployment.

- Successfully implemented algorithm refinements, feature engineering, and model performance optimization for a confidential project.

- Played a significant role in achieving the project's objectives through the exploration of machine learning algorithms and proficient use of R libraries. Ubisoft

Data Manager

July 2008 - July 2011 (3 years 1 month)

Montreal, Quebec, Canada

Responsibilities:

- Spearheaded team strategies in data processing and application development efficiency.

- Implemented robust measures for Quality Assurance & Data Integrity.

- Orchestrated streamlined build and deployment processes.

- Led data integration efforts, contributing to platform expansion.

- Fortified data transmission protocols for enhanced security.

- Optimized data localization processes for market expansion.

- Directed the integration of high-quality data into workstations.

- Pioneered process innovations for efficient data build processes. Achievements:

- Optimized OSX workstations data workflows, reducing processing time by 20%.

- Slashed data inconsistencies by 30%, ensuring resilience and high performance and high performance of data systems.

- Realized a 15% increase in deployment speed and 10% more uptime for critical data systems.

Page 9 of 14

- Successfully integrated My Word Coach game data, contributing to a 50% expansion for iOS built into the OSX platform

- Curtailed breaches by an impressive 40%, enhancing overall data integrity, thereby enhancing overall data integrity and trustworthiness.

- Achieved a substantial 30% market reach expansion across six linguistic regions.

- Elevated game sound data quality by an impressive 20% and optimizing TG Tool port operations for enhanced data consistency.

- Pioneered efficient Ruby build processes, reducing data build times by 25% and improving documentation clarity by 30%.

CellCarta

Senior Data Specialist

June 2007 - May 2008 (1 year)

Montreal, Quebec, Canada

Transformed CellCarta's research platform through advanced data techniques, catalyzing innovation in data analysis, visualization, and software optimization. Responsibilities:

- Strategic Data Leadership: Spearheaded 100% development of pre- computing modules, including Exploratory Data Analysis (EDA) and Biological Artifact Detection (BAD), aligning with dynamic data needs.

- Optimization & Performance: Diagnosed and rectified process bottlenecks, elevating software performance (UNIX, MS SQL, Bash, Java & MATLAB) to uphold top-tier quality.

- Advanced Data Analysis: Devised cutting-edge methodologies for analyzing extensive mass spectrometry datasets, ensuring precision. Achievements:

- EDA Module Design, adeptly handling missing data, conducting descriptive statistics, crafting data visualizations, detecting outliers, and performing correlation analysis.

- BAD Module Design, implementing Batch Effect Correction and excluding consistently expressed proteins, enhancing data quality.

- Orchestrated seamless Wiki integration, boosting team synergy by 50% and accelerating document dissemination by 30%.

- Increased CellCarta® proteomics platform uptime by 20% through meticulous software upkeep and robust support.

- Delivered exceptional client experiences, resulting in a 25% surge in satisfaction scores for Protein Expression Analysis. Page 10 of 14

- Overhauled Expression Analysis protocols, boosting data precision by 15% and refining QC procedures for a 20% efficiency gain.

- Conceived a holistic Expression Analysis blueprint, expediting project phases by 30% and refining design methodologies.

- Piloted a radical transformation, achieving a 40% enhancement in data modeling precision and accelerating prototyping processes by 25%.

- Championed automation, cutting resource commitments by 40% and advancing project completion rates by 35%.

Quebec Genomic Centre

Lead Data Scientist

July 2005 - May 2007 (1 year 11 months)

Québec, Quebec, Canada

Responsibilities:

- Spearheaded collaborative efforts with lab technicians, students, and researchers, leading a team to create a gene chip discovery pipeline for Affymetrix & spotted array technologies.

- Optimized platform strategies, achieving a 20% efficiency increase through refined software maintenance for Affymetrix & spotted arrays, directly impacting project timelines.

- Ensured compliance with MIAME standards through rigorous library and script development, showcasing dedication to industry best practices.

- Led initiatives for in silico SNP discovery, showcasing leadership and expertise in genomics.

Achievements:

- Achieved an unparalleled 99.9% data integrity in LIMS projects, employing cutting-edge scripting methodologies, ensuring high-quality data for critical research initiatives.

- Enhanced LIMS QC with a 15% accuracy surge via fine-tuned R/ Bioconductor libraries and scripts, demonstrating a commitment to precision in data analysis.

- Pioneered R/Bioconductor RPM package development, reducing installation and deployment times by an efficient 30%, streamlining processes for the team.

- Innovated Ruby On Rails interfaces, delivering a remarkable 25% boost in data retrieval speeds and chip tracking precision, improving overall workflow efficiency.

Page 11 of 14

- Implemented Affymetrix GCOS™ software, leading to a significant 20% leap in gene expression data analysis accuracy, contributing to the advancement of genomic research.

- Optimized the ESTs pipeline, resulting in a substantial 30% boost in sequencing accuracy, facilitating statistical application development for more robust research outcomes.

Institut du cancer de Montréal

Data Science Specialist

January 2004 - June 2005 (1 year 6 months)

Montreal, Quebec, Canada

Responsibilities:

- Spearheaded the discovery of 500 gene expression profiles in ovarian, BRCA1, and BRCA2 cancers

- Utilized Principal Component Analysis (PCA) for efficient gene expression profiling through dimensionality reduction.

- Implemented cluster analysis techniques to identify distinct patterns in gene expression data.

- Employed regression models to predict and optimize raw data analysis results.

- Applied time series analysis to understand the dynamics of gene expression changes over time.

- Optimized raw data analysis using statistical tests (Limma, Bayes, SAM, t- test, and Wilcoxon).

- Revolutionized data update processes with Perl and Java.

- Pioneered Q-RT-PCR setup with primer selection with lab technicians Achievements:

- Leveraged advanced data modeling in R/Bioconductor, resulting in a remarkable 20% increase in patient diagnosis accuracy.

- Contributed to streamlined data analysis through PCA, improving efficiency.

- Enhanced understanding of biological processes through cluster analysis.

- Showcased a data-driven approach to decision-making with regression models.

- Provided valuable insights into the temporal aspects of cancer biology using time series analysis.

- Achieved a 15% improvement in accuracy and streamlined class verification in expression studies.

- Enhanced data integrity by 25% and reduced discrepancies in large-scale datasets.

Page 12 of 14

- Elevated expression profile validation accuracy by 30% with Q-RT-PCR setup.

- Implemented machine learning algorithms for early detection and a 10% increase in successful therapeutic interventions.

- Collaborated with interdisciplinary teams to develop a robust data pipeline, reducing processing time by 20%.

Genome Quebec

Senior Bioinformatics Scientist

December 2002 - December 2003 (1 year 1 month)

Responsibilities:

- Designed and implemented a conversion interface to streamline processing of clone aliases across 96-well & 384-well plates.

- Collaborated with cross-functional teams, including molecular biologists and software engineers, ensuring the interface met diverse lab needs.

- Spearheaded the development of an in-house bovine genome map, aligning 260,000 cow BAC end sequences to the human genome. Achievements:

- Enhanced sequencing efficiency by 20% through the implementation of the clone alias processing interface.

- Introduced automated quality checks, reducing manual review time by 15% and ensuring data integrity.

- Developed a benchmark bovine genome map, achieving a 95% accuracy rate in clone placements across mammal species.

Education

Université de Sherbrooke

Master of Technology (M.Tech.), IT Management · (2016 - 2022) Marguerite-Bourgeoys School Board

Attestation of Vocational Specialization - Starting a Business, Business/ Commerce, General · (2013 - 2013)

Université du Québec à Montréal

Computer Science · (2005 - 2011)

The University of British Columbia

Page 13 of 14

Master of Science - MS, Computational Biology · (January 2004 - December 2004)

Université du Québec à Montréal

Bachelor's degree · (1995 - 1999)

Page 14 of 14



Contact this candidate