SKILLS
SUMMARY
Sr. Data Engineer Experience
AWS & Azure Cloud Experience
Airflow Experience
Snowflake, Redshift, PostgreSQL Experience
Python Development Experience
5 years
Retail Industry Experience
12 years
Financial, Energy and Insurance Experience
12 years
PROFESSIONAL SUMMARY
Dedicated Sr. Data Engineer and Systems Engineer with extensive experience in designing, implementing, setting up new data pipelines and optimizing high-availability database environments within highly secure, enterprise-grade infrastructures. Skilled in collaborating across cross-functional teams, including software engineering, network operations and deliver scalable, secure, and high-performing data solutions. Proven expertise in database installation, upgrades, performance tuning, and automation, with a strong focus on data security and encryption. Adept at optimizing server and database performance, executing complex schema and code changes, and ensuring seamless and 24x7 pager duty and operations through proactive monitoring, recovery planning, and on-call production support. Committed to continuous learning, clear communication, and contributing to architectural design and strategic growth initiatives.
PROFESSIONAL EXPERIENCE
July 2020 – Present
Nielson IQ – Toronto, Canada
Sr. Data Engineer/Sr. Database administrator (Big Data and DevOps)
Product:
Pricing Optimization Various Data integration into core product Weather data integration Promo Data portal and insights Segmentation (customer, vendor, product) Supplier chain Vantage
Project:
• Designed and developed an end-to-end Redshift automated database backup(manual) and table restore tool, ensuring seamless operation across one or more data warehouses, a major achievement in terms of company cost savings.
• Built an ETL pipeline in Python and PostgreSQL to ingest, transform, and load large volumes of transactional data into analytical tables for reporting.
• Designed and developed Azure Data Factory (ADF) pipelines to ingest, transform, and load data from multiple sources such as SQL Server, Azure Blob Storage, and REST APIs.
• Built an end-to-end universal data migration tool that supports multiple data warehouses, delivering significant efficiency and cost-saving benefits for the organization.
• Integrated Azure Data Factory with Azure Data Lake, Azure SQL Database, and Synapse Analytics for end-to-end data integration solutions.
• Automated PostgreSQL maintenance tasks, including vacuuming, indexing, and statistics updates, reducing manual effort and improving database efficiency
• Migrated Control – M jobs into Airflow DAGs maintaining existing scheduling and execution frequencies
• Developed vendor file split and file encryption tool using Databricks and python file splitter based on vendor number
• Optimized complex PostgreSQL queries and stored procedures, resulting in a 40% improvement in query performance for mission-critical reporting workloads.
• Developed database migration tool - on-premises client data Netezza appliance to AWS, collaborating closely with the operations team to establish new standards, practices, and processes for data integrity and security.
• Collaborated on various technology evaluation projects, including POCs and assessments of Databricks, Snowflake, and AWS Glue for ETL and analytics workloads.
• Led and contributed to data migration initiatives from Amazon Redshift to Snowflake.
• Deployed and managed containerized applications using CI/CD and GitHub pipelines on AWS and Azure cloud platforms.
Responsibilities
• Hosting & Onboarding new client data, setting up data pipelines, working closely with client technical team.
• Manage and maintain database retention and backup policies, including Snowflake time-travel configurations, PostgreSQL point-in-time recovery (PITR), and cloud database encryption
• Manage and maintain database retention and backup policies, including Snowflake and Redshift schema time-travel configurations and cloud database encryption.
• Utilized Snowflake Cortex to perform in-database AI inference and text analysis, enabling secure, scalable insights without data movemen
• Design and implement Apache Airflow DAGs, migrating legacy Control-M jobs into modern Airflow workflows.
• Review and tune complex SQL queries, stored procedures, and database architectures for efficiency and scalability across PostgreSQL and cloud data warehouses.
• Act as a subject matter expert for Snowflake and Redshift, focusing on system design, database development, and performance optimization.
• Review and tune complex SQL queries, stored procedures, and database architectures for efficiency and scalability.
• Performed thorough review and verification of labeled datasets to confirm correctness before client sign-off.
• Perform PostgreSQL version upgrades, extensions management, and performance tuning through indexing, vacuuming, and query plan analysis.
• Develop Python & Shell script base ETL pipelines, integrating data from AWS S3 and Azure Blob Storage with Redshfit and Snowflake.
• Collaborate with cross-functional teams to define project scope, perform analysis, design, coding, and testing (unit and integration).
• Write stored procedures and automation scripts using SnowSQL, Python, Java, VB.NET, and Bash, across AWS, Azure, and on-prem SQL environments.
• Implement automated and manual database backup/retention policies, ensuring data availability, accuracy, and security.
• Designed and implemented a Databricks Notebook workflow to automatically split large vendor files by Vendor ID using PySpark, improving data processing efficiency and downstream system consumption.
• Manage database operations, version migrations, and ensure minimal downtime in cloud and hybrid environments.
• Oversee database security, user access, and deliver training and support to end users.
• Monitor performance and recommend improvements to enhance data reliability, authenticity, and system efficiency.
• Developed and deployed AWS & Azure infrastructure using Terraform, including EC2, S3, and RDS resources
• Managed AWS IAM users, groups, roles, and policies to enforce least-privilege access across multiple AWS accounts
• Created and maintained IAM roles for EC2, Lambda, ECS, and cross-account access using trust policies.
• Audited IAM permissions using IAM Access Analyzer and CloudTrail to identify and remediate over-privileged access
• Managed Azure Active Directory (Entra ID) users, groups, service principals, and managed identities across multiple subscriptions.
Environment:
AWS Cloud, Azure Cloud, Linux, EC2, Netezza, Airflow, Redshift, Snowflake, Databricks, PostgreSQL, MS SQL Server, MariaDB, DynamoDB, Lambda, Docker, Bash, Python, Windows AMI, AWS S3, Terraform, Azure Blob, and related services.
Clients / Projects Supported
Worked with several major international retail and consumer clients, including:
• US Foods (USA) Morrisons (UK) Walmart (USA) Staples US & Staples Canada Dollar General (USA) CVS Pharmacy (USA) Hudson’s Bay (Canada) Southeastern Grocers (SEG)
March 2014 - July 2020
LoyaltyOne – Toronto, Canada
Data Engineer
Projects:
• Mentor, coach team members for technical and profession growth
• Identified talent from within team to take on new roles as we began our migration to AWS.
• Hosting & Onboarding new client data, setting up data pipelines, working closely with client technical team.
• Manage and maintain database retention and backup policies, including Snowflake time-travel configurations, PostgreSQL point-in-time recovery (PITR), and cloud database encryption
• Manage and maintain database retention and backup policies, including Snowflake and Redshift schema time-travel configurations and cloud database encryption.
• Design and implement Apache Airflow DAGs, migrating legacy Control-M jobs into modern Airflow workflows.
• Review and tune complex SQL queries, stored procedures, and database architectures for efficiency and scalability across PostgreSQL and cloud data warehouses.
• Act as a subject matter expert for Snowflake and Redshift, focusing on system design, database development, and performance optimization.
• Review and tune complex SQL queries, stored procedures, and database architectures for efficiency and scalability.
• Perform PostgreSQL version upgrades, extensions management, and performance tuning through indexing, vacuuming, and query plan analysis.
• Develop Python & Shell script base ETL pipelines, integrating data from AWS S3 and Azure Blob Storage with Redshfit and Snowflake.
• Played leadership role in cloud migration for all data migration activities.
• Designed and implemented end to end database extraction process useable for all Precima’s clients on-prem databases.
• Lead and collaborated to successfully migrate 3 clients. Providing consultation for the 4th client to create opportunities for team members to grow their skillset.
• Technical lead for data center migration to successfully migrate code and process from one data center to another.
• Lead design and development for different analytics solutions e.g. retail pricing, assortment, B2B price, B2B targeted marketing, Retail targeted marketing
• Lead design and development of multiple solution for retail or B2B wholesale foods business e.g. business intelligence like category insights, self-serve advanced analytics, B2B customer intelligence,
• Lead design and development of short-term loyalty program impact tools. The tool was designed to intake data from different retailers around the globe.
• Collaborated and played key roles in process improvements and tool selection process e.g. Workflow automation tool POC, created standard documentation for support, design review
• Lead the database operation activities and gradually delegating to team member to facilitate associate growth within organization.
• Designed database solutions to support One to One marketing reporting and insight at a leading US food distribution company, worked closely with the marketing and reporting team to create an efficient solution for QlikView Reporting and insight.
• Designed and implemented database solution to streamline the process for reporting and analytics by combining sales, customers, and product data.
Other:
• Introduced new data audits steps to support the analytic for a major US grocery and retailer client.
• Learned the new tools used by the technology team (Netezza, Unix Scripting and QlikView) in record time and introduced some new tips and tricks.
Tools: Snowflake, IBM Netezza, AWS Redshift, EC2, Lambda function, HTML, UNIX Shell Scripting, UNIX, AQT, QlikView, Dbeaver, Aginity
Responsibilities:
• Serve as a subject matter expertise for Netezza system and database development
• Review SQL queries and database design for performance
• Work as part of a project team to coordinate database development and determine project scope and limitations
• Analysis, Design and Development
• Coding using Linux Bash, .Net, Python
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Database backup and database failover.
• Develop standard and best practices for Netezza database. Work with application teams closely to implement the standard and best practice
• Implementing database version, patching and other database health related monitoring
• On Call Pager duty, troubleshooting Redshift WLM (workload management) related memory allocation to help production processes run smoothly
• Snowflake Warehouse usage, Resource monitoring and keeping track of warehouse usage and financial report. Sharing usage with multiple teams making sure they are on track based on credit quota allocation.
• Data sharing using snowflake powerful feather across multiple snowflake account across globe.
• Implementing changes in database operations using secure change system ticketing and ensuring data is readily available to end-users.
• Minimizing downtime of databases and ensuring data provided is accurate
• Managing database access and keeping stored data up to date
• Providing training, feedback, and support to users on the usage of data provided
• Ensuring that the database is adequately backed up so that vital information can be retrieved in case of accidental loss of security breach
• Monitoring performance and providing information on ways to improve the authenticity and security of data
• Maintaining the efficiency of the database by providing regular checks and data updates
Environment: NATEZZA, AWS Cloud, Azure Cloud, Linux, EC2, Snowflake, AWS Redshift, Bash Shell Scription, Python vers X, Windows AMI, PostgreSQL, MariaDB, DynamoDB, Lambda functions, Cloud infrastructure
July 2012 – July 2014
Allstate Insurance – Toronto, Canada
Programmer / Data Engineer
Projects:
• Developed Classifier reporting tool – Daily, monthly based on policy types and by claimant various reports to meet business requirements.
• Data analysis and reporting to support new regulatory changes in Ontario Insurance Market
• Provide data and analytics to other business units by providing ad-hoc reports as well as creating reports on reporting services
• Provide technical solution to implement new reports and validations from business requirement documents for Ontario operation team.
• Increase the IT decision maker universe in Marketing Database:
• Improved SQL code to increase the universe (from 300K to 850 K) for third part data augmentation project.
• Validate and audited the results from third party data augmentation and worked with them to improve the results.
• Identify data and approaches through which these results can be passed to the system.
• Produced advanced or complex List Generation Data Files
• Performed changes to internal tools, managing sandbox servers and collecting change requests for automation
Responsibilities:
• Analysis of the specifications provided by the clients
• Client Interaction
• Analysis, Design and Development
• Coding using Linux Bash, VB 6.0
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Database backup and database failover.
• Maintaining the efficiency of the database by providing regular checks and data updates and data retention policies
• Implementing change Request (CR) in database operations and ensuring data is readily available to end-users
• Minimizing downtime of databases and ensuring data provided is accurate
• Managing database access and security policies and keeping stored data up-to-date and backed up based on company backup policy
• Providing training, feedback, and support to users on database usage of data provided
• Ensuring that the database is adequately backed up so that vital information can be retrieved in case of accidental loss or security breach also archiving data to implement cost saving measurements.
• Monitoring performance and providing information on ways to improve the authenticity and security of data
Environment: Visual Basic 6.0 (C#), SQL Server 2005, Linux, SQL 2008, SQL 2012, SISS 2012, Excel, Collaboration with Marketers, Campaign execution team in India, Third party data e.g. Info Group, B2B, Focus Canada etc. Use of segmentation, profiling and modeling scores for campaign executions.
May 2007 – June 2012
Direct Energy – Toronto, Canada
Programmer
Project: Alberta Renewals, Fulfillment of new master sales agreement for all D2D vendors (Canada and US)
Projects:
• Fulfillment of new master sales agreement for all D2D vendors (Canada and US): Designed and developed the process (Alberta Renewal) Forecasting systems. This system not only ease the commission calculation process using complex criteria but also provide vendors and vendor managers up to date reporting at agent and office level to manage their performance. The process is set up to support future analytics needs.
• New channel launch (multi-Level marketing aka Network Marketing): Designed and implemented system for vendor reporting as well as commission process.
• Integrate regional customer information systems to support Pan North American reporting.
• Reduce errors and 2 FTE work by automating gas renewal submissions. Reductions in errors saved business from revenue losses, complaints, and be compliant of regulatory requirements
• Data analysis and reporting to support new regulatory changes in Ontario Energy Market
• Provide data and analytics to other business units by providing ad-hoc reports as well as creating reports on reporting services
• Provide technical solution to implement new reports and validations from business requirement documents for Ontario operation team.
Projects:
• Implemented undeliverable emails clean up in marketing database to turn DNC (Do Not Contact) metric in marketing scorecard green.
• Increase the IT decision maker universe in Marketing Database:
• Improved SQL code to increase the universe (from 300K to 850 K) for third part data augmentation project.
• Validate and audited the results from third party data augmentation and worked with them to improve the results.
• Identify data and approaches through which these results can be passed to the system.
Other:
• Produced advanced or complex List Generation Data Files
• Performed changes to internal tools, managing sandbox servers and collecting change requests for automation
Responsibilities:
• Analysis of the specifications provided by the clients
• Client Interaction
• Analysis, Design and Development
• Coding using Linux Bash, VB 6.0, and C#
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Uploading the complete system with SQL server configuration at dedicated Web Server.
Environment: Visual Basic 6.0 (C#), SQL Server 2005, Linux, SQL 2008, SQL 2012, SISS 2012, Excel, Collaboration with Marketers, Campaign execution team in India, third party data e.g., Info Group, B2B, Focus Canada etc. Use of segmentation, profiling, and modeling scores for campaign executions.
March 2006 – May 2007
Programmer
Al-Moayed Telecom – Middle East Bahrain
Project: International Call Assistance Application (ICA) at Saudi Telecom (STC) Jeddah Saudi Arabia
International Call Assistance Application, to support call center agents help guide internal calls.
• Analysis of the specifications provided by the clients
• Client Interaction
• Analysis, Design and Development
• Coding using Linux Bash, VB 6.0, and C#
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Uploading the complete system with SQL server configuration at dedicated Web Server.
Environment: Visual Basic 6.0 (C#), SQL Server 2005, Linux, DOS
March 2006 – May 2007
Al-Moayed Telecom – Middle East Bahrain
Programmer
Project: Mediation System at Bahrain Telecom (Batelco) Bahrain
• Responsible for design, developing, supporting, daily activity monitoring and maintaining the Call Detail Records (CDRs) 5 TB+ data warehouse.
• Reviewed daily CDRs and determined a need for more efficient methods of analyzing available CDRs data as information by business user.
• Defined and distribute daily work to team members, involved in prototype solutions, obtained senior management approval, and managed system design, coding, test, and release.
• The data warehouse is now the basis for all business intelligence for Batelco operator, simplifying advanced analysis, expanding business intelligence capabilities, and implementing market segmentation.
• Additionally working on deliver information distribution portal allowing access to analytical data.
Responsibilities:
• Analysis of the specifications provided by the clients
• Client Interaction
• Analysis, Design and Development
• Coding using Linux Bash, VB 6.0, and C#
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Uploading the complete system with SQL server configuration at dedicated Web Server.
March 2006 – May 2007
Al-Moayed Telecom – Middle East Bahrain
Programmer
Project: Directory Enquiries System at Bahrain Telecom (Batelco) Bahrain
Automate Directory Enquiries system to help support agent received call based on skill level. Calls like 151 priority calls.
• Analysis of the specifications provided by the clients
• Client Interaction
• Analysis, Design and Development
• Coding using Linux Bash, VB 6.0, and C#
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2005.
• Uploading the complete system with SQL server configuration at dedicated Web Server.
Environment: Visual Basic 6.0 (C#), SQL Server 2005, Linux
September 2001 – March 2006
Al-Moayed Group – Middle East Bahrain
Programmer & Team lead
Project: RYTE ERP Solution
Fully inhouse developed solution. A comprehensive ERP system with integrated modules for various aspects of financial accounting system. This class of software is often referred to small and mid-size organization to keep track of financial system. The important modules in the RYTE system, GL, BB, AR, AP, INVENTORY, HR, PO, SO & POS, well connected multiple stores connecting them into head office for centralized reporting. Well-designed backup and restore database operation, Transaction management with insert, delete & restore feature, House Keeping Reports, Availability Summary & Stay Information Grid, Transition History, Secure Database, Extended Help. In HRMS all modules are tightly integrated and all meaning all modules are included and integrated with each other.
Client: Intercol – Consumer Products Division (CPD)
Team leads for implementations at client side and Production monitoring.
Client: MEFT – Middle East Food Co
Team leads for implementations at client side and Production monitoring.
Client: DEEKO – Middle East Bahrain
Team leads for implementations at client side and Production monitoring.
Client: GCMS (General Cleaning and Maintenance Service) – Middle East Bahrain
Team leads for implementations at client side and Production monitoring.
Client: Al-Ansari Lightings and Fixtures – Exhibition Ave, Middle East Bahrain
Team leads for implementations at client side and Production monitoring.
Client: Al-Zayani Investment – Exhibition Ave, Middle East Bahrain
Team leads for implementations at client side and Production monitoring.
Client: Gulf Air (Middle East Bahrain) – Cabin Crew Accommodation and enquiry system
Provided support with internal enquiry (call) and search system.
Client: Middle East Bahrain Jewellery Center (BJC)
Provided support with internal enquiry (call) and search system.
Similar experience with Other valuable Clients in Middle East Bahrain:
Fine foods – Manama, Bahrain Universal foods – Manama, Bahrain Crystal Palace – Seef, Manama, Bahrain Next UK – Seef, Manama, Bahrain IL marcato – Seef, Manama Bahrain Farooghi Pharmacy / Tariq Pharmacy, Manama Bahrain Public Securities, Manama, Bahrain
• Designing Frontend screen based on customer input and customization
• Client Interaction
• Analysis, Design and Development
• Coding using PowerBuilder and Stored Procedures
• Testing - unit testing & Integration Testing
• Wrote the stored procedure in SQL Server 2000.
• Uploading daily sales report to head office to get centralized view of daily sales by store.
Environment: PowerBuilder 6.5, MS DOS 6.22, Sybase anywhere, Adaptive Server and SQL Server 2000
November 1999 – July 2001
Al-Moayed – Middle East Bahrain
Programmer
Project: CAM – Central Alarm Monitoring System
Developed end-to-end frontend to keep track major, medium, and minor alarm, designed alarm monitoring popup screen based on error reported by onsite BABX. Also developed end to end alarm lifecycle reporting for higher management to provide overall high-level view of the system health.
Developed monitoring screen based on input error send by onsite BABX
Client Interaction
Analysis, Design and Development
Coding using visual basic 6.0 and Crystal Report
Testing - unit testing & Integration Testing
Wrote the stored procedure in SQL Server 6.5.
Environment: Visual Basic 6.0, DRD reports, PowerBuilder 6.5, SQL Server 6.5, HTML.
EDUCATION
KU University
BCOM.
ICT Ltd.
Advanced Diploma, Software Technology & System Management
York University Toronto
High Impact Business Communication, Writing: Report, Proposals, Emails and Letter
CERTIFICATIONS
• Snowflake SnowPro core
• AWS Associate Architect
• Microsoft Azure Administrator Associate
• Ansible Playbook
• MSDBA