Alla Aleyner
* ******* **** ****, *****, ON. 416-***-**** ****.*******@*****.***
Education
Bachelor of Computer Science Degree (York University, Toronto, Canada)
Big Data Analyst Certificate (York University, Toronto, Canada )
Technical Summary
Programming languages:
Python, PySpark, SQL, Unix Shell Scripting, Java/J2EE, SAS, JAVA, UnixLinux Cloud Hadoop
Databases:
Oracle 8.x, 9 SQL SERVER, MySQL, DB2, MS Access,
Operating systems:
Windows 2003/NT, Windows XP, Windows 7, Linux, Sun/Solaris, HP-UX, AIX, Mainframe, AS400, BB10, iOS, Androids
Web servers:
BEA WebLogic, Apache/Tomcat, IBM WebSphere
Third party software:
Power BI, JAWS,Selenium, Cognos, Postman, Informatica, Teradata, SSIS, SharePoint, Business Object, Crystal Reports, TOAD, QTP,
Office Tools:
MS Project, Visio, Power Point, Excel, UltraEdit
Bug tracking software:
DevOps, Jira, Test Director, AWE, HP ALM, HP QC (Quality Centre)
Ministry of Public and Business Service Delivery
January 2024-Present
Data Validation Engineer/Sr. ETL Tester
Automated Slowly Changing Dimension (SCD) delta load processes in the cloud Shell scripts and SQL
Contributed to the creation and execution of Test Plans, Test Strategies, Quality Standards, and Test Procedures.
Built ETL automation scripts to ensure data quality and integrity prior to production deployments.
Contributed to the creation and execution of Test Plans, Test Strategies, Quality Standards, and Test Procedures.
Verified ETL processes, metadata accuracy, referential integrity, and reporting outputs using SQL and PySpark.
Validated data ingestion from Parquet, JSON, Excel, and CSV sources using PySpark.
Utilized PySpark hash joins to validate field-level data consistency across large datasets.
Designed and executed complex nested SQL queries incorporating UNION, MINUS, ranking, and partitioning for data validation.
Developed validation logic to check date formats, null values, and duplicate records using PySpark and SQL.
Tested Power BI reports by validating aggregations and pivoted data using PySpark.
Participated in Agile projects using DevOps, GitHub, JIRA, and Confluence.
Conducted testing activities in alignment with Agile project methodologies.
Sunlife
September 2022-December 2023
ETL Tester
Conducted data testing on the cloud using PySpark, Python and SQL for Workforce datasets.
Developed PySpark test scripts leveraging pivot tables, partitioning, ranking functions, nested queries, and validations for null values and duplicate records.
Validated schemas and data ingestion for newly created tables sourced from Parquet files, and reports using SQL, PySpark, and Python.
Produced data analytics and data lineage documentation through SQL analysis.
Detected and logged ETL defects while managing comprehensive test documentation and automation scripts using Jira, GitHub, and Confluence.
Executed testing activities in alignment with the Waterfall project methodology.
WESTON FOODS
https://www.westonfoods.com/
September 2019-September 2022
ETL Test Analyst
Wrote Informatica ETL scripts to test data load from SAP into FACT and DIMENTION tables
Developed Informatica ETL scripts to validate data loads from SAP into FACT and DIMENSION tables.
Verified Enterprise Data Warehouse (EDW) data and reports using SQL Server scripts.
Created Data Lineage documentation supporting sales, finance, profitability, and manufacturing reports.
Performed ETL data cleansing to ensure data accuracy and consistency.
Tested reports by writing complex SQL queries involving nested joins and aggregation functions.
Authored and executed test cases to validate data migration from SAP to SQL Server.
Conducted ETL and Cognos root cause analysis to identify and resolve data discrepancies.
SAS August 2017 – June 2019
https://www.sas.com/
Sr. ETL tester
Authored comprehensive test cases to validate ETL requirements supporting Customer Due Diligence (CDD) initiatives
Performed functional testing for Anti–Money Laundering (AML) applications to meet regulatory and compliance standards.
Verified historical data accuracy for Type 2 Slowly Changing Dimensions (SCD) to ensure proper data versioning.
Managed task updates and defect tracking in HP ALM across the full testing lifecycle.
Built ETL automation scripts to ensure data quality and integrity prior to production deployments.
Executed advanced Oracle SQL queries utilizing CASE statements, INTERSECT, MINUS, subqueries, DECODE, and UNION to validate pay run processes.
Provided daily testing status updates to leadership during Agile/Scrum ceremonies.
Conducted Functional, Regression, and Smoke testing to maintain system reliability and performance.
Ontario Ministry of Social Services August 2015 – October 2016
Billing and Payments Tester
Validated Driver, Vehicle, and Health Services functionality within the ServiceOntario web portal for the IBM Cúram COTS application.
Verified end-to-end data flow across Government Payment Gateways integrated with the IBM Cúram COTS platform.
Created and executed test cases to validate payment transactions between CIBC and Ministry gateways.
Tested and validated payment processing, correspondence (letters), and reporting modules within the IBM Cúram COTS application to ensure accuracy and compliance.
Ontario Ministry of Labor September 2012-February 2015
WSIB.on.com
ETL Team Lead
Executed ETL test cases for Data Warehouse
Validated Driver, Vehicle, and Health Services features within the ServiceOntario web portal for the
IBM Cúram COTS application.
Ensured end-to-end data integrity across Government Payment Gateways integrated with the
IBM Cúram platform.
Designed and executed test cases to verify payment transactions between CIBC and Ministry gateways.
Tested payment processing, correspondence (letters), and reporting modules within IBM
Cúram to ensure accuracy, compliance, and system reliability.
CIBC
CIBC.com
September 2011 – May 2012
Quality Assurance Analyst
Authored and executed test cases focused on QE approach and financial risk validation.
Participated in end-to-end testing of lending and credit risk applications across the full SDLC.
Developed Oracle SQL queries to validate and analyze mortgage customer data.