Sign in

Data Service

Toronto, Ontario, Canada
January 28, 2018

Contact this candidate


Taghi Saadat



●15+ years of experience on complex data driven environments, Big Data on-premises or within the public cloud, Data Warehouse, Data Marts, and BI

●Differentiate in Engineering design, excellence in building operating model (interface/business process), and technology model (Applications/ Data/ infrastructure)

●Hands on Data Architecture, Data Modeling (BPM, CDM, LDM, PDM), Data Management (DAMA framework includes and Master and Metadata Management (MDM/MDDM/RDM/DQ/DG)

●Expert in designing data services like Data Profiling, Data Security, Data Quality, Data Migration, Data Integration, Data Enrichment, and Data Archival areas

●Talent in making unified platform from mixed products into a common set of powerful and easy-to-use tools in a batch and real-time on-premises or in the cloud platform

●DevOps team lead of a product on a Service Oriented Architecture in BIG Data computing on the cloud consists of the crucial set of Amazon™ technology including RDS, EC2, S3, PostgreSQL, and Redshift

●Leading multiple teams of designers in developing preliminary and detailed designs of manufacturing ERP

●7+years of experience in enterprise level technology consulting, project management and cloud based solutions architecture

●Practice in designing Enterprise Data Warehouse (EDW) and Cloud computing in Retail industry with AWS (IaaS, PaaS, SaaS)

●BI solutions architect with the capability of design massive data transferring in a fully automated system and controllable through the Gap and Gate for un/structural data

●Expert in Dashboard designing and KPIs configuring

●Diversity industry experience in Banking, Healthcare, Insurance, Retail, Manufacturing, Financial, Transportation, Manufacturing, Oil and Gas

●Providing over 50 Application software from scratch including an ERP solution for manufacturing and distribution


Architect: Service Oriented, Task Oriented

Data modeling tools: Power Designer, Erwin

Database: PostgreSQL, MS-SQL, Oracle

Scripting: Bash/Linux, Java, HTML, XML, JSON, Python, SQL

BI tools: eVision, Talend, Gentran, SSIS

Reporting tools: SSRS, Cristal Report

Big Data: AWS-RedShift, EMR, RDS, S3 (Amazon technology), Sqoop, Hive, HBase (Hadoop technologies)


B.S. in Software Engineering from the Sharif University of Technology which has the best record university in Iran-Accredited as BSc in Computer Science by World Education Service


2016, Big Data tools in Apache side: HBase, Python, Tomcat, Sqoop, SSAS

2014, Big Data Fundamentals, IBM, AWS

2014, Internet Marketing (SEO) certified, University of Waterloo

2013, Servers Configuration, IBM/Lenovo certified

2012, Crystal Report, Epicor BPM, EDI, SSRS, SOA at EPICOR University

2008, Project Management certified, Oracle DBA, SSIS, Network Administration and security ‘MCSA' certified

2007, Intermediate Web designing, CCNA1-4 ‘CISCO'

2002, OOP Programming using by Rational-rose

2001, Implementation and Database design ‘MCSE' Certified, Visual Basic Programming ‘MCSD’ Certified


MDM Architect Emblem Health, Manhattan, NY Aug – Nov 17

●For the first time, analysed and designed DM-data management- structure based on Data Domain with DAMA framework

●Determined Information Lifecycle Management for all data movement phases like Landing, Staging, and Loading in a complex event processing

●Proposed an Enterprise-wide, multi-domain Master Data Management (MDM) as part of DG- Data Governess program-

●Dig data with all relevant customer technical requirements, processes, procedures, limitations and constraints

●Create an Enterprise Information Model (EIM) which gave a big view of the data movement with onboarding/ongoing execution process

●Architected custom MDM solution to get CDEs- critical data elements- and enforced DG and DQ –Data Quality- disciplines

●Ensure Data Governance policies, Regulatory principals (HIPAA), and Data Security Disciplines (FWA -Fraud, Waste and Abuse) are enforced in to Data Sources, checkpoints, reference data

●Provide Metadata Profiling mechanism as a functionality of MDM in Detection Prevention and Correction procedures

Senior Data Modular TJX opportunity, Marlborough, MA Dec 16 - Apr17

●Monitoring massive ordering products for DCs collected from EDI transferred information in retail industry

●Provide BI principals, assessment metrics and related checklists to decision makers

●Data Modeling (MDDM, BPM, CDM, LDM, and PDM), Master Data Management, Database Design, Database Security, Data Quality and Data Enrichment

AWS Consultant OnX, ON Sept-Oct 16

●Contributing on requirement gathering and responsible for the IaaS Cloud Architecture on upcoming Enterprise Solution in where is extended on AWS

●Provides a highly technical IaaS pilot plan on AWS for the upcoming Enterprise Solution in moving two data center of the City of Mississauga

●Given architectural recommendations based on 69 products of AWS

●Provides low-cost solution with Elastic Bean Stalk at rate of $0.09/month includes 24 Hours/Day up running instance, 15 GB of IN/Out Data Transfer and capacity of handling 300 requests/month concurrently

●Provides a 3-Tier Auto scalable web app solution includes Load Balancer, 2 webservers, 2 App Servers, 1 High Availability Database Server and 30 GB of Storage with capacity of 120 GB data transfer under $1100 /monthly cost

●Provides Virtual Private Cloud with Connectivity Options

●Establishing easy to go on AWS by IAM Policy, Standards, AWS instance classification and practices for the LARGE data environments

Infosys, WA Big Data Architect Feb- Aug’16

●As a part of the Data Architect team in Minitoka, Minnesota works closely with

Big data developers in creating a DW platform to support data scientists for government sector

●Technical Team lead of RDBMS to Hadoop data movement within Scrum / Agile methodology

●Data Integration with Talend Data Fabric modules, identified ETL load jobs, current state along with dependencies and performance aspect of loading data into Hive

●Import RDBMS tables into HDFS/Hive using Sqoop command and complex options like compress codec; query, filter out by Where conditions, utilized Spilt –by columns, incremental append with check –column ID and last value

●Upstream Data Profiling of 40 different structured and semi-structured data resources includes intensive US Government Healthcare data, external insurance systems, and government websites

●Interpreted and delivered initiative plans on the specify strategy and improve data integration, data quality, data enrichments, data governance, data classification, data cleansing and data delivery in support of big data business

●Working on Data-lake to store pre-processed data in staging tables

●Post-processed on landing zone with row level, and standards

●Provided Metadata Management (MDM) includes Mappers, Data type conversions, Links, Mapping rules, Filters, Expressions, Lookups and configurator to create Base Tables rows on downstream Hive tables

Enterprise software consultant (contract) Feb15-Aug’2016

Epicor company consultant for Unique Sports (Alpharetta, GA); Hornady(Grand Island, NE); Gullco (Newmarket, ON)

●Backing the client by advice on improving their existing BI, data management, and delivery End to End EDI (source to target data loads) solutions

●Developed data transformation and mappings rules with eVision to convert inbound source data into target formats

●Providing the client working on ways to improve data accuracy and viability by streamlining the data quality and data flow processes used across the enterprise with 500 customers and over 1900 stores

●Customized on Customer Credit and Customer maintenance UI with light C# programming and .NET Business object calls

●Provided the guidelines, templates and tool mentors necessary for the entire team to take full advantage of component-based architectures, software iteratively developments and manage requirements

●Dealt directly with clients along with coordinating an Epicor consulting team of software engineers, presented solutions to client and get approval at various stage of managers

●Using Meta Data extracting from MS-SQL server in generating SSRS reports

●Designed and implemented a business intelligence tools in data presentation for stakeholders which was helping the business to gain strong advantage of +63% profit in GM% by calculation on 15 critical lines of business

Big Data Architecture Terra Nova Analytic, Toronto, ON Feb-Dec’ 2014

●Adopted very small team and expected to grow and build out support team from there

●Architected and adopted a powerful data foundation

●Utilized AWS objects for the cloud computing (SaaS) in Big Data Platform

●Consulate and created multidimensional data warehousing on Redshift

●Worked closely with UI designer to support application development

●Leverage massively parallel processing, columnar data storage, and columnar data compression

●Built an efficient fully automated reporting service well suited to client business objectives needs on AWS-S3

●Setting up strong BI capability with a saleable Load/Dump Data from/To Amazon S3

●Created staging tables on PostgreSQL with Python scripting on JSON files

●Defined IAM service Roles and assigned policy based permission to access AWS resource (Encryption, Role, USERs, Groups and resources)

●Setting up secure and robust environments with 99.95% uptimes platform via AWS-RDS instance for core back engine of the business

●Speed rate up ingested and process 0f 500,000 POS transactional data per second, (40 billion transactions per day) which is reduced the average time of analysis from 4 hours to 3 seconds from multiple assorted sources within retail industry

Enterprise software consultant (contract) Mar 13-Aug’2013

Epicor ERP consultant for MI-Jack (Chicago, IL); Lanco group of Company (Chicago, IL); Loxcreen Flooring, Mississauga, ON

●Provided special middleware solution on extracting registered cheques processing

●Provided BPM solution architecture including back-end integrations for Citibank in Check transfer process

●Align Epicor ERP to the business in the Account Payable module in tracking cheques; including inbound/outbound settings, work-flows, Epicor BO Dot Net/ Web service calls Ask for full version of my resume.

Contact this candidate