Post Job Free

Resume

Sign in

Data Engineer

Location:
Vaughan, ON, Canada
Posted:
March 23, 2021

Contact this candidate

Resume:

Mahesh Gope ** Hartley Crt

Email : adk4u1@r.postjobfree.com Concord – L4K 4W4

Mobile : 416-***-**** ON, Canada

Professional Summary

More than 12 years of IT experience in the implementation of Data warehousing projects using Talend, Informatica Power Center and Power Exchange, Informatica BDE, Data Stage, Alteryx, Syncsort DMExpress, Microsoft SQL Server (Azure), Cognos and Oracle PL/SQL.

Experience in the data migration and data integration.

Experience working on Kubernetes platform using docker, Jenkins and Rancher.

Experience with the spark platform using python and Scala.

Used power exchange for CDC (for capturing the before and after image of the records from the oracle tables)

Experience in working with complex COBOL copybooks (data maps through power exchange).

Experience working as a Data Modeler, both logical and physical and using Erwin data modeler tool.

Experience with Informatica BDE and Hadoop to access data in HDFS and Hive.

Extensive experience as a Team/Tech Lead and leading a group of resources.

Experience in using GitHub, Docker image, bitbucket, Jira board.

Excellent experience in using the Autosys, Tidal scheduling tool.

Technical Competencies

ETL Tools

Talend Studio, Informatica 7.X, 8.X, 9.X, 10.X, Informatica Power Exchange 9.X, Informatica BDE (Big Data Edition), Data Stage, Syncsort DMExpress, SSIS, Alteryx

Reporting Tools

Cognos 8, SSRS

Databases

Oracle 8i/9i/10g/11g, Teradata, Sybase, SQL Server 2008, DB2

Scheduling Tools

Tidal, Autosys, Control M, TWS, Airflow

Functional areas

Banking and Finance, Telecommunications, HealthCare

Programming/Scripting Languages

Scala, Python, Java, SQL and PLSQL, Unix Shell and windows DOS batch scripting

Tools

Toad for Oracle 9.0.1/10.6.1.3, SQL Developer, IntelliJ

WORK EXPERIENCE

Data Enablement & Architecture/ Automated Data Quality Suite Mar 2017 – Till Date

(Senior Data Engineer - Scotiabank)

ADQS: creating the unified view for the manual DQ issues that are getting generated from Collibra as well as for the automated issues that are being created by the business. Data is being ingested after doing various functional requirements implementations like applying technical/business rules to an attribute. This unified view is being used by the dashboard team for the reporting purposes.

-Talend Rest API is being used to generate the access token from Passport URL and this token is then being used for authentication to Collibra.

-Writing the java programs within the Talend Studio to address various business transformations.

-Used various data transformation scripts based on the workflow needs and executed these jobs from TAC, airflow, eclipse.

-Connecting to the collibra end points for various assets to pull the data quality issues (JSON payloads) attributes alongwith the relationships between multiple DQ issues.

-Creating DQ metrics rule for the Data Quality Technical Rule which represents an instance of a business rule applied to a column e.g. what is the % of the total passed records vs total available records, what is the completeness % of the end of day balance etc. Aggregated DQ metrics for all the technical rules is also being calculated for various assets.

-All the DQ metrics and DQ dimensions are being integrated from ADQS to the UC (universal catalog)

DEA : Creating common data lake by sourcing the data from multiple source systems (e.g. basic banking, mortgage, investment etc.). All departments data initiatives will leverage this product, which gives them easy access to trusted, well-understood data assets and services. Use CI/CD pipelines to build the package and deployment using docker image.

Environment/tools : Talend Studio, Spark, Scala, DB2, UNIX, Airflow, Java, Collibra, Python, Azure, Presto

Real Time Customer Insights – Consultant

(Senior Data Engineer - Scotiabank)

Targetting customers for the real time insights campaigns and various nudges for its existing and new customers. Each individual campaigns has been categorized into various LABS. Enterprise Data Lake’s data are being used as a source for all these campaigns.

Designed the solutions for the data stage migration project

Worked on multiple campaigns (e.g. Paperless, Pay-no-Pay, Fraud, CRM) in analyzing and preparing the data for the ESM and DSM team who are responsible for launching various campaigns

In addition, worked on various campaigns activities like Unica and SAS leads campaigns and nudges that involves interacting with AS400 system

Responsible for reviewing the codes like Scala, Python, Data Stage, Informatica, Hive and Unix scripts

Analyzed campaigns data using Pandas.

Environment/tools : Scala, Informatica, Data Stage, HIVE, DB2, Tivoli, Bitbucket, Talend, Python, Azure, Sqoop, UNIX, Alteryx, Pega, Autosys

CIBC – Lead Data Engineer Dec 2013 – 28th Feb 2017

Project # 1 CCT - EDH

Interacting with business to gather requirements and to propose the effective and efficient design and leading the technical team for coding and designing the tasks as per the CIBC standards.

Creating the Enterprise data hub (EDH) to integrate all the source system data in the hub using Informatica BDE and ingesting these data in the HIVE tables.

Used Informatica Power Exchange and syncsort DMX to process complex COBOL Mainframe files based on the copybooks.

Responsible for all kinds of designs, codes (e.g. ETL mappings, DB queries, autosys job designs).

Environment : Informatica BDE, HIVE, Oracle, Power Exchange, Autosys, UNIX, Syncsort DMExpress

Project # 2 BASEL III (Treasury LOB)

The global financial crisis has forced Governments and central banks across the world to overhaul the existing Basel guidelines for banks leading to the drafting of the U.S. Basel III Capital Rules. Consequently, areas like capital adequacy, liquidity risk, stress testing, trading, consumer protection and recovery and resolution planning will come under the purview of strict regulations.

Environment: Informatica Power center 9.0.1, Oracle 11g, Unix, Autosys Scheduling Tool, SQL*Loader

Project # 3 WAL

Worked under the Finance Technology department (EDSF) for replacing the WIL legacy system with the new WAL (wholesale Automotive Lending) system for improving the existing process with regards to the easy access by the end users and other customers like dealers and suppliers

Responsibilities involved in creating the application design document containing the requirements in detail.

Environment: Informatica Power center 9.0.1, Oracle 11g, Unix, Autosys Scheduling Tool, Informatica Power Exchange 9.0.1.

CGI Group Inc. – Tech Lead ( Consultant ) Jan 2012 – Dec 2013

DRG Conversion / DRG TAHP Extract

Tufts is one of the larget healthcare provider in Massachussets, USA .DRG (Diagnostic Related Group) was one of the project related with the different diagnostic related groups.

Cigna Provider Extract

This project was to re-write the existing application written in Oracle using Informatica in order to improve the performance and maintenance. Used power exchange for CDC for capturing the before and after image of the records from the oracle tables.

IHM(Integrated Health Management)

Primary goal of this project was data integration from different departments related with Tuffts. Requirement also involved generating various bills related to Medical, Pharmacy and Enrollment of the customers. The files for each categories were generated on weekly as well as on monthly basis based on the claims made by the customers. Used Java for creating the connections between oracle and the UI.

Environment: Informatica Power center 9.1, Informatica Power Exchange, Oracle 10g, UNIX, Java, Tidal

IBM India Pvt. Ltd. – Application Consultant Aug 2010 – Jan 2012

Bill ODS Consolidation/Bill Presentment

Bill ODS Consolidation basically deals with the merging of wireless and wireline bill i.e. instead of generating separate bills each for wireline and wireless, now a single bill will be generated.

Environment: Informatica Power center 8.6.1/9.1.0, Oracle 10g, Contol-M, Microsoft Visio, UNIX.

RBS (Royal Bank of Scotland) - Software Associate Dec 2008 – Jul 2010

ETL Re-Architecture

ETL Re-Architecture was the extension of the country risk. The objective was to tune the existing code by creating different streamlines.

Country Risk

After RBS acquired ABM amro bank, this project was started for migrating and integrating the data of ABN with the RBS. After the merger of ABN amro bank with RBS, the rules that were followed by the ABN was now going to apply all the rules defined by RBS.

Roles and responsibilities involved creating physical data models by creating various queries out of the logical data models to test various scenarios including the testing of data models and database.

Environment: Informatica Power Centre 8.6.0, UNIX, Oracle 10g, Microsoft Visio, Erwin

Capgemini India Pvt. Ltd. – Associate Consultant Feb 2007 – Dec 2008

Liquidnet

This project was related to the trading firm located in USA and the client was Liquidnet. The main objective of this project was data integration of the firm that Liquidnet has bought and were being merged into its business. PL/SQL used for ETL implementation and Cognos for the reporting tool.

Equities Analysis Datawarehouse

The objective of this Project was to define various business parameters along which a tool can be developed to assist stock investors / traders to evaluate a particular company to determine its investment attractiveness.

VCAS Data warehouse maintenance

Creating mapping for different source systems related to Lehman Brothers business in the market. Informatica used as an ETL tool and cognos 8 for the reporting.

Environment: Informatica 7.1.4, Data Stage, Cognos,Sybase, Perl, UNIX and AUTOSYS, SSIS, SSRS, SSAS, SQL Server, Oracle 9i, PL/SQL



Contact this candidate