Post Job Free

Resume

Sign in

Data Warehouse Sql Server

Location:
Toronto, ON, Canada
Posted:
October 03, 2023

Contact this candidate

Resume:

Sharath Chandra Modem

ETL / Informatica Developer

Azure Databricks Azure Data Factory Informatica Lead

+1-647-***-**** adz43z@r.postjobfree.com

Toronto, ON

Over 7+ years experienced ETL, Data warehouse & Data Analyst proficient in creating data flow channel and actionable insights for management. An accomplished software engineer with extensive experience in full life cycle of software development and support process.

Worked primarily in the domain of Telecom. Predominantly worked for the Cisco Enterprise Data warehouse (EDW) and Platform Operations (EDIO), Rogers Communications for Enterprise Data and Analytics (ED&A).

Commendable experience in migrating the oracle data warehouse to azure using azure data factory and Azure Databricks.

Possesses a strong comprehension of ETL principles, architecture, and creating SSIS packages.

Has experience in writing SQL for various databases, such as Oracle, SQL Server, Teradata, and Snowflake in Cloud Data Warehousing.

Skilled in developing data warehouses and has a solid understanding of OLAP/OLTP. Capable of creating UNIX shell scripts for SSIS processes and ETL flow.

knowledge with programming languages like Python and SQL.

Extensive working experience in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating Database Objects like tables, Constraints and Indexes using DDL, DML, DCL commands.

Experienced in SQL Server DDL, DML, and transaction queries.

Skilled in utilizing various SSIS transformations such as source qualifier, lookup, filter, expression, router, joiner, update strategy, rank, aggregator, stored procedure, sorter, normalizer, union, and XML source qualifier.

Experienced in automating ETL extraction from Snowflake and SQL Server databases to distribute data to suppliers using scheduling tools.

Practical expertise with elastic search, AWS step function, and Amazon Event Bridge.

Practical knowledge of AWS Services Subnets, including EC2, S3, Cloud Front, and SNS

Co-ordinated monthly roadmap releases to push enhanced/new informatica code to production.

Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 10.1 to extract the data from multiple source systems that comprise databases like Oracle 19C, SQL Server 7.2, flat files to the Staging area, EDW and then to the Data Marts.

Created mappings using different lookups like connected, unconnected and Dynamic look-up with different caches such as persistent cache.

Hands on experience in core Python programming language

Good Understanding of various Python libraries like NumPy, SciPy, Panda etc.

Worked on Informatica Source Analyzer, Mapping Designer & Mapplet, and Transformations.

Performed Impact Analysis of the changes done to the existing mappings and provided the feedback

Create mappings using reusable components like worklets, mapplets using other reusable transformations.

Participated in providing the project estimates for development team efforts for the offshore as well as on-site.

Coordinated and monitored the project progress to ensure the timely flow and complete delivery of the project

Hands on in error handling and debugging coding issues and troubleshooting of production problems.

Proficient in performing DBA tasks including installing and configuring SQL servers and backups/restore and recovery of databases. Extremely diligent and possess good communication skills.

Capable of building ETL architecture and source-to-target mapping to load data into a data warehouse.

Strong Analytical and problem-solving skills. Ability to quickly master new concepts and applications.

Good written, communication, presentation skills, proven team player and team lead with an analytical mindset to problem solving and delivering solutions.

Experience

ETL Developer Oct 2022 to till date

Tata Consultancy Services

Toronto, Canada

Project Details: Rogers

Description: CABLE IBRO (Migration from Oracle to Azure)

Designed and developed ETL processes to extract data from various sources and load into a data warehouse.

Worked with business analysts and stakeholders to understand data requirements and translate them into ETL processes.

Created and maintained technical documentation, including ETL process flows, data mapping, and data lineage.

Developed and executed test cases to ensure data accuracy, completeness, and integrity.

Worked with database administrators to optimize ETL performance and ensure scalability.

Developed and maintained ETL scheduling and monitoring processes.

The project objective is to migrate of ETL and Database to Azure cloud for a telecom giant. Analyze existing ETL data warehouse, work with architects, project managers to understand design specifications document based on new cloud data warehouse i.e., Microsoft Azure platform.

Developed SCD-Type 1, SCD-Type 2 Databricks notebook. Also work on sequence generator.

Developed Databricks notebook for one-time historical load. Did end-end Development to migrate On-prem tables to Cloud

Create schema registry and YAAF scripts for ADF pipelines.

Fixed defects raised by QA and resolved their queries.

Implemented REF Cursor and Advanced PL/SQL Cursor.

created a complex trigger in PL/SQL to enforce business requirements and maintain data integrity.

Prepared STM’s for all the Informatica ETL’s and SQL scripts.

Interact with business, cross functional teams, and SME on daily basis for timeline requirements, issues, and updates.

Created documents related to standards such as ETL strategies, naming standards and input/output file formats

Validate data after parallel runs and do necessary implementation changes in case of failure or data mismatch.

Handled prod failures.

Mentored juniors providing technical, domain assistance, trouble shooting.

ETL Developer Oct 2018 to Sept 2022

Tata Consultancy Services

Hyderabad, India

Project Details: Cisco EDW Teradata

Description: Developed, optimized, and implemented T-SQL functions, tables, stored procedures, and

triggers in SQL Server.

Utilized Informatica PowerCenter to extract data from MS Excel, flat files, and MS Access and schedule sessions to load it into the target system.

Provided support and maintenance for Informatica PC ETL, Oracle, and SQL Server operations.

Created stored procedures for various types of SSRS reports, including tabular, drill-through, parameterized, and matrix reports.

Conducted data ingestion from various sources, such as Oracle and Informatica, using Sqoop, and established ETL architecture and source-to-target mapping to load data into the data warehouse.

The project objective is to provide development, maintenance, and enhancement to the Teradata Enterprise data warehouse ‘EDW – CISCO' which is in 3NF data model. Different Types of organizational data like Bookings, Expense, Revenue and Service is loaded into Warehouse from Transactional databases through Oracle in 3 Phases using Informatica ETL Plans. First, Data from Oracle is extracted into Flat files. Second, data is load into Stage tables from flat files using Teradata utilities Multiload, Fast Load and TPump. Third, the Data load process parses, validates and moves data from stage tables to 3NF tables.

Data Marts extract data from Published layer and Landing zone layer which are built on top of 3NF tables. Data from 3NF tables in EDW is replicated into 3NF tables in EDW2 using Golden gate and Data Sync for CISCO Business continuity.

Roles and Responsibilities

Created different types of ETLs using different transformations in Informatica.

Have working experience on Job scheduling tools (TES, Control-M), Teradata load Utilities, monitoring tools (Viewpoint), data replication tools (Informatica, golden gate, Data sync between two Teradata systems through Arcmain utility)

Closely worked with Teradata DBA team on trouble shooting the performance issues.

Implemented different SCD types depend on data volume and requirements.

As a Data Engineer used to maintain high level data Quality and Consistency between different source systems.

Developing new business requirements and enhancing already existing data flow.

Used Teradata loader utilities like Fast load in data loading from flat files to relational tables.

Maintaining data Quality in job refreshes (single and multiple refreshes) and handling Informatica job aborts, applying short term fixes based on data sensitivity and criticality.

Developed the jobs in Tidal Enterprise scheduler with required dependencies and alerts.

Migrated the existing ETL jobs from TES scheduler to Control-M.

Developed expertise and knowledge for consistent delivery of outstanding quality services.

Guided teams in product merchandising and inventory management.

Database Developer Sept 2016 to Sept 2018

Tata Consultancy Services

Hyderabad, India

Project Details: PepsiCo, Inc

Description: PepsiCo, Inc. is an American multinational food, snack, and beverage corporation headquartered in Harrison, New York, in the hamlet of Purchase. PepsiCo has interests in the manufacturing, marketing, and distribution of grain-based snack foods, beverages, and other products.

Built basic ETL that ingested transactional and event data from a web app with 12,000 daily active users that saved over $85,000 annually in external vendor costs

Developed mappings that perform Extraction, Transformation and load of source data into Derived Masters schema using various power center transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure, SQL and update strategy to meet business logic in the mappings.

Understanding the business flow, requirements and providing solutions. Job monitoring for the smooth runs of the applications. Responsible for coordination with onsite team for fixing high Priority Issues.

Analyzing issues and data and providing solutions for the same. Implementing the production fixes and development, enhancement of mappings and workflows as per the fixes. Working on support and recovering the systems from failures/problems within predetermined SLA's.

Interacting with Business users, third party vendors for ad-hoc requests and new minor changes requirements.

Developing ETL components and database scripts.

Performing gap analysis between two different databases.

Analyzing existing impacts on data/reports.

Performing unit and system testing on the developed Informatica ETL components from end to end.

Perform ad-hoc analysis of production data where necessary to develop solutions to reported incidents

Develop technical system documentation

Programming – utilize programming languages to create complex functions for use in databases

Core Qualifications

Informatica Power Center.

Azure Data factory, Azure Databricks

Github

Pyspark

Informatica Intelligent Cloud Services

Oracle 11g/10g

Teradata

Relational databases

Tidal Enterprise Scheduler, Service Now Incident Management

XML (Extensible Markup Language)

HTML, CSS

UNIX shell scripting

SQL Server Package

Education

Post Graduate Diploma: Cloud Computing for Big Data Sept 2022

Lambton College Toronto, Canada, Canada



Contact this candidate