Post Job Free
Sign in

Azure Data Factory

Location:
Surrey, BC, Canada
Posted:
June 25, 2024

Contact this candidate

Resume:

Sindhu Muni Padmanabha 778-***-****

www.linkedin.com/in/sindhu-munipadmanabha-9915aa146 ************@*****.*** TECHNICAL SUMMARY

● Have an overall 6 years of extensive experience as developer in Data warehousing, Data integration, Data Migration, ETL process and Business Intelligence projects.

● Designed, developed and transformed various data pipelines using ETL tools like Informatica (both on-prem and cloud), Teradata, SSIS and Azure data factory based on business requirements.

● Expertise working on various database applications like Oracle, Teradata, MySQL, Microsoft SqlServer, Netezza, PostgreSQL and Snowflake .

● Implemented and created various Teradata Bteq scripts for data loading and worked on improving the performance of already existing scripts.

● Worked on creating notebooks for the data load using Scala and Python in Azure data bricks .

● Created various reports using PowerBI and Tableau based on the business needs.

● Hands on experience with various version control tools like Bitbucket, GitHub and GitLab and used Jenkins for CI/CD process.

● Have good experience using Putty/WinSCP and writing Unix commands to store the scripts developed, to check the logs for failures and to view the source files.

● Have experience working in Agile/Scrum methodologies and worked as admin for Jira and Confluence.

● Hands on experience in creating or modifying data pipelines using Azure Data Factory using CDP and Gitlab integration.

● Broad experience using workflow management and scheduler tools like Azure Data Factory, Informatica and Control-M .

● Huge practices in trouble shooting, debugging, bug fixing, performance tuning and optimization of slow running ETL/ELT jobs using push down optimization, Partitioning techniques to manage large volumes of data.

● Have implemented various projects as part of my academics using Python (Data Mining, Data Wrangling, Data Processing and Prediction), Hive, HTML5, CSS3 Java, Android studio, AWS S3, Google cloud analytics and statistics.

Education

Douglas College, New Westminster, BC December 2020 Computing and Information Systems-Data Analytics

( Post Baccalaureate Diploma)

GPA - 3.58

Sri Venkateswara College of Engineering, India

Electronics and Communication Engineering June 2014

(Bachelor of Technology)

GPA - 86%

Work Experience

Organization: Infosys Ltd., Vancouver, Canada. May 2021 – Present Client: T - Mobile

Role: Technology Lead - Data Engineer

Tools: Teradata, Oracle, SSIS, Azure Data factory, Azure Databricks, Snowflake, SQL Server, WinSCP, Putty, GIT, Confluence,Scala, Python.

● Created Azure data factory pipelines to load data from SQL server to Snowflake using Scala and Python.

● Created bteq scripts to load data from other databases into Teradata based on business requirements.

● Worked on landing and moving data using different containers in Azure storage explorer.

● Worked with the Data architecture team and business analysis team to provide design and ETL solutions for the requirement.

● Performed Unit testing, code reviews and performance tuning for the code developed by fellow teammates.

● Created reports using PowerBI based on the business needs.

● Used Jira, PIER and ServiceNow tools for incidents and ticket tracking.

● Worked in Putty/WinSCP to store the scripts developed, to check the logs for failures and to view the source files.

● Used GitHub as version control tool and Jenkins for deploying the code to other environments.

● Created runbooks to guide the team on how to handle the failures and various other technical documentations in confluence.

● Worked as a team lead for a team of 10 members and successfully completed assigned projects on time. Organization: Great Canadian Gaming Corporation, Vancouver, Canada. February 2021 – May 2021 Role: Data Warehouse Developer

Tools: Informatica Cloud, SSIS, SQL Server, Winscp, Putty, Cherwell, Confluence.

● Providing technical development and 2nd level support services for designated applications including associated technical troubleshooting, analysis and research.

● Developing, updating and maintaining middleware interfaces using Informatica Cloud and SQL server.

● Managing and maintaining customized application deployment lifecycle and all the associated development packages.

● Supporting and developing application integration components using a mix of technologies like Informatica cloud, SSIS, SQL, VBScript and FTP.

● Conducting unit and data quality tests.

● Analyzing database models and requirements

● Working closely with end users, support vendors and technical team for troubleshooting and issues resolution activities

● Assessing application performance and integrity and developing recommendations for corrections and improvements

● Creating and updating technical documentations to ensure it is current and accurate. Organization: Ritchie Bro’s Auctioneers, Burnaby, Canada. September 2019 - April 2020 Role: Systems Integrator (COOP)

Tools: Jira, Confluence, Bitbucket, Putty, Jenkins, Powershell, Python, Salesforce, ServiceNow, SVN, Active Directory, Groovy.

● Administration, maintenance and upgradation of Atlassian Systems - Confluence, Jira and Bitbucket

● Salesforce sandbox, Docusign, Drawloop and Dupeblocker configuration and Salesforce access management.

● Writing Python scripts for automation of various tasks like creating Confluence documentation based on schemas and Atlassain plugin usage statistics.

● Created automated Jira functions using Groovy Scripting and worked on Microsoft SQL server for checking the ServiceNow ticket status.

● Configured InfluxDB in the Virtual Machines for monitoring the performance using Grafana.

● Building CI/CD pipelines using Jenkins.

● Creation, administration and maintenance of virtual machines through Vsphere.

● Working with ticketing systems and change requests on ServiceNow and Jira.

● Writing Powershell scripts to automate various tasks in salesforce.

● Training and supervision of junior COOP.

Organization: Cognizant Technology Solutions, Chennai, India. Period: November 24 th 2014- June 6 th 2018.

Project #1 May 2017 - June 2018

Title Golden Stay

Client IHG (InterContinental Hotels Group)

Tools Informatica 9.6.1, Teradata, Winscp, Putty, SSIS, SSAS, Microsoft SQL server. Role BI Data Engineer, Teradata Developer

Description:

InterContinental Hotels Group plc., informally InterContinental Hotels or IHG, is a British multinational hospitality company headquartered in Denham, Buckinghamshire. IHG has nearly 799,923 guest rooms and more than 5,367 hotels across nearly 100 countries. As per BRD’s company requires keeping track of every customer and details of their interaction with the hotel. So our project Golden stay is to combine the data of stay and booking into a single database, which will give complete information related to a customer. Roles and Responsibilities

● Hands on experience on Teradata BTEQ scripts and performed performance tuning for various BTEQ scripts for better performance.

● Worked on Coding the change request and defects raised by testing team by understanding and implementing the logics in existing mappings.

● Involved in analysis and identifying discrepancies of revenue gap between business and client.

● Created High-level design documents for developed mappings in confluence.

● Implemented ETL - SCD type 1 and type 2 for various tables in different projects using SSIS and Informatica.

● Created stored procedures and scheduled the SSIS packages using Microsoft SQL and reports using PowerBI.

● Prepared unit test documents for developed mappings and migrated to other environments ensuring zero defects.

Project #2 Feb 2016 - May 2017

Title CLIP

Client Branch Banking and Trust (BB&T)

Tools Informatica 9.6.1, Aginity (Netezza), Winscp, Putty. Role Informatica Developer

Description:

Branch Banking and Trust (BB&T) is one of the largest financial services holding companies in the U.S. Deposits, Retail banking, Mortgage and CLIP constitutes the core business. CLIP called as Commercial Lending Improvement Program, which includes details of customers who made loans to business for their day-to-day needs.

Roles and Responsibilities

● Implemented ETL for various facts and dimensions in Informatica power center.

● Tested the functionality by performing various Data load validation checks across different layers.

● Tested the logics implemented with all the test cases ensuring 0 defects in development phase.

● Coded the change request by understanding and implementing the logics in existing mappings.

● Hands on experience on Netezza queries.

Project #3 Nov 2014 - Feb 2016

Title STARS

Client Walt Disney Parks & Resorts

Tools Informatica, Teradata, Putty

Role ETL Developer

Description:

Walt Disney Parks and Resorts Inc., a division of The Walt Disney Company (NYSE:DIS), is World’s leading providers of family travel and leisure experiences. It is responsible for the conception, building, and managing theme parks and vacation resorts, as well as a variety of additional family- oriented leisure enterprises. The Information Management program initiative is to provide architectural solution and implement it for managing the data from various sources systems to a Centralized Data Repository (CDR). The data from different source systems were stored and transitioned into a common enterprise-wide data, which can be used by multiple business areas for Business Intelligence, analytical reporting, data enrichment and data cleansing. By bringing in the Seaware data business can analyze the guest trends for their cruise line booking and optimize the overall process to provide better service delivery to the Disney guest. Roles and Responsibilities:

● Involved in business requirement analysis and clarifications have been raised.

● Created High level design documents for ETL process.

● Designed and developed various datamart ETL mappings in Informatica power center V9.6.

● Hands on experience on Teradata Bteq Scripts.

● Tuned various Informatica mappings to achieve better performance by identifying performance bottlenecks and using various techniques like PDO.

● Involved in unit testing and created the test reports for various functional and data versioning scenarios. Domain Knowledge transition to the new resources in the project. Awards

● Received “BEST EMPLOYER” of the month twice.

● Awarded with Cognizant innovation award “EKALAVYA” of the quarter. Interests and Activities

● Playing outdoor games like Badminton, Throwball, Volleyball.

● Interested in playing mobile and computer games.

● Love listening to melodious music everyday.



Contact this candidate