Post Job Free

Resume

Sign in

Data Warehouse Developer

Location:
Pune, Maharashtra, India
Posted:
May 11, 2021

Contact this candidate

Resume:

Pallavi Vibhute

Data Warehouse / Business Intelligence Developer

Contact

Address:

Chandrama Residency, ‘B’ Wing, Flat No. 15, Suncity, Pune

Phone:

+91-721*******

Email:

adma3b@r.postjobfree.com

LinkedIn:

linkedin.com/in/pallavivibhute

Skill Highlights

Data Warehousing

Database Management

Azure / Oracle Cloud Certified

AT&T Client interfacing

Performance Tuning

Query Optimization

Impact / Data Analysis

ETL / ELT

DevOps, Agile, CI / CD

Automation

Conducting Tech Forums

Data Modeling

Complex Problem Solver

Snowflake hands-on essentials

Summary

A team & goal oriented, dedicated DWH/BI Developer having an experience of 4.10 years in Amdocs with strong SQL, analytical and problem solving skills. Looking for an opportunity that would entrust me with leveraging open source technologies & application development.

Achievements

2021 Microsoft Azure Cloud Fundamentals 2021 Certification

2020 Oracle Cloud Developer Associate 2020 Certification

2019 Winner of ‘Hackathon 2019’, at an organization

2018 Winner of ‘Kaizen Event 2018’, an innovation program

2017 Certificate of Appreciation 2017 Quarter 4 for exceptional

Contribution in Revenue project

2017 Rewarded by 3 Spot Awards for Quality performance and

collaborativeness amongst team

Experience

Data Warehouse/ BI Developer [Jul 2016 - Present]

Amdocs Development Centre India LLP - Telecom Domain

Client : AT&T US

Experienced in DevOps, Agile, SCRUM methodologies

Involved in Requirement Gathering, Analysis, Planning, Design, Development & Implementation stages of various projects

Expertise in working on Informatica Powercenter 9.x/10.x

Expertise in working on Teradata 16.0 systems & utilities (BTEQ, TPT, MLOAD, FLOAD, FEXPORT), Unix / Linux / Python shell scripting

Advanced SQL skills, including use of derived tables, unions, multi-table inner/outer joins, Query Optimization

Responsible for applying business transformations, communicating functional / technical issues with Onsite managers

Defining schema, staging tables, landing tables, configuring base objects, foreign-key relationships

Contributing the technical metadata to the metadata repository

Proficient in database management practices & processes which includes writing user documentation, table level specification, table process flow documents and training, impact analysis, data analysis

Ability to work independently & as a team & able to communicate effectively with customers, peers and management at all levels

Conducting tech forums with stakeholders & customers prior to project deployment

Good knowledge of data and dimensional modeling

Performance tuning including collecting statistics, analyzing explains & determining which tables need statistics,

Increased performance by 35-40% in some situations

Skill Highlights

Ability to meet critical deadlines

Training new joiners

Effective Team Player

Collaborative

Self-motivated

Ambitious

Strong in SQL

Tools / Technology

Teradata SQL

Informatica Powercenter

TOAD

Oracle

Unix/Linux Shell Scripting

GIT

Jenkins

SVN

Vertica

Microsoft Office Suite

BTEQ

TPUMP

FastLoad

MultiLoad

FastExport

Slowly Changing Dimensions

SDLC

Gromit Metadata Repository

JIRA

Python

Database - MySQL

OS - Windows / Linux / Ubuntu

Scheduler - Maestro

Trainings / Seminar

Scaled Agile SAFe 5.0

Python 3.7

Azure Cloud

Oracle Cloud

Microstrategy Reporting Tool

Teradata Architecture

Scheduling Mechanisms

Data Mining Techniques

NiFi Data Ingestion Framework

Projects :

Merlin

Identified functional requirements, analyzed the system, provided suggestions & design as per requirements

Worked with other team members, including DBA's, Technical Architects, QA and Business Analysts to define the best solution

Created Informatica Mappings to build business rules to load data using transformations

Responsible for capturing, reporting & correcting error data

Used Maestro for job scheduling, workload automation and for generating reports

Responsible to get the US’s accepted from client post successful delivery/validation of the project

Evergent

Worked on integrating customer data from multiple data sources to achieve consistency and providing a single source of truth for business decision making

Included error-handling and exception handling by logging in the error tables and sending an alert message via e-mail to the concerned distributions list

Designed and developed the data transformations for source system data extraction, data staging, movement, aggregation and data quality handling

Created Informatica Mappings to build business rules to load data using transformations

Find

Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters

Applied slowly changing dimensions like Type 1 and 2 effectively to handle the delta Loads

Advanced SQL along with regular expressions is used

Wireline Billing for WTN

Created mappings and sessions to implement technical enhancements for data warehouse by extracting data from sources, Oracle & Delimited Flat files

Closely worked with the reporting team to ensure that correct data is presented in the reports

Automation(Python)

This script takes only SQL Queries from user and convert it into the standardized BTEQ / TPT script format, avoiding syntactical errors and reducing the time required for manual efforts ~1.5 hrs/day

Education

Bachelor of Engineering - Computer Engineering

Sinhgad College of Engineering, Vadgaon(Bk), Pune



Contact this candidate