Post Job Free

Resume

Sign in

Data Informatica

Location:
Bellevue, WA
Posted:
January 04, 2021

Contact this candidate

Resume:

RAMAKOTESWARARAO YAKKALA

Contact: +1-732-***-****, E-Mail:adi5vi@r.postjobfree.com

LinkedIn Profile: https://www.linkedin.com/in/ramakoteswararao-yakkala-b1484b75/

JOB OBJECTIVE

To secure a position with an organization of repute where I can Exercise, maximize my ETL, BIG DATA, DW & BI, to meet business objectives.

PROFESSIONAL SUMMARY

With 14 + years of overall IT Experience, I have worked in Various Roles with varied technology Stack on Integration/Migration projects, E2E delivery of Big Data Projects of Customer Journey, Data warehousing projects. This experience includes Development/ Operations of applications for Verizon and T-Mobile.

PROFILE SUMMARY

Enriched with good hold of Data warehousing & Business Intelligence projects management involving various technologies like Informatica, oracle, Teradata & concepts like Client/Server, grid, cloud and approaches like Dimensional, Logical & Physical data modeling

Knowledge of advanced concepts of ETL, ELT, Data Mining/Cleansing/Profiling/Masking, BI Analytics, BIG DATA, Hadoop, Visualization tools, python, R etc…

Extensively used ETL methodology for supplying Data Extraction, Transformations, loading to process solution in a corporate–wide–ETL Solution using Informatica BDM 10.2.2,10.21,10.2.0, Informatica Power Center 9.6,9.5.1,9.1.0,8.5,7.1

Worked on databases like Oracle 11g/10g/9i, Teradata 16/15/14/13, SQL Server 2008.

Extensive Experience using SQL, PL/SQL, SQL*Plus, SQL*Loader, Unix Shell Programming/Scripting, Python, Scala.

Knowledge in SQL, PL/SQL, Visio, Erwin, DataStage, Hadoop, Spark

Experience in loading data into Teradata Database. Implemented Teradata Parallel Transporter connection as part of Performance Tuning by creating Stored Procedures.

Involved in Complete ETL code migration to Teradata from Oracle RDBMS.

Interacted with Management to identify key dimensions and Measures for business performance.

Extensively worked on Data migration, Data Cleansing and Data Staging of operational sources using ETL process and providing data mining features for Data warehouses.

Optimized the mappings using various optimization technique like Push Down Optimization (PDO).

Implemented performance tuning techniques on Targets, Mappings, Sessions and SQL statements.

Exposure to Dimensional Data modeling that includes, star schema and snowflake schema modeling.

Extensive experience in Relational database/data warehouse environment: Slowly changing dimensions (SCD), operational data stores, data marts.

Proficient in all phases of the Software Development Life Cycle (SDLC), including requirements definition, system implementation, system testing and acceptance testing, Production Deployment and Production Support.

Used the job scheduling tools like Control-M, UC4 (Appworx), Cisco Tidal.

Substantial experience in Telecommunications Domain.

Excellent communication, interpersonal, analytical skills and strong ability to perform as an individual as well part of a team.

EDUCATION

B.E in ECE (Electronics and Communication Engineering) from MADRAS UNIVERSITY passed out in the year of 2002.

TECHNICAL SKILL - SET

ETL Tools

Informatica BDM 10.2.2/10.2.1/10.2.0, Informatica Power Center 9.6.1,9.5.1/9.1.0/8.x/7.1(Designer, Workflow Manager, Workflow Monitor, Repository Manager, Integration Services, Repository Services, Administration Console

Data Modeling Tools

Microsoft Visio 2007/2000, Erwin 7.0/4.1,

RDBMS

Oracle 11g/10g/9i/8i/8.0/7.x, Teradata 16/15/14/13, SQL Server 2008, MS Access

Tools

Toad, SQL Navigator, SQL*Loader, MS Access Reports, Informatica Data Analyzer, Teradata Utilities (MLoad, FLoad, Teradata SQL Assistant)

Languages

SQL, PL/SQL, C, Python, Unix Shell Scripting,Awk/Sed, Scala

Scheduling Tools

Control-M, UC4 (Appworx), Cisco Tidal

Other Tools

HADOOP (HDFS, Pig, Sqoop, Hive,Hbase ),Spark, Mongo DB, FLUME,KAFKA

Operating Systems

HP-UX, IBM AIX 4.2/4.3, MS DOS 6.22, Win 7, Win NT 4.0, Sun Solaris 5.8, Unix and Red hat Linux7.2.2.

CERTIFICATIONS AND TRAININGS SUMMARY

Informatica Certified Developer, Certificate No: 292412

Oracle SQL Developer Certification.

ACHIEVEMENT SUMMARY

Got “Excellence Project Execution & Implementation” Award for Project North Project in Verizon.

Got “On the SPOT” award for the year 2007, 2008, 2012, 2013, 2014, and Above & Beyond award for 2009, 2010, 2014 at Verizon.

Work/Professional Experience

Client : T-Mobile USA

Duration : Jan’2018 to till-date

Project Name : Informatica BDE 10.x

Role : Sr. Software Developer

Environment: : Informatica Big Data Edition 10.2.x, AWS Hadoop Distribution: Hortonworks 2.6,

Teradata SQL Assistant 15/14,Teradata Utilities (BTEQ,MLoad, FLoad, Fast Export),HP-UX

Project Description:

The project involves in development of data ingestion pipeline for business applications to process Finance/Digital/Marketing data. The objective of the project is to build real-time and batch Data-ingestion pipeline to process incoming data from external vendors by refining data and apply validations and transformations and finally store data to Data-warehouse which is further used in reporting and analytics. Also, changing business needs changes in existing projects and/or new launches. Migration of existing

projects with trending technologies helps in betterment in terms of efficiency, accuracy, through-put and cost effective.

Roles & Responsibilities

Developed, implemented and take the responsibilities for setting up Informatica BDM tool from inception to production phase by using Apache Spark as the execution engine in running the Data Ingestion into Data-Lake.

Closely worked with Business users, Informatica Product support group and Hortonworks teams for on boarding informatica BDE.

Implemented Informatica BDM mappings & workflows for Data ingestion of various data sources into Hadoop Data Lake.

Implemented the robust process through informatica BDE, which will help the business team to dynamic change the business needs.

Involved in setting up Hadoop configuration (Spark, Hadoop and Hive connection) for Informatica BDM and testing Various execution engines of informatica BDE.

Design & development of BDM mappings in various execution engines (Hive, Blaze and spark) mode for large volumes of INSERT/UPDATE.

Setting standards in adopting of the ETL tool.

Created dynamic workflows/mapping so that it can handle various Business model and rules for ingestion of various data sources.

Designed and implemented dynamic mapping/workflow for the File ingestion and DB ingestion.

Automated Informatica workflow by using the control-m scheduling.

Good understand of Informatica BDE archericture and various components and there usage.

Implemented Sqoop functionality for DB ingestion and exporting from Data lake to any Relational sources.

Designed and demonstrated in creation of the schema parameter file and automated generation of parameter file for various workflow.

Used IDQ Mapping variables/output and capture into workflow variables.

Involved in poc phase of the project to benchmark against various ETL tools.

Implemented workflow and mapping On cloud AWS in utilization of S3 buckets in reading and writing the data to and from data lake.

Extensively worked on code migration from one environment to another by using Infa CLI commands.

Client : T-Mobile USA

Duration : Jan’2018 to till-date

Project Name : IDW (Integrated Data warehouse)

Role : Sr. Software Developer

Environment: : AWS Hadoop Distribution: Hortonworks 2.6,

Teradata SQL Assistant 15/14,Teradata Utilities (BTEQ,MLoad, FLoad, Fast Export),HP-UX

Project Description:

Data is been ingested into Hadoop through a framework called DMF (Data movement Frame) into Edge node, Once the data is available in edge node, Hadoop Jobs are been triggered to move the data from edge node to Hadoop and then Hive Queries are been executed for creation of hive table and load the data into hive table. Transformation are being applied to the ingested data for the desired reports based on Business need.

Roles & Responsibilities

Extensively worked with Partitions, Bucketing tables in Hive and designed both Managed and External tables in Hive to optimize performance.

Created and worked Sqoop jobs with full refresh and incremental load to populate Hive External tables.

Experience is writing python scripts and Unix bash scripting.

Worked on Pig to do data transformations, event joins, filter and some pre-aggregations before storing the data onto HDFS.

Worked on creation of Oozie workflow for Daily ingestion jobs.

Developed Simple to complex Map/reduce Jobs using Hive, Sqoop and Pig.

Design and Develop Pig Latin scripts and Pig command line transformations for data joins.

Creating hive tables to the imported data for validation and debugging.

Experience in using Sqoop to migrate data to and fro from HDFS and My SQL,Oracle and Teradata.

Designing and creating Oozie workflows to schedule and manage Hadoop, Hive, pig and sqoop jobs for Daily Loads.

Designed and deployed big data analytics data services platform for Data injection into Hadoop Data lake.

Client : Verizon Data Services INDIA

Duration : June’2016 – Dec’2017

Project Name : VISION5 - Single Biller

Role : Consultant-System Development

Environment: : Informatica 9.6.1, Teradata SQL Assistant 15/14,Teradata Utilities (BTEQ,MLoad, FLoad,

Fast Export),HP-UX

Project Description: A new instance of VISION platform, VISION5, will be created to handle all VzT CMB billing. The New Business Process will have the FiOS customers in scope for this phase will be billed by VISION5 and the revenue will be booked in vSAP via the FDW/RACE/vSAP path in place of current process of FiOS Customers in scope for this phase are billed by RIBS/CBSS, and the revenue is booked in people soft today.

The objective of the Single Billed project is to move FiOS Customer accounts from RIBS billing systems into a new instance of the VzW Vision platform referred as VISION5. aEDW and its Sub Applications will make the changes necessary with the new VISION5 instance while removing the complexity involved with interfacing with 5 different legacy billing systems.

Challenges:

Eliminating Five Different Billing Systems and incorporating the functionalities in single System V5 adopted from Wireless, Scope Reduction, Elimination of Waste, and Technological Consolidation are few challenges.

Architecture, designing, generating Vision ID Process for New customers & existing, One Bill mechanism in parallel with Bundle Re-Architecture continuing BAU.

Single Biller, Unique Customer Identification Vision ID Conversion and sync up of existing Warehouse 6000+ tables & 200+ TB of Data, 40000+ Reports is very huge data Cleansing, Exploring Challenge.

Data Replication, Configuration, Architectural Setup using GoldenGate for Sales & IBM-CDC for Billing Data.

Meeting the ROI, KPI, and PXQ metrics by managing the Project Scope, Estimation, Costing, Release management, Configuration Management, Change Management with Best Project Management practices.

Responsibilities:

Played Key role in the Project initiation and responsible for phases definition by Services (FiOS, Copper), and Phases for 6M Customers Vision Conversion.

Managing Design & Architecture, Development, ITO testing & Deployment Teams of aEDW Interfaces, QUANTUS, Vision Conversion Modules.

Participated in the strategic planning of MDR Migration, project planning, Hardware & software requirements procurement for Dev & Prod Environment. Played key role in allowing 85K hours project & 3.9 M Savings by retaining ownership with Verizon IT owing the E2E solution leaving Teradata partnership.

Responsible for ETL Conversion tool & Unit testing at Map level to convert the existing mappings to be compatible to run on Teradata Environment.

Participated in all phases including Client Interaction, Requirements Gathering, Design, Coding, Testing, Release, Support and Documentation.

Interacted with Management to identify key dimensions and Measures for business performance.

Involved in defining the mapping rules, identifying requires data sources and fields

Extensively used RDBMS and Oracle Concepts.

ETL data load using Teradata load utilities like BTEQ, Fast Load & Multiload.

Dimensional modelling using STAR schemas (Facts and Dimensions).

Develop ETL Jobs using various transformations to handle the Business Logic.

Preparation of Technical Specification document and Migration Documents.

Develop Informatica Mappings, Sessions and Workflows.

Involved in Performance Tuning.

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Loading Data to the Interface tables from multiple data sources such as Text files and Excel Spreadsheets as part of ETL jobs.

Used various Informatica Transformations in building the ETL jobs to get the data loaded into Data warehouse.

Involved in Data Analysis for developing Dashboards and other end user solutions. Worked with Business Analysts and acts as a liaison between technical departments and business departments.

Analysed data flow requirements and developed a scalable architecture for staging and loading data and translated business rules and functionality requirements into ETL procedures.

Participate in development, defect resolution on existing ETL processes.

Unit Testing and System Integration testing.

Trouble shooting data and error conditions using the Informatica Debugger

Client : Verizon Data Services INDIA

Duration : June’2016 – Dec’2017

Project Name : Customer Journey Analytics

Role : Consultant-System Development

Environment : HDP 2.4.1, Hive, Sqoop,pig, Oozie, Apache Ambari,Spark,Scala,mysql,

Oracle, Teradata 14,SQL server, RHEL 6

Project Description: CJA is the one of the key project which is built on Hadoop platform. In this we are pulling all the customers touch points data from different systems and implemented model which helps better insights of customer journey so that we could provide the better customer experience and reduce the Churn rate.

Smart cart provides Omni channel experience to the customer. Whenever customer interact with any engagement channel and somehow, he couldn’t complete the order then even after few days he goes to some other channel, we are giving the feasibility to complete the order from the point where his previous order was dropped. We are storing the Customers orders information as smart cart save request for the customers whoever completed their address validation, credit check and then comes to order page but couldn’t able place an order. once smart cart order becomes sale then record will deleted from the cart and even inactive cart also removed after 30 days.

To provide better customer experience and increase the RCM, we started abandon cart project. Here we tracking all the existing FiOS customers who comes to one of the online ordering channel, Verizon.com and visits add/upgrade services pages and then come out of the website .We send these customers details to Marketing team then they will analyse the customers data in multiple aspects and do the campaigns multiple ways like email communication, Adds, D2D campaigns and also we provide the offers for those customers based on customer – package segmentation.

Roles and Responsibilities

Pull the RDBMS(Teradata,ORACLE,SQL Server) data into HDFS by using sqoop and we pull the file feed using shell script

Develop the shell scripts to insert process details into mysql audit tables Once Sqoop is completed

Create the hive tables on top of HDFS data files

Create domain tables which are having the partition columns on proc_date and src_system and load data into them

Create the key tables, Node table and Fact table and load data using hive scripts

Automate the entire process using the oozie workflow manager and coordinator Jobs

We used the mail action nodes to send the job status details.

Pull the smart cart save request events from kafka broker and load into hdfs using apache Nifi in real time manner

Move the smart cart events from Nifi target location to stage location in batch mode and create hive table on top that files

Apply the xml parsing logic and store the final result into one more Hive table, which is source for Reporting team

Automate the complete process using oozie workflow scheduler.

Pull the Data from Teradata to Hadoop environment using Sqoop and create a hive table

Applying the parsing logic on xml table data field and get the required information using Hive xml functions

Applying ETL logic & filter to get only abandon cart data from complete online customer’s data set

Export the final feed to Teradata using Sqoop which is golden source for marketing team for their campaigns

Automate the entire job and schedule the job daily by using oozie workflow

We used the mail action nodes to send the job status details to appropriate teams and also send each action node statistics into mysql table for auditing purpose.

Client : Verizon Data Services

Duration : July’2014 – May ’2016 India

Project Name : OrBiT MDR – AEDW Migration

Role : Senior Analyst-System Development

Environment: : Informatica 9.6.1, Teradata SQL Assistant14, Teradata Utilities

(BTEQ,MLoad, FLoad, Fast Export),HP-UX

Description: “Increase revenues by reducing costs” initiative doctored the OrBiT program and visualizing it through Consolidation of Applications, Systems & Data Centres. As part of this MDR, Metrics Data Repository the operational and strategic Metrics Data Warehouse for Verizon Telecom Services is being migrated to aEDW, active Enterprise Data Warehouse. This consolidation effort will eliminate the overlap of some existing order and billing data and will also enrich aEDW data mart with new data it does not currently possess. The Strategic Benefits are #) share Customer level business intelligence/ analytics by having both performance (MDR) and marketing, campaign (aEDW) metrics Under one platform #) Enable cross-domain

analysis #) Enhance Speed to market by creating one common source for performance metrics, Marketing and Sales tracking & Compensation.

Challenges:

Identifying the Project scope and Phases in spite of Requirements gaps, Technological Gaps and Architectural gaps between the existing and proposed migrating environment with proper Analysis.

Project Estimation, Costing & Budgeting, Identify ROI and Project planning, identifying Hardware & Software requirements involving all the stakeholders.

ETL Code Conversion to be compatible to work on Teradata Database & Plan 60+ TB Data Migration.

Setting up the environment to utilize the VDSI Resources to reduce the project costs and identify the optimal balance of onshore & offshore team sizes.

Project Execution and meet the project timelines with proper Risk management & other challenges like Monitoring controlling, Measuring the Project Progress with proper metrics, Teams Training on technologies.

Responsibilities:

Participated in the strategic planning of MDR Migration, project planning, Hardware & software requirements procurement for Dev & Prod Environment. Played key role in allowing 85K hours project & 3.9 M Savings by retaining ownership with Verizon IT owing the E2E solution leaving Teradata partnership.

Responsible for ETL Conversion tool & Unit testing at Map level to convert the existing 9000+ mappings to be compatible to run on Teradata Environment.

Participated in all phases including Client Interaction, Requirements Gathering, Design, Coding, Testing, Release, Support and Documentation.

Interacted with Management to identify key dimensions and Measures for business performance.

Involved in defining the mapping rules, identifying requires data sources and fields

Extensively used RDBMS and Oracle Concepts.

ETL data load using TeraData load utilities like BTEQ,FastLoad & MultiLoad.

Dimensional modeling using STAR schemas (Facts and Dimensions).

Develop ETL Jobs using various transformations to handle the Business Logic.

Preparation of Technical Specification document and Migration Documents.

Develop Informatica Mappings, Sessions and Workflows.

Involved in Performance Tuning.

Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.

Loading Data to the Interface tables from multiple data sources such as Text files and Excel Spreadsheets as part of ETL jobs.

Used various Informatica Transformations in building the ETL jobs to get the data loaded into Data warehouse.

Involved in Data Analysis for developing Dashboards and other end user solutions. Worked with Business Analysts and acts as a liaison between technical departments and business departments.

Analyzed data flow requirements and developed a scalable architecture for staging and loading data and translated business rules and functionality requirements into ETL procedures.

Participate in development, defect resolution on existing ETL processes.

Unit Testing and System Integration testing.

Trouble shooting data and error conditions using the Informatica Debugger.

Few Roles & Responsibilities for below enhancement projects of MDR July 2013 to June 2014 India

#) Consumer Affinity Programs

#) 500 GB DVR Expansions

#) Mass Market Weekly Access Line Walk

#) Verizon Websites - Downgrades

#) Sales Channel in Billing/Ribs

#) Network Walled Garden

#) California Cramming Rules - Compliance

#) FiOS TV New STB Strategy - vBundles

#) TPM Sub-Regional Reporting Restructure (CBST0139 Re-sync process)

#) Live TV Content Expansion (MPEG4)

#) FiOS Temporary Service Suspension Details to <SVC> Billing Data Marts

#) Reports for FDV New Sales vs. Migrates

Acting as a good Team member & part of change control board.

Responsible for MASKING environment creation for ETL DEV team to avoid the project closure by clearance team.

MDR part: Involved right from JAD Sessions, High level, low level documents and through the project full lifecycle (Design, ETL & DB Development, testing, documentation, Implementation by CA’s, and Post implementation support till the transition)

Involved in designing the Database and functional specification of the project.

Involved in mapping, sessions and workflows design using Informatica 8.6, Database Tables, objects in Oracle 11g & TOAD, Unix shell scripts, Appworx jobs, Tidal Jobs, unit test plan, Code reviews and Test cases and release related documents and activities.

Involved in team management, performance evaluation besides assisting in selection and recruitment process, holding knowledge sharing session for the teams & Coordinating with offshore team and onsite client coordinator.

Client : Verizon Data Services

Duration : Jan’2012 – June’2013 India

Project Name : PROJECT NORTH – MDR – SPINCO & RETAINCO

Role : Senior Analyst

Environment: : Informatica 8.x,Oracle 10g,HP-UX

Description: Verizon sold out West part of business to Frontier which includes business of FIN, H S I, FTV, FDV at 14 states & few wire centers of CA. The Project North includes a kind of Demerge of existing COMPLETE VERIZON S/W, H/W, Data, Resources and Lot more (State of art World’s Largest Scale of Spinning Out New company out of existing organization in Stringent, Tightly time bounded deadlines with boasting figures like 2700+ Projects, 2K+ Sources, 4K+ Resources, 6 Billion $’s, 1 year etc…) into Two parts namely SPINCO (Transferred to Frontier), RETAINCO (Retained by Verizon) adhering to the both organizations regulatory business rules & security of Data. MDR being the broadband metrics data Warehouse has to churn out PN-MDR with same Applications, existing functionality in simple a REPLICA in which runs in parallel along with existing BAU.

Challenges:

Develop methodology, Project plan, identify upstream interfaces IN & identify downstream interfaces OUT

Identifying the COMPLETE stake holders list like Source, Target DB Systems, Inbound, Inbound Feeds, Retired-Not PN Applications, S/W, H/W Servers & Configuration Licenses procurement details.

Blanket Security clearance, publishing all MDR applications at CR with biometric access to PN resources.

Continue BAU & Set up Applications, databases, establish hardware, software configuration in FTW

Responsibilities:

Identifying the COMPLETE stake holders list like Source Database systems, Target DB Systems, Inbound DB, Inbound XFER, Outbound DB, Retired-Not PN Applications, S/W, H/W Servers & config, Licenses AND REPLICA S/W, H/W Licenses procurement details.

Blanket Security clearance, publishing all MDR applications at CR with biometric access to PN resources.

Project related activities 9Identify hardware requirements, identify software requirements) – DB Servers, ETL Servers, JOB scheduled servers setup AND Appworx client & Server setup, Informatica setup, SFTP setup & config, Oracle & SQL Server 2005 database setup & configuration for cube processing database, SQL server analysis 2005 configuration – memory settings, Microsoft Analysis servers driver installation, Cube processing tool.

Develop methodology and work plan, identify upstream interfaces IN & identify downstream interfaces OUT.

Application and database setup i.e. establish hardware, software config in FTW.

Replicating production environment to PN environment and access to all VDSI PN resources from CR.

Setup the replication process like FTW DB instances, TW Informatica instances etc..

Extract Frontier data using data pump export scripts. Export all on-line data including history for project NORTH states

Testing the application Load/Unit test, inbound feeds, Outbound feeds

Regression and client acceptance testing

Post production support.

Client : Verizon Data Services

Duration : Jan 2011 – Dec 2011 India

Project Name : MDR – FiOS TV Data mart

Role : Analyst

Environment : Informatica 8.x,Oracle 10g, HP-UX

Project Description:

The Project for Developing the FiOS-TV Data mart is part of CSG Migration. Cable Systems Group (CSG) is a third party vendor which is used to assist the initial requirements to launch the FTTP-based Video Services by VERIZON. The purpose of this FiOS-TV Data mart initiate is to improve efficiency as volume increases, and to support a full-scale deployment, by migration the CSG functionality into VZ-systems.

FiOS-TV Data mart mainly holds the VIDEO on DEMAND (VOD) and Pay per View (PPV) data with RIBS. This helps better management of the Data by moving the existing billing system ARBOR to RIBS and finally better decision supply system.

Responsibilities:

Responsible for Mapping Development, Mapping Reviews

Developed mappings and workflows to pull data from source system to Interim tables, PSA and Data mart.

Implemented SCD type2 mappings.

Documented and Tested the mappings and workflows for DataMart.

Testing and Preparing of UTC’s for developed mappings.

Responsible for Onshore Interaction & getting clarifications for Development hurdles

Responsible for the All the FiOS-TV deliverables from Offshore

Client : Verizon Data Services

Duration : June’2006 – Dec’2010 USA, India

Project Name : MDR – Matrix Data Reconciliation

Role : Analyst

Environment : Informatica 7.x,Oracle 9i, HP-UX

Description: Verizon Communications is the one of the leading provider of the communication services in the world and largest provider of wire line and wireless communications in the USA. MDR, Metrics Data Repository is the operational and strategic Metrics Data Warehouse for Verizon Telecom Services. It is the end-to-end operational and executive enterprise data warehouse for all broadband (HSI and FiOS Internet, FiOS Voice & TV, Voice Wing & VASIP) Services across Verizon footprint. Metrics data repository deals with DSL and FTTP sales data. It enables the top-level management to analyse the DSL and FTTP sales. MDR is developed to support a variety of customers by providing a unified metrics database to store business data and generate reporting metrics.

Responsibilities:

Responsible for SSP, ARBOR, REMEDY data loading onto MDR for Reporting.

Generating Outbound feeds for different systems like IGO, VERICHECK, NORDIC, AOL, and CCI Provisioning.

Responsible for MDR Enhancements as well as new development requirements for MDR.

Performing ETL operations for Real Time loads, Periodic Loads & Dimensional Loads.

Responsible for Solving of Issue Requests (IRs) rose by different systems of Verizon for error detection, analysing & Solving.

References & other details would be furnished upon request



Contact this candidate