PRASANNA VIJAY MEDISETTY
Over 15 years of experience in Data Integration, Data Quality, Data Analysis and Metadata Management. Hand on experience in Informatica Power Center, Power Exchange, Data Quality (IDQ) and Metadata Manager (MDM). Worked in various ETL tools like Alteryx, Talend, SSIS and Cast Iron. Scripting experience using UNIX, Python and Windows Batch Scripts. Business Intelligence experience using Reporting Tools like OBIEE, Tableau, Cognos and Business Objects.
7 years of experience in a technical project management leading complex projects
Experience in providing end to end solutions with optimum Design and Architecture.
Exposure in Big Data Cloud integration with Hadoop Hortonworks Data Platform using Scoop, HDFS (Hadoop Distributed File System), Hive, Pig and Amazon S3.
Currently working in Toyota Financial Services in Data Quality & Metadata Data project. Playing an instrumental role to partner with Business and Technical teams to build, implement, and maintain data quality and metadata of entire Enterprise Data.
Strong ability to gather organization's data and provide actionable insights through data analysis and building and supporting enterprise solutions.
Excellent understanding of fundamental data warehousing and data lake concepts and able to translate business requirements into meaningful business/technical specifications and process design.
Knowledge of Financial industry across a diversity of business areas (Customer Data Management, Order, Membership, Billing and Call Center, Financial Service Marketing and Customer behavior) combined with technology, strong understanding of architecture and data modeling.
Excellent problem solving, communication, leadership, analytical and interpersonal skills. Coordinating and managing team. Providing Estimates, Project plans and timelines. Working independently and as part of a team. Good in multi-tasking by prioritizing the tasks. Highly effective at communicating with all levels of management, coworkers and other stake holders of the Project. Outstanding skills in analyzing business process and end-user needs. Detail oriented and committed in delivering superior quality of work. Frequently interacted with users to understand the needs and providing technical solutions by having frequent meetings and discussions with Business Users. Handled successfully onshore-offshore coordination.
A proven verifiable track record and a strong sense of dedication driven towards accomplishing challenging goals with "persistent commitment". Successfully organized cross trainings within the team.
15+ years of Database experience using various databases like Netezza (Aginity Workbench), Oracle 11g/10g/9i/8i, Teradata 13, Sybase IQ 15/13, SQL Server 2008/2005/2000 and DB2.
Good knowledge and understanding of Teradata SMP (Symmetric Multiprocessing) and MPP (Massively Parallel Processing) Architecture.
Hands on experience in creating Indexes like UPI/NUPI, USI/NUSI and Join Indexes.
Proficient in implementing solutions using Teradata utilities - BTEQ, Fast Load, Multi Load, TPUMP and Fast Export.
Wrote BTEQ scripts to Extract and Transform the data.
DATA QUALITY & METADATA
6+ years of Data Quality and Profiling Experience using Informatica Data Quality (IDQ) Tools like Informatica Analyst & Developer.
Design, implement and maintain data quality processes using Informatica Data Quality (IDQ).
Working directly with the Business users and SMEs and Data Analysts to understand data quality requirements.
Hands on experience with Informatica Data Quality Analyst (browser based) to provide interface to the business users
Experience with Informatica Data Quality Developer workbench (eclipse based) for building global, reusable Data Quality rules.
Supporting business users through Data Quality as a Service implementing business rules and scorecards.
Analyze, troubleshoot, diagnose Data Quality workflow issues; recommend and implement appropriate solutions.
Develop and support the Real Time Web-Services for Address Validation
Develop and improve standards and procedures to support data quality development, testing, production, and operational oversight of the data quality processes.
Experience in Informatica Master Data Management (MDM). Data acquisition to capture and maintain Asset information and business glossary. Collection of data definition dictionaries to capture and provide data lineage.
8+ years of Data Modeling Experience using ER Studio and Erwin. In-depth experience in Star / Snow Flake Schema, Conceptual, Logical and Physical Modeling. Good Understanding of Ralph Kimball and Bill Inmon Methodologies.
DATA INTEGRATION & MIGRATION
15+ years of Expertize in Data Integration into Enterprise Data Warehouse from various source systems using Informatica Power Center and Power Exchange tools.
Extensive experience in gathering Business Requirements, writing Technical Specifications, Application Design, Development, Testing, Implementation and Support for Data Warehousing Client/ Server applications by applying my technical, problem solving and coordinating skills. Positively impact the company’s operations for smooth business drive.
Knowledge in Informatica B2B Data Transform (DT), Data Exchange (DE) and Managed File Transfer (MFT).
Knowledge in Informatica Information LifeCycle Management (ILM) Application tools like Data Archive, Test Data Management (TDM) and ILM Nearline.
15+ years of Business Intelligence experience using reporting tools like OBIEE, Cognos and Business Objects.
Developed Universes for multiple subject areas.
Created reports using both Webi and Deski.
Scheduled reports and provided support to business users to run the business.
Toyota Financial Services, Plano, TX 11/16 – Current
Data Quality & Metadata Consultant
TFS is part of the worldwide financial services operations for Toyota Financial Services Corporation (TFSC), which is a wholly-owned subsidiary of Toyota Motor Corporation (TMC) in Japan. TFS has three Regional offices, three Customer Service Centers, 29 Dealer Sales and Service Offices throughout the United States. The Data Quality and Metadata project is a Business intelligence & Enterprise Data Management collaborative initiative to capture and report the quality of the data and provide visibility to the metadata and data lineage.
Data acquisition from Customer, Marketing, Re-Marketing, Sales, Accounting, RISK and Insight marts.
Gathering Data Quality rules from various data marts and executing the rules data to provide insight on the quality of the data.
Capturing Metadata and business descriptions of the all the database objects.
Colleting the lineage of the data objects to provide end to end data process flow.
Data Integration of all the data marts and building reports and dashboards to the end users.
Requirements gathering with key business and IT stakeholders by using excellent inquiry and communication skills.
Design, Develop, Test, Deploy and maintain DQR & MDR projects.
Create documentation related to the project throughout Software development life cycle.
Evaluate, plan, manage, track, and provide status on system maintenance, enhancement and support activities.
Create and maintain physical database data models required for the enterprise data quality.
Facilitating discussions with internal\external customers, vendors or others involved cross-functionally, to determine appropriate solutions and coordinate the prioritization.
Coordinate closely with business owners, solution teams and design teams to ensure that solution are fit-for-purpose and adequately meet the requirements.
Interface with clients, third party partners, program management, and developers to understand the day to day needs and what steps need to be taken to correct issues
Providing data analysis and recommendations for the enhancement, correction, and/or development of the applications.
Providing business/technical leadership within project teams as a subject matter expert and conduct/participate in business requirement, design, and code reviews.
Identifying indicators of potential problems/issues, coordinating and performing root cause analysis, and alert business community or technical team with recommended courses of action.
Continually seeking opportunities to increase customer satisfaction and strengthen working relationships with internal and external clients.
Managing and participating in projects with varying levels of scope and complexity.
Environment: Informatica, Netezza, OBIEE, UNIX, Windows.
Beachbody, Santa Monica, CA 03/15 – 10/16
Data Migration Consultant
Beachbody LLC is an American multinational corporation that uses direct response infomercials and multi-level marketing to sell fitness, weight loss, and muscle building home-exercise DVDs. Team Beachbody is dynamic online support community to help customers stay motivated with their workout program from start to finish. Team Beachbody is the ultimate customer resource, where trainers, Coaches, and experts enhance the experience and results of each individual user. Team Beachbody applications are developed and supported by third party, scope of the data migration project is to build in house solution using Oracle e-business suit and migrate all the data from third party application databases to e-business suite application databases.
Working in WAVE1 Data Migration Project.
Gathering data migration requirements from application teams to migrated Customer, Member and Order related data from legacy system to Oracle e-business suite like EBS, IDM (Oracle Identity Management) and ATG.
Collect data points and configuration details from LifeRay and ByDesign sources systems.
Design, plan and develop data migration tasks.
Setting Data migration environments and process dependencies.
Coordinating between source system and target systems.
Working closely with Product team to gather requirements of target systems.
Gathering details from Business operations team on SKUs, Bill of Materials, Continuity Masters and Modifiers.
Developed Data Migration processes using Informatica mappings.
Developed historical data load process as well incremental data load processes to catch up delta data.
Participating in end to end Integration testing. To make sure data is seeding into required applications as needed.
Validating and working with QA team to test the front end applications.
Conducting User Acceptance Testing with business users.
Supporting all data migration activities at enterprise level.
Environment: Informatica PowerCenter 9.5, Alteryx, Oracle, SQL Server, OBIEE, Tableau, UNIX, Windows.
Experian, Costa Mesa, CA 08/13 – 03/15
Data Integration Consultant
Experian plc is a global information services group with operations in 40 countries. The company now employs 17,000 people with corporate headquarters in Dublin, Ireland and operational headquarters in Nottingham, United Kingdom, Costa Mesa, California, US, São Paulo, Brazil, and Heredia, Costa Rica. It is listed on the London Stock Exchange and is a constituent of the FTSE 100 Index. Working in Experian Consumer Services reporting team. To meet the business needs in a fast paced environment and to make the right decisions at the right time, am representing as a member of the agile team. Using Sprint/Agile methodology to ship the constructive finished products by responding to the users changing business needs.
Working in Direct2Consumer and Affinity Agile Team.
Participating in Agile team meetings like Grooming, Planning, Daily Standups, Acceptance and Retrospective team meetings. Create/update appropriate project management documentation as and when required.
Communicating and consulting with all parties directly involved or affected by assigned deliverables.
Working on both legacy Enterprise Data warehouse build on Stored Procedures and SSIS as well on Customer Data Warehouse using Informatica.
Extensive experience in Informatica coding, testing and code migration.
Good knowledge and hands on in handling performance issues by fixing the time consuming ETL processes.
Working closely with Data Architect team to Design Conceptual Data Model.
Working with the Business Users to identify the key business rules for Business Intelligence reporting.
Translated Business rules and Functionality requirements into ETL procedures and BI Reports.
Developing Informatica mappings & Optimizing Query Performance, Session Performance.
Worked on Power Center client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformations Developer.
Used transformations such as the Source Qualifier, Joiner, Expression, Aggregator, Connected & unconnected Lookups, Router, Sequence Generator, Sorter, Union, Filter and Update Strategy. Created reusable transformations for better code reusability.
Environment: Informatica PowerCenter 9.5, SQL Server2008 R2 Database / SSIS, Red Hat Enterprise LINUX Server release 6.4 and Windows Server 2008 R2 Standard.
NIKE, Inc. Beaverton, OR 04/12 – 08/13
Data Quality Analyst
Nike, Inc. is an American multinational corporation that is engaged in the design, development and worldwide marketing and selling of footwear, apparel, equipment, accessories and services. The company is headquartered near Beaverton, Oregon. It is one of the world's largest suppliers of athletic shoes and apparel and a major manufacturer of sports equipment, with revenue in excess of US$24.1 billion in its fiscal year 2012. The brand alone is valued at $10.7 billion making it the most valuable brand among sports businesses.
Planning Transformation – Data Quality
SNP Supply Network Planning System in NIKE is upgrading the modules from legacy to SAP APO. In order to do that, existing data needs to be Cleansed and processed into Master Data. Informatica Analyst and Developer Tools have been identified as a best fit for Cleansing and Standardization of the data.
Worked as Data Quality Analyst using IDQ Informatica Data Quality tools.
SAP SNP Data profiling using Informatica Analyst tool to apply rules and profile data to identify the value frequency, patterns and statistics of the data.
Contact data cleansing like phone, email, gender, SSN etc
Address validation referencing to Informatica address data.
Provide Data to Audit screen to give a holistic view of planning master data created & derived in Planning Master Data.
Identify critical missing data errors using IDQ - Informatica Data Quality Tools and Prevent flow to APO and Teradata
Report Informational and Critical errors to Business Users by Profiling data and providing Dashboards in Informatica Analyst.
Creating/Applying Rules and Filters in Mapping Specification to validate the Data.
Used Lookup to reference data and Aggregator to summarize the data.
Created Profiles to identify the Patterns and different formats of Data.
Migrated Informatica Analyst Mapping Specifications, Source and Target Objects to Informatica Developer and implemented additional rules in Power Center.
Environment: Informatica Analyst/Developer and PowerCenter 9.5, Teradata 13, Oracle 10g, TOAD, Shell Programming, SQL * Loader, UNIX, Windows 7/NT/XP.
Dun & Bradstreet, Parsippany, NJ 03/07 – 04/12
Business Intelligence Technical Lead/Developer
D&B is the world's leading source of commercial information and insight on businesses, enabling companies to Decide with Confidence® for 169 years. D&B's global commercial database contains more than 177 million business records. The database is enhanced by D&B's proprietary DUNSRight® Quality Process, which provides customers with quality business information. This quality information is the foundation of D&B global solutions that customers rely on to make critical business decisions. Below are the Development and Support Project worked in D&B:
Custom Data Loads and BI Reporting:
Hoovers Data Migration
Global Customer Knowledge
Oracle Territory Management
Global Hygiene and Matching Services (GHM)
Requirements Gathering and Analysis. Prepared Business / Functional Requirement and Design Document
Conducted meetings with Internal Teams and gathered information about the requirements.
Interacted with end-users and involved in preparing Technical Design Documentations.
Transform the business requirements to system/data requirements.
Analyzed data coming from multiple source systems with BOLT (Back Office Legacy Transaction System) as major source for Billing and Financial Transactions and other sources like Flat files, Sybase IQ, Oracle, MS SQL Server relational tables for creating Dimensions and Facts
Build Informatica Data Quality Profiles on Objects and Mapping Specification by applying Rules to cleanse the data in Informatica Analyst 9.1 for the Business as well developed code in Informatica Developer Thick Tool for Technology Team.
Followed Informatica and Data Warehousing best practices in the Project.
Good Knowledge on Data Quality Lifecycles, Naming Convention, Version Control.
Implemented Data Quality rules
Worked on design, development and implementation patterns with cleanse, parse, standardization, validation, scorecard and handled exception, notification and reporting in Data Analyst.
Used XML Source Qualifier Transformation to extract data from XML files and XML Generator Transformation to write.
Used XML Parser Transformation to Extract data from Salesforce Objects and load into Database tables.
Used shortcuts to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
Developed shell scripts, PL/SQL procedures for creating/dropping of indexes on tables for performance using pre and post session SQL.
Documentation to describe program development, logic, coding, testing, changes and corrections.
Created and Configured Workflows, Worklets and Sessions using Informatica Workflow Manager. Scheduling of jobs to run in the batch process.
Performance Tuning of sources, targets, mappings, transformations and sessions to optimize session performance (dropping & re-building indexes, partitioning on tables). Used PL/SQL Procedures and Functions to build business rules. Debugging Critical and complex mappings.
Wrote PL/SQL packages, triggers, functions and stored procedures in Oracle SQL Developer
Extensively involved in Code Deployment in QA, UAT and Production.
Created, scheduled & automated Informatica jobs using Control M Scheduler & Unix.
Performed Informatica administration activities including; backup & recovery of Informatica repositories, installation & configuration of Informatica server & clients, manage users, user locks and folder privileges.
Actively participated in Data Integration of different source systems data. Involve with the Testing team and performed unit testing, integration testing and regression testing to validate ETL and BI reports.
Environment: Informatica PowerCenter 8.6, SSIS, Business Objects XI R2, Teradata 13, Sybase IQ 15.3/12.7, Oracle 10g, Erwin 7.2, TOAD, MS SQL Server 2008/2005, PL/SQL, Shell Programming, SQL * Loader, UNIX, Windows 7/NT/XP.
Novartis, East Hanover, NJ 09/06 – 02/07
RR Donnelley 01/06 – 09/06
Mc Graw-Hill Companies 03/05 – 12/05
Consulting 10/02 – 02/05
Data Integration (ETL) Tool
Informatica Power Center 9.5/9.1/8.6/8.1/7.1/6.2, Talend, SQL Server 2008 SSIS, Unix Shell Scripts, Python, SQL Loader
Data Profiling & Data Quality Tools
IDQ: Informatica Analyst/Developer 9.5/9.1
Business Intelligence (Reporting) Tools
Tableau, OBIEE, Business Objects XI R3, Cognos, SSRS (SQL Server 2008)
Netezza, Teradata 13, Sybase IQ 15.3/13, Oracle 11g/10g/9i/8i, SQL Server 2008 / 2005 and DB2
Data Modeling Tools:
ER Studio 9, Erwin 7.2/4.1
Control M 6.3, Autosys
Version Control/Issue Tracker
Serena 12.2 / PVCS / Mercury Quality Centre
2016 – Big Data training from Hortonworks.
2012 - Informatica Certified Professional Data Integration Developer from Informatica Corp.
2002 - Masters in Computer Management in Pune University.
1999 - Bachelors of Commerce in Nagarjuna University.