Post Job Free

Resume

Sign in

Data Architect Modeler

Location:
Exton, PA
Posted:
April 11, 2023

Contact this candidate

Resume:

PRASAN KUMAR NAVUDURI

Data Architect /Senior Data Modeler

Email: adwhs8@r.postjobfree.com

Summary:

** ***** ** ******* ********** in IT with emphasis in Data Modeling and Database engineering using industry accepted methodologies and procedures.

Over 12 plus Years of Experience in Designing the Data models for OLTP & OLAP database systems.

Strong technical skills in data modeling, data integration, and data warehousing.

Collaborated with stakeholders to ensure alignment of data architecture with business objectives and requirements.

Read Json structure to understand the data elements to design the logical model.

Designed and implemented the organization's enterprise data architecture, including data models, data integration processes, and data warehousing strategies.

Worked on cloud data migration for Teradata, SQL server and Oracle.

Experience in Relational Data Modeling, Dimensional Data Modeling, and Design Hadoop tables.

Provided guidance and support to development teams in implementing data solutions in accordance with the data architecture

Design table in Datawarehouse following MDM process such as identifying, defining and managing the critical data entities that are shared across an organization.

Worked on Event driven Architecture and design tables based on the Json structure.

Worked on designing tables in Azure Data Lake.

Worked on creation of external table in Hive.

Worked with producers and consumer to create table used by Kafka Topics

Worked on data bricks for data profiling.

Thorough understanding of System Development Life Cycle (SDLC) from Requirement analysis through Design, Development and Implementation.

Created the external tables and views in Hive, Hive on Hbase.

Strong expertise in designing Data Warehouse, Data mart designs using Dimensional Modeling Star and Snowflake modeling techniques.

Create high-level conceptual model that defines the relationships between data entities.

Designed and implemented the organization's data architecture, including the development of data models and the establishment of data integration processes.

Extensive experience in Relational Data Modeling, Dimensional Data Modeling, Logical/Physical Design, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents and proficient in Relational Database Management Systems (RDBMS)

Strong RDBMS concepts and well experience in creating database objects like Tables, Views, Sequences, triggers, etc.

Follow MDM process such as data profiling, data cleansing, data integration, and data governance.

Experienced in Documenting Data Models, logic, coding, testing, changes and corrections.

Worked is Agile Methodology and involve in the sprint planning and grooming

Worked in both independent and collaborative work environments.

Technical Skills:

Data Modeling: Erwin, ER Studio, Visio,

Skills: Data Modeling, Data Integration, Date warehouse, Data Governance, Business Intelligence, Team leadership, MDM.

Tools: Informatics Power Center 6.x/7.x/8.x, Workflow Manager, Workflow Monitor, Hue web-based query editor, Data Bricks

Languages: C, C++, SQL and PL/SQL, HiveQL

Databases: Oracle, SQL Server, DB2, Teradata 13.0, Apache Hadoop, Apache Kafka and Data Lake.

Operating Systems: Windows XP/2000/NT/98/95/2007, Sun Solaris, AIX

Experience:

Johnson & Johnson Feb 2022-Mar 2023

Data Architect /Senior Data Modeler

Skillman, NJ

Description: Working as Data Architect/data modeler for J&J retail supply chain project. Involved in dashboard design, which is used for analysis and statistics for raw material shortage and logistic. Designed tables in Erwin for Azure Data Lake.

Designed data model the in multiple layers such as inject, core and semantic layer based on the functionality and provide the tables for tableau reports.

Responsibilities:

Worked with product owners to gather business requirement.

Worked on source SAP tables to understand the source data.

Worked on the cloud technology to design the architecture.

Managed the design and implementation of data analytics solutions, including data visualization and reporting tools.

Worked on data profiling to understand the data and granularity using data bricks.

Design the enterprise logical model structure in multiple layers.

Design physical model based on the J&J standard template.

Create Core data model using Erwin data model based on the business meetings.

Work with Data Governance team updating MDM documentation.

Following J&J standard design physical Model with constraints.

Design multiple tables for unstructured data in Azure Data Lake.

Generate the ddl and add in source tree.

Work with developer for data mapping.

Generate data model report and define transformation logic in mapping documents

Merge the model to enterprise model mart using model compare.

Environment: Erwin 2021R, SQL Server, Teradata 13.0, SAP, Data Lake, Data Bricks, Data Factory, bitbucket.

Vanguard Group, Malvern PA Sep 19-Feb 22

Data Architect /Senior Data Modeler

Description:

The Vanguard group is an American registered investment advisor. Its largest provider of mutual funds.

Vanguard offers brokerage services, educational account services, financial planning, asset management, and trust services Designed data architecture across multiple team and multiple databases.

Worked on Peregrine Systems project, which is to create online Installment to pull data from Vista Equity partner to db2 so that UI can use the data. Supported data modeling for eCompliance web application.

E-compliance Web Application is used by office of General Counsel’s Legal and compliance team.

Responsibilities:

Worked with business user to understand the business requirement and source system.

Analysis on the source system to capture the data elements.

Create conceptual model and present to the business user and team to understand the data flow.

Worked on the model mart model in Erwin.

Support data model for postgres for AWS.

Create logical data model and capture the metadata for entity and attributes.

Define the Entity relationship diagram.

Design physical model with data standards as per database.

Generate the tables as per the logic.

Run the query report in Erwin.

Create model report for the development team.

Create source target mapping document.

Create views to handle sensitive, on-sensitive and highly sensitive data.

Work with data governance team before the model is published to the model mart.

Environment and tools: Erwin 2021R, SQL server, Teradata, DB2 and postgres.

Barclays US, Wilmington DE Sep 16-Aug 19

Data Architect /Senior Data Modeler

Description: Working as Data Architect/data modeler for Barclay Card US banking data warehouse. Involved in integration project and designed data models across multiple databases such as Oracle, Hadoop across multiple teams. Working on partner integration projection and marketing 360 where customer information is gather for Marketing campaigning and offers for existing customersMarketing 360 is to understand the potential risk on the customer and prevent external fraud and provide the insight of the customer.

Responsibilities:

Work with Business user to understand the business requirements and source system requirements.

Work with domain architect to understand the Events and Pay Load structure before designing the downstream process.

Analyze the source data from source systems to do data mining and data analysis to store the data in Hadoop.

Create kafka topic for external domain events and capture the data using flume interceptor and create tables or views for the business users.

Create Analytical tables for reporting purpose.

Create logical data object model and mapping which links objects in target model to data sources.

Design and implement data integration solutions.

Import metadata for sources, targets for mapping and describe structure of logical view data.

Run a profile to analyze structure and content to determine quality of data.

Develop mappings to implement data integration tasks.

Create table design Hive on Hbase based on the nature of the data.

Create and run workflows to perform a sequence of events, tasks, decisions based on business process requirements.

Create views to handle non sensitive data, sensitive and High sensitive data following Data Governance standards

Create table design for hive and create views for the business for reporting.

Utilize Java, Oracle, SQL, PL/SQL, Erwin, BigData, MongoDB, R, Scala, Hadoop, Shell scripting, and Toad.

Environment and Tools: Erwin 2021R, SQL Server, Teradata 13.0, Jira, Oracle 11 g, Oracle, AQT tool, MongoDB, Hive, Hadoop, Hadoop, Stash.

QVC, West Chester PA Apr 15-Aug 16

Senior Data Modeler/Database design Analyst

Description: Project objective was to provide the capabilities (People, Process and Technology) that will allow QVC to easily start broadcasting / playing videos on new channels by reusing the product clips from previously aired or recorded shows. This project established to create a low-cost method of creating content that can be distributed wherever needed in support of increasing QVC’s presence in the Content space. It outlined the capabilities needed to support the planning, creation and management of playlists using pre-defined business rules – utilizing learning from the launch and evolution of QVC Plus (QVC channel in West Coast) in 2013 as well as QVCUK and Digital channels: Beauty, Style and Extra.

Responsibilities:

Participated in agile ceremonies namely Daily Scrums, Sprint Planning, Sprint Review and Sprint retrospectives.

Designed data model for multiple database.

Worked with Business to gather requirements.

Analyze the source data as a part of requirements.

Design the conceptual data model and discuss with architecture team.

Design logical model based on the requirements and business rules.

Designed 3NF data model.

Create primary key and foreign key relationship.

Define entity, attributes and their relationship.

Assign user defined properties to all the attributes

Peer review with the team on the logical data model

Designed the physical data model.

Assign user define and standard properties to the tables and columns based on the database.

Create unique index for the tables and assign database standards.

Design physical model and generate DDL.

Create views on the tables based on business requirement.

Creating mapping document to EDW.

Define detail level source to target mapping.

Discuss the source to target mapping with ETL team and explain them with transformation.

Environment: Erwin r 9, SQL Server, Teradata 13.0, Jira, Oracle 10 g, Oracle,AQT tool.

Capital One, Wilmington DE Sept 12-Mar 15

Senior Data Modeler / Database design analyst

Working as data modeler for Capital one banking data warehouse. Involved in integration project and designed data models across multiple databases and across multiple teams. Working on the Master Card Digital Enablement service project for digital cards.

Project 1: Bank Data Warehouse

Responsibilities:

Working on Master card Digital enablement service project.

Implementation is in Agile Methodology and involved in the sprint planning and grooming.

Involved in gathering business requirements from business analyst.

Involved in profiling the source data before designing the data model.

Based on the requirement design the conceptual data model and discuss with business analyst for the approval.

Design the Logical data model after getting approval from business analyst.

Designed data model for Data staging layer and banking data warehouse.

Design the 3NF Data Model if required based on the requirement.

Create primary key and foreign key relationship.

Define entity, attributes and their relationship.

Assign user defined properties to all the attributes.

Create source to targeting mapping.

Responsible for data mapping and writing transformation rules.

Setup peer review meeting for data model approval across enterprise level.

Get enterprise level approval to the logical model.

Design physical model and generate DDL.

Develop DDL implementation plans.

Support DDL PROD implementation.

Support post-production validation of DDL.

Develop, test, and implement scripts, programs, and related processes to collect, manage, and publish DEV and PROD metadata from data models

Promote data models into production metadata environments.

Sync the current data model with packet (Source to target mapping)

Create Metadata document for the data model.

Merge the subject area model to Master Model with Erwin compare tool.

Merge the data model to Enterprise logical data model.

Conduct review meeting with BSA, Ab Initio team and design document team.

Project 2: Home loans Marketing Campaign

Worked as a Data Modeler for Capital One Home loans. One of many projects is for Mortgage marketing campaign which is part of Online Advertisement from Double click search (DoubleClick is a subsidiary of Google which develops and provides Internet serving services), which capture the activity of the user on the capital one websites based on the advertisement of the capital one on search and display network.

Worked as part of a Sprint Team using agile methodology where planning and grooming of business requirements is part of the process.

Responsibilities:

Involved in gathering business requirements and translate it into functional specifications.

Interacted with users, application architects, & enterprise architects to clarify and build a functional data model.

Designed/captured the business processes & mapped them to the conceptual and logical data model.

Created and maintained Logical Data Model (LDM) / Physical Data Modeling. Includes documentation of all entities, attributes, data relationships, primary and foreign key relationships, allowable values, codes, business rules, glossary terms, etc.

Responsible for Source to Target data mapping and writing transformation rules.

Used Erwin to transform data requirements into data models

Synchronize the current data model with the data model packet (Source to target mapping)

Define business metadata and partner with Data Governance team to define alignment with data definitions between business verticals.

Get enterprise level approval to the logical model by going through peer reviews.

Design physical model and generate DDL to handover to the DBAs

Merge the subject area model in model mart into the Master Model with in Erwin using the complete compare tool.

Update the Enterprise logical data model with the new data models.

Conduct review meetings with the BSA and AbInitio team in addition to working with the ETL developers until the model is ready for deployment.

Project 3: Basel –II RWA (Risk weighted asset)

Basel-II is the second of the Basel Accords, Which are recommendations on banking laws and regulation issue by the Basel committee on Banking Supervision. Basel II, initially published in June 2004, was intended to create an international standard for banking regulators to control how much capital banks need to put aside to guard against the types of financial and operational risks banks face.

Responsibilities:

Worked in RWA (risk weighted assets) calculator.

Gather mapping document from business system analyst.

Analyze mapping document and communicate with BSA for additional information on source system.

Analyze the mapping document before design the model for staging area.

Build 3rd normal form Logical model based on requirements with Erwin tool and assign UDP (user defined properties).

Generate Logical model in PDF format from packet tool.

Conduct review meeting with multiple teams.

Design physical model and generate DDL.

Forward Engineering to enhance the Data Structures of the Existing Database and Creating New Schemas.

Merge Subject area model to Main model in Model Manager.

Maintain subject areas updates on the model.

Generate data dictionary and maintain metadata.

Based on requirement Reverse Engineering on existing model if required.

Maintain versions of the model.

Environment: Erwin r 7.3, Teradata 13.0,Version One, Oracle 10 g, Clear Case, Oracle SQL Developer, Teradata Assistant.

Wells Fargo, IA Jul’11-Aug’12

Senior Data Modeler / Database design analyst

Worked as Data modeler/data analyst, designed loan processing system which in takes Customer and mortgage loans. Designed data model for Mortgage loan applications and analyzed the existing model for transformation and mapping.

Responsibilities:

Involved in gathering business requirements and translated business requirements.

Interacted with users, application architects, & enterprise architects for getting the requirements.

Involved in Full Project Life Cycle (requirement gathering to implementation phase to closing phase).

Designed/captured the business processes & then mapped them to the conceptual data model.

Created and maintained Logical Data Model (LDM) / Physical Data Modeling for the Order Provisioning system. Includes documentation of all entities, attributes, data relationships, primary and foreign key relationships, allowed values, codes, business rules, glossary terms, etc.

Responsible in designing and introducing new (FACT and Dimension Tables) to the existing Model.

Responsible for collecting information from SRS for data mapping.

Responsible for data mapping and writing transformation rules.

Developed the data model using Sybase Power Designer 15 & printed reports to support the data dictionary.

Created Logical and Physical model database and dimension database in oracle.

Normalizing the Data and Developed the Database in Star & Snow flake schema.

Built Data marts and developed procedures and packages for populating the data from the Data Warehouse.

Actively involved in Database designs discussions, identifying the Facts and Dimensions tables for building the Metadata Manager Data warehouse.

Writing Complex SQL and PL/SQL procedures, Functions and Packages to validate data and testing process.

Provided help to fellow ETL developers, assisting them with troubleshooting.

Used Informatics power center for (ETL) extraction, transformation and loading data from heterogeneous source systems.

Designed staging tables, to load data from various sources.

Good understating of all the projects supported by the team and played an important role in production support.

Environment: Sybase Power Designer 15, Teamwork 6 portal, Jira, Toad 10.6,Orcale SQL Developer 1.5.59,Informatics Power Center 8.6.1, Source Analyzer, Data warehouse designer, Workflow Manager, Workflow Monitor, Oracle, PL/SQL

ProAg, Eagan, MN Jan’10-Jul’11

Senior Data Modeler / Database design analyst

Worked as a Data Modeler, designed complex Order provisioning system, which in takes Customer and Insurance Policies. Designed data model for Insurance policy Application and Quotes Application.

Responsibilities:

Developed the data model using ERWIN r8 & printed reports to support the data dictionary.

Developed Data models and ERD diagrams using ERWIN.

Built Physical Data models by Reverse Engineering the existing databases and modifying the Database structures as per business needs.

Forward Engineering to enhance the Data Structures of the Existing Database and Creating New Schemas.

Developed the Data mart for the base data in Star Schema, Snow Flake Schema and Multi Star Schema and involved in developing the Data warehouse for the Database.

Conducted Design reviews / Code reviews / Test Plan reviews with other developers and architects throughout project lifecycle.

Developed Procedures and Packages for populating the data from a different Database.

Wrote Complex SQL and PL/SQL procedures, Functions and Packages to validate data and testing process

Expertise in Data Analysis by writing complex SQL Queries.

Involved in application support and troubleshoot production problems of various applications.

Good understating of all the projects supported by the team and played an important role in production support.

Normalizing the data 3NF, 2NF, and 1NF.

Environment: Erwin 8.x, SQL Server, Visio, Informatics Power Center 8.6.1, Source Analyzer, Data warehouse designer, Workflow Manager, Workflow Monitor, PL/SQL.

Education and Certification:

Masters in Information Technology, Stratford University VA

Bachelor of computer Science, JNTU (Jawaharlal Nehru Technology University)



Contact this candidate