Post Job Free

Resume

Sign in

Data Analyst Modeler

Location:
Charlotte, NC
Posted:
April 18, 2024

Contact this candidate

Resume:

Renuka Nalluru

ad438h@r.postjobfree.com,

LinkedIn: linkedin.com/in/renuka-nalluru, 201-***-****

Data Governance Reporting Analyst Data Modeler

Hello, I have around 5 years of experience as a Data Modeler&/Technical Data Analyst who is strong at traditional and modern could data technologies and analytical skills. My unique skill set is what I have achieved in my career till date and is what I bring to the table. I am extremely good at exploring and learning new data tools based on the project needs.

CAREER SUMMARY

Ø5 years of experience in Analysis, Design, Development, Support, Migration and Testing of various Data Applications in Healthcare, Service and Banking/Finance domains

ØProficient in Software Development Life Cycle (SDLC) methodologies and Database design for Online Transactional Processing (OLTP) and Online Analytical Processing(OLAP).

ØExperience in BI tools such as Micro Strategy, IBM Cognos, SAP BO for Ad-hoc reporting and data visualization alongside with Tableau and PowerBI.

ØExperience in creating Data governance policies, Business glossary, data dictionary, reference data, metadata, data lineage and data quality rules.

ØData analysis of meter data in MDM and generating reports, identifying data mismatches and proposing data fix solutions.

ØDesigned normalized(3NF) Conceptual/Logical data model for MDM Hub landing, Processing and publish layers.

Ø4 years of experience in IT industry, related to various aspects involving Data integration and Data warehousing techniques, using ETL tools like Informatica Power Center 10.2/9.6/9.1/8.6, Informatica Power Exchange 10.2/9.6, Informatica intelligent Cloud Services (IICS)

ØManage workflow process in Collibra via Activiti.

ØExperienced Data Modeler with strong conceptual, Logical and Physical Data Modeling skills, Maintaining Data Quality, Creating data mapping documents and writing functional specifications and queries

ØExperience in implementing the complex business rules by creating re-usable transformations, developing complex Mapplets and Mappings, PL/SQL Stored Procedure and Triggers

ØExperience in Creating ETL Design Documents, strong experience in complex PL/SQL packages, functions, cursors, indexes, views, materialized views

ØExperience in designing Star Schema, Snowflake Schema for Data Warehouse and ODS architecture.

ØWell versed in Normalization/DeNormalization techniques for optimum performance in relational and dimensional database environments.

ØExperience in Relational Data Modeling (3NF) and Dimensional Data Modeling.

ØHands on experience in writing T-SQL queries, Stored Procedures, User Defined Functions (UDF), Cursors, Derived Tables, Views, and Triggers.

ØExperience using the main ETL tools as Alteryx, Excel, PowerBI, Datastage and ended up using Tableau for the data visualization of the expected end result.

ØExperience in Development, support and Maintenance for the ETL (Extract, Transform and Load) processes and writing SQL Queries to perform end-to-end ETL validations and support ad-hoc business requests.

ØExperienced in Data Analysis, Mapping Source and target systems for data migration efforts and resolving issues related to data migration in MSAccess.

ØExperience in creating Triggers, Tables, Stored Procedures, Functions, Views, Indexes and Constraints in T- SQL.

ØExperienced in extracting, transforming and loading(ETL) from spreadsheets, database tables and other sources.

Education:

Bachelors in Computer Science and Engineering, Karunya University

Master of Computer Science, New York Institute of Technology

Technical Skills:

Databases

Oracle, Snowflake

Big data Technologies and ETL tools

Talend, Traditional Informatica, IICS, ER Studio V8.0.1, Erwin r7.1/7.2, Alteryx, SSIS (Intermediate)

Tools and languages

SQL, PLSQL, PowerShell, Python and R (Intermediate). Tools: Toad, SQL developer, Visual studio, Collibra, Confluence, MDM

Data Visualization & Business Intelligence tools

Tableau User and Power BI (Developer), SAP BO, IBM Cognos, Micro Strategy, Ad-hoc Reporting

Operating System

Windows

Functional areas

Finance/Insurance, Healthcare

Professional Experience:

03/2021 – 01/2024

Senior Data Governance Reporting Analyst/ Data Modeler - U.S Bancorp, NC

Project 1: MAD Data Sprawl Remediation Initiative – Senior Data Governance Reporting Analyst

·Worked with subject matter experts (SME) to understand the SOR details and how the data entities are defined.

·Worked with EAA/BL users to determine the MAD entity approvals on the requirements document for various SORs

·Analyzed the data that is already present in Marketing Analytics Datamart (MAD) to export the same data entities to a new DataMart.

·Worked on querying complex SQL queries as part of data validation for the mappings by referring to the sample datasets from IT and PROD environments from TOAD Data Modeling Tool.

·Gather requirements from client business team and help Business Analyst and data modelers in creating a Business Requirement Document and designing the data models respectively.

·Implemented Data Governance using Excel and Collibra.

·Performed an end to end Data Lineage assessment and documentation for select CDEs.

·Collibra workflow development and configuration based on MS Data Governance Approach and requirements.

·Analyzed data lineage processes to identify vulnerable data points, control gaps, data quality issues, and overall lack of data governance.

·Update the business requirement documents in the Customer 360 portal so that it can be accessible for the entire team.

·Designed and created logical and physical data models using ERWIN, resulting in improved data organization and accessibility for the organization's data-driven initiatives.

·Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the logical data model.

·Used Model Mart of Erwin for effective model management for sharing, dividing and reusing model information and design for productivity improvement.

·Created data sharing between two snowflake accounts.

·Created internal and external stage and transformed data during load. Redesigned the Views in snowflake to increase the performance.

·Good working knowledge of any ETL tool (Informatica or SSIS).

·Created Talend Mappings to populate the data into dimensions and fact tables.

·Analyze the data that is present in Marketing Analytics Datamart and transfer the data without gaps from Enterprise landing zone (ELZ) to Analytics Customer360(AC360)

·Work with engineering team to review all the fields, transformation logic and query the sample data for all the source columns and provide SQL logic using joins for the data model generation.

Project 2: Union Bank Conversion Risk Compliance – Senior Business Analyst

·Gather requirements from Union Bank Business Line team, conduct reviews to discuss the scope and resource allocation for the required timelines.

·Work with junior business analysts and allocate the needed stories based on the timeline, sign off the essential work performed.

·Create the Vault workspace connection in Info Analyzer for data quality checks.

·Work on creating data rule definitions and the required data rules for the identified Key business elements for the data assets.

·Validate the test results for the performed validations.

·Work on signoff from PM for the required Vault intake from Union bank data.

·Worked on the validation between restored DB counts vs Control file counts from IT environment in SSMS.

·Performed complex SQL queries in SSMS to validate between both the entities.

·Worked on creating PowerBI dashboards for reporting service.

·Extracted, transformed, and loaded data from various sources into the data warehouse using Microsoft SQL Server Integration Services (SSIS).

·Developed ad-hoc queries and stored procedures to fulfill data requests from business stakeholders.

·Collaborated with the data modeling team to design and implement efficient data structures for reporting and analysis purposes.

·Assisted in the creation of interactive dashboards and visualizations using Tableau and Power BI to present data-driven insights to key decision-makers.

·Conducted data validation and cleansing activities to ensure data accuracy and consistency.

·Used Cognos as it supported U.S Bank’s leading databases and multiple data source options that allowed me to make effective Content Management and Reporting strategies.

·Connected the data and cleansed to ensure for better quality and insights.

·Worked on assembling and integrating multiple-page reports easily via Cognos.

Project 3: ORM Data Analytics – California Privacy Rights Act(CPRA) – Data Modeler

·Gather requirements from stakeholders, conduct reviews to discuss the scope and resource allocation for the required timelines.

·Create the sample questionnaire template and meet with the business resources to gather all the CPRA related customer information for the rolling 12 months.

·Worked on creating data rule definitions and the required data rules for the identified Key business elements for the data assets.

·Created mapping documents with detailed source to target transformation logic, Source data column information and target data column information.

·Connect to the source database and execute the code to match the questionnaire or the analysis document.

·Designed Star and Snowflake Data Models for EDW using ER studio.

·Validated and updated the appropriate LDMs to Process Mappings, Screen Designs, Use Cases as they evolve and change.

·Create the conceptual model for the data Warehouse using Erwin.

·Designed Fact tables and dimension tables for the CPRA data mart to support all the gathered requirements using ER studio.

·Worked on performing code alignment reviews, creating GIT LAB Wiki pages.

·Designed the Kanban boards and followed the Sprint methodology to track all the requirements for the resources accordingly.

·Utilized Apache Spark and Scala to process and analyze large-scale datasets, improving data processing efficiency by 20%.

·Conducted data cleaning, transformation, and validation to ensure data quality and accuracy for downstream analysis.

·Developed and maintained complex Spark applications to extract insights from structured and unstructured data sources.

·Collaborated with cross-functional teams to understand business requirements and translate them into actionable data analyses.

·Implemented performance tuning strategies, optimizing Spark jobs for improved processing times and resource utilization.

·Worked in a Linux environment, using shell scripting to automate routine data processing tasks and job scheduling.

·Designed and executed ad-hoc queries and reports, providing timely and relevant insights to business stakeholders.

·Contributed to the development of data visualization dashboards using Tableau, enhancing data-driven decision-making.

·Conducted exploratory data analysis (EDA) to identify trends, patterns, and outliers, facilitating strategic decision-making.

·Regularly updated technical documentation for Spark applications, ensuring knowledge transfer within the team.

·Designed, Developed and Implemented ETL processes using IICS Data integration

·Created IICS connections using various cloud connectors in IICS administrator

·Installed and configured Windows Secure Agent register with IICS org

·Extensively used performance tuning techniques while loading data into Azure Synapse using IICS

·Extensively used cloud transformations - Aggregator, Expression, Filter, joiner, LookUp (connected and unconnected), Rank, Router, Sequence Generator, Sorter, Update Strategy, Union Transformations

06/2019 – 02/2021

Senior Data Modeler – Healthcare, Blue Cross Blue Shield of Kansas City - MO

·Worked on defects, enhancements of preexisting SSIS ETL jobs

·Acted as a Scrum Master and Lead developer whenever needed for the team

·Built basic power BI reports on User activity by using Azure data bricks as a source

·Used Alteryx to built basic debugging pipelines

·Loading data from different sources, data visualization, data quality checking, using descriptive statistics and graphs like bar charts, histograms, pie charts.

·Analyzed various jobs and source files using IBM DataStage Designer and performed various complex SQL queries and made changes as per the business requirement.

·Worked with Codes and Data Mapping Application in performing the mapping changes and the transformation rule changes.

·Worked with Data Architects in pushing in and verifying the models into Codes and Data Mapping Application in MS Access and worked with ETL Developers in forwarding the analysis and updating with the development work needed according to the business requirement.

·Worked extensively using ETL Tools like Alteryx, PowerBI, Excel to connect the data exploration platform of data assets where data can be accessed, blended and cleaned from any source.

·Served as cross-functional liaison for product direction, bug fixes, and enhancements using Visual Studio Team Foundation Server (TFS).

·Analyzing the source data to know the quality of data by using Talend Data Quality.

·Broad design, development and testing experience with Talend Integration Suite and knowledge in Performance Tuning of mappings.

·Developed jobs in Talend Enterprise edition from stage to source, intermediate, conversion and target.

·Involved in writing SQL Queries and used joins to access data from Oracle, and MySQL.

·Developed a standard ETL framework to enable the reusability of similar logic across the board involved in System

·Documentation of Dataflow and methodology

· Extensively developed Low level Designs (Mapping Documents) by understanding different source systems

· Designed complex mappings, sessions and workflows in Informatica PowerCenter to interact with MDC and EDW Design and develop mappings to implement full/incremental loads from source system

·Responsible for ETL requirement gathering and development with end to end support

·Responsible to coordinate the DB changes required for ETL code development

·Responsible for ETL code migration, DB code changes and scripting changes to higher environment

·Responsible to support the code in production and QA environment

·Developed complex IDQ rules which can be used in Batch Mode

·Developed Address validator transformation through IDQ to be interacted in Informatica PowerCenter mapping

·Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter

·Worked closely with MDM team to identify the data requirements for their landing tables and designed IDQ process accordingly

·Extensively used transformations like router, lookup, source qualifier, joiner, expression, sorter, XML, Update strategy, union, aggregator, normalizer and sequence generator

·Recognized subject matter expert in interoperability standards: HL7, CDA (Clinical Document Architecture), and FHIR (Fast Healthcare Interoperability Resources).

·Recognized for combining technical leadership with verbal & written communication skills in creating working systems.

·Worked extensively in Development Reports and maintenance project of Power BI.

01/2019 – 05/2019

Research Assistant – Data Analytics, New York Institute of Technology

·Performed data profiling and data analysis on different sources.

·Applied data validation around the performed data analysis and gap analysis.

·Queried extensive SQL on the source and target entities. Applied multiple joins and tested the data.

·Built basic Tableau dashboards and reports.

·Worked with the Data Modeling team to create or update the workspace in Erwin and assign the naming conventions to the entities and generate a logical or physical data model.



Contact this candidate