Post Job Free
Sign in

Senior Data Business Intelligence

Location:
Cleveland, OH
Posted:
January 26, 2025

Contact this candidate

Resume:

Yashwantej DS

216-***-**** **************@*****.***

https://www.linkedin.com/in/yashwantejdata/

Senior Data Modeler/Analyst

PROFESSIONAL SUMMARY

● Result-oriented, solutions-focused Senior Data Modeler/Analyst with over 8+ years of experience in Analysis, Design, Development, and Testing Enhancements (SDLC) of Enterprise Data Models (Erwin) Analytical Solutions using PowerBI, Tableau, Cognos, Qlik View/Qlik Sense& Other BI Tools along with Data Analytics & Data Engineering role. With proven skills in requirement gathering, gap analysis, change management, user acceptance testing, data analysis, business analysis, ETL testing, and BI testing in the healthcare / Pharmaceutical, Finance, and Energy sectors.

● Experience in developing Business Intelligence assets using tools such as Informatica PowerCenter, Informatica Data Quality, OBIEE, Tableau, Oracle, and others.

● Proficient in Business Intelligence, Oracle, database management, Informatica, and SQL Developer.

● Experience working with data modeling tools like Erwin, Power Designer, and ER Studio.

● Experience in designing star schema, Snowflake schema for Data Warehouse, and ODS architecture.

● Designed the Data Marts in dimensional data modeling using star and snowflake schemas.

● Designed dimensional data models following Kimball Methodology principles to support business reporting and analytics requirements.

● Create comprehensive documentation for data models, data dictionaries, and design specifications.

● Very proficient with Data Analysis, mapping source and target systems for data migration efforts, and resolving issues relating to data migration.

● Experience in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL.

● Experience in medical claims data claims, member, provider, clinical, etc.

● Strong understanding and experience in user acceptance testing, smoke testing, regression testing, performance testing, and functional testing.

● Involved in IT business analytics, data analytics, and data profiling tasks with businesses to enhance data accuracy and results.

● Expertise in various software development life cycles, including Waterfall and Agile methodologies.

● Experience with Hadoop stack and big data analytic tools, including Hive, Spark, HDFS, Sqoop, Kafka, and Airflow.

● Extensive experience with analytical and data visualization tools and techniques, including Tableau and Power BI.

● Experience in data management using Collibra for enterprise data governance.

● Good knowledge to recommend and implement claims policies and procedures

● Sound knowledge of health insurance products, healthcare claims, and policy administration operations

● Experienced in modeling OLTP and OLAP using tools like Erwin r9.6/9.1/r8/r7.1/7.2, Sybase Power Designer 12.1 and E-R Studio.

● Integrated data from diverse source systems into the data warehouse using Kimball's data integration techniques.

● Experience in designing and developing SSIS packages for loading data from Oracle, text files, Excel, and Flat files to SQL Server database.

● Familiarity with ETL (Informatica, Data Stage, Talend)/ELT processes and implementing data operations/solutions as per the Data strategies.

● I am experienced in Agile, Waterfall SDLC methodologies, preparing status reports, running status meetings, scrum calls, team management, and offshore/onsite delivery models.

● Good experience in creating semantic layers for data modeling and reporting on AtScale, and Dremio accelerator tools for better performance of the developed dashboards.

● Experience in data analysis, data profiling, data validation, data cleansing, data verification, data mining, data mapping, and identifying data mismatches.

● Skilled in writing HiveQL and Spark SQL queries for managing and analyzing large datasets.

● Experience in creating BI reports using Cognos and Tableau, including dashboards, summary reports, master-detail reports, drill-down reports, and scorecards.

● Developed Tableau dashboards using stacked bars, bar graphs, scatter plots, geographical maps, and Gantt charts.

● Excellent knowledge of Health Insurance Portability and Accountability Act (HIPAA) standards and compliance issues. Collaborated with cross-functional teams to implement fraud detection algorithms and methodologies, improving real- time detection and prevention systems.

● Analyzed large datasets to detect potential fraud patterns and anomalies, leveraging statistical analysis and SQL to enhance fraud detection processes. Applied cybersecurity principles and fraud detection algorithms to design automated workflows, improving the integrity of data in fraud analysis.

● Good in Normalization / Demoralization procedures for effective and optimum performance in OLTP and OLAP environments.

● Proficient in System Analysis, ER/Dimensional Data Modeling, Database design, and implementing RDBMS-specific features.

● Have good exposure to different business domains like Banking & Finance, Sales and Retail, and health and insurance.

● Experience in working with multiple Relational Databases, including DB schemas creation in Oracle 9i/10g, SQL Server, My-SQL, IBM-Db2,

● Extensive experience in gathering business/functional user requirements, creating use cases, and developing/designing diagrams such as activity, class, and sequence diagrams, along with creating Business Requirements Documents (BRD).

● Involved in facets system implementation, electronic claims and benefits configuration setup testing, inbound/outbound interfaces, and extensions, as well as load and extraction programs involving HIPAA and proprietary format files and reports development.

● Worked with providers and Medicare/Medicaid entities to validate EDI transaction sets and internet portals.

● Defined AS-IS and TO-BE processes and developed requirements through gap analysis.

● Having Knowledge of Cloud technologies like Snowflake Salesforce, Azure Cloud Developer &Architect TECHNICAL SKILLS

Certification: Google Data Analytics Professional Certificate. Micro-Certification: Data Cleaning, Data Mining, Advanced Excel, Predictive Analytics, Data-Driven Decision Making, Data Visualization, infrastructure design, including schema design, dimensional data modeling, data Security, Analytical Thinking, Competitive Advantages, Data Analytics, Data Mining, Data Science, Machine Learning, Impact Assessment, Risk Management, and Solution Development. Data Modeling & BI Reporting Tools: • Erwin (Data Model), Alteryx

• Informatica 9.5/9.1x/8.6x/7.x (Power Center) (Designer, Workflow Manager, Workflow Monitor).

• Tableau (Desktop, Server), Tableau Prep, Tableau Public

• Power BI (Desktop, Services), Power Apps, Power Automate), SSRS.

• Cognos (Report/Query/Analysis Studio’s), Framework Manager (FM), CognosTM1.

• Jupyter Notebook, Apache Hadoop, Spark, AWS S3, AWS Lambda, Excel, Power Query.

• QlikView (Desktop, QMC), QlikSense.

• SAP BO (Business Objects) & Micro strategy& Other BI tools Looker, DOMO, Spotfire etc..,

DATABASE & Accelerator Tools: • SQL Server 2005/2008, My-SQL, IBM-DB2, T-SQL, SAP–HANA, Hadoop- HIVE, Oracle 9i/10g/11g, Teradata, Sybase 11.5, DB2, PL/SQL.

• Postgres SQL, Mongo DB, AWS-Red Shift, Google-BIG Query,

• Dremio, Jethro, AtScale.

• SQL Loader, SQL*Plus, TOAD, SQL Developer, Erwin, Jira, Trello. LANGUAGES&APPLICATION:

• SQL, PL/SQL, C&C++, Java, Python, QL, NoSQL, T-SQL, Unix Shell Script, R, JavaScript, HTML5, CSS3.

• Confluence, GIT-HUB, Bit bucket, Udeploy, Team City, CI/CD Pipeline.

• Jira, Service Now, Trellocard.

OPERATING SYSTEMS: • Windows 2000/XP/NT/Vista.

• Unix/Linux, DOS, Windows 2003 server.

ETL TOOLS & CLOUD: • Informatica, DataStage, Talend, Snowflake.

• Sales force, GCP, Azure Cloud Developer &Architect PROFESSIONAL EXPERIENCE

Client: Johnson & Johnson, Bridgewater, NJ.

Senior Data Modeler September 2024 – Present

Responsibilities:

● Played a key role in the System Development Life Cycle (SDLC) process, including design, gap analysis, business requirements, systems requirements, test criteria, and implementation for projects automating correspondence for insurance policy owners.

● Utilized Agile methodology to implement project life cycles for report design and development.

● Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER Studio.

● Developed a Conceptual model using Erwin based on requirements analysis.

● Used Erwin for reverse engineering to connect to existing databases and ODS to create graphical representation in the form of Entity Relationships and elicit more information.

● Involved in Data Mapping activities for the data warehouse.

● Produced PL/SQL statements and stored procedures in DB2 for extracting as well as writing data.

● Ensured production data being replicated into a data warehouse without any data anomalies from the processing databases.

● Experience working with Azure BLOB and Data Lake storage and loading data into Azure SQL Synapse analytics

(DW).

● Worked as OLTP Data Architect & Data Modeler to develop the Logical and Physical 'Entity Relational Data Model' for Patient Services system with entities & attributes and normalized them up to 3rd Normal Form using ER/Studio.

● Worked on data cleaning, data visualization, and data mining using SQL and Python

● Designed graph data models for serverless architectures using MongoDB, AWS Lambda and Amazon DynamoDB.

● Designed and implemented data models for AWS Data Lakes using services like Amazon S3 and AWS Glue DataBrew.

● The data model was developed with PARTY, ADDRESS, PROVIDER, DRUG, CLAIM & MEMBERSHIP entities.

● Created and maintained documentation for AWS data models, including data dictionaries and metadata.

● Participated in the Data Governance working group sessions to create Data Governance Policies.

● Linked data lineage to data quality and business glossary work within the overall data governance program.

● Experience in Data Extraction, Transformation, and Loading of data from multiple data sources into target databases, using Azure Databricks, Azure SQL, PostgreSQL, NOSQL Server, Oracle

● Loaded the tables from the DWH to Azure data lake using Azure data factory integration run time.

● Developed Star and Snowflake schemas based dimensional model to develop the data warehouse.

● Created Physical Data Model from the Logical Data Model using Compare and Merge Utility in ER/Studio and worked with the naming standards utility.

● Reverse Engineered DB2 databases and then forward engineered them to Teradata using ER Studio.

● Provided source to target mappings to the ETL team to perform initial, full, and incremental loads into the target data mart.

● Responsible for Manage and administer Tableau Online or Tableau Server instances in the cloud, including user access, performance monitoring, and scalability.

● Created different types of charts/Visualization in Power BI as well as Tableau like Bar,Line, Pie, Area, Tree Map, Donut,Scatter Plots and waterfall charts, Geo Maps as per the use case requirement.

● Migrated the existing BO(LOB’s) reports to Cognos Report Studio reports to evaluate the performance of the existing reports and supported the existing reports enhancements and upgrades as per the client requirements.

● Create and develop data models, reports, and dashboards, Use Power Query to clean, transform, and load data and DAX formulas to create calculated columns, measures, and tables, and Optimize data models and reports for performance.

● Optimizes the performance of data queries within AtScale, focusing on reducing latency and improving efficiency.

● Designing and implementing virtual data cubes within AtScale, ensuring that they meet the analytical needs of the business.

● Responsible for data exploration and analysis using the semantic layer provided by AtScale, deriving insights and trends from the data.

● Responsible for Tableau deployments from Dev->UAT-> Prod server Via CI/CD process where GIT HUB, Eclipse, Team City and UCD (Urban Code Deployment) tools have been used for code deployment.

● Involved in all the deployments from one environment to go live as well as Support & Maintenance in the Server.

● Responsible for delivery of assigned modules/ components /phases of a project, status reporting, guiding the development team, estimation, planning & execution, knowledge transfer, and arriving at SLAs for steady.

● Responsible for migrating the data from On-Premises data to Snowflake cloud migration as per the data strategy plan.

● Monitor app performance, usage, and adoption, providing reports to management and develop automated workflows using Power Automate to streamline business processes, analyze existing workflows and optimize them for better performance and efficiency, and Lead the implementation of Power Apps and Power Automate solutions, including customization and integration with other systems.

● Actively participated in Scrum meetings (Daily Stand, Retrospectives, Backlog Grooming, and Product Demos, etc.)

● Responsible for the installation, configuration, and maintenance of the AtScale platform. Manages user access, permissions, and security settings within AtScale to protect sensitive data and Works on integrating AtScale with various data sources and BI tools to create a seamless data environment.

● Designs and implements data models that AtScale can use to create virtual cubes, ensuring the models are optimized for performance and scalability.

● Responsible for migrating the data and data models from the SQL server environment to the Oracle 10g environment.

● Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic.

● Extensively worked on the naming standards which incorporated the enterprise data modeling.

● Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.

● Collected business requirements to define data mapping rules for data transfer from source to target systems.

● Utilized AWS services to store and analyze large datasets, reducing query times and enabling efficient data-driven decision-making.

● Worked with specialists to migrate data into DWH and was involved in data modeling.

● Created logical and physical data models using Erwin for relational (OLTP) and dimensional (OLAP) databases with Star schema for fact and dimension tables.

● Gathered requirements and modeled the data warehouse and transactional databases, conducting a gap analysis to identify and validate requirements.

● Built and customized interactive reports and dashboard reports using Tableau.

● Managed Tableau Administration tasks, including configuration, adding users, managing licenses, scheduling tasks, and embedding views. Created SQL logic for aggregate views in Snowflake DB.

● Documented data sources and transformation rules for populating and maintaining data warehouse content.

● Designed use cases and process flow models using Visio and Rational Rose. Environment: Jira Atlassian, Collibra, Power BI, Snowflake, SQL Server (SSIS, SSAS), MS Excel, GitHub, Snowflake, T-SQL, AWS (EC2), S3, Erwin, Kafka, OLTP, OLAP, Visio, UNIX, Shell Scripting, Windows. Client: Biogen Inc, RTP, NC

Sr Data Modeler August 2022 – October 2024

Responsibilities:

• Gathering the requirements, creating custom SQL queries per the reporting requirement, and consolidating all the other SQLs for Agency modernization.

• Collaborate regularly with team members and actively participate in architectural decisions for the BI platform.

• Involved in all phases of SDLC that comprise analysis, design, development, implementation, and execution of enterprise analytical solutions while following established standards and processes.

• Working with the business users and product managers to gather the requirements and understand the Metrics and KPIs.

• Design, build, enhance, and maintain both historical and real-time interactive dashboards and reports that are to be used by analysts, applications, and external business partners.

• Created the Database Mapping Sheets & Technical specification documents for all Use cases Created Jira tasks and assigned those tasks to developers as per the sprint plan.

• Work on high-level models that define the scope of data requirements, often used to communicate with stakeholders.

• Use Erwin to reverse-engineer existing databases to create models that represent the current state of the data architecture.

• Perform impact analysis to assess the effect of changes to data models on existing systems.

• Work closely with business analysts, data scientists, and IT teams to align ETL processes with business goals

• Creating the mock data (As-is data as per the Historical data information) using the client application) and creating the sample table structure.

• Created different types of charts like Bar, Line, Pie, Area, treemap, Donut and scatter Plots and waterfall charts, Geo Maps.

• Developed charts using different functionalities like filters, Calculations and parameters, Conditional Formatting, creating variables, and applied conditional styles.

• Regular interaction with the onsite team and client coordinators to discuss the requirements and existing report enhancements.

• Developed the Complex/out of box/advanced dashboards using Conditional formatting with Trend indicators in Tableau Desktop.

• Publishing into the tableau server to discuss with end users and scheduling the dashboard as per the user's requirement.

• Creating the Row level of security for all the end user to access the data as per their roles and zones and give the permissions to access the server dashboards.

• Create and develop data models, reports, and dashboards, Use Power Query to clean, transform, and load data and DAX formulas to create calculated columns, measures, and tables, and Optimize data models and reports for performance.

• Responsible for data exploration and analysis using the semantic layer provided by AtScale, deriving insights and trends from the data.

• Testing and updating the Dashboards according to requirements and preparing test cases for the tested reports/Dashboards.

• Responsible for Tableau deployments from Dev->UAT-> Prod server Via CI/CD process where GIT HUB, Eclipse, Team City and UCD (Urban Code Deployment) tools have been used for code deployment.

• Responsible for delivery of assigned modules/ components /phases of a project, status reporting, guiding the development team, estimation, planning & execution, knowledge transfer, and arriving at SLAs for steady. Environment: Tableau 2023. X (Desktop & Server), 2021. X, Tableau Prep, SAP-BO, Power BI, Hadoop-Hive, Oracle12, SQL& Pl/SQL, Hadoop-Hive, AtScale, Dremio, Toad, Windows XP, GIT, Eclipse, Bit Bucket, Team City, IBM-UCD. Client: Northern Trust, Chicago, IL

Data Modeler/Business Analyst May 2020 – July 2022 Responsibilities:

● Participated in requirement-gathering sessions to understand expectations and worked with system analysts to understand the format and patterns of the upstream source data.

● Developed logical models per the requirements and validated their alignment with the enterprise logical data model

(ELDM).

● Architecting and Modeling Enterprise Data Hub (EDH) cloud-based solutions using AWS Redshift for Analytical platform and AWS Oracle RDS for Reference and Master Data.

● Involved in the design and development of data warehouse environment, liaison to business users and technical teams gathering requirement specification documents and presenting and identifying data sources, targets, and report generation.

● Enforced referential integrity in the OLTP data model for consistent relationship between tables and efficient database design.

● Used different ER Model techniques such as Parent-Child, 'Associative', Sub Type Super Type, Union, Key Value Pair concept, etc. to develop the data model.

● Identified and Analysis of various facts from the source system and business requirements to be used for the data warehouse (Kimball Approach).

● Generating DDL scripts for both Redshift and RDS Oracle out of ADS ER Modeler.

● Providing day-to-day data administration and security-related tasks for the ETL team related to AWSRedshift and AWS Oracle RDS.

● Translated business requirements into detailed system specifications and developed use cases and business process flow diagrams employing unified modeling language (UML).

● Checked for all the modeling standards including naming standards, entity relationships on the model, and comments and history in the model.

● Implemented naming standards when creating indexes, primary keys, and relationships.

● Created reference values, lookup tables, history tables, and staging tables per requests from development teams and worked with them in developing the mappings.

● Worked extensively on legacy models that were reverse engineered from databases to implement object naming standards in the logical model and completed definitions by working with respective application teams.

● Generated ad-hoc SQL queries using joins, database connections, and transformation rules to fetch data from legacy DB2 and SQL Server database systems.

● Developed Data Mapping, Data Governance, Transformation, and cleansing rules for the Master Data Management Architecture involving OLTP, ODS, and OLAP.

● Supported Data Quality Managers in developing tools for data governance and master data management.

● Assisted other teams in integrating multiple data models when the respective systems were migrated from multiple databases into a single Oracle schema.

● Prepared ETL technical mapping documents along with test cases for each mapping for future developments to maintain SDLC.

● Responsible for performing in-depth quantitative data analysis and generating ad-hoc queries as per client requirements and performed complex ad-hoc SQL queries from BAs in both SSMS and Teradata SQL Assistant.

● Assisted the DBA in creating tables, indexes, views, snapshots, and maintaining the security of the database.

● Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).

● Defined integrity rules and defaults and was involved in data profiling.

● Helped testing teams with the development of test cases in User Acceptance Testing and production testing, and independently performed exploratory testing.

Environment: SharePoint, MS Excel, Agile, MS Word, MS Access, HP Quality Center, Python, UML, ETL, Informatica Power Center 7.X, Oracle 11g, MDM, Erwin, OBIEE 11g, MS Office suite, MS Visio, Oracle SQL Developer. Client: Baker Hughes, Huston, TX

Data Analyst/ SQL Developer Intern April 2019 – April 2020 Responsibilities:

● Hands-on experience in troubleshooting production issues and performance tuning of jobs.

● Participated in development efforts to analyze client-server applications using Visual Basic as the front-end tool and MS SQL Server Database. Designed packages for various data sources, including SQL Server, Flat Files, and XML. Involved in converting PL/SQL stored procedures, triggers, functions, and scripts to T-SQL.

● Designed a custom Reference Data Management system, allowing business data stewards to manage slowly changing master data sets in registered Oracle 19c tables used by other OLTP systems.

● Defined standards and procedures for data modeling, database design, reference data management, and ETL requirements and design.

● Used ER Studio to make logical and physical data models for enterprise-wide OLAP system/Star Schema/ Snowflake Schema.

● Write ETL development specification and run POC to prove data transformation, migration, or integration.

● Developed Star and Snowflake schemas-based dimensional model growing the data warehouse.

● Specialized in transforming data into user-friendly visualization to give business users a complete view of their business using Tableau

● Designed, implemented, and maintained OLAP servers and processes to replicate production data to the server.

● Performed data extraction from OLTP systems to OLAP environments for decision-making using SSIS.

● Extracted large volumes of data from different sources and loaded them into target databases by performing various transformations using SQL Server Integration Services (SSIS). Created reports by extracting data from cubes.

● Involved in the installation and configuration of SSRS Reporting Server and the deployment of reports.

● Deployed SSRS reports across multiple environments, including Test, Production, and Supporting environments.

● Created ad-hoc reports for upper-level management using stored procedures and MS SQL Server 2005 Reporting Services (SSRS) based on business requirements. Developed VBA scripts and macros to automate report generation. Environment: SQL Server, Teradata, SAS, SQL, PL/SQL, T-SQL, OLAP, OLTP, SSIS, XML, SSRS, Oracle, Excel, Jira, VBA, Shell Scripting.

NFS IT Solutions, Hyderabad, India July 2015 – July 2017 Data Analyst/ Data Modeler

Responsibilities:

● Interacted with users and business analysts to accumulate requirements.

● Understood existing data model and documented expected design affecting performance with the system

● Initiated and conducted JAD sessions inviting various teams to finalize the necessary data fields in addition to their formats.

● Developed Logical and Physical Data models by employing Erwin

● Created logical data model on the conceptual model and its particular conversion into physical database design.

● Developed shell scripts to process and manipulate large datasets, perform data validation, and extract relevant information.

● Followed industry best practices for shell scripting, including modularization, code readability, and version control.

● Created ftp connections, and database connections for sources and targets.

● Maintained security and data integrity from the database.

● Developed several forms & reports using Crystal Reports.

● Provided maintenance support to customized reports created in Crystal Reports/ASP.

● Mapped the data between Source and Targets. Generated Reports from data models.

● Reviewed data model together with the functional and technical team.

● Assisted the ETL staff, developers, and users in understanding the data model.

● Maintained an alteration log for each data model created.

● Created PL/SQL codes from data models and interacted with DBAs to make development, testing, and production databases.

● Implemented database objects (like indexes, and partitions in Oracle database) for performance.

● Interacted with DBA to discuss database design and modeling, index creations, and SQL tuning issues. Environment: Oracle 8i, SQL Navigator, PLSQL, Pro DB2, SQL server 2008, MS SQL server Analysis Manager, Sybase, Rational Requisite, Windows NT, Crystal Reports.

EDUCATION

• Master of Science in Data Science University of Memphis at Memphis, TN March 2019

• Bachelor of Computer Science and Engineering Jawaharlal Nehru Technological University, INDIA June 2015



Contact this candidate