Post Job Free

Resume

Sign in

Sql Server Developer

Location:
Collierville, TN
Salary:
130,000
Posted:
April 10, 2024

Contact this candidate

Resume:

Michael McKinney

Collierville, TN ad4xb0@r.postjobfree.com

Career Objective

To advance my career in BI Development and Architecture

Certification

Taking online classes over the weekend

for Certification DP-203: Data Engineering on Microsoft Azure

Career Profile

Accomplished BI Developer/Architect offering 22 years in IT including 12 years of Healthcare industry experience functioning as a SQL Developer, ETL Developer, DBA, Data Architect, Technical Lead or combination of roles.

12 years as an ETL Developer. 6 years as an ETL Architect. 10 years of ETL Development/Architecture experience within Healthcare industry.

ETL Architect level experience includes Master Data Management Schemas and Data using Erwin, Erwin Transaction and Dimensional Data Modeling, ER Studio Data Modeling and Version Control of Software Developers Database, Versioning Database and Source Control for each release, etc.

ETL Development experience includes building Data Pipelines in either SSIS or Informatica the takes data from ODS (Operational Database such as Cerner or Epic Database), Flat Files, XML Files or VSAN Files and loads into Staging, Test and Production Warehouse databases. Stored in SQL Azure, SQL Server, or Oracle Databases.

Strong knowledge of popular ETL tools (SQL Server Integration Services, Informatica and Oracle Data Integrator).

20 years of experience utilizing SQL Server Management Studio.

BI and Data Visualization tool expertise includes SSIS (Advanced), SSRS (Advanced), SSAS (Intermediate), Power BI (Advanced), Cognos Data Manager (Beginner), Informatica Power Center (Advanced), Informatica Data Quality Tool (Advanced), Informatica Data Linage Tool (Intermediate), Tableau Desktop (Beginner) and Tableau Administrator (Advanced).

Expertise in developing SSIS ETL and SSRS Reports at several large companies since 2008. Strong experience with SQL Server Reporting Services (Report Manager, Scheduler, Publisher, Report Builder, Datasets and Security). Expertise in tuning Reporting Services stored procedures, queries and design structures.

14 years of Data Modeling experience. 10 years of Fact Dimensional Modeling experience. Data Modeling Experience with Erwin, Visio, Oracle SQL Developer and Management Studio Diagrams.

2 years supporting Azure cloud technologies.

Current experience with Epic (currently migrating from Cerner to Epic at St. Jude Children’s Research Hospital. Additional experience with Cerner and MediTech.

TECHNICAL QUALIFICATIONS

SQL Server 2000/2005/2008 R2/ 2012 / 2014/2016:

SQL Server Management Studio, Business Intelligence, Server Integration Services(SSIS), Data Transformation Services (DTS), Security, Enterprise Manager, Query Analyzer, Power Shell, T-SQL, DDL, DML, Table-Valued Functions, Scalar Valued functions, File Groups, Agent, Jobs, Database Configurations, Database Options, Database Services, Database Growth, Maintenance Plans, OSQL, SQLCMD, DTEXEC, BCP, Bulk Insert, Red Gate (SQL and Data Compare)

Oracle 9i/10g/11g:

OEM, TOAD, Vi, SQL * Plus, PL/SQL, Oracle SQL Developer, Toad Data Point

Data Warehousing

Extract Transform Load (ETL), SQL Server Integration Services (SSIS), Informatica, Oracle Data Integrator (ODI), Dimensional Modeling, SSAS

Programming:

T-SQL, PLSQL, VB.Net, C++, C#, PowerShell, Shell Programming, Perl, DAX or MDX (Expressions), VBA, XML, XSD, ASP, ColdFusion( CFM/CFC), JavaScript, HTML, CSS, DHTML, DOM

Data Modeling:

ER Studio Data Modeling, Erwin Data Modeler r7, Visio, Entity Relationship Diagrams, Dimensional Modeling, Cubes, Table Definitions, Reporting and Transactional Databases

Web / Data Server:

IIS 5.0/6.0, ColdFusion Server, LDAP, Active Directory, NT Security, Server Certificate, Client Certificate, Web Server Ports (80, 443, 1433, 389) and Set up DSN

Microsoft Office Programs:

Power Pivot, Outlook, Excel, Access, Word, PowerPoint, MS Project

Software Life Cycle:

Experiences working in an Agile and Water Fall work environments. Experience using source control, bug tracking and time management software such as DevOps, TFS or Subversion

Performance Tuning:

Indexing (Clustered, Non-Clustered, Covering, Statistics), Query Execution Plan, Server Side Trace, Profiler, IO, Page Reads, CPU Time, Duration, DTA, Tuning (Schema, Stored Procedures and Queries), Partitioning Tables

System Monitoring / Capacity Planning:

Perfmon.exe, CPU, Disk, Memory and Network System Counters. Monitoring and capacity planning for VLDB (Very Large Databases). Worked with 6 TB of data and over 9,000 tables.

Operating Systems:

Windows (98, 2000, XP, Vista, Windows 7, 8, 10), Windows Server (2003, 2008 R2, 2012), MS-DOS, Mac, OS, 32bit(x86-based pc ), 64bit (x64 – based pc)

Sr. Data Warehouse Developer, Memphis, TN

EPG Insurance an Assurant Company 11/17/2021 - present

Technology Used: SQL Server, MySQL, Visual Studio, Shell Commands, Notepad++, Power BI, SQL Server Reporting Services, Power BI Web Portal, SQL Agent Jobs, Toad Data Point

My responsibilities include SQL, Power BI, Reporting Services and ETL Development.

ETL Development:

Monitor SQL Agent ETL Jobs run with no error each morning

Re-run jobs in order if a job step failed and reload Power BI Dataset

Monitor Power BI Job dataset run each morning re-run if one fails

Monitor Replication and BCP Jobs are staged tables data each day

Enhance existing ETL Jobs, SSIS Packages and Database Objects

Tune SQL Queries, Stored Procedures, Views, and Indexes used by Reports and ETL Jobs

Reverse Engineer ETL jobs when jobs fail, or data is missing. Track data linage reviewing SSIS Packages, SQL Agent Job Execution Sequence, Stored Procedures, Views, Replication and BCP process.

Automate Loading Excel files into SQL Server database by scheduling jobs to check drop file location. Underwriter just needs to format the excel file correctly for file to load dynamically.

Develop SSIS Packages, Stored Procedures, Views and ETL Jobs to Extract, Transform and load data from Source System into Physical Damage and Extended Services Data Warehouse

Master Data Management:

Configure control table used to define new contracts, policies, data groups and entities used by Data Warehouse ETL process to load the fact and dimension tables

Validate Control tables data is loading correctly and matches underwriter’s data definitions

Create custom SQL script to insert and update missing data elements into control tables and validate reports data is correct after data configuration changes

Reporting:

Monitor Power BI Dashboards and verify Datasets get refreshed each day

Administer Power BI or Reporting Services Reports and Dashboard Security

Develop New Power BI or Reporting Services Reports based on underwriter and VP requirements

Report Services Reports are operation reports used by underwriters and insurance processors daily. I support over 80 Reporting Services Reports.

Power BI reports are used for forecast profit and losses ratios of insurance claims. I support over 40 Power BI Dashboards and Reports used by upper management, VPs and customers.

Sr Database / System Developer, Memphis, TN

St. Jude Children's Research Hospital

Department of Infectious Disease 06/15/2019 – 02/07/2022

Technology Used: SQL Server 2016, Visual Studio, Shell Commands, Notepad++, Power BI, Hyperion Reporting, SQL Server Reporting Services, Power BI Report Server, SQL Agent Jobs, IIS

Mentoring and Leading:

Train enters users how to use Power BI Report, train Scientist how to use SQL Agent Jobs, Run SSIS Packages and Monitor Dropbox files move from process to loaded folder. Mentor IDS IT Directory in ETL and SQL Server Development best practices. Demo Tableau and Power BI to Doctors to show features in each before purchasing BI solution.

My responsibilities include SQL Development, Power BI Report Development, Hyperion Report Development, Reporting Services Development, Database and Web Server Administration, Visual Studio Solution Deployments, Database Upgrade, Jobs Monitoring and ETL Development

Database and Web Server Administration:

Document and Configure Web and Database Server

Monitor SQL Agent Jobs and Backup

Develop job to backup and sync databases.

Windows and Database User Administration for Development and Production Servers

Manage firewall changes for Web and Application Servers

Administer Power BI, Report Services and Hyperion Users

Tune SQL Queries, Stored Procedures, Views and Indexes used by Web App, Reports and ETL Jobs

Database Upgrade

Migrate Development and Production Windows Servers and document migration steps

Migrated Legacy Windows Database Server to Windows Server and SQL Server 2016

Migrated Legacy Windows Server to Windows Server 2016 and Convert IIS Settings

Documented Database, Windows Server and IIS Configurations

Deploy latest build of Visual Studio Project on New Development and Production Servers

Enhance Web Configuration file to use Integrated Authentication and Service Account

Report Development

Installed and Configured Power BI Report Server on Premium installation

Hyperion Report Bug Fixes and New Development

Developed Power BI (Dashboards, Reports) and Reporting Services Paginated Reports

Sr. Database Developer

RIB Americas (Software Company), Memphis, TN 11/15/2017- 06/15/2019

Technology Used: SQL Server 2016, Azure Platform as a Service (PaaS), Azure Infrastructure as a Service (IaaS), Visual Studio, XML, Java Script, PowerShell, Shell Commands, Notepad++, Office 365 (Power Apps, Power BI), ER/Studio Data Architect

RIB Americas is a global provider of software solutions for the construction industry. My responsibilities include SQL Development, Power BI Dashboard / Report Development, Data Modeling, Database Administration of on-premises and Azure cloud solutions.

Database and Report Development

Create Power BI Dashboard and Reports for iTWO Software

Design Physical Data Models in ER Studio

Review and Add Developer Schema Changes to Source Control Branches

Develop Database Maintenance Plan using Database Mail, Database Alerts, Stored Procedures, SQL Agent Jobs and Batch Files, Server-Side Traces, System Perfmon Counters and Database Data Collectors.

Develop Common Table Expressions (CTEs), Virtual Table, Temp Table, Stored Procedures, User Defined Functions, Indexes (Unique, Clusters, Non-Clustered), Default Constrains, Primary Keys, Foreign Keys (Cascading Updates, Nulls or Deletes), DDL and DML in support of iTWO Software

Analyze iTWO Application workflow in order to develop meaningful reports

Works with Project Managers, Consultants, Application Mangers to gather report requirements

Database Administration:

Monitor SQL Server Instances Disk Utilization, Queries and Performance

Tune Database Queries, Indexes, database setting, file system layout and windows settings

Configure / Run Power Shell Scripts used to upgrade database data and schema to current release.

Standardize Volume Names for Databases Data, Log and Backup files across enterprise

Configure and Monitor Database Maintenance Jobs (Rebuild Index, Backup, Consistence Checks)

Monitor Daily Notification of Jobs Completions, Database Alerts and Disk Utilization Reports

Implement and Recommend Database Server Disk Volumes, Data Files, Log File, Backups, Memory, CPU and Database Settings.

Administer SQL Server Report Server, Power BI Report Server and Power BI Services

Administer IIS for different versions of iTWO Web Application.

Install and Configure SQL Server 2014, 2016 and 2017 on Virtual and Physical Servers

Microsoft Azure

Configure Azure IaaS Recovery Vault database Backup and Azure SQL Server Always-On

Utilize Azure Resource Manager to setup Disk, Networks, Virtual Machines, Database Servers, Application Servers, Azure Active Directory, Power BI Security and Geo-redundancy

BI Developer III

TRUGREEN, Memphis, TN 01/25/2016 – 11/15/2017

Technology Used: SQL Server 2012, Toad Data Point, Cognos Data Manager, MS Business Intelligences (Data Tools, SSIS), Tableau, Visual Studio, NotePad++, FTP, DB2 iSeries, Postgresql

Develop, support, and optimize the data warehouse ETL process and data structures to facilitate reporting and analysis of enterprise data. Responsible for analyzing source system data, building processes to provide data cleansing & proper categorization. Translating business processes into logical & physical models in support of business requirements.

Mentoring and Leading:

Function as technical lead for 5 BI Developers in our Data Group. Do code review and build SSIS Package templates for Mid. And Jr. BI Developers. Train BI Developer how each Control Flow and Data Flow task work and which ones to use for best performance. Explain database settings to team and best ETL Package setting to prevent the SQL Transaction Log File from growing too large and run ETL Jobs faster with less blocking and locking errors. Do weekly lunch and learns meeting to train Jr. Developers over time.

Essential Job Functions:

Work with business teams to gather and document requirements and business needs.

Translate requirements into technical and functional design for back-end and front-end solutions.

Support and Enhance Cognos Data Manager ETL Process.

Pull Data from various sources such as DB2, Postgresql, SQL Server and Flat Files into Staging, Reporting and Data Warehouse databases.

Setup SSIS Test and Development Servers for Development team.

Install and Configure SQL Server 2012 and Visual Studio 2012 Data Tools on Developers notebooks.

Install and Configure Cognos Data Manager and iNavigator onto new developers’ notebooks to connect to DB2

Train team on best practices for SSIS Package and Solution development.

Tune ETL Process in Cognos Data Manager and SSIS Packages. This is combination of best configurations for ETL tool along with index recommendations. Considering things such as batch size and transaction log file growth. Also, look into query hint to prevent locking (query isolation levels) or parallel processing (MAXDOP query hint) and refactoring overly complicated queries.

Develop, test, implement, and support all aspects of data integration, utilizing Stored Procedures, SQL scripts and SSIS.

Evaluate, analyze and monitor databases to resolve performance issues.

Recommend data warehouse improvements to database settings, database maintenance plans, service packs, ETL development tools, deployment strategies, data structures and process improvements.

Create and maintain documentation of processes, databases, applications and procedures.

Work within the prescribed change control processes, procedures and development methodology.

Work with Data Stewards, external vendors and business users to resolve data questions.

Report project/task status. Adhere to IT development standards and best practices.

Effectively communicate with Business users and customers.

Data Architect 1

Hancock Bank / TEK Systems, Long Beach, Mississippi 06/10/2014 –01/22/2016

Technology Used: SQL Server 2012, Oracle, SSMS, Oracle SQL Developer, Informatica (Data Validator, Designer, Workflow, Monitor, Repository Manager) TFS, MS Business Intelligences (Data Tools, SSIS, SSRS, Database Projects), Visual Studio Database Project

Develop, support, and optimize the data warehouse to facilitate reporting and analysis of enterprise data. Responsible for analyzing source system data, building processes to provide data cleansing & proper categorization. Translating business processes into logical & physical models in support of business requirements.

Essential Duties and Responsibilities include the following.

Contribute to the data warehouse development lifecycle (requirements gathering, ETL, database design & development, testing & validation)

Design and build data structures to support analysis and reporting.

Design and build processes to ensure integrity and quality of data.

Troubleshoot and optimize existing queries and databases.

Recommend and implement process improvements.

Maintain data dictionary & documentation.

Maintain knowledge of current BI tools, methods, and best practices.

Acts as an internal consultant providing technical guidance on business systems projects.

Determines and resolves problems with other systems analysts, programmers, and systems users.

Maintains, develops, modifies, and documents processes according to general specifications and guidelines.

Provides guidance to lower-level team members.

Job Tasks:

Manage SSRS Report Manager Server and User Access, Scheduled Reports, Subscriptions and Data Sources.

Develop and gather requirements for ETL process. Use a combination of excels mapping documents, SharePoint and ticketing systems to track ETL requirements, migration plans and test plans. Also use Informatica Source Control and Linage tools to track ETL changes.

Document new and existing ETL mapping business logic and transformation code. These documents are used by the business analyst to keep track of mapping sources fields, transformation logic and Targets.

Follow Agile Methodologies for ETL development

Create Informatica Mappings and Workflows based on requirements and technical best practices.

Build homogeneous and homogeneous ETL Pipelines that contain lookup tables, source tables and source VSAM files. These files can be 6 GB in size and may take an hour to load. Reduce ETL time by first loading all VSAM files into Staging Area first then doing ETL on SQL table so it is easier to debug and load test.

Attend control board meeting for each ETL migration to Production.

Implement Database integrity by creating table constraints to prevent orphaned or duplicate records.

Load tables to pre aggregate totals into staging area to improve query read performance.

Build homogeneous and homogeneous ETL Pipelines that contain lookup tables, source tables and source VSAM files. These files can be 6 GB in size and may take an hour to load. Look for ways to improve loading the files and testing the ETL process.

Reduce ETL time by first loading all VSAM files into Staging Area first then doing ETL on SQL table so it is easier to debug and load test.

Load data from SQL Server, Flat Files, CSV files and VSAM Mainframe files to SQL Server and Oracle.

Work with tables in Oracle and SQL Server on DEV, TEST and PROD.

Manage Database User and System Accounts.

Implement database security changes based on SOX compliance audit reports.

Help Administer SQL Server Database User and Service account permissions.

Tuned FLARe ETL process so that it runs faster and does not cause the database to crash during peak ETL loads.

Perform Data Quality checks using the data validation table pair tests to compare data between DEV, TEST and PROD tables.

Coach other Data Services team member how to easily migrate SQL Server Objects between database, SQL tuning techniques and Stored Procedure unit testing methods.

SQL Server Database Design Analyst,

HP Services / Avista Corp, Spokane, Washington Nov 2013 (11/15/13) - May 2014 (05/17/14)

Technology Used: SQL Developer, Oracle, SQL Server 2012, Windows Server 2012, SSIS, Pervasive Database

Avista is the largest Energy Company in the Northwest with customers in Washington, Oregon and Idaho. My job was to develop the SQL Server 2012 Data Warehouse containing weather and electronic meter information from various Avista applications. The Data Warehouse loads data from flat files, Oracle, SQL Server and Pervasive Data Stores. This project was a 6 month contract to build the Data Warehouse to support Trove’s Energy Forecasting Web Application. Trove firm is a team of PhDs that specialize in statistics for the Energy Industry. The project was initiated to improve energy forecasting models. This in turn would save the company money when they buy and sell energy. The current forecast was based on meter reads and weather data from monthly ETL processes. Troves wanted Avista to build a Data Warehouse that was refreshed hourly. The idea was the more data points the more accurate the forecast models would be. My job was to do a combination of data discovery, gap analysis, data architecture, ETL Development and system requirements. I was able to work with Oracle DBAs, SQL Developers, SQL Server DBA, Energy Traders, Accountants, Project Managers, System Administrator, Security Officer and Vendors in order to gather requirements for the project and build the ETL solution. I think I did a good job because I was able to release a product in a short time that was operational, well documented and met Avista’s requirements. Avista was happy with the Data Warehouse and Trove Forecasting application.

Database Design Analyst Responsibilities

Reviews project requests containing user requirements for the Data Warehouse.

Build Dimensional Data Models

Estimates time and cost required to accomplish project.

Attends specification meetings with project team members to determine scope and limitations of project.

Designs and develops data warehousing strategy / schema implementation.

Develops data model describing data elements and how they are used.

Creates and maintains the Data Dictionary.

Designs logical and physical databases and coordinates database development as part of project team, applying knowledge of database design standards and database management systems.

Implements physical database in accordance with appropriate security designs / constructs to protect company data.

Designs and implements appropriate backup strategy for Data Warehouse.

Database Administrator

Kootenai Health, Coeur d'Alene, Idaho Mar 2012 (03/15/12) - Nov 2013 (11/15/13)

Technology Used: SQL Server 2008R2, Windows Server 2008 R2, SSRS, SSIS, Red Gate Toolbelt, Meditech Data Repository, Meditech Modules (Emergency Department, Electronic Medical Records, Abstracting, Pharmacy, etc..)

DBA Tasks:

oDisk Subsystem

Monitor transaction log and data file growth.

Allocate enough space to data and log file to meet growth trends in disk usage.

Isolate data and log file on separate disks

Recommend Raid Levels for data and log files.

oSecurity:

Manage request and ticketing documentation for all vendors and employee.

Create security matrix for all users.

Create security T-SQL scripts to quickly restore all user’s access.

Manage Security for SSRS, SSIS and SSAS

If you have to give higher level permission use Policy Based Management or log default trace data to a repository to see if users make changes to security or database configurations.

oFragmentation

Create a clustered index on all tables. A table heap is fragmented based on the disk fragmentation. Have a clustered index helps put the fragmentation within the database vs. on the disk subsystem.

Only automate log and data file shrinks if there is no other choice. Try to allocate enough disk space to begin with and manually kick off shrink jobs to prevent fragmentation. Create log and data file shrink stored procedure that goes through each database on the server and shrinks them during a emergency. Example, you cannot log into your database if the transaction file has taken up the whole disk volume.

oIndex Management

Run tried and true stored procedure run daily that rebuilds or reorganize indexes based on fragmentation level.

Configure stored procedure index fill rate depending on if the database is a reporting or transactional database.

oIntegrity Checks

Run stored procedure to check database integrity.

oSQL Reporting Services:

Install and administer report manager. Configure folder security, report subscriptions, report users, data sources, application accounts, automate deployments of reports.

Performance features of reporting serves are configuring such as caching report for reuse or scheduling long running report so used doesn’t have to wait for the processing.

Setting up memory settings in server to improve process of reports.

oMonitoring Database and Server

In the morning check the Database Error Logs and Event View logs for unusual activity

oLoad balancing

Helped with load balancing the Web and Database Server to handle more concurrent users for the Monitor 24/7 ticketing application.

oPerformance Code Review

Used function to help create standard calculation and field formats but make sure developer understand to only use them on a small subset of data to help improve duration of stored procedures.

Unit test stored procedure and make sure not recommended indexes show up in Query Plan editor. Also, on table with many rows make sure and index exists that has columns in the joins, where and select.

Use Set Stats IO, Time to manually see what tables in query has the highest logical page reads.

Use page reads more of an indicator of performance issues then duration. Duration can go up and down depending on network usage and current users on the system.

Use Activity Monitor to see long running stored procedure. If you spot one double, click and open in profile to see issues

Create Profiler templates that are filtered to what database activity you want to monitor. These templates can help you to quickly focus on specific database events, databases, application, etc…

oMonitoring Tool

RedGate SQL Monitor. Helps automate monitoring database stats.

oSupported http://medisolv.com/ BI solution at Kootenia Health. Implemented the installation of SSAS / SSIS, Setup Database Server Permissions, Setup Service Accounts to run SSAS/SSIS and for Task Manager Jobs, managed Database and Server Contractor accounts, worked with off shore contractors to build ETL and Cubes to work with Meditech Data Repository structures. For example, our installation was more complicated because we supported multiple hospitals with 300 to 50 beds. The Medisolv application code did not support multiple hospitals so we had to work with the consultants to tweak the code and verify it is correct for each hospital’s dataset.

oAdministrator / Developer Tasks: Perform capacity planning/space analysis and performance tuning of SQL 2008 R2 Databases. Upgrade SQL Server 2005 to SQL Server 2008 R2. Create migration/upgrade plan. Configure Windows OS and SQL Server to best utilize Memory, CPU and Disk Subsystem. Install SQL Server Service Packs, Cumulative and Windows Updates. Manage security for Meditech, SQL Server and Report Services. Develop Meaningful Use Reports using Stored Procedure, Views, Functions (Table-Valued and Scalar Valued functions), Schemas, Tables, Indexes and Reporting Services. Work with Hospital (Nurse, Financial and Pharmacy Specialist) to gather requirements for Meaningful Use Functional and Quality Measures reports. Complete Change Control Request forms for every major release of software or hardware upgrade. Monitor Meditech’s Data Transfers process daily and file tickets to resolve data or schema issues. Optimize long running stored procedures and develop and monitor database indexes. Administer 9,000 tables and 30,000 stored procedures used to transfer data from Meditech’s proprietary system to the SQL Server Data Repository. Reduced some long running stored procedures duration from 2 hours to 8 seconds by refactoring stored procedures and indexing tables. Use PowerShell, System Tasks and Report Scheduler to automate printing ER reports each morning. Develop/Monitor SQL Agent jobs used for database administration tasks and data extractions. Configure SQL alerts to email admins if server disk space has reached 10 percent of capacity.

SR. SQL DEVELOPER CONSULTANT

PACE Performance Group / Luxsolutions Inc, Portland, Oregon Aug 2011 (08/15/11) - Dec 2011 (12/30/11)

(Consulting for Healthcare and Railroad Industry)

Technology Used: SQL Server 2008R2, Windows Server 2008, SSRS, SSAS, SSIS, Panorama Business Intelligence Software, Excel, Visio, PowerPoint, MS Project, JIRA, Subversion, Team Foundation Server, SharePoint, Windows 7, MS Live Meeting and Amazon Virtual Server

GREENBRIER COMPANIES (Railroad Car Leasing)

Portland, Oregon

Report Performance:

oWhite Board Cube to Customer

The directory and technical lead where .Net gurus but didn’t understand Cubes or Data Warehousing concept

Created PowerPoint Slides and Data Model of a typical Star Schema.

Explained how the organization could use this to improve their .Net applications reporting features.

Gave multiple vendor and tool options for processing cubes.

oCubes

Create cubes to help preprocess reports. A lot of the performance issues are caused by duration to process and format report data. Cubes can help with this.

Developed Star Schema tables model in SQL, Views and Cube data structures in SSAS.

oRecommended Partitioning Fact table:

You can help performance by partitioning out the fact table in a start schema and dividing in up on many servers to help process the data. The query can then use CPU and Memory from multiple server to process the cube.

Develop business intelligence tool used to analyze car leasing agreements.

Gather requirements to build Business Intelligence solution using Microsoft’s BI Stack (Reporting Services, Integration Services and Analysis Services).

Demo Panorama Business Intelligence Software

Help design PowerPoint presentations to communicate concepts to Management and Architect.

Develop Visio Diagram of proposed Star Schema for BI SSAS Cube

Design and present typical Data Warehouse Diagram to Management/Architects

Mississippi State Department of Health (Healthcare Analytics)

Jackson, Mississippi

Profiled electronic medical record’s database to build cubes for BI Solution.

Generate Gap Analysis excel report of existing data that could be used to build Business Intelligence Reports prior to development.

Communicate with client’s DBA to better understand health data to see if it was possible to build reports off of existing collected data

Built summary tables querying



Contact this candidate