Post Job Free
Sign in

Database Administrator Sql Server

Location:
Tampa, FL, 33602
Posted:
May 02, 2025

Contact this candidate

Resume:

Corie Curcillo

Consultant

Consultant with over *5 years of experience in the Information Technology industry. Client industries include Healthcare, Insurance, Energy, Retail, Banking, Government, Non-profit, and Utilities. Throughout his career, Mr Curcillo has taken on the roles of data modeler, data architect, database administrator, database developer, across a wide breath of tools and technologies. He has been a Microsoft Certified Trainer for over 20 years with experience as both a public and private trainer.

Education

Bowling Green State University – BSBA triple major o Management Information Systems (MIS)

o Finance

o International Business

Certification (Transcript ID: 661881, Access Code: 12081973)

Microsoft Certified Trainer (MCT)

Azure Data Engineer Associate

Azure Data Fundamentals

Microsoft Certified IT Professional (MCITP) – Database Administrator

Microsoft Certified Technology Specialist (MCTS) – SQL Server

Microsoft Certified Database Administrator (MCDBA)

Microsoft Certified Solutions Developer (MCSD)

Microsoft Certified Professional (Windows Server)

Oracle Certified Associate

Relevant Training

SQL Server Development

SQL Server Administration

Windows AD Administration

MDX

CRM

OLAP Modeling

Master data management (Magnitude Kalido MDM)

Professional Affiliations

Central Ohio SQL Server Users Group (5-year director)

Central Ohio .Net Developers Group (CONDG)

agileLUNCHBOX (Agile Development User Group)

Data Management Association International (Central Ohio Chapter) Skills

Modeling Tools

ER/Studio Data Architect

SAP Sybase PowerDesigner

Erwin

Agile PM Tools Rally / CA Agile Jira Agile

ClickUp

DMBS

SQL Server

(Certified

AWS - Amazon

Aurora

PostgreSQL

Trainer)

Oracle (Certified

Associate)

Cloudera Hadoop

(Hive + Impala)

DB2

Netezza

Azure Synapse

DBMS Tools

DBeaver

Toad

Hive Editor

Impala Editor

Oracle SQL

Developer

SQL Server

Management

Studio (SSMS)

SSDT

Netezza

Workbench

SQL Server

Analysis Services

(SSAS)

Infrastructure Windows Server Active Directory Development

Tools

Azure Portal

Alation Data Catalog

Kalido MDM

Collibra Data Governance Center

SQL Server Data Tools (SSDT)

SSIS (Integration Services)

Redgate

Microsoft Dynamics CRM

Imperva Camouflage Data Masking (PII)

Languages

T-SQL

PostgreSQL

Oracle PL/SQL

Hive SQL

Impala SQL

DB2 SQL

Netezza SQL

Visual Basic

(Certified)

MDX

ANSI SQL

MySQL

Professional Experience Summary

Companies Worked For From – To Roles

Consulting (Gravity IT

Resources) - Jackson Health

Systems (JHS) / Stax Payments

2022 – 2025 Consulting

Centric Consulting 2021 - 2022 Centric Staff consultant TEKsystems –

PopHealthCare

(Nashville), CareSource

(Dayton)

GE Aviation (Cincinnati)

Great American

Insurance (Cincinnati)

Bank of America

2013 - 2021 Contract Consultant

(Charlotte)

Schaeffer's Investment

Research (Cincinnati)

SOGETI (BMW Financial, Ohio

Department of Transportation

(ODOT), Ohio Attorney General)

(Columbus, OH)

2012-2013

Business Intelligence & Data Warehousing

Architect, FDIC Data Compliance

Dell / Perot Services - Boy

Scouts America (Dallas/Tampa)

2011 – 2012 Data Architect / Data Modeler

IGS Energy (Columbus, OH) 2003 – 2011 Data Architect / Database Administrator Consulting (Columbus, OH) 1998 – 2002

Database Administrator, .NET Developer, SQL

Server Trainer, BI Data Warehouse Architect,

CRM Consultant, ASP Developer

Chase Bank (Columbus, OH) 1997 – 1998 Card Services management training program Professional Experience Details

Consulting Gravity IT Resources (Jackson Health

Systems /Stax Payments)

2022 – 2025

Role Description:

Data Architect: Merchant Deal management, Payment/Refund processing, Card Program management Accomplishments

Architected Single Source of Truth (SSOT) 3NF data warehouse (Amazon Redshift)

Architected FACT / DIM data mart (Amazon Redshift) to support enterprise level KPI and standardized reference tables.

Erwin - created logical data model (LDM) using Amazon DataZone business term glossary for entity/attribute naming

Erwin – forward engineered physical data model (PDM) into PostgreSQL DDL for deployment to Redshift

Landed AWS Kinesis Data Streams into AWS S3 RAW data layer

AWS Glue to move data from S3 RAW layer to stage layer to SSOT layer (Redshift)

Used AWS QuickSight to report data integrity errors Role Description:

Solutions Architect: CRM, Provider & Patient analytics Accomplishments

Architected data universe to integrate and normalize data from various LOB applications (Cerner EMR, MD-Staff - Healthcare Credentialing, NBER (National Bureau of Economic Research).

Utilized NBER to correct missing and invalid National Provider ID (NPI) & State License Numbers

Provided CRM with the ability to customize aggregations of Diagnosis and Procedure codes

Resolved data anomalies such as missing and duplicate MRN (Medical Record Number) & FIN (financial number)

Data universe design was independent of application specific system IDs and utilized durable business keys (MRN, FIN, NPI, etc.)

Standardized Diagnosis codes to ICD-10 standard

Geocoded patient and facility address to determine proximity of patient care.

Created Geography dimension to analyze Patient encounters. Level hierarchy design included Facility Building Care Unit Room Bed level

Profiled patient utilization of ER care and demographic data to anticipate and recommend patient preventative care (population health care).

Ingested and normalized Press Ganey survey data to analyze against patient master data.

Participated in Snowflake POC pilot to determine data platform replacement capabilities (replacing on- prem SQL Server)

Centric Consulting / Breakthrough T1D (diabetes

research, formerly JDRF)

2021 – 2022

Role Description:

Data Engineer (donor formulary logic change)

Accomplishments

Analyzed code logic (SQL Stored procedures and Informatica code) to determine the reason for discrepancies between documented business logic and reporting output. Centric Consulting / Red Oak Sourcing 2021 – 2022

Role Description:

Azure Engineer (CVS Store implementation project)

Accomplishments

Migrated EXCEL project file project into Azure DevOps

Performed system table profiling to determine which objects were potentially impacted by project requirements)

Altered business logic of Azure Data Factory (ADF) pipelines feeding Azure Synapse Dedicated Pool to implement new dispensary routing logic.

Migrated several legacy Databricks jobs and datastores to Azure Instance using ADF.

Consolidated store routing logic into a single function call (previously a dozen lines of code copied 50+ places)

Decoupled source to target DDL maintenance. Target tables which were previously static are now created dynamically as per source changes.

Centric Consulting / American Modern Insurance

Group (AMIG)

2021 – 2022

Role Description:

Data Architect (Policy Coverage Premium model)

Accomplishments

Extracted data from TYPE-2 Enterprise Data Warehouse into fact dimension data mart

Consolidated fact table attributes into reference keys (along with accompanying dimensions)

Entities included Policy, Policy Term, Policy Risk, Catastrophe, Claim, Coverage, Producer, Producer Agency, Policy Payment, Actuarial group

TEKsystems 2013 – 2021

Role Description: CareSource (Dayton)

Regulatory Reporting, ETL design, data architecture Accomplishments

Provided gap analysis for missing MDM dimensions along with prototype designs that supported both star and snowflake consumers.

Redesigned Member Churn regulatory reports serving Medicare Advantage/Medicaid/DSNP programs across 5 states (OH, IN, WV, KY, GA). Replaced dynamic time schedule with standardized date dimension to ensure that zero activity buckets are explicitly reported, and discontinuation of coverage periods are accurately reported. Also, by implementing a standard date DIM, it eliminated the need for the consumer to explicitly know period-end dates (fiscal month-end, year-end, quarter-end, etc.).

Replaced batch loaded stored procedure design with parameterized table-valued functions (and indexed for performance). This allowed for real-time data reporting and eliminated the risk of batch job failures. One of the exposed input parameters was level of aggregation. This allowed the consumer to choose the aggregation level, reducing both the amount of data transported to the (often remote) client as well as the amount of client-side processing.

Documented complex target-to-source data mappings for Authorizations being reported from disparate data sources (Claims data mart, Analytics data mart, SAS code, staging tables, etc.). Provided recommendations for standardized 3NF & Fact / Dim design

Assisted DBA team to troubleshoot object and schema level permissions errors.

Revised framework to provide unknown or late arriving dimension keys thereby eliminating the need to perform costly LEFT JOIN operations.

Performed record deduping via analysis of surrogate key and hashed complex business key values.

Preformed reorg of TFS backlog by segmenting stories into epics and features. Implemented task assignment to track sprint progression and accountability.

Implemented XML pivot function to provide tabularized/flattened versions of reporting data (while the presentation team figured out how to do it using the reporting platform).

Created dedicated reference data schema that provided dimensions that are common to multiple lines of business.

Changed default isolation levels on batch loaded databases thereby eliminating the need to specify NOLOCK on every SQL statement.

Trained junior staff on how to save their import processes into DTSX files, thereby eliminating the need to manually run the data import wizard every time.

Trained junior staff members on how to perform their own meta-data discovery and data profiling via usage of INFORMATION_SCHEMA objects

Role Description: GE Aviation (Cincinnati) – Data Engineer Engines Modernization Operational Data Store ODS.

Accomplishments

Architected secure base data layer in which disparate HVR feeds of Covered Defense Information (CDI) were aggregated, homogenized, securely routed to a schema-based security model (Amazon Aurora PostgreSQL), and masked as per requirements. As per new regulations, this replaced the legacy process in which several teams of non-us workers managed data masking and CDI compliance. Source ingest data included vendor products such as PPM - Project Portfolio Management, Oracle ERP, Kronos Workforce Management, as well as several internal proprietary systems. This 3NF base data layer provided a single source of truth (SSOT) for future reporting needs as well.

Assumed release manager responsibilities for code promotion to QA and PROD. Instituted versioned release numbers and release logging. Applied PK to release log table to prevent duplicate release deployments. Tracked technical debt by release version (maintained via Rally Agile backlog). Instituted regression testing policy for changes to base layer objects. Published coding best practices checklist which included standardization of naming prefix/suffix, standardization of datatypes, minimization of run- time string parsing, and standardized reference data lists. Instituted code review and sign-off process.

Applied schema-based security model eliminating the need to micro-manage permissions at the object level.

Replaced hourly batch processing with real-time data through sequencing of trigger-based materialized views.

Published data dictionary via Alation articles.

Groomed and organized Rally Agile backlog by organizing stories into Features and Feature Sets. Role Description: Great American Insurance Group (Cincinnati) Data Architect / Data Modeler

Provided Primary full stack Data Modeling support for Application DB design, Relational Warehouse design, and various Fact/Dim Data Mart designs. Also acted as primary Administrator for ER Studio Repository and co-managed Master Data (MDM) / Reference Data enterprise architecture to serve as SSOT for consuming data-marts and applications.

Accomplishments

Created Enterprise Information Security Warehouse in Oracle 12c then migrated design to Cloudera Hadoop/Hive/Impala. This involved creating exception record sets of non-compliant data which I explicitly excluded from working set objects (as a substitution for constraint technology missing from Hadoop platform). In the requirements process, analyzed proprietary formatted security data (flat files, XML, etc.) and built consolidated data model focusing on intrusion point (Accounts, Technologies, Hardware, etc.) detection. Vendor sources included Qualys Vulnerability Management, Cofense PhishMe, Microsoft Intune Mobile Device Management, IBM Resilient Incident Response Platform (IRP), Symantec Data Loss Prevention (DLP), etc., as well as several custom-built security solutions.

Constructed and maintained Epics, Features, Stories, and Tasks via Rally CA/Agile. Maintained personal and team story velocity well above team averages.

Upgraded ER Studio Repository from version 9.1 to 16.5. Consolidated legacy (application specific) logical models into a single enterprise LDM. Constructed sub-model perspectives to represent application specific concerns.

Synchronized Collibra Data Governance Center business terms dictionary with ER Studio Domain Dictionary. Carefully associated physical names and logical datatypes to Domains, thereby promoting consistency across the physical layer of divergent DBMS platforms.

Implemented user defined logical data types and rules that could be automatically forward engineered into consistent DBMS specific types and constraints (as defined by business rules defined in the logical model).

Conducted model peer reviews to promote consistent use of ER Studio Data Dictionary objects (domains, UDTs, Rules, etc.).

Architected information (abstraction) layer, which exposed meaningful information to consumers absent of internal columns that should remain private to the source system. This provided several benefits such as providing a simplified security layer, removed the need for consumers to understand complex normalization designs (or under-normalized designs in many cases), and encouraged consistent use of business keys throughout the enterprise. This was accomplished using Functions, indexed schema-bound VIEWS, and aggregate tables (when absolutely required for performance).

Reduced type 2 dependency cycles by modeling only transactional data + reference keys whenever possible. This allowed applications and reports to provide real-time descriptions (and other extended properties) when displaying reference data.

Designed ISO 3166 Dimension to provide consumers the ability to support international postal addresses based on the existing domestic address structure.

Built simple integer-based type 2 key for Enterprise (Insurance) Class Code service that provided persistent point-in-time record values. This provided all insurance lines with the ability to perform internal audits and correct compliance issues before fines were issued. The type 2 integrity was maintained by performing hash comparisons and generating a new key when a meaningful correction occurred in the source. Project Description: Great American Annuity Group (Cincinnati) Enterprise Data Warehouse 2.0

Initial scope was a rebuild of existing architecture while applying best practices to data ingestion, master data management (MDM), and traditional multi-dimensional data warehouse architecture. Project direction changed 5 months in, opting instead for a gradual phased approach. Accomplishments

Moved agile team from 3-week to 2-week, and finally 1-week sprints.

Stories, Tasks, and Spikes managed via Jira projects, with careful story dependency for managing shared database objects.

Implemented peer review disposition status in Jira workflow to enforce coding and design standards.

Split existing architecture into discrete schemas form managing Stage data, ODS data (3NF) and Data Mart objects (Fact/Dimension tables), while still managing consistency via database level development objects (User defined types, functions, etc.).

Implemented new data staging/cleansing pattern which allowed for ETL developers to quickly stage heterogeneous data sources while maintaining full data lineage to source (as per configurable retention guidelines).

Implemented data replication (SQL Merge) with source Master Data Management (MDM) platform

(Magnitude Kalido), thereby qualifying all ODS data with enforced referential integrity (RI) with enterprise MDM source.

Refactored existing (unusable) type2 design into type1, modeled after business activity (not system activity), which allowed for differentiation between record corrections vs business event changes.

Worked to eliminate derived attributes by normalizing and strongly typing TABLE structures (in ODS), while insulating downstream consumers from such complexities through use of semantic information layer (selective schema exposer to data consumers).

Implemented User Defined Types (UDT) to enforce domain integrity and provide column name conventions.

A combination of strongly typed tables and UDTs helped to mitigate the need to manually maintain synchronization between our LDM and PDM Erwin models (which were constantly outdated due to Erwin R9 inability to persist inheritance relationships).

Replaced complex SSIS code with simple source (database VIEW) -to-target design patterns, thereby eliminating the constant need to perform SSIS impact assessments when making base table DDL changes. All fact/dimension/reporting views were schema bound to prevent DDL from unknowingly affecting downstream reporting processes.

Complex source to target ETL performed via native SQL MERGE, reducing the complexity of ETL code.

Implemented state/postal code validation preventing commission payments to wrong territory managers.

Implemented binary HashKey pattern to facilitate quick matching and change detection of large multi- million record datasets. This also had the effect of simplifying and improving the readability of merge code. Additional efforts were made to centralize the generation of shared fact/dimension tables so that surrogate keys were consistent across all data marts, as well as eliminating the need to merge changes to data mart tables (fast kill-n-fill)

Preserved lowest possible grain of source data, providing aggregation (often virtual) as per business need. Sometimes legacy data was pre-aggregated, and needed reverse engineered for migration. Project Technologies/ Products

SQL Common Table Expressions (CTE), SQL Pivot/Un-pivot, SQL Schema binding, SQL Merge, Binary Hashbyte conversion, UNION SQL VIEWS, Erwin Data Modeler R9, Erwin Model Mart, Logical Data Modeling (LDM), Physical Data Modeling (PDM), Entity Inheritance, Domain Data Typing, Data Item standardization, forward/reverse engineering, Reference Data, Master Data Management (Magnitude Kalido MDM), Meta-data management Project Description: Bank of America (Charlotte, NC) (Global Data Security / Third Party/Vendor Management)

Goal of GDS “Data as an Asset Project” was to create a centralized data store for ingesting security log and intrusion detection events, applying data and domain integrity rules, and associating that data to any master reference data. Role on the project was to design the Conceptual Data Model (CDM), Logical Data Model, and Physical Data Model(s) (PDM). Role on TPM project was similar, but with handoff of LDM to technology partner.

Accomplishments

Interpreted enterprise data standards and assessed compliance within various LOB applications

Installed and Configured Sybase PowerDesigner Repository for model/code sharing, versioning, and design document collaboration.

Performed source data profiling against Cloudera Hadoop staging data via Hive-SQL.

Developed LDM (Sybase PowerDesigner) to standardize several ingest sources/subject areas (Qualys Vulnerability Management API, InTrust Active Directory Event Log, Cisco Secure Access Control Server

(ACS), Portfolio Project Management (PPM), etc.). Later reverse engineered to CDM for alignment with Identity Access Management (IAM) Data Warehouse (Oracle Exadata).

Assisted infrastructure support team with configuring Kerberos Authentication (SPN registration & AD trusted for delegation settings).

Architected real-time data integration process between PPM (the GIS platform) and PPRT (the Enterprise platform). This was achieved via Linked Server authentication, which I also configured across all 3 environments (dev/integration/prod). The process was cross reference table driven so that additional data elements could be synchronized via the INSERT of meta-data translation records with zero code change.

Championed Informatica PowerCenter external reference data feature to leverage existing reference data sources (Enterprise and Line of Business), thereby limiting the proliferation of reference data sources within GIS.

Identified missing CHECK CONSTRAINTS, UNIQUE INDEXES, and USER-DEFINED TYPES (UDT), which were causing data and domain integrity issues the existing PPM data. If unaddressed, this would cause synchronization collisions/mismatches between PPM and the enterprise platform (PPRT).

Assisted Informatica design team with translating data type, inheritance, and normalization rules (defined within the LDM) into Informatica mapplets.

Identified several circular reference issues between the project/portfolio management platforms (PPM & PPRT), initiative funding platform (PCM), base funding platform (Clarity), enumerated risks (STORM), AIT audits (and PSIA – potentially self-identified audits), and General Ledger (GL), that was causing budgeting variances and project/portfolio/risk mitigation misalignment.

Performed business rules audit of existing custom third party management system, while evaluating platform for expansion to support all third parties (non-vendors). Business rule validation included (but not limited to) vendor role management and assignment with proper cardinality, risk portfolio and line of business alignment, proper ingestion of enterprise data sources (such as HR), and proper alignment of partner data with Dun & Bradstreet (DUNS) data.

Performed LDM modeling to provide business stakeholders a detailed yet non-technical visualization of how entity relationships (ER) & attributes relate across both the enterprise and our own line of business. This entailed reverse engineering several RDBMS systems from HR (Oracle), custom vendor mgt. system

(SQL Server), reporting data warehouse (Netezza), etc. Project Technologies/ Products

Toad Data Point 3.6, Netezza, SQL Server, Linked Server UNION SQL VIEWS, Hive, Hadoop, Cloudera, Hive-SQL, Scoop (Java), SAP Sybase PowerDesigner 16.5, PowerDesigner Repository, Conical/Conceptual Data Modeling (CDM), Logical Data Modeling (LDM), Physical Data Modeling

(PDM), Entity Inheritance, Domain Data Typing, Data Item standardization, forward/reverse engineering, Informatica PowerCenter, Informatica Developer, Reference Data, Master Data Management (MDM), Business Requirements Documents (BRD). Project Description: Schaeffer's Investment Research (Blue Ash, OH) CRM Data Architect, DBA

SIR utilized Microsoft Dynamics CRM 2011 as its ERP and order processing system. For product and marketing promotion delivery, several delivery platforms were used (CRM, Lyris List Manager, Gammadyne) Accomplishments

Daily ETL jobs that processed records from CRM to various delivery systems were a mixture of SSIS and .NET applications. Architecture was replaced by a single linked server view and native SQL Merge code

Marketing segmentation was previously hard coded in the daily ETL. Implemented a product-to-product relationship hierarchy which allowed marketing personnel to dynamically change market segmentation in real time by changing the product-to-product relationships.

New segmentation reference linked functions and linked server views which provided sales and engagement data in real-time rather than the overnight batch processing dataset as previously provided.

Unsubscribe (by sales channel) functionality was implemented for email opt-out compliance, which was previously damaging the client’s reputation with the email ISPs (HOTMAIL, GMAIL, YAHOO, etc.)

Legacy code referenced CRM base tables (not best practice) which resulted in UTC time conversion errors and poorly documented and vulnerable code (as it referenced private ID values rather than public code values). Implemented utilization of CRM filtered views which performed implicit UTC conversion and exposed business code values for metadata lookup (best practice). In order to utilize filtered views, remote queries are needed to utilize domain account authentication where-as SQL authentication was previously used. In order to accomplish this, I needed to implement Kerberos authentication (also more secure).

SQL Performance tuning on CRM advanced finds for CSRs and Sales

Disaster recovery (DR) upon server hardware failure.

Security audit of missing and blank legacy SA passwords

Implemented Domain Group authentication on SQL Servers. No more stand-alone logins.

Implemented least privilege security model for application authentication. Applications were previously authenticated as SA.

Managed offshore resources (Eastern Europe) for production DBA support (backups, workload tracing, legacy server retirement, patching, etc.).

Occasional SSRS support when my report writer went on maternity leave. Project Technologies/ Products

SQL Server Integration Services (SSIS), SQL Server Management Studio (SSMS), Lyris LM (List Manger) Service, Gammadyne, Microsoft Dynamics CRM 2011, SQL Server Reporting Services

(SSRS)

Sogeti (various clients) 2012 to 2013

Project Description: Ohio Department of Transportation (ODOT) Speed & Travel Time Data Warehouse / Business Intelligence

ODOT receives near-real time S&T data from 12,000+ cameras from throughout the state of Ohio. This data needs to be aggregated, run through a business rules/confidence assignment process, and written to several LOB and Reporting PerformancePoint Dashboards & Scorecards. Accomplishments

Re-architected data warehouse STAR schema (and SSIS ETL) to allow for 5 min data granularity (was previously aggregated to the hour).

Made corresponding changes to SSAS data cube and time/calendar dimensions

Implemented Cube Partitioning which decreased hourly cube refresh time by 60%

Placed existing reporting solution (SSIS & SSAS) under source code control (Team Foundation Server)

Implemented SDLC lifecycle process that allowed team to test and promote changes from development to production environment(s)

Created process for automating the generation of future time dimension data.

Identified missing levels in Road Segments dimension.

Altered internal data feeds to leverage existing data assets and eliminate reliance on several 3rd party leased data subscriptions.

Identified broken Unicode ETL conversion (Oracle to SQL) Project Technologies/ Products

SQL Server Integration Services (SSIS), SQL Server Analysis Services (SSAS), Team Foundation Server (TFS), SharePoint PerformancePoint, SQL Server Data Tools (SSDT), Business Intelligence Development Studio (BIDS), Oracle

Project Description: BMW Financial Services FDIC Compliance Subject to FDIC compliance, BMW was required to alter (mask) Personally Identifiable Information (PII) in all non-production systems to protect customers from fraud/identity theft. Accomplishments

Performed data discovery across 107 data stores throughout BMW North America to identify common data elements and dependencies.

Worked with business stake holders to identify appropriate data masking techniques which would protect customer anonymity while maintaining usefulness of the data within the non-production system.

Automated discovery result evaluation process so that it could be run incrementally going forward for purpose of identifying new data elements subject to PII regulation. Project Technologies/ Products

SQL Server 2008, ER Studio, Camouflage (Data Masking Tool) Consulting (Boy Scouts America, Tampa/Dallas) 2011 to 2012 Project Description: Boy Scouts of America Service Oriented Application BSA is replacing existing Oracle Forms application with “Boy Scouts of America's Service Oriented Architecture (BSASOA)” application. As lead data architect, Mr. Curcillo was responsible for designing the enterprise data model to support lead procurement, youth member / adult volunteer registration, unit management, order management, and organization management. New solution hosted in SharePoint 2010 on SQL Server 2008.

Accomplishments

Gathered data requirements from multiple business units (unit leaders, order fulfillment, finance, etc.)

Perform logical and physical modeling (using Erwin) of enterprise data store for Boy Scouts of America's new Service Oriented Architecture (BSASOA) application.

Developed ETL for legacy data migration of youth/adult membership information, member accomplishments, organization structures, etc.

Administered / supported multiple SDLC environments in Dallas, Tampa, Toronto, & India. Project Technologies/ Products

SharePoint, SQL Server, Oracle, Erwin, Redgate SQL Compare, Attunity SSIS CDC IGS Energy (Columbus OH) 2002 to 2011

Project Description

The scope of was to build a custom CRM (.NET) solution to replace existing MS Access/SQL/VBA application. Accomplishments

Performed both logical and physical database design for CRM, Contract Pricing / Product Management, Transaction Management, Prospect Management, and Reporting (Crystal Reports, SSRS, etc.)

Managed 20+ database SDLC environments supporting custom applications and vendor solutions.

Installed / Managed 2 node SQL Failover Cluster for high availability

Production



Contact this candidate