Michael Lucio
Sparta, New Jersey
*********@*********.***
Experience Highlights:
SUMMARY:
Highly accomplished manager with proven success in leading and creating strategies to integrate disparate computer systems by leveraging best-practices, which include CMM and Master Data Management (MDM) to allow Enterprise-wide consolidation of source system data for globally distributed organizations.
Independent consultant with more than 35 years' experience leading people and delivering clear and simple solutions to complex data management challenges; including rescuing at-risk projects for some of the world’s largest companies.
Primarily focused on Master Data Management, Data Architecture, Real-Time Analytics and Project Management
International Off-Shore Management
Multimillion Dollar Budgets
Gramm-Leach-Bliley Financial Modernization Act
Data Manager Experience: Over 30 years total experience. I have been focused almost exclusively on leading large systems integration efforts and often requiring the adoption of formal project management SDLC methodologies.
Master Data Management (MDM) Expertise: Success in the deployment of a Master Data Management (MDM) solution allows an organization to realize opportunities not present with traditional integration (i.e. cross-selling and fraud detection). However, leading these efforts requires a unique mix of technical, project management and political skills.
Data Analysis: I have extensive data analysis experience in multiple Data Base Management Systems. These include DB2, Oracle, IDMS, VSAM, Sybase, Teradata and Model 204.
Database Design: I have been responsible for system and data architecture, design, logical database model and physical database design.
Project Manager Experience: Over 30 years total experience. I have been focused almost exclusively on leading large systems integration efforts and often requiring the adoption of formal project management SDLC methodologies.
Work History:
(02/2024 – 01/2025) Montefiore Medical Center – New York, United States
Data Technical PM
Montefiore Medical Center is a premier academic medical center and the primary teaching hospital of the Albert Einstein College of Medicine which is expanding rapidly through acquisitions of smaller providers. They use the International Classification of Diseases (ICD) for classifying and coding health information, social circumstances of the patient and causes of diseases, injuries, or death. ICD-11 will replace ICD-10 in January 2022.
I was hired to lead the project which combined the multiple legacy systems from recently acquired health care providers into a new single source of truth while concurrently deploying the new ICD-11 as required by the Electronic Medical Records (EMR) Mandate which incentivizes and funds healthcare professionals using EMR.
Developed two project plans, for Master Data Management (MDM) and adoption of the ICD-11 revision which were successfully executed at the same time using a blended onshore/offshore team.
Designed the ETL workflows which were then deployed using Informatica Power Center 10.4.1.
Managed an offshore ETL team of four people doing the Informatica Power Center coding in India
Developed detailed profiling (SQL) scripts and data quality validation workflows for both the MDM project and the ICD-11 upgrade project. Created complex test cases which were managed using on in-house tool.
Hired, managed and trained 5 quality assurance people to profile the legacy source data, the target MDM data, the consistency between legacy and MDM, and report data quality issues.
Was in charge of Data Validation. Making sure what we moved from the legacy systems were loaded into the new system
Verified that Target database design could accommodate the business requirements. Modified as needed.
Responsible for JIRA implementation
Dynamic Database Systems
Data Architect/Project Manager 12 /18 – 09/24
Designed and deployed an Enterprise Information Integration & Management Framework meant to manage the entire lifecycle of their data - from receipt to its retirement. This new system, data governance framework and all required business processes sat atop Oracle 11, Oracle Business Intelligence Enterprise Edition (OBIEE), Oracle Data Integrator (ODI), PHP for GUI screens, Chameleon Metadata as its metadata management layer, Global ID's as its data quality/profile/matching engine (more on Global ID's - see http://globalids.com/), and Navicat/Toad as analysis and data modeling tool.
Provided all the capabilities needed to integrate multiple legacy systems around a set of Master Data subject dimensions (i.e. Product, Source, Supplier, Agreement, etc.) and classification faceted look-up (including lineage and cross-referencing of historic legacy data). Designed Logical and Physical Database to house the new metadata. Tools Used—Navicat, Toad, Heavy SQL User. This project was an exercise in multi-domain MDM Platforms. It was easier to ensure the standards were met and there was only one repository for the Master Data. Our architectural vision was to organize the “old” and “new” features into a integrated and manageable solution. At Dynamic Database our job was to give developers a way to design scalable applications. We did this by decomposing applications into services by business functions. The data feeds came from various parts of the world and organizations. We had to understand each piece of data and make it available. At Dynamic Database Informal Heatmaps were used to track user behavior. We used ETL Technigues to extract, load and transfer 18 feed systems from 18 different platforms and we created a 19 system with translated uniform data for use by customers. We used “Global ID’s” software tool and Informatica Powercenter for Data integration.to match then enrich to data. Modelled the logical target database. Then designed physical Oracle database. We loaded translated data in this Oracle database and used Python code for access. Had to bring the business and tech teams together accept the business needs surrounding the enterprise data. Specs were written for the ETL Informatica Powercenter team to transfer the data into usable information for the business.
Merrill Lynch/Bank of America 12/17 – 11/18
Data Manager/Project Manager
Initially contracted to provide leadership and install IT governance to a series of application groups, my assignment charter changed after several months.
I was given a development group of both onshore and offshore resources. These resources supported a series of distributed applications. As part of my responsibility I was given the environments that supported these applications also. The UI was DotNet framework, C Sharp, and Java. The middleware consisted of Tibco servers.
My job was to manage the resources, install process, and upgrade the environments.
As I managed the tasks before me, my responsibility grew. Merrill Lunch/BOA was creating a Centralized Production Support (CPS) Team. I was put in charge of building and managing the organization.
Using a matrix organization plan, the main functions of the CPS consisted of Triage, Monitoring and Reporting, and Remediation of Defects. As applications moved into PROD, they were turned over to CPS after complying with a list “On Boarding” criteria developed by my teams.
Prudential Financial @ Roseland, New Jersey 02/15-11/17
Chief Architect / Management Consultant
Retained by Prudential Financial to salvage (under budget within 18 month project schedule) a 10-year attempt at converting mission critical licensing and agent compensation data store from IDMS to DB2.
Required collaborative teamwork with the Legal Department in the interpretation and implementation of Internet securities offerings exempted from § 201(a) and § 201(c) of the Uniform Securities Act. Due to quickly changing interpretations by the States, government regulatory licensing and license delegation through registered hierarchies required particular focus as did compliance with regulations restricting Agent, Broker and Broker-Dealer offering of insurance/securities products to those licensed and registered correctly.
Created, implemented and lead a global virtual team and delivery software development model to convert the Agencies Database (ADB) system from IDMS to DB2. Created organization that leveraged the time difference between India, Ireland and Roseland, NJ to allow for 24 hour per day software development, testing and installation of applications being coded in India tested in Ireland and installed in their Roseland, NJ data center within Software Engineering Institute’s (SEI) Capability Maturity Model (CMM) framework.
Managed a team of DBAs that assumed all DBA functions for the ADB organization. Subsequent responsibilities included training, organizing and assessing the effectiveness of a team of programmers sent from TATA Consulting Services (TCS) in India to learn to provide database administrative services once the DBA function was transitioned.
Managed team that directly connected the ADB mainframe DB2 data stores to Prudential’s award winning Prudential Xpress website via MQ Series and LU 6.2 message request and resolution approach to allow authorized agents to perform most application, present product illustrations and locate Broker-Dealers authorized to distribute the Prudential Variable Portfolio via standard internet interface. This was a major improvement to the time lag associated requiring agents to gather data in the field and complete the application process once back in their offices.
Lead team of data modelers responsible for the new agents database in DB2. This included physical design.
Responsible for 15 million dollar budget and staff of 110 located in India, Ireland and Roseland, NJ.
Provide leadership to United States, European and Asian outsourcing partners including the implementation of formal communication and verification procedures to minimize the impact of cultural and communicative differences between the different groups.
Negotiated mutually agreeable terms with outsourcing partners related to fixed price project contracts, time & materials contracts and their associated staffing requirements.
Resolved inconsistencies, as final arbiter, between global team participant organizations as the Ireland, Indian and Roseland, NJ participants had their own distinct, inconsistent schemes for CMM compliance.
Led effort to convert existing DB2 Agency Systems to Oracle Party Systems.
Led technical teams that reduced DP expense by $5 million over a 2-year period.
AIG
Database Architect/Project Manager 04/13-01/15
Led teams that designed and implemented a Managed Care Data Warehouse. Responsible for Data model and physical database.
Led teams that built 'feeds' from various Managed Care Organizations to house their data in our warehouse.
Led team that developed reporting applications to return the data to the users.
Responsible for reviewing system and data architecture, design, logical database model and physical database design. LBMS was used.
Responsible for staffing of teams to accomplish the various projects.
Developed 'outsourcing' vendor partners for programming.
Investigated various repository 'tools' before selecting LBMS for the first phase and Rochade
as the long-term tool.
Lead JAD sessions with other system and user groups to develop the requirements for this repository.
erwin Data Modeler was used to depict the logical data model
Prudential Financial 06/00-02/13
Chief Architect / Management Consultant
Retained by Prudential Financial to collaborate with its Legal Department in the interpretation, design, implementation and compliance measurement with §7702 and §7702A of the IRS code and well as subsequently adopted Revenue Procedure 99-27 regulating compliant treatment of premium payments.
Created framework of procedures, implementing those procedures, having them audited and confirmed Level 2 by an independent SEPG unit of Prudential in preparation of Software Engineering Institute (SEI) CMM audit. Pre-audit activities included educating executive management in CMM requirements, soliciting their input and incorporating that input into a CMM plan adopted by the enterprise.
Led the redesign of CICS/DB2 architecture exclusively supporting one product line to exploit MQ-series for use by the entire enterprise for requests from any product/platform.
Assumed responsibility for DB2 / COBOL II / CICS application consisting of OLTP and batch application components for automatic identification of policies used by customers as investment instruments to track taxable implications of changes in policy cash values resulting from customer deposits or withdrawals.
Led team that successfully reduced pre-existing bottlenecks to allow the entire batch stream to complete in approx. 33% of elapsed time when compared to pre-tuning runtime measurements.
Incorporated changes into the LBMS data model and generated physical database to completely satisfy IRS, state and business community requirements after the first production release went live.
Led team that developed procedures for Database Backup/Recovery and capacity planning.
Responsible for reviewing system and data architecture, design, logical database model and physical database design.
MCS Computing, Inc 02/82-05/00
Senior Consultant
Data Management – Managing Data teams of DBA’s and Data Analysts
Project Management – Managing various project teams
Data Modeler – Multiple projects required database creation. Both logical and physical
Data Analyst – Logical and Physical database design
Database Administrator - IDMS DBA
Project Leader – Leader on various projects
Education:
1986 New Jersey Institute of Technology
MS Program in Computer Science
1973 University of Vermont
BS in Economics
References: Available upon request