John C. Menard
**** ** **** ******, ****** City, MO 64151 816-***-**** ***********@*****.***
SUMMARY:
An Information Technology professional with more than 35 years of extensive experience in all phases of the Software Development Life Cycle from requirements, design through coding, testing, deployment and the maintenance of the various applications or systems. This includes designing data structures to support enterprise OLTP and OLAP systems.
A Data Modeler/ Data Architect with 20 years of strong hands on experience in creating Conceptual, Logical and Physical data models for enterprise systems in many DBMS environments. Modeling tools include System Architect, ERStudio, ERWin, and COOLBIZ.
10 years as the Data Warehouse Analyst at DST designing data warehouse/data marts for our Mutual Fund customers to do their own custom queries for reporting.
20+ years using reverse engineering processes to convert legacy systems to newer data structures. I was assigned to many projects where we migrated data from the companies we acquired. My responsibility was to get the new companies data migrated into our systems.
As a Data Modeler/ Data Architect, I created/updated data models in 3rd Normal Form using modeling tools to support enterprise legacy systems. Also, created/updated many data warehouses and data marts to support corporate reporting requirements.
A working knowledge of ETL processes, including defining source to target mapping, creating/defining transformation rules and defining staging areas for loading data at the target location. I used tools such as MS SQL Server, Oracle's PL SQL Developer and Informatica's Data Profiling module to do the data analysis of the data to find the source system data.
Extensive experience in writing system specifications, translating user requirements to technical specifications, created/maintained/modified database design documents which included detailed descriptions of logical entities, relationships, attributes and conversion to physical table specifications.
Ability to communicate with technical, non-technical, and senior management audiences. Have been responsible for requirements gathering with internal and external clients on many large projects.
Also, taught “Introduction to Data Modeling” for 9 years to IT and Business employees to give them an overview on how data models are created.
Experience:
March 2016 – Present – AT&T Boise, ID.
Working as a Data Analyst/Data Modeler for AT&T on a multiyear project to reverse engineer hundreds of AT&T’s legacy systems. We are creating complete data models with definitions for these legacy systems to feed them into an enterprise metadata repository that will be used for reporting and impact analysis. Data models are created using the Erwin Modeling tool.
June 2015 – December 2015 - Micron (Experis) Boise, ID.
I was the Business Intelligence Analyst working on the Procurement Initiative Project. I was responsible for creating the Source-to-Target mapping for all data from Micron's source systems into their SAP data warehouse. Created Data Flow Diagrams, Data Models and Process models using Enterprise Architect. Responsible for coordinating all deployments through the environments to move changes from development up to the production systems. This included SSIS packages and SAP transports. I also collected and documented all business and technical requirements for the project.
June 2014 – June 2015 – National Association of Insurance Commissioners (NAIC) Kansas City MO.
As the Data Modeler/Architect, I was responsible for all of the Data Structure Design and Data Migration PL SQL scripts to develop and convert the NAIC's legacy State Based Systems to a new version for one of four teams. I worked with other Data Modelers to coordinate and maintain migration scripts from the development platform up the different environments for the new system. For the data modeling portion of the job, I use the Erwin modeling tool to create logical data models for the subject areas our team is responsible for. All load and migration scripts are created using PL SQL Developer.
July 2013 – June 2014 – Blue Cross Blue Shield Kansas City MO.
As the Metadata Analyst, I used Erwin to create data models for inbound/outbound data structures which provide the target for information coming into and out of Blue Cross Blue Shield KC. I work with Business Data Analysts to perform source to target mapping and create transformation rules for all data being processed through the DataStage ETL tool. Create XML from Erwin to load the metadata into a home-grown mapping documentation tool and DataStage’s ETL developer environment. As part of the Data Governance group, I approve/reject all projects mapping, the approval process checks that standards are followed to enable efficient and effective use and reuse of the company’s data.
November 2012 – July 2013 – ConocoPhillips, Houston, TX.
Data Center of Excellence (DCOE) Information Architect – Accountable for the development of Data Standards, Definitions, and Rules. Performed analyses and alignment of data movement across various corporate systems. Conoco Phillips has hundreds of systems that contain data that was not kept in sync across the corporation. They purchased the Informatica suite of components to allow the DCOE to do analysis and design of a Data hub and an MDM hub. For my role as Information Architect, we used some of the components to do data profiling and SOR (System of Record) assessments across multiple systems to determine which systems contained what was called the golden records. After determining the Source systems, we used Informatica to develop ETL processes (including transformation rules) to move data from the source systems, through our new Data and MDM hubs. We also developed processes to update all systems data to keep all corporate systems master data in sync. Since Informatica was a new tool to ConocoPhillips, we used PL/SQL to write our own queries to validate the information produced in the tool. Created Data Flow Diagrams and performed Data Modeling for new and existing ConocoPhillips systems, including some reverse engineering of systems not previously modeled.
December 2011 – September 2012 – Accenture Contractor for Shell Oil, Houston TX.
OGEAP Enterprise Data Architect – Enterprise Data Architect for Shell’s Commercial to Finance (C2F) project. In the Data Architect role, I created Conceptual and Logical models for Shell’s Onshore Gas Enterprise Architecture Program(OGEAP) that included entities across the entire Shell Value Chain. These data models became part of Shell’s new Data Architecture and were the first steps in creating an enterprise reporting warehouse for Shell. Additional responsibilities include performing technical reviews for team members, business SME’s and Shell stakeholders to ensure consistency and accuracy of the data models for each business area. Involved in reverse engineering of existing application data structures that will be included as part of the C2F program. Provided input for the creation of the Data Landscape Guide which included input for Data Governance to define Data Definition Owners and Data Value Owners for Shell’s Master Data. Provided entity list and conducted meetings with SME’s to determine CRUD Matrix’s in the current processes and the to-be processes.
2001-2011 – DST Systems Inc. Kansas City, MO.
· Senior Data Modeler/Analyst - Responsible for gathering data requirements for multiple projects (anywhere from 25-35 projects at a time) across multiple DST products. Products include TA2000, TRAC-2000, Voice, FanWeb and our corporate systems applications. This requires meeting with project teams, analyzing business and technical requirements, and building logical and physical data models using ERStudio. We delivered DDL in 3rd normal form to our DBA’s to produce tables for our project teams to meet their business requirements. I also taught “Information Modeling” to DST technical associates.
DST’s enterprise systems are run on IBM DB2 platforms. Our data models are maintained and updated using ERStudio. After our project data models go through DST’s review process, they are integrated into the enterprise model for each of the systems. All enterprise data models are available to the business and development community at DST. As a Data Modeler, our responsibility is to create DDL from the tool to be submitted to our Database Services group.
I was also the Data Analysts assigned to support our PowerSelect product. PowerSelect is a data warehouse that allows our clients to write and run their own SQL queries against the PowerSelect tables for their reporting purposes. As part of my role, I was responsible for working with the project teams to identify their data requirements, develop data solutions for the extract, translate and load procedures used by PowerSelect. I was also responsible for creating all views for the PowerSelect product.
1993-2000 - Haldex Brake Systems Kansas City, MO.
· Project Engineer/Manager Customer 2000 Data Warehouse - Lead a team to define internal and external customer requirements. Designed, developed and implemented a customer information data warehouse. This cross-functional team consisted of personnel and management from Credit Services, Customer Service, Sales, Marketing, E-Commerce, and Manufacturing. Team discovered 21 separate customer data sources throughout the corporation. We developed a single database and converted the customer data from the 21 data sources. This is a multi-tier client server system, utilizing Windows 95 / 98 for the client, Windows NT for the middle tier, and SQL Server 7.0 for the server side of the system. Database was designed and maintained using Erwin and all requirements were maintained using Requisite Pro.
· Direct Connect - An online character-based system that allowed external customers to dial into Midland s order entry system through General Electric Information Systems (GEIS). This online system allowed our customers to place orders, check order status, back order status of parts, and stock availability.
· Sales Analysis/Sales Reporting Data Warehouse Created a multi-tier system to store, query, and retrieve corporate sales information. This database was defined through a requirement gathering process with several small cross-functional teams and was used by management and sales force to analyze sales information for the company. Reports could be generated by customer, date range, part number product line, market, buying groups, and any combination selected by the Haldex user.
For the systems above, part of the development required “SQL on the fly” be incorporated into the system to do the users queries. This took the selected parameters that the end users requested and created SQL to be built and submitted within the application to retrieve the results expected by the users.
EDI Project Manager Worked for a sister company in McHenry, IL. Developed interface applications to process all EDI transactions for Brake Parts, Inc. Data was feed / extracted from the corporate legacy order entry system and sent to or received from external customers through VAN s or FTP.
Education:
· BS Management, Computer Information Systems - Park College
Languages / Tools/ DBMS
Delphi, Modula, Pascal, COBOL, VAX BASIC,
MS SQL, PL SQL Developer, SAP BW
System Architect, ER/Studio, Erwin, CoolBiz
Informatica, DataStage
DB2, DB2 UDB, Oracle, MS SQL Server