Post Job Free

Resume

Sign in

Data Project

Location:
Tampa, FL
Salary:
1
Posted:
October 08, 2018

Contact this candidate

Resume:

Objectives

IT Technical Lead-Architect-Project Manager with over 18 years of solving complex and mission critical problems. Leading teams of management, development, infrastructure and testing. System Architect and Former Computer Science Professor: with the soft skills, the experience and the knowledge of management, analysis, design, architect, development, coding, testing, infrastructure, training and setting standards. Solution Architect, Innovator and Strategic Planner: creates next-level intelligent solutions and architects-designs solutions for Big Data, CRM, Business Intelligence, Analytics, Data Streaming, Cloud, Governance, Next Generation of Cyber Security, Web-Mobile-Cloud Services and Turnkey DevOps Services. Business Owner: owns, operates and manages IT consulting companies plus family business. Technologist: embraces leading-edge technologies and methodologies. Project Manager: works with clients, investors and prospect projects to develop business plans, proposals, prototypes, management plans, proof of concept, balance sheets, taxes, and works on projects budgets. Visionary: architects-designs-develops intelligent platforms-frameworks, security, dynamic business rules and next generation of intelligent data services. Team Player: inspires teams and brainstorms solutions to harness the team brainpower. Excellent oral and written communication skills with the ability to present ideas and build consensus when working with clients/teams, and engagement. Demonstrates exceptional organizational and interpersonal skills, professionalism, resourcefulness, and a strong sense of accountability. Experience in grasping issues quickly and making educated, critical judgments in the absence of complete information.

EDUCATION and PROFESSIONAL DEVELOPMENT

Earned a Masters degree in Computer Science from University of Illinois.

Earned a Bachelor of Science in Mathematics & Chemistry from University of Illinois, Chicago, Illinois.

PROFESSIONAL AND BUSINESS EXPERIENCE

June 2018 to Present – Services Map Platform (Tampa) Position: Architect/Analyst/Developer/PM

http://gdprarchitects.com/SMP_ArchitectSummary.html

Global institutions (including financial) handle their growing Big Data and distributed system by employing multiple data centers, active sites (near zero downtime), Bi-directional replication, security, web services, cloud platforms and web-mobile services. Their major issues of implementation and coordination are the use of different technologies, platforms, frameworks, data exchange and vendors' support. On/offshore teams of infrastructure, development and testing are additional constrains. These institutions are not in the software business and they end up building throw away software system. Investing in mapping new and existing services for reuse would reverse such costly trends and provide reusable platform for future products and services.

Services Map Platform (SMP) is an intelligent platform which harnesses virtualization, cloning, DevOps and reusing of existing services. It maps the existing structure, new services, business processes and dynamic business rules for maximum reusability.

SMP has Vertical and Horizontal implementations. Vertical implementation is done by building virtual servers as services. SMP implements cloning of services (servers) to handle load balance of high web traffic. Horizontal implementation is done by mapping the existing and new services with intelligent data exchange, security, Dynamic Business Rules, and web services.

Data exchange is also a virtual service (server) with parsers and convertors to handle any data exchange parsing and formatting.

Security is implemented as hardware and software security layers.

Mapping of Services is implemented vertically for each tier and its running virtual servers regardless of underling technologies and frameworks (Java, .NET, Spring, Web Services (RESTful), Oracle PL/SQL, WebSphere, Angular, jQuery, JavaScript ..etc).

SMP's DevOps provides the underling infrastructure and implements virtualization and cloning by building virtual environments and their tiers (production, testing, database, operation system, batch, bridges, ..etc).

SMP's DevOps provides Turnkey DevOps Editors which are used by any of the institution's departments to dynamically and in real-time build and manage their virtual infrastructure environments (production, post-production, testing, batch).

The data exchange encompasses JSON, Text, XML, Java Data Access Object, Message Queues, C-Tables or any data format. The data exchange also provides data exchange parsers and convertors to resolve any data exchange issues and communication.

SMP data exchange convertors build XML format with No Schema regardless of its sizes and complexities.

Automation of services mapping of existing and new services can be achieved by creating Virtual Services Matrixes, which contain services categories, networks-IP addresses-protocols (local or remote), data exchanges and exceptions-errors handling.

Building virtual servers for testing Virtual Services Matrixes.

SMP's architecting-managing requires clear vision of the target goals, ownership and accountability. Building effective partnership across organizational boundaries, coordinating, planning, and tracking of risks, and control. Collaborating with change management and the various lines of businesses. Creating metrics to measure the impact and effectiveness of controls and users' experience. Building central point of contact, as well as coordination and planning. Developing processes and procedures to identify the efficient performance and innovate approaches. Managing communications, collateral, and coordination of training.

Building a Pilot Project as a starting point, learning curve, developing services maps, environments, data exchange and testing.

January 2017 to June 2018 – Intelligent CRM Metadata (Chicago) Position: Architect/Analyst/Developer/PM

http://gdprarchitects.com/ + http://crmmetadata.com/

The General Data Protection Regulation (GDPR) is a regulation in European Union (EU) law on data protection and privacy for all individuals within EU. Our website (http://gdprarchitects.com/) is a presentation of our solution architects addressing:

Critical Components: Cookies, Intelligent DAO for Security, Tracking, Dynamic Business Rules, Database Visualizer

End-to-End Solutions: Security, Data Centers, Data Streaming, Decision Support System (DSS) Dashboard, Web Services

Spring Framework Replacement: Business Intelligence Framework (BIF)

Management, Plans, Documentation: Management Knowledge Base (Science of Project Management), Documentation

Testing. Testing Plans, DevOps Editors, Object Oriented Design (OOD) Testing

Duties include: Search Engine Optimization (SEO) and providing architectural answers to companies complying with GDPR.

Our Intelligent CRM Metadata (ICM) is our independent project and a part of our Post Office (PO) Project. Our approach to Big Data and all its issues is to replace database tables with XML files. Any existing Big Data (structured, unstructured and Legacy System) must be converted to our XML file (No Schema) regardless of its sizes and complexities.

Our ICM Approach is to revolutionize data from being dummy stagnant storage to an Intelligent Data Services which automates, parses, thinks in abstract, makes decisions and gives options. Our Intelligent Data Access Object (IDAO) is a memory resident data object which is an intelligent interface between our XML Database files and any running application.

ICM Big Data Conversion: Big Data Conversion is not a simple task ; its size and complexity are growing exponentially. Our ICM project's goal is converting Big Data into XML-DAO (using Java and SOAP) which is processed faster and stored easier. ICM http://crmmetadata.com/DAO_dataStructurePage.html page presents our approach which simplifies Big Data Conversion.

Duties: analyze-architect-develop Intelligent Big Data Parses and Converters using Java, SOAP, XML and Design Patterns.

We built, documented and posted our first test project on our ICM DAO Data Structure page which includes input files (Excel sheet and Business Token ID), First Seed Framework (Java code-packages-classes) plus SOAP-XML and DAO as output files.

URLClassLoader, JavaCompiler, javax.xml.soap and Refection were used to dynamically create ".java class"and XML files.

Implemented our automated Dictionary approach to dynamically handle Business Rules and Buzzword by using Token-Value text files. These files would be automated to add or audit new and existing parameters and buzzwords.

We generated and posted our JavaDoc Reference for this project: http://crmmetadata.com/javadoc/index.html

ETL Intelligent Data Warehouse Framework: analyze-architect a framework for extracting (structured, unstructured and Legacy) data, parsing-converting-structuring data into IDAO-XML storage using SOAP and Dynamic Business Rules. Our attempt is to replace Informatica PowerCenter with our ETL Intelligent Data Warehouse Framework.

Test Data: we had chosen excel sheets as the projects first input data due to the fact that Excel sheets are unstructured and have no fixed standards in their creation or format, therefore they bring the biggest challenges to our parser and converters. There are thousands if not hundreds of thousands of excel sheets with critical data which plays a big part of the Big Data dilemma.. Having copies of excel sheets to test our parsers and converters do not require any special effort as in the case of databases.

ICM Matrix (Indexing and Hashing): ICM has additional intelligent memory resident transactional Java objects, which are composed of Index tables and Hashing tables. The key-value of using these tables is fast, efficient and economical lookup of values without any database access. Based on business and product statistics, certain data values are loaded in these tables.

Testing and Our Intelligent DevOps Editor: (http://crmmetadata.com/IntelligentDevOpsEditorPage.html) we architected an Intelligent DevOps Editor as a "Turn Key" development and testing environment tool which would include all infrastructure components; testing is the key to ICM conversion success. It is critical that we identify what tastings' needs and the frequency.

Virtualization: ICM is using Virtualization to build DevOps Editor as a "Turn Key" development tool (see DevOps page).

ICM Framework and Prototypes: Our task is to architected, design, develop, and test ICM Seed Framework with running prototypes which would be expanded and modified based on the business processes ( see DAO Data Structure page).

ICM Website: ICM site is designed and documented using our Project Framework and it covers the project details to give the site visitors a view of how we handle our projects. The footer of each page lists our frameworks project folders: Analysis, Data Structure, Design-Architect, Development, Testing, Management and Cost. The site content is an ongoing task/work in progress.

Technology Used: Java, SOAP, XML, Web Services, Eclipse, NetBeans, JavaDoc, Servlets, SQL, MySQL, IIS, Apache, Tomcat, JDBC, Oracle, NAS, IBM MQ, SFTP, NDM, SAN, ETL, Unix, Redhat Linux, Solaris, Windows, Golden Gate.

May 2016 to January 2017 – Bank of America USA (Charlotte) Position: Infrastructure Architect

Technical Engineering & Architecture (TEA) is a branch of the Bank of America Global Wholesale Banking Technology (GWBT). TEA is responsible for architecting, building, managing and maintaining the entire Bank of America (BA) software (Clusters virtual and bare metal servers) infrastructure and data centers. TEA’s services include working with Development, Testing, System Infrastructure, Sales Infrastructure, Credit, Client, Treasury and rest of BA departments.

Working with other BA departments’ teams to create Technical Requirement Document (TRD) and Non-Technical Requirement Document (NRD), Proof of Concept (POC) and other TEA Design Patterns docs.

Analyzing, Architecting, documenting and tracking the building and execution of the Projects’ Infrastructures.

Working with Infrastructure and admin teams to insure the timely success and performance of all BA's data centers.

Working with other BA’s teams to service both low level, production and post-production Environments.

Performance includes the understanding and working experience with concepts, architecting, building, documenting, interfacing and troubleshooting what are called the Swim Lanes:

Presentation Zone Firewall (DMZ)

oWeb Services – IIS, Apache, IHS (IBM HTTP Server)

Secure Zone Firewall (DMZ)

oApplication Server – Weblogic (JVM, JRE, JDBC, SAN), WebSphere, Tomcat

Baronet Zone Firewall (DMZ)

oDatabase – Oracle, EXADATA, Netezza, SQL,

oCitrix – Virtual desktops and applications to any device over any network from the data centers

oMessaging – IBM MQ, Microsoft Messaging Service

oMule- Mule is a lightweight enterprise service bus (ESB) and integration framework

oFile Transfer – SFTP & NDM, SAN, NAS, DTS

oETL – Informatica

oOS – Unix, Redhat Linux, Solaris, Windows

oMulti-Node Local High-availability (HA) - Multiple Data Centers (Near zero downtime)

oGolden Gate Bi-directional replication

oActive in two sites

oMonitoring

Technology used: Citrix, Web Services, IIS, Apache, Weblogic, JVM, Tomcat, JRE, JDBC, Oracle, EXADATA, Netezza, NAS, IBM MQ, SFTP, NDM, SAN, ETL – Informatica, Unix, Redhat Linux, Solaris, Windows, Golden Gate.

January 2015 to May 2016 – CRM DATA Farm (Chicago) Position: J2EE Architect/Analyst/Developer-PM

CRM Data Farm Project is a Virtual Data Farm and a Private Enterprise which addresses current issues including Big Data, CRM, Business Intelligence (BI), Analytics, Data Streaming, Template Driven front-end, Intelligent DAO, Cloud Governance, The Next Generation Security, Web-Mobile and Cloud Services - CRM Data Farm: http://crmdatafarm.com.

CRM Data Farm Project is our new concept called Post Office (PO) which is an intelligent data warehouse with build-in security. PO data is converted into Intelligent DAO types (Personal, Business, Transactional and Misc).

PO Next Generation Security approach uses encryption, compression, indexing, hashing and Intelligent DAO.

PO approaches are secure, fast, economical, loosely coupled and easily integrated into any existing systems.

CRM Data Farm's site is the PO architect with over 80 pages of documentation addressing the current issues.

Technology used: Java, J2EE, Java Collections, JSP, JavaScript, Servlets, SQL, NetBeans, XML, HTML, JAXB, RUP.

September 2013 to May 2016 – Halal Farms USA (Chicago) Position: Java-Web Architect/Analyst/Developer - PM

Halal Farms USA: Halal Farms USA is a specialty slaughterhouse grossing over five million dollars in sales. The business processes start with purchasing animals from ranchers and ends with the sale of meat products to retailers

Webatizing the business and employees processes from phone and faxes by transforming them into excel spreadsheets which were converted to MySQL tables and e-Commerce system plus training employees on the e-Commerce system.

Analysis-architecting-design and developing a four tier J2EE web system using Java which mimics QuickBooks software.

Analysis-architect-design Cloud Computing, CRM system to offer CRM, cloud and e-Commerce services.

Analysis-Architect-design and develop e-Commerce Home Delivery Products, Business Plan, prototype and CRM services.

Managing and maintaining IT services including hosting and deployment (including Tomcat) of web-mobile portals

Technology used: Java, J2EE, e-Commerce design, NetBeans, Android/Eclipse, swing, HTML, JavaScript, MySQL.

June 2013 to September 2013 – Blue Cross Blue Shield Association (Chicago) Position: Java-Unix Developer

Blue Cross Blue Shield Association (BCBSA) is a federation of 38 separate health insurance organizations and companies (Members). Blue Health Intelligence (BHI) is a National Data Warehouse (NDA) project which facilitates the communication and the exchange of data among the Members.

Coding Java, SQL and Unix shell scripts as services to the communication and the exchange of data.

Architected-designed and developed a four tiers software for small in-house timesheet used to track projects' total hours.

Technology used: Java, J2EE, Unix, Eclipse, CSS, JSP, JavaScript, MySQL, DB2.

December 2011 to June 2013 – Abu Dhabi University (Remotely from Chicago) Position: Java Architect-PM

Digital Parasitological Examination Program (DEPP) is an ADU project to create a library for identifying parasitological specimens. It is a web-mobile service (multilingual) for underdeveloped-unequipped-underprivileged clinics in which medical staff can use the iPhone/iPad or any digital tool to capture an image of a specimen. The medical staff may upload images to identify the candidate specimen plus may receive medical assistance in how to handle such cases.

Worked with ADU professors, gathered all material for DEPP. Performed analysis and created business plan, vision, requirement and project docs. Architected-designed DEPP system and developed a website for promoting DEPP.

Used my personal hosting to build a paperless communication-documentation site for DEPP members.

Architected an intelligent software that will learn as the cases increase by using search algorithms for searching and matching of specimen that is done in milliseconds. DEPP would run millions of tests with astonishing results. Hosting patients' data will have encryptions and security software and servers. Clients may use mobile, computers, internet-chatting to access DEPP services and answers-chat from experienced staff.

Technology used: Java, J2EE, e-Commerce design, NetBeans, Android/Eclipse, swing, HTML, JavaScript, MySQL.

October 2010 to September 2011 – Liberty Mutual Insurance (Indianapolis, IN) Position: Java Developer

Liberty Mutual Group ranks 71st on the Fortune 500 list of America's largest companies. Liberty specializes in business insurance and provides web based systems for agents, business and consumers using Enhanced Commercial Line Policy System (ECLPS). ECLPS is a J2EE web system that communicates with a number of backend Legacy systems (Mainframes with DB2) using XML BLOB (known as BLOP).

Performing maintenance (Tickets) on ECLPS which includes fixing existing bugs and adding new features-enhancements.

Tickets may require maintenance to JSP- JSTL and a number of Java service layers (Actions, Screen Setup, Processers, Handlers, Builders, Factories, Packages, Printing, Buffer Builders, Rating, XML Parsers, plus products specific services) that communicate with the Legacy systems through the XML BLOP.

Using IBM Rational Application Developer (RAD) to perform ticket maintenance and ClearQuest to communicate with analysts in documenting and tracking tickets.

Using RAD to perform unit testing and debugging ECLPS. Buffer dumps are used in tracking values on the BLOP.

Technology used: Java, J2EE, IBM Rational Application Developer (RAD), ClearQuest, HTML, XML, JavaScript and JSTL.

July 2009 to October 2010 – Freelancing (Chicago) Position: J2EE Lead Architect/Analyst/Java Developer-PM

Credit Repair Project (CRP) is to build an intelligent web data warehouse system that helps Consumers and Commercial companies repair and maintain their credit worthiness and credit standing. CRP's Goals are to service 100 Million Clients and the credit repair process would take 90 seconds not the current 90 days. CRP performs Extract, Transform, and Load (ETL), Cleansing, Validation, Certification and Audit Trail. CRP interfaces with TransUnion, Equifax, Experian and data vendors.

Contacted credit bureaus, gathered all material for CRP. Performed analysis and created business plan, vision, requirement and project docs. Architected-designed CRP system and developed a prototype and key Java components.

Web browser side has 12 different services (Front Stores), Outside Vender Call Center and CRP employees access portals. Customized Reports are created for each Front Stores plus clients can create their own formatted reports.

Tomcat is CRP's servlets engine where JOSSO and proxy servlets are used as security layers for access and speed.

Architected batch and Intelligent Software Engines (Reports, Alert-Audit, Call Center Services, Business Rules). It had the database, Intelligent Analysis engine, Dynamic Business Adapter and Web Services.

Technology used: Java, Java Collections, JSP, JavaScript, Servlets, SQL, NetBeans, XML, HTML, JAXB, RUP.

November 2008 to July 2009 - ADP (Chicago) Position: J2EE Lead Architect/Java Developer

ADP DealerSuite (DS) application is a Customer Relationship Management (CRM) system and a Dealer Management System, serving dealerships of all sizes. DS is a Java-Web based application with services that run on Apache Tomcat 6 and developed using JBoss framework, Hibernate, JOSSO (for security), log4j, and MS SQL Server with offshore testing.

Truck Marketing Group is a marketing group that had static web pages and updating required recoding of these pages.

Performed analysis, architected/designed and developed the DS Truck Back-End Process. Developed an MVC Administrative Tool for Truck Marketing Group to update their marketing web pages in real-time.

The View had Admin tool pages for creating/editing pages and uploading PDF doc, images, wmv files, and links.

Analysis-architecting-design and developed Tomcat (Controller) and Servlets (Model) used to access pages builder Business tiers plus BO, Excel sheets parser, page builder parser, image-videos parser and Admin command handler (Model).

Technology used: Java, J2EE, e-Commerce design, JBoss, Hibernate, MS SQL Server, JOSSO, Perforce, UML, HTML, XML, JavaScript and JavaServer Faces.

May 2007 to November 2008 - Walgreens (Chicago) Position:J2EE Architect - PM- Java Developer

Walgreens Image Distributed System (IDS) is a Java Distributed System centralized by Corporate UNIX system (Oracle DB) with 6000+ stores as clients. Each store has an IBM AS400 (Tomcat and UNIX) system and a PC (Windows XP or NT) that displays images on the store Electronic Reader Boards (ERB). IDS uses Java zipping utility code to group the distributed images and reduce their byte size. IDS performs the logistics of communications (pull-push-on demand) in real-time, image repository synchronization, audit trail and dynamic Business Rules. Images transport used Java Apache code to FTP images.

Performed analysis, created vision, requirement and project docs. Architected-designed IDS and lead a Java team of four consultants and two Walgreens engineers in developing, integrating and testing IDS.

Walgreens has over 90,000 vendors and Walgreens marketing department produced over 12,000 ads, images and messages monthly to be displayed in real-time on the stores' ERB. Created zipping and FTP tool to zip/ship ads to 6000 stores.

Architected and developed Control Units for Corporate and for each store for the logistics of communications (pull-push-on demand) in real-time. Corporate Control Unit runs IDS and Store Control Unit gives store manager display options.

Architect-designed and developed Corporate WebSphere and stores AS400 communication using HTTP-Servlets and FTP.

Architect-designed and developed Corporate and stores AS400 had Reports, Synchronization, Inventory and Scheduler.

Architect-designed and developed Corporate Oracle DB and stores AS400 DB to have Synchronization of ads, images and messages. Corporate Control Unit uses Dynamic Business Rules engine for real-time control over IDS.

Technology used: Java, J2EE, e-Commerce design, IBM WebSphere, Swing, Walgreens SDF, UML, HTML, XML, JavaScript, SAX and JAXB.



Contact this candidate