Post Job Free

Resume

Sign in

Data Modeling Solution Architect

Location:
Somerset, NJ, 08873
Posted:
April 19, 2024

Contact this candidate

Resume:

Qualification Summary:

Over ** + years of experience as Data /Solution Architect in various domains such as Government, Finance, Pharmaceutical, Insurance and Retail.

Worked closely with stakeholders, SMEs, and staff to document and understand business requirements, functional requirement, and design specifications for new applications in addition to enhancements to existing applications.

Worked in SDLC (Software Development Life Cycle) and Agile Methodologies with strict dead lines and proactively communicating with stake holders.

Involved in Conceptual, logical and Physical modeling, (Erwin Data Modeling) development, testing, implementation and production support. Act as technical consultant, application DBA and successfully implemented high risk hands-on technical projects utilizing Enterprise Cloud, Client /Server Methodology, On-line Transaction processing (OLTP) system, OLAP, Batch Processing System and Decision Support System (DSS). .

More than 15 years of supervisory, Lead / Project management leading to high ROI, managed onsite and offsite teams from 4 to 15 team members.

Familiar with TOGAF and Zachman Architectural Frame works.

Data Modeling for OLTP (3NF) and OLAP (dimensional) systems, Data warehouse (STAR SCHEMA) with conceptual/logical/physical and slowly changing dimension (effective dating) and rapid changing dimensions structures.

Developed Fingerprint applications solutions, Chronic Decease Training systems, Incident Management system and consolidation of smaller applications to reporting and budget monitoring solutions of NJ-DHS.

Data Lineage remedies, Logic correction to mitigate inconsistencies and Data mining including corrections and associated architectural changes.

Created road maps for current state and future state application designs.

Architectural / Design and development of applications and solutions in SQL and NOSQL databases.

Data quality, Data Governance suggestions and implementations for various organizational issues.

Developed / Reviewed BRD/FRD/RFQ for various applications assessed and modified in scope/out scope issues.

Evaluated work BDS (Break Down Structure), and assignment/follow up activities in PMO.

Worked in Evaluation and Acceptance of Value driven transactions and data points into design.

Designed and implemented data warehousing projects, data quality management, suggestion/ signoffs about data conditions from users, data Governance, assessment; dimensional/data modeling; ETL framework design; architecting BI solution, report breakdown to facilitate data modeling and ETL design.

Design and Implementation of Reference Data and Master Data Management solutions.

Knowledge of Financial products like Derivatives, Liquidity, Prime Brokerage, Asset Liabilities, Custodian banks. PMBOK (Completed the course).

Experience in prem to Cloud Transformation

Development of GAP analysis for IT systems across organizations to help in merger / stay decisions.

Developed the applications per the directions of top management and interacted with end users to accomplish the new application and to bridge the business process with Users and Technical DBA/ Architects. Used ERWIN, ERSTUDIO, VISIO, Power Designer for Data modeling.

Big Data HDFS, map reduce and interface to Data lakes.

Exposure to Trades, Swaps, Equities, Derivatives, Fixed Assets, Collaterals etc.

Quick Fix of temporary solutions in Pre-Prod and patch up with production until final resolution is derived/implemented.

Justification and solution architecting across cross functional teams and collaborating with Architectural Rev Board for design/ development/ implementation, Re-usability, and logic consumption.

Develop short term and long-term solutions to ad hoc problems to Keep business as Usual (BAU).

Assess and decommission short term or time bound solutions as the scalable and long-term processes / solutions coming up.

Initiate / design/ develop POC (Proof of Concept) where required as suggested by top management and ARB.

Involve with ARB (Arch. Review Board) to assess the POC of other projects, identify conceivable concepts.

U.S. Bank (April-2022 – Till Date) (Solution Architect / Data Architect)

1 Project Symphony. Lead for Brokerage / Digital Stream

(Bank integration Project: Union Bank merged with U.S. Bank the Merger and Acquisition of the banks in the wealth management stream is done through this project. Per government rules, regulations and compliances)

Studying the current operations of bank in various operations such as Brokerage and Digital -OLB, (Online Banking) Trust and investments, money guide. Etc.

Created project and application architecture deliverables consistent with architecture principles.

Studying the data flows and suggesting alternate plans and optimizations.

Developing L0-L1 data flow diagrams and discussions and review about the current state and upcoming migration approach.

Creating architectural flow and process handling diagrams.

Developing Gap analysis and Reviews with stake holders and signing off on risks and action items.

Developing ad hoc solutions for Gaps with known and discussed solutions and mitigate the gaps for quick Integration and BAU efforts.

Current state and future state design and implementation. (POC, evaluation and acceptance strategies)

Created conceptual, logical, and physical data models for the project.

Consumption of Mainframe and DB2 data per business needs and maintaining integrity.

End to End data mapping specifications for the project, involving BAs, and developers and useful for entire project and consistent logic across all levels of design/development.

DWH and Reporting architecture.

Data feed from SAP to Data hub and SOR (System of Records) for internal applications.

Consolidating reports and making sure integrations are possible and information Privacy is maintained.

Interacting with ARB (Arch Review Board) for corrective actions.

Involved with ARB for suggestion and recommendations for BAU and other relevant issues.

Used Manta for data lineage and data governance.

Oracle packages and SQL server Procs and functions for the project including utilities and standard logics.

Data quality issues and resolution strategies for data ingestions.

Reference data strategies and centralized hub concepts.

Designed and developed data quality alerts discussing with data stewards and business approvals.

MDM steps for short/ long term and immediate resolutions.

Designed and developed APIs for the systems and DB results to be shared with partners.

Evaluated APIs for the global security principles and sharable datasets.

Created secured data file transfers using GPG and double key concepts.

Interaction with infrastructure and other key players of the project for seamless execution.

ETL Architecting the migration process and streamlining cross and downstream applications along with Reporting systems. Reducing island applications and consuming to nearest applications.

Suggesting the SMEs and developing POC for sequencing the operations and flows.

Developing teams for data handling domains and standards in those domains.

Parallel run of Applications and Data consumption and decommissioning process.

Report system consistencies, Data model evaluations across systems.

Python scripts to load file data to databases, data alerts and monitoring.

SQL/ Spark /Scala/spark-SQL, Spark ML scripts for ad hoc queries and analytical POC’s where required.

SQL Query tuning and performance improvements for Transaction applications, Reports, Ad hoc queries.

Designed/ customized Power BI data model for optimum use and quick response.

Design and development of Data brick pipeline for POC in Azure.

Performed client acceptance and prototyping using azure compute and SQL azure instances.

Designed and developed Spark SQL in data bricks for ETL.

Designed PySpark logic designs in data bricks for equivalent informatica transformations.

Unity Catalog and meta stores designs in data bricks for data catalog.

Created and maintained delta lake tables in data lake house (data bricks) for business and enforced PII, GDPR enforcement per data protection policies (Office of Data Protection).

Environment: MS Visio, Erwin, Azure, Informatica 10.4, oracle 19c, SSMS, SSIS, SSRS, mongo dB, python. Azure, Power BI, Data bricks, Alteryx, Amazon Red shift.

2. GN Infotech: Client - NJ DHS -NJ Department of Human Services ( (05/15 – April – 2022)

The following 4 Project for NJ-Dept of Human Services.

I. Project CDSMP - (Enterprise / Solution Architect)

The application is for chronic decease related training / workshops and trainer maintenance provided by the Department. The new version of the application is designed with a lot of changes per business from its previous version. Trainers will be assigned to training programs and will be maintained to train in particular discipline /subject. Various analytical reports as requested by end users were designed and developed.

Conceptual, Logical and Physical redesign to facilitate Activity and related Reports and transactions.

Re alignment of KPI to facilitate out going data feeds.

Created Data pipelines and maintained DMS for reference.

Developed Data quality alerts through python scripts.

Designed Data Flows for OLTP and OLAP Databases with consistency for future scalability.

Designed and developed back-end sub procedures for Front end screens.

Data Quality, Data integration, profiling, and governance. (used Manta,collibra and alation)

Used Manta / Alation for data lineage and data governance.

Snowflake databases and developed snow pipes for data feeds.

Created Transformation, loads and data consistency for DW.

SQL server procs and functions for transactions and ad hoc queries and data feed for reports for SSRS/SSAS

Modified the Pro*C programs for new data tables logic.

Worked with dynamic SQLs and bulk data processing.

Architecting and developing Snow pipes.

Delta data loads in snowflakes tasks and streams.

Data migration to Snowflake for analytics and data evaluations.

ii. PIA – Payroll Interface application (Kronos as vendor) – Facilities and Hospitals Payroll project.

Enterprise / Solution Architect.

This is an automation project for the manually processed pay roll application at DHS/DOH facilities (hospitals).

Complete product line and solution responsibility. The vendor application will process the scheduling and timecard data. This application will interact with different agencies and adhere to DHS/ DOH facilities and process the employee. This on prem application is moved to cloud. This Information with consistency is used to make payments from OMB-CP (Office of management – Central Payroll).

Regular payments for normal hours worked in a particular pay period, overtime, differential (evening and night shifts), moon lighting across facilities as regular payment. OT, bonus, and clothing are dealt in Supplemental Payments. The extra payment, if any paid to employees, is recouped by recoupment process.

Analysis of various business functions and interfaces to the business processes and essentialities.

Development of Gap Analysis and bringing the stake holders and agencies(players) to acceptable inputs.

Designed and developed the employee information consumed from public service commission which is used in the application and reverted to commission per guidelines in the life cycle of employee.

Analysis of Impact and efficacy of data streams.

Conceptual, Logical and Physical Design of the system.

Used Manta for data lineage and data governance.

Logical and Physical design and reviews with teams and end users.

Interacting with each data providing / consuming agency for requirement gathering.

Studying existing manual application and end user inputs. Action Item List and resolution.

Presentation to ARB, and to discuss the merits and demerits of logic/application flows.

Design of Architectural data flow diagram for cross functional teams.

DMS (Data Mapping Specification) for analysts and Developers to consistently migrate data.

Data quality, governance while inception to new application.

Involvement with various groups for information security, design, and developmental effort.

Identification of services, interoperability’s, and abstracts to encompass in SOA.

Collecting final resolution and signing off from participating groups and Stake holders.

List of issues and resolution follow ups.

Data quality, RMS and MDM implementation for Payroll data.

Migration of Reference data / Master data and consistency check.

List of Data anomalies to end users and corrective actions for final load.

Consistency check and list to eliminate based on business functions.

Maintenance of Information resolved from action items applied to Concept list.

Helping in PSAR (Physical system approval request) approval to finalize the design.

Proof of concept development and Review.

Modified the Pro*C programs for new data tables and augmented logic.

Data ingestion through Azure data factory and data pipelines.

Design and development of Datawarehouse in Dimensional modeling.

Reporting system design and consistency.

Agile group development analysis and strategies.

Reference data preparation and MDM study and analysis.

Created Master data hub and populated Legacy data.

Derived/Determined Master records from legacy systems with rules and stake holder knowledge.

Spark, python for analytical report system.

Reworking Power BI reports with data modelling to cater reporting needs.

Hadoop and Spark streaming for online data feeds.

Designed a sample process to address vendor issues.

Created ADF pipelines for data movement.

Azure Data Lake implementation and design.

Azure Data bricks and pipelines for consolidated reports.

Azure synapse and ETL setup for data consolidations from heterogeneous sources.

Business discussion and approvals for legacy data fills in Synapse.

Developed scripts in python for data alerts and data sets.

Document Data collections and integration (Mongo DB).

iii. Project FARA - Finger Print Reporting system (Solution Architect)

This application is the evaluation of a given subject to work in DC, PC after fingerprint for that subject is done. The data from SBI which is obtained by FPTS is used and criminal evaluation and various 16 other evaluations as suggested by Fingerprint division of DHS are incorporated to automate the certificate printing process for subjects. The manual process of printing and mailing the certificate is automated such that the subject can print the certificate by themselves. The evaluation process is automated in this application and the subject can login and print after 10 days of fingerprint within which the evaluation is completed.

Examine the current architecture and recommend a new design.

Developed proof of concept.

Used Manta for data lineage and data governance.

Security and Data integrity in HIPAA environment.

Designed Conceptual, Logical design and proved the needs are met.

Suggested the need of data alerts where possible for business error detection.

Used Alation for data quality and lineage detection and enforcement.

Developed automated Letter / Notice printing system.

Improved the design analytically to reduce SME/ functional analysts’ manual effort.

Development and alteration to Data marts and new dimensions in EDW.

Used Digital backdrop for digital unit to store official signatures for Letters.

End user interaction and logic creation wherever required.

Designed and developed back-end sub procedures for subject’s evaluation and processing.

Accommodated different groups of Data with different evaluations under one roof.

Easy to retract the errors for which the Certification held.

iv. NJ- DHS IDMRS (Incidental Data Management Reporting Systems).

This application is used by the Office of Investigation for incidental reporting and investigation of cases in facilities. The application comprises of starting a case for investigation of incidents and interviewing the suspects / victims. I was responsible for Investigation report which encompass all the activities and events of the case in a report form. It is used to measure and see what has happened in any case holistically and can assess the case situation / stage in which the case is pending or completed.

Designed and developed the data flow and process flow for the project including Erwin Modeling.

Oracle / SQL Server (packages) procedures and functions for the Project.

Bigdata HDFS – Consolidation to Data Lake.

Azure Purview to manage data governance and secured access.

Azure Synapse set up, data load and Tableau, QlikView, Power BI reports.

Design/development of Synapse Pipeline where required in consultation with business.

ETL – Informatica Loads to DWH.

I was also responsible for the development of analytical queries for the dashboard.

Big data, hive, and analytical processes. Developed ML programs in python to predict MH patient outcomes.

Spark, Python for reporting.

Environment: Oracle 11g / 12 C,19 PL/SQL, UNIX, Erwin 8.5/9, Visio, Azure, Synapse, Data bricks, AWS, SQL server -2017/19, SQL Developer, TOAD, Informatica V 9.5/10, Informatica MDM

Azure, data factory, data lake, Synapse, AWS, S3, SQL Server, power BI, HDFS Big data,

Spark, Python, R, Windows NT/2000/XP, Qlik view, Tableau, Power BI Reports, HDFS, Hive, Dynamo DB, Jupyter, Red Shift, Databricks, Snowflake database evaluations, Snowflake tasks and streams. Kafka, Spark, Hive, SQL-Server 2017, SSIS/SSRS/SSAS, T-SQL

3. NJ CASS (HP Implementation Partner), 6/14 – 5/15

(Enterprise Architect / Solution Architect)

Designed and Architected the DOL feed using MV from online curam production database disabling the Trigger based approach this feature gathers core data for DOL feed from online curam production database without burdening like triggers and resource efficiently fetches the DOL data extraction.

Optimized query for faster data processing and extraction to Business Partners.

Modularized the processes using oracle Packages/procedures / functions to maintain consistency and

Single point of change for maintenance. Including global DB2 mainframe data consumption.

Developed Materialized Views with logic to process the extraction (output) faster for multiple interface extractions.

Developed analytical queries in Netezza. Developed SPs, objects.

Created metadata to facilitate the data extraction and Business Partner’s data consistency.

Prepared the system for SIT and UAT grouped and tested the interfaces.

Involved in Logic correction while integration testing and concluding with cross functional teams.

Improved the Code using Standards across in terms of speed, error, and exception handling.

Designed / developed sanity test for data interfaces.

Developed Data mart for Reporting systems AND Power BI.

Environment: Oracle 11g, PL/SQL, Erwin 8.5/9, TOAD, Netezza, UNIX/Windows NT/2000/XP, AIX/Linux, Oracle OWB – HP Quality Center, SQL Server SSIS/SSRS/SSAS. MS Power BI.

Duration

Organization

GE Treasury

7/2011 – 6/2014

JP Morgan Chase

07/2010 - 7/2011

Sanofi-Aventis Pharmaceuticals

06/2009 – 06/2010

Novartis - Pharma

09/2008 – 09/2009

Multiplan Inc (Insurance)

07/2006 – 08/2008

May 2005– June 2006

1.1Unilever NJ.

Oct 2004 – May 2005

BASF - Datawarehouse

Jan 2002 – Oct 2004

Warner Chilcott Labs NJ.

Jan 2001 – Dec 2001

Systemax (Global Computers Ltd.) – NJ.

Jan 2000- Dec 2000

Knoll Pharmaceutical Co – NJ

Apr-1998 Dec- 1999

Merck Pharma

July-1990 – Apr 1998

Wipro India, ITC India, Mind ware India

Education:

Diploma in Electrical Engineering – CPC Polytechnic Mysore, India. (1983-1986)

Bachelor of Engineering in Computer Science, Bangalore University, India, (1986-1990).

Professional Certifications:

Oracle Certification for SQL and Tuning – Oracle.

Six-Sigma Green Belt - TCS.

Project management Professional (Course completed) – PMI.



Contact this candidate