Post Job Free

Resume

Sign in

Data Management Warehouse

Location:
Des Moines, IA
Posted:
February 19, 2024

Contact this candidate

Resume:

Ashish Bajaj ad3qz8@r.postjobfree.com 415-***-**** (c)

Information and Data Management Leader

Leader in Information and Data Management with proven track record of building robust Enterprise Architectures and Data Management solutions for Global organizations.

Over 20 years of experience in leading teams and building Enterprise Data Architecture, Enterprise Data Warehouse, ETL Architecture, Data Integration and Reporting Architecture, Data Governance, Business Intelligence and Data Analytics solutions within time and budget for various clients.

Recognized for driving results, delivering complex engagements, innovative thinking & leadership.

Provide strategic thinking, solutions, roadmaps and driving architectural guidance for clients.

Developing Architectural plans and roadmaps supportive of the business objectives.

Establishing communication strategy to share, present, educated and promote procedures, processes and roadmaps with leadership and various business and architectural teams.

Solid experience leading and managing successful Design, Development and implementations of Data Warehouse and Data Integration projects across various business domains.

Strong experience leading and managing successful implementations of Data Governance, Data Quality, Data Security, Metadata and Master Data Management initiatives across the enterprise.

Strong experience with Guidewire DataHub Architecture and Design to customize solutions.

Very strong Data Warehouse, Data Architecture, BI & Team Leadership and Management skills.

Data Warehouse, ETL and Reporting Architecture and Design (Conceptual, Logical & Physical).

Strong experience with various ETL/Data Integration and BI/Analytics/Reporting tools/platforms.

Strong experience Managing and Leading Data Management solutions across the Enterprise.

Excellent Management, Team Leadership, Communication & Inter-personal skills.

Strong Vendor and Client Management experience.

Education:

Masters in Computer Applications

Bachelors in Statistics

Professional Training/Certification:

Trained and Certified Six-Sigma Green Belt (for TQM – Total Quality Management)

Guidewire DataHub, InfoCenter and ETL Architecture and Design

Industry:

B2B, Energy, Airline, Banking, Government, Healthcare, Reinsurance, Insurance, Finance

Specialty Skills:

Leadership skills for Enterprise Data Architecture, Data Integration and ETL Architecture, Data Warehouse & BI/ Reporting Architecture, Data Analytics, Data Quality, Data Governance, Data Management.

Achievements:

Provide Advisory Services and create high level Data Strategy, Roadmaps and Architecture for Data Integration, Data Migration, Data Management, Data Governance, BI and Reporting solutions.

Successfully lead teams through Requirements, Architecture, Design, Development, Implementation

Lead the initiatives for Enterprise Data Architecture and Strategy, Enterprise Data Warehouse, Data Management and BI/Reporting solutions from grounds up and successfully implemented within time and budget for Insurance, Finance and Healthcare companies and State Government clients.

Technical Skills Summary:

Oracle RDBMS 7.x, 8.x, 9i, 10g, 11g, 12c, MS SQL Server 7/2000/5/8/12/14, Teradata, UDB/DB2, HP Vertica, Java, C++, .Net, ASP, Perl, Java Web Server, IIS, WebLogic, TIBCO, Jaros Analytics 3.0.3, Jaros AMDR 1.19.7, Erwin 9.64, 7.3, 4.1.4, 3.5, Sybase Power Designer 16.5, 15.x, Embarcadero ERStudio 9.5, Altova XMLSpy 2016, Informatica Power Center 5.1, 7.1, 8.1, 9, 9.5.1, 10.1, IBM Data Stage 8, OWB 2.1, 3i, Astera Centerprise Data Integrator 5.x, SAP BODS, SSIS, SSAS, DTS, WTX, EDI, X12, Cognos 8i, Business Objects, Logi Analytics 11.x, Guidewire DataHub 8.2, 8.3, 9, Guidewire InfoCenter, Visio, TOAD 11.0, 10.5, 9.6, 8.5, SQL Navigator, PVCS, VSS, TFS, CA Rally, HP ALM 12.21, JIRA, Azure, AWS

Professional Experience:

Jun 21 – till date Enterprise Data Architect/Lead Piedmont Community Health Plan, Lynchburg, VA

As part of the WINGS project, Piedmont Community Health Plan (PCHP) was implementing HealthRules Payor (HRP), a core administrative processing system from HealthEdge, for configuring Benefit Plan, Provider Contract, Claims Processing and Information Transparency, as they were moving from the legacy system IMPACT and S&S. Working as Enterprise Data Architect / Data Strategy Lead to implement a new reporting solution to deliver data and reporting needs for PCHP’s business users and capabilities to interface and exchange data with other systems and external vendors. The reporting solution is implemented using SSRS.

Worked with various business users and stakeholders to identify PCHP’s data needs.

Worked with business users from Provider Relations, Pharmacy, Medical Management, Customer Service, Operations, Underwriting, Actuarial, Sales, Marketing, Finance, Compliance divisions to ensure continuity of business operations along with external servicing partners and vendors.

Identified the data and reporting needs for PCHP.

Defined the to-be architecture to support PCHP’s data integration and reporting needs.

Helping with implementation and rollout of the new reporting solution built using SSRS.

Document Data Flows and Process Flows.

Help with resolving issues while converting the Provider data into Symplr, and Member and Group data from Exchange (Individuals) and Commercial (Small Groups and Large Groups).

The data from HRP flows into the data warehouse HRDW and then into ODS, in near real time basis.

ODS was built by the HRP implementation partner company UST.

Understood and analyzed the data models for the movement of data across various systems

Helped with source to target data mapping and resolve issues.

Worked with the report developers to extract/load data from ODS into various reports and data feeds.

Help with establishing SFTP connections and setting up folders with various external vendors.

Data files processing for Eligibility, Enrollment, Member, Provider, Authorizations, Accumulators, Claims, Repricing, Payments, Reconciliation data in X12 834, 837, NCPDP formats and as text files.

Data files exchange with various vendors like Aetna (Repricing Claims), FirstHealth (Repricing Claims), Burgess (Claims Editing), Foster Fuels (Enrollment), Quantum, CVS/Caremark (Eligibility, Accumulators), Inovalon, Aerial (Eligibility), CRM (Member Benefit), Medecision (Authorizations), Softheon (Exchange and Small Group), HRP (Large Group), HealthTrio (Member Portal, Provider Portal, Cost Transparency) PNC(Claims Payment), VHI, etc. on daily and monthly basis.

Providing APCD Eligibility, Provider, Pharmacy, Claim data feeds to VHI.

Working with the business team for identifying and resolving issues and defects.

Working closely with the UST Integration team, onshore and offshore, to resolve any issues.

Provide regular updates to business users and internal and external stake holders to ensure successfully delivery of the reports and data feeds.

Working closely with Program/Project Manager to define the plan, tasks, dependencies, estimates

Leading the team and directing work to development resources. Issue escalation and conflict resolution.

Ensuring successful delivery of project within time and budget.

Environment: AWS, SQL Server 2018, SSRS, JIRA, Visio

Oct 18 – Jun 21 Enterprise Data Architect/ Data Strategy Lead MN Housing Finance, Saint Paul, MN

Reporting to the CIO of Minnesota Housing Finance Agency (MHFA) working as Enterprise Data Architect / Data Strategy Lead as part of Single Family’s BOSS project to implement a new loan origination system ELC (Enterprise Lending Center) by Mortgage Cadence and create a centralized data repository to support Management of Funds solution to perform disbursements, have reporting capabilities and to interface and exchange data with other systems including HDS Master Servicing and US Bank. The new application ELC is on Microsoft Azure and the solution is implemented on Oracle database using Pentaho for ETL.

Worked with business users and stakeholders to identify MHFA’s data needs.

Worked with Single Family, Accounting and Finance divisions of the Agency to ensure continuity of business operations along with external Servicing partners and Lenders.

Defined the Enterprise Data strategy to be implemented by the agency.

Defined the to-be architecture to support MHFA’s data integration and reporting needs.

Helping with implementation and rollout of the new loan origination system ELC (Enterprise Lending Center) built on Microsoft Azure/SQL Server.

Document Data Flows and Process Flows.

Created Data Modeling standards and best practices.

Create Logical and Physical Data Models using ERStudio

Designed Source and Staging layer to land the incoming data before loading it into ODS using Pentaho.

Defined the ETL architecture for the movement of data across various systems

Helped in creating source to target data mapping and resolve issues.

ELC data is sourced multiple times a day from SQL Server database restored from Azure cloud.

Worked with ETL developers to extract/load data from ELC into ODS using Pentaho.

ETL workflows for sending data from ODS to US Bank, HDS Master servicing and back to ODS.

Working with QA team for identifying and resolving issues and defects.

Working with developers to provide data needed for Management of Funds (MoF) application.

Working with developers to update the XML file used by the homegrown reporting application FIRM

Creating data visualization dashboards and cubes in Sisense to support business data analytics needs

Exploring Power BI capabilities to support current/future agency data needs

Work closely with DBA’s to resolve any issues.

Capacity planning. Database security, sizing and growth estimates,

Provide regular updates to business users and stake holders to ensure successfully delivery of the project

Working very closely with Program/Project Manager to define the plan, tasks, dependencies, estimates

Leading the team and directing work to development resources. Issue escalation and conflict resolution.

Ensuring successful delivery of project within time and budget.

Environment: ERStudio Data Architect 18.2, SQL Server 2016, Oracle 12c, Pentaho, Sisense, Power BI

Oct 16 – Oct 18 Chief Data Architect / Data Management Leader FGL, Des Moines, IA

Working as Chief Data Architect/Data Management Leader for Fidelity & Guaranty Life (FGL) Insurance Company as part of the BigDig program for building a data platform to streamline processes and consumption of data from various sources to create timely, accurate, useful, single source of truth for reporting investment data for Investment Accounting/Financial Reporting, FP&A, Treasury, Risk Management and Audit. The solution is being implemented on SQL Server database using Informatica for ETL and Essbase and Tableau for BI and Reporting.

Worked with business users and stakeholders to identify FGL’s data needs as part of the BigDig project.

Established Data Management strategy to share, present, educate and promote procedures, processes and roadmaps with leadership and various business and architectural teams

Created the strategic layout of future Enterprise Architecture and Data Management plan.

Create the roadmap for the future Data Management strategy and the Enterprise Data platform

Facilitate on-going communications to leadership teams, Business leaders, Technology partners.

Conducted analyses of business architecture models and reviewed the IT processes (SDLC), Program/Project Management processes (PMO), Cost control, Governance structures.

Negotiate priorities and resolves conflicts among stakeholders and project team(s).

Played the role of an Evangelist to educate users and socialize solution designs and the value of Data Management and architecture to solve complex problems.

Lead and coordinate efforts to write/establish and gain consensus with standards in managing and governing enterprise data to reflect data governance, data security, data operations, data management, data quality, data processing, and platform/architecture

Engage with teams to ensure standards can be consumed & leveraged as standard course of business

Document the as-is and to-be System and Data Architecture

Defined to-be architecture to support FGL’s data management, sourcing, integration and reporting needs.

Architect and Design end-to-end Solution for the BigDig project to address business data needs.

Defined ETL architecture for the movement of data across various systems.

Defined Reporting Architecture and designed the solution to support FGL reporting needs.

For Release 1, build a platform to source the data from PAM for securities provided by SSKC/SSGS

Extract Transform and Loading of the data from PAM Adaptor files (50+) for Securities, Forwards Swaps, including positions, valuations, transactions data, and PAM Supplemental files including mortgage data.

Applying set of data quality checks and business rules on the incoming data, identify, measure, analyze and remediate data quality problems through data analysis and information processing of large datasets

Creating Golden Investment Datasets (GID) for Asset, Income, Transactions, Unapplied Cash, Open Payable/Receivable, CML, etc. by creating a set of processes to create Bronze (raw) datasets and applying adjustments to create Silver (adjusted) datasets until it becomes Golden (approved) during the Monthly PreClose, PAM Close and FGL Close of the month.

Providing a set of reconciliations for data received from PAM with GL data received from Oracle.

For Release 2, work on the replacement and sunset of ARD (Asset Reporting Database) by replicating the data and processes in the newly designed EDS

Designed solution for data exchange with additional data sources Bloomberg, Moodys and Factset

Designed new reporting platform to replace the existing Asset Reporting Database (ARD)

Creating new capability for ARD reporting dataset combining data from various sources including PAM

Manage business users and stakeholders expectations to ensure successfully delivery of the project

Providing regular updates to upper management and business users

Working with Program Manager to plan, define tasks/activities, dependencies, resource allocation.

Ensuring successful delivery of project within time and budget.

Environment: Erwin 9.6, SQL Server 2014, Informatica 10.1, Essbase, Tableau, MS Project, Visio

Jul 15 – Oct 16 Enterprise Data Architect / Lead TMNAS, Philadelphia, PA

Worked as Enterprise Data Architect / Lead for Tokio Marine North America Services (TMNAS), for providing services to Philadelphia Insurance Company (PHLY). As part of the PHLY’s SPIRE transformation program, the Guidewire DataHub is implemented as primary data repository for all core source systems (Rapidsure, OneShield/Philly Bond System, AQS, Guidewire BillingCenter, Guidewire ClaimsCenter, Guidewire PolicyCenter and APPS). The data from all these source systems needs to be integrated into the DataHub. The DataHub will be used as centralized data repository for Reporting, Coverage Verification, Phly.com, SAP GL, ISO STAT, DMV, VIN Lookup and any other outbound integration services. The solution is being implemented on SQL Server database using Informatica and BODS (Business Objects Data Services) as data integration tools.

Worked with business users and stakeholders to identify PHLY’s data needs.

Facilitate on-going communications to leadership teams, business leaders, Technology partners.

Created the strategic layout of future EA plan. Accomplished transition plans to reach EA goals.

For Release 1, sources Rapidsure and Billing Center, including lines Personal Auto, custom LOB’s (Collector Vehicle, supported by Personal Auto, Fitness Studio (GL, IM), Fitness Trainer (GL) and Billing Center (Commissions, Funds Transfer, Outgoing Payments, Hold, Region, User Information Services.

For Release 2, sources AQS, OneShield (Philly Bond System) and Policy Decision, for ISO Lines (GL, Commercial Auto, Property) and for custom LOB’s (Surety, Excess Liability, Non-ISO GL, Non-ISO PR, IM, D&O, E&O, EPL, Cyber Security Liability, BOP (ISO based and Custom), Medical Liability, Crime, Umbrella, Manufactures Liability)

Defined the to-be architecture to support PHLY’s data integration and reporting needs

Created Logical and Physical DataHub Integration Architecture

Created Pre-Landing Zone (PLZ), identical to source, Landing Zone (LZ), pipe delimited files, for landing of source data into TRF tables before loading it into DataHub using BODS and then InfoCenter.

Defined the ETL architecture for the movement of data across various systems.

Policy Event Handler to log all Policy transactions as XML messages through service on Enterprise Service Bus (ESB) for all LOB’s into PLZ

Manage business users and stakeholders expectations to ensure successfully delivery of the project

Provide Guidance to various development teams to resolve issues

Helping Project manager with project planning, timelines, task dependencies and estimates,

Environment: ERStudio Data Architect 9.5, SQL Server 2012, Guidewire DataHub 8.3, Guidewire InfoCenter, BODS 4.2, Informatica 9.5, MS Project, Visio

Dec 13 – Jul 15 Enterprise Data Architect / Leader CSC (NYSDOH), Albany, NY

Working for CSC’s client New York State Department of Health (NYSDOH) on the Encounters Data Project as the Affordable Care Act (ACA) requires the State Risk Adjustment and Reinsurance entities to collect claims and encounter data for actuarial analysis. The Office of Quality and Patient Safety (OQPS) identified the submission by unifying the QHP, Medicaid and CHP plan data as an intake for the envisioned Statewide All Payers Database (APD). The solution is being designed using IBM’s WebSphere Transformation Extender (WTX) for EDI, Data Stage, Cognos and Oracle.

Interviewed the SME’s and business users.

Conducting JAD sessions to get the data requirements. Document Data Flows and Process Flows.

Worked with business users and stakeholders to identify NYSDOH data needs.

Working with users in identifying Tier1 and Tier2 edits to X12 and NCPDP formatted data to increase the quality of data for Post Adjudicated Claims Data Reporting (PACDR).

Architected the end to end solution for Encounters Intake System (EIS)

Established data modeling standards and best practices.

Created Conceptual, Logical and Physical data models using Erwin for EIS from scratch.

Develop new data structures to store the historical PACDR data for all the EDI X12 837’s (298-Professional, 299-Institutional, 300-Dental) and NCPDP data formats for QHP (Qualified Health Plans) and MMC (Medicaid Managed Care).

Defined the ETL architecture for the movement of data across various systems.

Loading various data files including Member Validation data from eMedNY.

Create source to target data mapping for generating the MAEE (Medicaid Analytical Extract) extract from EIS (Encounter Intake System) for MDW (Medicaid Data Warehouse).

Worked with the SME’s to identify the data and reporting requirements of the NYSOH.

Working on data management and data security plan for confidential and restricted information and PII.

Leading and Managing the entire development team to ensure successful implementation of Database Architecture and Design, ETL Design and reporting deliverables.

Environment: Erwin 9.5, Oracle 11gR2 (11.2.0.1), IBM InfoSphere Data Architect 8.1, IBM Data Stage 8.1

May 11–Nov 13 Enterprise Data Architect Farmers Mutual Hail Insurance, West Des Moines, IA

Working for Farmers Mutual Hail Insurance Company of Iowa (FMH) as Enterprise Data Architect for designing and implementing the Data Warehouse for their Insurance, P&C and ReInsurance lines of business. FMH is one of the leading providers of Crop Insurance to Farmers in many states across US and has P&C and ReInsurance divisions for accepting risks from both domestic and international companies. The Enterprise Data Warehouse (EDW) was designed and implemented using Astera Centerprise Data Integrator and Logi Analytics (formerly LogiXML) support their reporting needs based on their newly in-house developed system CloudNet replacing their old policy admin system eCrop.

Interviewed the SME’s and business users, for both Multi-peril Crop Insurance (MPCI) and HAIL lines of business within Crop Insurance to get the data requirements.

Worked with business users and stakeholders to identify FMH’s reporting and data needs

Facilitate the presentations, education and on-going communications to leadership teams, business leaders, Technology partners.

Created the data warehouse architecture and roadmaps and got buyin’s from the upper management.

Evaluation of various tools, POC’s and Vendor selection for ETL/Data Integration and Reporting tools.

Defined and document the requirements using use cases. Document Data Flows and Process Flows.

Established data modeling standards and best practices and created Conceptual, Logical and Physical data models using Sybase Power Designer.

Created the BUS Metrics for the various business processes across FMH and identify the Dimensions and Facts to be used across these business processes.

Designed Enterprise Data Warehouse including dimensions and fact tables using Kimball methodology

Created STAR schemas for Policy, Agency, Claims, Reserves, Financial data marts.

Defined the ETL architecture for the movement of data across various systems.

EDW also housed some of the historical information from retired systems (TIP, AS400), P&C (Garvin Allen) and ReInsurance (SICS).

Worked with the SME’s of source systems to satisfy the data and reporting requirements of the government body RMA (Risk Management Agency), NCIS and other third parties.

Conducted various POC’s and user focus groups for Agents and Adjusters for getting the dashboards and reporting needs using LogiXML (Logi Info and Logi Adhoc).

Identifies the reporting needs from the workflow and document management system ImageRight and its data integration needs with CloudNet.

Worked on reporting needs for Geospatial and GIS information in ESRI with CloudNet

Worked on variety of old systems to be retired (TIP, AS400) to move their data in EDW.

Working on the data security plan for confidential and restricted information and PII.

Implemented user level reporting security based on user types, roles and permissions.

Helped with conversion effort from old policy admin system eCrop to CloudNet.

Vendor evaluation and vendor management.

Environment: Sybase Power Designer 16.5, 16.1, Oracle 11gR2 (11.2.0.1), TOAD 11, 10.5, Astera Centerprise Data Integrator 5.1, Log Analytics (LogiXML) 11.x, ASP.NET, Perforce, MS Project 2007

Mar 10 – May 11 Enterprise Data Architect / Project Lead Wells Fargo, West Des Moines, IA

Working for Wells Fargo at their Home Mortgage headquarters as Enterprise Data Architect/ Project lead for Sourcing work stream, for sourcing and integration of data from 40 Home Mortgage related source systems for MIDE (Mortgage Integrated Data Environment) project. MIDE project is being implemented in Teradata environment for high volume parallel processing of data to support business needs.

Working as Enterprise Data Architect/ Project Lead for Sourcing Workstream leading team of 10 Analysts

Worked with the SME’s of source systems including Loan Origination, Servicing, Correspondent and Wholesale, Institutional Lending, Sales systems, etc.

Meet with SME’s of source systems including LIS, LPS, AS400, PCLS, SMS, CORE, C&W Standard, IL Online, MSP/Fidelity, Phoenix, Profiler, Arrow, ACAPS, SalesLogix, CTS, AOW, ECPR, MYBOB, Oracle GL, PeopleSoft, etc.

Meet with SME’s of various users to help identify what data should be sourced into MIDE.

Worked on variety of source system platforms including Mainframes, Oracle, SQL Server, Sybase, extracted data files, User Maintained data, etc.

Setup meetings and work with the SME’s to identify and get the system of record (SOR) data sourced into MIDE from the respective source systems.

Interviewed the SME’s & business users of all the 40 source systems to get business data requirements

Managing process for loading of metadata into EMR (Enterprise Metadata Repository).

Help in documenting business rules, transformations and source to target data mappings.

Working with source system experts and data stewards to resolve data issues.

Lead and Manage team of Data Modelers and Data Analysts adhering to budget and strict timelines

Defined Project plan detailing tasks, activities, dependencies, resource allocation

Environment: Sybase Power Designer 15, Teradata, AbInitio, MS Project 2007, Visio

Mar 09 – Mar 10 Enterprise Data Architect AVIVA USA, Des Moines, IA

Worked as an Enterprise Data Architect for AVIVA USA on a variety of projects. AVIVA is the sixth largest Insurance group in the world in Life Insurance and Annuities. Followed the Agile methodology and worked as Data Architect / Data Modeler for the Predictive Modeling project to help business cut costs on ordering labs for Life Insurance applicants. Worked on PII and data security projects and variety of other projects

Meet with users to identify the requirement for the Predictive Modeling project.

Interviewed the business users and stake holders to get the requirements

Defined and document the requirements using use cases.

Document Data Flows and Process Flows.

Creating Logical and Physical data models using Erwin.

Supported other application data models on DB2/UDB databases

Metadata creation, management and publishing.

Followed the Agile methodology for the Predictive Modeling Project

Worked with ETL developers to load data from third party vendors.

Working on the PII (Personally Identified Information) project. Analyzing various applications and identifying as well as controlling access to users.

Working on various projects and involved in tool evaluation.

Helping in Project planning, estimates and timelines.

Environment: Erwin 7.1, Oracle 10g, DB2/UDB, Informatica PowerCenter 8.1.1, TOAD 9.6, MS Project

Mar 08 – Mar 09 Data Warehouse Tech Lead MoneyGram, Minneapolis, MN

MoneyGram uses packaged data warehouse solution provided by Jaros for their source system Oracle Applications. Working for MoneyGram as a Tech Lead for the data warehouse team doing business analysis and design and delivering reporting needs based of the new module AR of Oracle Applications. Got trained and used the packaged out of the box (OOB) data warehouse solution by Jaros capturing near real time Change data capture (CDC) in the ODS and nightly in the Data Marts. Used the various tools by Jaros, Jaros Analytics, IDS (Information Directory Services) for Metadata management and building the Cognos namespaces to be used by the users. Analyzed and gather Business Requirements for reporting needs and adhoc queries as well as ancillary applications. Helped to design the ODS tables and star schemas for the custom tables built in Oracle Applications to support the business functionality, etc.

Interviewed the business users and stake holders

Identify business data needs and reporting needs and obtaining sign-off from client on specifications

Conducting Joint Requirements Development (JRD) sessions.

Worked with the data modelers to design the ODS tables and star schemas for custom tables in Oracle Applications supporting MoneyGram specific business needs.

Create Conceptual, Logical and Physical data models.

Created various tables to support the data needs of the ancillary applications

Designed structures around basic Customer Setup for various business units Credit Management and Recovery, Legal Fraud & Compliance, Agent Servicing and Operational Support, Settlement Servicing, Product Management & Marketing, Finance and Accounting, etc

Designed hierarchies for Agent and their relationships over time, Sales Rep, etc.

Help resolve the data issues by interacting with the SME’s and the business users.

Worked with Informatica ETL developers for loading ‘Near Real Time’ in ODS and ‘Nightly’ in Data Mart.

Created and maintained control tables for sourcing the data from the Oracle Applications source tables.

Monitor the overall development process and made sure deliverables are on time.

Used Jaros Information Directory Services (IDS) to define relationships between various entities and publish the model in Cognos Framework Manager.

Used Jaros Metadata Manager (MDM) to define and maintain the metadata so it can be published by the Jaros Help System for the business users.

Applied Fixes for enhancement to the OOB Jaros data warehouse solution using the Jaros Fixes Tool

Identify sensitive data elements and implement the information security policies

Decryption of the encrypted data elements in the Cognos Reports and Auditing the information both at record level and the request criteria level for PCI compliance.

Environment: Sybase PowerDesigner 12.5, Oracle 10g, Oracle Applications v11i, Informatica PowerCenter 8.1.1, Cognos 8i, TOAD 8.5, 9.6, PL/SQL Developer 7.1.4, MS Project 2007, Visio, WinCvs 2.0.2.4, HP Project and Portfolio Management (PPM) v 7.1, Jaros Analytics v 3.0.3, Jaros AMDR 1.19.7

July 07 – Mar 08 Enterprise Data Architect EOS Airlines, Purchase, NY

Working for EOS Airlines as an Enterprise Data Architect creating Architecture and Design of new Data Warehouse. Analyzed and gather Business Requirements. Designed Star and Snowflake schemas for Guest, PNR and VCR data from the primary source system SABRE. Also worked on creating a guest database to cleanse, merge and create master guest profiles to help reporting on guest behavior, guest activity, target repeat guests, revenue/nonrevenue guests, etc.

Interviewed the business users and stake holders. Defined and document the requirements.

Doing business



Contact this candidate