Resume

Sign in

Data Sales

Location:
Highland Park, New Jersey, United States
Posted:
December 17, 2018

Contact this candidate

Resume:

SRINIVASAN SHANMUGAM (SRINI)

917-***-****(C)

E-MAIL: ac7zoy@r.postjobfree.com

Professional Summary

Highly accomplished result-oriented professional with a distinguished, global, 19 plus-year career leading design, development, implementation, maintenance, and management of Analytics Platform (on-Premises, AWS & Azure Cloud platforms) and Applications Software Systems resulting in increased customer satisfaction and productivity gains. He is a natural communicator with strong motivational skills and ability to succeed. Proven record in helping organizations reduce cost and improve profitability.

Technical Background Summary:

Over 19 years of commercial software experience in Systems Analysis, Design, Development, and Implementation, including 5 years of design and develop the Big data Technologies for the leaders in financial industry. Also, 14 years of developing and administering Middleware applications using TIBCO Middleware Active Matrix products and BPM (iProcess). Technical expertise in also Sybase, and Oracle and Unix System Administration experience on SCO/HP/SUN/AIX UNIX and NT and always willing to learn the new Technologies. Extensive exposure to numerous industries including Finance, Insurance, Casino, Medical, Logistics and Finance, Insurance, developing software for Distribution Management, Freight Forwarding, Payroll, Inventory Control, and Production Management.

Doctorate in Computer Science (2014) - Completed the academics –Thesis - defense is in

Progress – ABD (ALL BUT DISSERTATION).

In my doctoral thesis, I have implemented a Machine-Learning technique of Predictive Model that allows system to improve the outputs based on learning algorithm and training data. In detail, used supervised learning as a utility from categorized training data in order to predict the value of any similar valid input.

Published Papers:

1.“A Novel Approach to Predictive Graphs using Big Data Technologies”, IEEE International Conference BigDataSecurity 2016 – Columbia University, NY

2.“Aspects of Data Cataloguing for Enterprise Data Platforms”, IEEE International Conference BigDataSecurity 2016 – Columbia University, NY

3. “An Approach to Eliminate Semantic Heterogeneity Using Ontologies

in Enterprise Data Integration”

http://csis.pace.edu/~ctappert/srd2013/d2.pdf

4.“Heuristic Approach to Manage Semantic Heterogeneity and Data

Inconsistency in Enterprise Data Integration”

http://csis.pace.edu/~ctappert/srd2014/c2.pdf

5.“Training Instructions Pattern Language”

www.hillside.net/plop/2013/

Technical Skills & Expertise:

Architecting Microsoft Azure Solutions (Azure Cloud)

Cloudera Certified Developer for Apache Hadoop (CCDH) - License: 100-013-361

Completed the course - Architecting the Enterprise from TOGAF

Areas of Expertise

Analytics

AI

ML

Application Outsourcing

Business Intelligence

Change Management

Data Visualization

Data Architecture

Data Migration &

Integration

Application Architecture

Disaster Recovery

Data Migration

Systems Development

Resource management

Financial Reporting

Project Management

Process Improvement

Systems Integration

Strategic Planning

Big Data Technologies

Product Management

Summary of Technical Skills

Cloudera

QlikView

Qlik Sense

Tableau

TIBCO

SharePoint

Microsoft Office Office 365

Microsoft SQL

Microsoft Visual Studio

Linux

Oracle 11g

AWS/AZURE

JavaScript

CSS

HTML5

Python

PostgreSQL

Trifacta

Podium

Summary of Organizational Skills

Work closely with Cloud vendors/Technology Partners, understand new product/functionalities and features and work with Technical/Business teams on proposing and integrating new contributions into the program and implement the solutions to holistically meet their requirements.

Work with leadership (Client/Internal) to budget, prioritize solutions, and maintain project engagements. Making decisions and provide solutions seamlessly.

Ensure business needs is clearly understood and the solutions implemented meet the needs and expectations of the business.

Take risk when necessary to get the job done but most importantly have the ability to learn quickly from mistakes while learning and sharing from other team members/peers.

Hand on all aspects of the solutions engagement, including pre-sales, post-sales, architectural design sessions, driving proof of value/proof of concept and driving final project engagements.

Lead all aspects of the Engagement & solutions Implementation, and supervision of technical staff

Provide guidance on the customers cloud adoption model, and recommendations to overcome blockers

Identify, validate and grow opportunities to accelerate consumption in next-gen high potential customer accounts, in partnership with the sales team, by driving solution architecture.

Professional Experience:

Employee: Bardess Consulting LLC Jun’18 - Present

Role: Principle architect & BigData Practice Lead

Current Roles & Responsibilities

oProviding account leadership and technical consulting team management as well as being the technical leader representing Bardess Group LLC to Big data accounts

oLead in analytics designing, specifying, and choosing the information system solutions

oCogitating functionality, data, security, integration, infrastructure, and performance

oPartner with the organization to understand the overall business goal and departmental strategy and produce information systems solutions to meet needs.

oLaunching cloud agonistic analytics products with partners MS Azure/Cloudera/AWS/Trifacta/Qlik

oMaintain an awareness of current and past business solutions, particularly in areas of collaboration that will support Bardess Group LLC in staying at or above industry standards.

oIlluminate technical concepts to lay audiences (Our Clients). Inspire and mentor the technical data and analytics consulting team

oProvide technical sales support and leadership for key opportunities that require a strong and creative technical point of view

oEstimate cost, and formulate business cases for IT solutions, considering cloud infrastructure

oLeading the Big Data practice strategy

oMaintaining Big Data solutions and contributing to standard delivery methodologies;

oSupporting sales with demos, proofs of concept, webinars, and proposal development

oAssisting the customer experience team with Big Data solutions

oDeveloped and Managing Zero2Hero™

oPartnering with other practice leads to support Bardess’ integrated solutions including Zero2Hero™

oDesigned Kafka messaging platform with scalability, data partitioning, low latency, and the ability to handle large number of diverse consumers make it a good fit for data integration related use cases.

oDesigned Kafka used for website activity tracking, operational metrics, log aggregation and stream processing/various POCs.

oGuided the team administration and operations/securities of the Kafka platform like provisioning, access lists Kerberos and SSL configurations

oDesign and Lead the teams to create topics, set up redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices

oDesign and Managed the teams to create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms

oWorked with Data Science team to make decision on choosing Supervised or Un Supervised model based on the requirements

oPlan –Prepare-Execute Model to avoid re-do algorithm or change models due to specific segment was not included or not excluded

oImplemented the following Libraries in the Data Science project

Scikit Learn for machine learning or machine and statistical modeling including classification, regression, clustering and dimensionality reduction.

Statsmodels for allows to explore data, estimate statistical models, and perform statistical tests. An extensive list of descriptive statistics, statistical tests, plotting functions, and result statistics are available for different types of data and each estimator.

Cloud Agnostic

Manage the AWS/Azure infrastructure and strategic vendor relationships including development firms

Work with internal teams to create the migration process of legacy systems to the AWS/Azure cloud

Work with business unit managers to understand project scope, suggest possible alternatives

Work with Security division to design and manage IAM roles for users

Work with several third-party vendors in big data and other areas to support client’s overall cloud initiative

Partner with the sales team, formulate and execute a sales strategy to exceed revenue objectives through the adoption of AWS/Azure

Product Developed and Launched in MS MarketPlace (Azure Cloud)

https://www.cloudera.com/solutions/gallery/bardess-customer360.html

Managed:

oServed as Bank of America(7 years) Technology Architect within CTO organization’s Emerging Technology Services

oManaged a team of Architects, Engineers and Analytics Platform Support staffs (L1/L2 & L3) which includes various consultants/contractors for Bank of America in US/India and three geographically dispersed datacenters (EC/WC & GC)

oEvaluated and architected Infrastructure/analytics platforms solutions aligned with the overall IT strategy. Collaborated with the CIO Partners, Enterprise Architect, DBA Teams and UNIX SA Teams in support and deployment of IT solutions

oEnforced proactive monitoring and change management to minimize operational impact

oClearly defined processes and procedures for the Infrastructure team to adhere and comply with Bank regulations and other security policies

Notable Achievements:

oRecognized for consistently delivering solutions with highest customer satisfaction that met or exceeded the vision

oImplemented enterprise (Bank/Wealth Management) standards, guidelines, best practices for IAAS, PAAS, DBaaS and Analytics Platforms

oBuilt a Shared Services Platform for CIO -Enterprise Technology Shared Services Capabilities Team

oConsolidation of multiple co-located Data center and applications to a one HaaS (Hadoop as a Service) platform. Built a Data lake for Analytics Platforms to help fill the gaps and data lineage

oManaged team through post-merger challenges and kept the team motivated and focused on the task at hand. Data center migration of critical applications to support consolidation efforts

Employee: Bank of America, NJ Jan’13 - May’18

Role: Technology Architect

Client Wellness: This project will integrate the proactive analytics, tools and process into the routines of the field leadership. This will help the field leadership to become more effective and proactive in identifying emerging opportunities and mitigating risk. Financial Advisor to prepare a better and consistent Client Experience, in the context of practice performance and enterprise strategies.

Also, Designed and Built a Data lake env for wealth management clients for very large amounts of data

Architectural Responsibilities:

Design and build scalable infrastructure and Big Data analytics platforms based on relevant tools and technologies

Drawing detail design and interpret Business and Technical requirements

Due diligence and analyze on Clients large dataset and magnifying the unknown insights

Leading and mentoring the Big data Analytics teams as well the Business Partners

Bridging and Providing the solution for the issues occurring in Technical (Developers) and Business (Business Analyst) team

Designing reports using Tableau/MS and Cognos extracting the data from analytics platform

General Responsibilities:

Articulate pros and cons of Relevant (Big Data) Technologies and Platforms

Work with Technical and Non-Technical business leaders on risks, roadmaps, and strategy

In Big Data Projects, identifying the use cases, solutions and recommendations

Co-ordinate Big Data Projects-Program and project managers in the design, planning and governance of implementing projects of any kind

Perform detailed analysis of business problems and (Big Data) Technical environments and use this in designing the solution

Work creatively and analytically in a problem-solving environment

Work in teams, as a big data environment is developed in a team of employees with different disciplines

Work in a fast-paced agile development environment.

Implement and Support the Big Data POC Projects and Env.,

Involved in architecture design, development and implementation of Hadoop.

Using Sqoop to import/export data into HDFS and Hive

Developed Hive Ad-hoc queries for Business User/Analyst to meet their Ad-hoc analysis.

Developed an interface to validate the ingestion data arrival in HDFS before it starts the Hadoop process.

Identified and Solved performance issues in Hive with understanding of Joins, Group and aggregation and how does it translate to MapReduce jobs

Setting up the BDR jobs to back-up the Hive and HDFS data

Designed and build job-flows and data-flows.

Involved in data architecture, Data ingestion pipeline design, Hadoop information architecture, Data modeling and machine learning and advanced data processing.

Designed and developed Partitions, Bucketing concepts in Hive and Managed and External-tables in Hive for optimized performance.

Tuning Hive to improve performance.

Developed a system for Business user to write Hive queries by using Sqoop to preprocess the logs, which are stored in HDFS.

Developed a script to import the semi-structured file content to store in HDFS were preprocessed using Sqoop and the processed data is imported to HDFS for Business Analysts to write Hive queries

Developed a script for Business user to write Hive queries by using Sqoop to preprocess the semi-structure files, which are stored in HDFS.

Designed and implemented – log analytics and data analytics.

Automated shell scripts for business process and data loading from different interfaces to HDFS.

Design and develop off load the data from MainFrame sources using Syncsort.

Design and develop the Impala as an alternative technology for Ad-Hoc business requirements where Business Analysts can run the query from Impala.

Working on migrating Hive (HQL) queries in SPARK SQL.

Implemented DR site

Employee: Bank of America, NY Sep’10 - Jan’13

Role: Solution Architect

GWIM T&O provides comprehensive technology solutions plus operations and client service support for Global Wealth & Investment Management businesses. It is achieve service and technology excellence through voice of the client driven change, innovative solutions and strong partnerships with business leaders, while effectively managing risks. GWIM BPM Team who provides SOA/BPM solution across the Bank.

Leading the Gap analysis/Infrastructure assessment (Network, Security, etc)

Leading/Developing the Strategy and Analytical plans as well capacity planning for each application.

Recommend and negotiate solutions to Application Teams.

Promote integration architecture vision to gather support IT and business-wide.

Create visibility of integration capability.

Employee: Guardian Life Insurance, NY Dec’06 – Sep’10

Role: Infrastructure Architect

Joined Guardian Life Insurance as a Infrastructure Architect in Guardian Integration Team.

Implemented Guardian’s one of the major project Customer Experience (CE) beginning of 2007.

The goal of CE projects implemented in Workflow (iProcess) and EAI (BW/Adapters) improve

the CE through minimized processing time, improved transparency and process streamlining.

Built the EAI/iProcess Servers from the scratch as well set them up in Failover. Since the

Applications are matured and volume of the transactions are very high, so working with

Application Teams in Performance tuning. Upgraded Guardian to iProcess 11.x and EAI 5.x.

iProcess

Mainframe

PHENIX

zOS

LotusNotes

LDAP

BlazeRuleEngine

FileNet

Kofax(Imaging)

Extream

Websphere

MQ and MSMQ

Java

UDB DB2

SQL Server

Operational Responsibilities

Develop, deploy, refine, and maintain support processes.

Monitor 24x7 routine break / fix EAI/iProcess

Perform capacity planning.

Automated the Deployment for Applications in EAI/iProcess Servers.

Maintain and repair integration components to keep them performing in accordance with technical and functional specifications.

Provided performance requirements for infrastructure

Provided consulting for application teams based on the application performance.

Configure and tailor EAI/iProcess Servers and Utility scripts, which help the Application. (Example: In iProcess, when a BG service are involved directly with BG tables, write a script to monitor the table/Designed a commenservice(BW) for all the Application)

Provide definition for application backup / storage / recovery.

Recommend and negotiate solutions to Application Teams.

Promote integration architecture vision to gather support IT and business-wide.

Create visibility of integration capability.

Designed and Deployed fail over plan for EAI/iProcess Servers

Maintaining Upgrades & Hot Fixes across the Env

Support TIBCO Server hardware and software

General Responsibilities

24/7 on call Production Support.

Support the Developers and Clients (end-users) and worked closely with users/dev to assess problems and develop and implement resolutions.

Establish and Enhance client relations both during pre and post project lifecycle.

Worked on Seamless Integration with Interrelated Management tools.

Monitor the EAI/iProcess Servers in AIX and Windows platforms.

First point of observation, Deployment, Start and Stop processes if hawk failed to start – normally this will be a complicated restart that Hawk could not handle.

Manual Sequencing, when processes are coming up after a period of being down, Administrator is responsible to make sure the correct sequence of followed, also has to coordinate with Developers and Business.

Maintaining the Log Sheet of daily errors and resolutions.

Implemented Custom/Hawk Accelerator Rule base.

Use TIBCO Hawk Microagents and Methods to monitor iProcess Server, application health.

Watch the Alerts/E-mails why are the alerts happening Immediate action. (Example: Notify the Administrator by Alert/email for details of critical errors)

Resolved the Tickets(internal) which open in TeamTrack (workflow) Developers.

Closely worked with Teams like Situation Management and Application Teams and resolved the issues in PROD,

Streamlines the Identification, Tracking, and Resolution of end-user issues with complete audit trail.

Wrote Shell Script to clean-up Scripts for the Production and Non-Prod Environments.

Above Shell Script implemented with Hawk Schedule or Cron Jobs to run the Scripts.

Provided sequential startup and shutdown instructions for the newly implemented system.

Upgraded TIBCO EAI to the Latest and Greatest

Provide end-user support and training for the user community.

Client: All State Insurance, Northbrook, IL July’ 06 – Dec’06

Role: Sr.TIBCO Administrator & Production Support

All State Insurance Enterprise Technology Services (ETS) delivers shared services including code frameworks, integration infrastructure and collaboration tools; supports and maintains shared services within the enterprise. Also ETS is also responsible for managing/maintaining products and common infrastructure where the clients can communicate, exchange ideas and work as a cohesive team (Windows Share point Services).

Mainframe

SAP,MQ and MSMQ Adapters

.NET

Oracle

Responsibilities

24/7 on call Production Support.

Support the Developers and Clients (end-users) and worked closely with users/dev to assess problems and develop and implement resolutions.

Establish and Enhance client relations both during pre and post project lifecycle.

Managed both inward facing service desk and outward Customer service requirements.

Worked on Seamless Integration with Interrelated Management tools.

Monitor the Servers include Sun Solaris and Windows platforms.

First point of observation, Deployment, Start and Stop processes if hawk failed to start – normally this will be a complicated restart that Hawk could not handle.

Manual Sequencing, when processes are coming up after a period of being down, Administrator is responsible to make sure the correct sequence of followed, also has to coordinate with Developers and Business.

Maintaining the Log Sheet of daily errors and resolutions.

Use TIBCO Hawk Microagents and Methods to monitor system, application health.

Implemented Custom/Hawk Accelerator Rule bases.

Set up rules so TIBCO Hawk Agent can manage exception conditions.

Hawk will be used to create rule bases to watch system resources and alert necessary personnel according to the situation.

Watch the Alerts/E-mails why are the alerts happening Immediate action. (Example: Notify the Administrator by Alert/email for details of critical errors)

Upgraded EMS 4.2 TO 4.5. And Hawk 4.2 to 4.6.

Deployment in UNIX/ Windows (Scripted Deployment/ TIBCO Admin.)

Configured and deployed projects (EAR) in PROD, Pre-PROD, QA, UAT & DEV environments conducted and documented testing prior to production launch.

Resolved the Tickets which open in USD (Unicenter Service Plus Service Desk) by Clients & Developers.

Closely worked with Teams like Change Order Management, Issue Management & Problem Resolution.

Created release notes documents to provide Support Personnel with Support instructions.

Streamlines the Identification, Tracking, and Resolution of end-user issues with complete audit trail.

Wrote Shell Script to clean-up Scripts for the Production and Non-Prod Environments.

Above Shell Script implemented with Hawk Schedule or Cron Jobs to run the Scripts.

Provided sequential startup and shutdown instructions for the newly implemented system.

Developed auditing and error handling schemas for BW process.

Upgraded TIBCO Business Works 5.2 to 5.3.

Upgraded EMS 4.2 to 4.5. And Hawk 4.2 to 4.6.

Provide end-user support and training for the user community.

Environment:

(TIBCO Business Works 5.3, TIBCO EMS 4.1/4.2, TIB/Hawk 4.6, Oracle 10g, Toad, Windows

2003 Advanced Server, Sun Solaris 5.8, Measure Ware, Reflection/Putty, VSS, USD)

Client: PepsiCo. Chicago (TIBCO PSG) Mar’ 06 – June’06

Role: Sr. TIBCO Administrator & Production Support

PepsiCo’s SAP supply chain systems which would allow its delivery and TIBCO to integrate their data with back-end corporate databases. Streamline distribution and delivery, improve planning and forecasting, increase information transparency, and link supply chain and inventory processes with customer-facing activities.

Mainframe, AS400

SAP

Oracle & Sysbase

24/7 on call Production Support.

Manual Sequencing, when processes are coming up after a period of being down, Administrator is responsible to make sure the correct sequence of followed, also has to coordinate with Developers and Business.

Servers include HP-UNIX and Windows platforms.

Configured and deployed projects in PRODUCTION /UAT environments.

Resolved the Tickets which opened in Clear Quest by Clients & Developers.

Closely worked with Change Order Management Team.

Recycle ADMIN, HAWKAGENT, JMS Server.

Using JMS client to Publish, Subscribe the messages to TOPIC and QUEUES.

Maintaining the Log Sheet of daily errors and resolutions.

Use TIBCO Hawk Microagents and Methods to monitor system, application health.

Used HP-MeasureWare to collect the Metrics from the Servers and the Metrics like machine level resources (CPU, Memory, Disk space) and Print a Excel Report.

Implemented Custom/Hawk Accelerator Rule bases.

Watch the Alerts/Email why are the alerts happening Immediate action. (Example: Notify the Administrator by email for details of critical errors).

Developed manual process like Queue Performance, Manipulate files to Automated Process in Business Works.

Upgraded TIBCO Business Works 5.1 to Business Works 5.2/5.3.

Upgraded EMS 4.2 TO 4.5. And Hawk 4.2 to 4.6.

Involved in TIBCO Application Migrations.

Involved in DR.

Created LandScape Visio Diagram and Tech. Documents..

Checking the status of the JMS Server, Checking the status of the queues, Consumer count, pending messages, checking durables. If Hawk fails to restart in any of the above mentioned conditions, Administrator has to take control.

Services(BW Process, Adapters)– If the service is stopped or suspended, The process is taking too long, Monitoring the log files, looking for Errors, Checking Dependencies, Checking Directories for file accumulation.

Trained employees in the use of Business Works, and JMS so that the knowledge is spread among employees.

Environment:

(TIBCO Business Works 5.3, TIBCO EMS 4.1/4.2, TIB/Hawk 4.6, Oracle 10g, Toad, Windows

2003 Advanced Server, HP UNIX, Measure Ware, Reflection/Putty, Star Team, Celar Quest)

Client: Smart and Final, LA, CA (TIBCO PSG) May’ 05 – Feb 06

Role: TIBCO Administrator & Production Support

Smart & Final has selected TIBCO’s business integration software to integrate a supply-chain system and pricing system that helps maximize business agility. Smart & Final, a leading grocery wholesaler, has been a pioneer of innovative business processes in the grocery industry. The current integration plan includes providing consistent and generic enterprise business processes and operations that involve interactions with the following software systems:

G.O.L.D.

Mainframe

Lawson

Oracle

Responsibilities:

24/7 on call Production Support.

Setup of the entire development environment with TIBCO Business Connect 5.X, TIBCO Business works 5.X, TIBCO EMS 4.X.

Implementation of the new TIBCO Business Connect Version at the client site.

Setup of Business Connect 5.0 with transports, trading partners, business agreements and transactions.

Servers include AIX and Windows platforms. These are monitored using Hawk and Manual. The results are displayed using SL(DashBoard) console.

First point of observation, Deployment, Start and Stop processes if hawk failed to start – normally this will be a complicated restart that Hawk could not handle.

Manual Sequencing, when processes are coming up after a period of being down. Administrator is responsible to make sure the correct sequence of followed, also has to coordinate with Developers and Business.

Configured and deployed projects in PRODUCTION and TESTING environments conducted and documented testing prior to production launch.

Some situations require that the Hawk rules be taken off for a period and then introduced again. For example the data base is down. The downstream services have to be shut down manually. Otherwise error messages will accumulate.

Recycle ADMIN, HAWKAGENT, JMS Server.

Using JMS client to Publish, Subscribe the messages to TOPIC and QUEUES.

Maintaining the Log Sheet of daily errors and resolutions.

Use TIBCO Hawk Microagents and Methods to monitor system, application health.

Set up rules so TIBCO Hawk Agent can manage exception conditions.

Hawk will be used to create rule bases to watch system resources and alert necessary personnel according to the situation.

Considerations include both machine level resources (CPU, Memory, Disk space) as well as application level resources (such as Ledger sizes and Database issues).

Hawk and SL can be used to collect data for performance metrics.

Watch the alerts why are the alerts happening Immediate action. (Example: Notify the Administrator by email for details of critical errors)

Developed manual process like Move, Delete files to Automated Process in Business Works.

Upgraded TIBCO Business Works 5.1 to Business Works 5.2/5.3.

Upgraded EMS 4.2 TO 4.5. And Hawk 4.2 to 4.6.

Implemented Business Factor and Designed to address business problems.

Checking the status of the JMS Server, Checking the status of the queues, Consumer count, pending messages, checking durables, pending messages. If Hawk fails to restart in any of the above mentioned conditions, Administrator has to take control.

Services(BW Process, Adapters)– If the service is stopped or suspended, The process is taking too long, Monitoring the log files, looking for Errors, Checking Dependencies, Checking Directories for file accumulation.

Trained employees in the use of Business Connect, Business Works, and JMS so that the knowledge is spread among employees.

Environment:

(TIBCO Business Works 5.2/5.3, TIBCO EMS 4.1/4.2, TIB/Hawk 4.5, Business Connect 5.0

Business Factor 5.1,Oracle 9i/10g, Java 1.2, XML, Windows 2003 Advanced Server,AIX, SL)

Client: Wynn Resorts, Las Vegas (TIBCO PSG) Mar’ 05 – May’05

Role: TIBCO Developer

Wynn Resorts has select TIBCO as their enterprise integration vendor. They plan to use TIBCO software as the basis of their Wynn Integration System (WISE). The current integration plan includes providing consistent and generic enterprise business processes and operations that involve interactions with the following software systems:

Micros Opera PMS

Acres CMS

Blue Martini CRM and Data warehouse

First Logic Address Correction Engine

The above system interfaces include the following items:

EMS Queue and Topic Names

HTTP/SOAP URLs and Ports

WSDL and XSD Schema Files

Responsibilities:

Communication between participating systems occurs using XML formatted messages. Transports include HTTP and JMS-compatible queues.

Most interactions are conducted using Web Services or XML-messaging through JMS queues.

All messages are defined using XML Schemas.

Systems are decoupled from each other, and communicate using the central message transports accessible through HTTP or JMS queues.

TIBCO Business Works handles all message schema translations, message routing, and core enterprise event business processes.

Environment:

TIBCO (TIBCO BusinessWorks 5.2, TIBCO EMS 4.1, TIB/Hawk 4.5), Java 1.2, XML,

Windows 2003 Advanced Server.

Client: Fashion Logistics Inc, NJ Mar’ 04 – March



Contact this candidate