Enterprises offer a multitude of channels for users to interact with. The channels include traditional methods like ATMs and modern methods like digital wallets. Enterprises can unlock a wealth of benefits by capitalizing on business events in real-time .. naming a few:
- Enhanced Analytics Capabilities
- Improved Business Visibility
- Better Security
By centralizing and analyzing business events from various sources, enterprises can gain a comprehensive understanding of their bussiness applications, enabling them to make more informed decisions and take proactive actions; in other words, to become an event-driven enterprise. However, events are diverse in nature, and that is expected due to the various industrial strandards and practices.
In this blog, I will explore how IBM App Connect can address this challenge.
IBM App Connect has been transforming and streamlining transactions for businesses worldwide. It offers a wide range of features and capabilities designed to help enterprises automate and optimize their integration requirements, ensuring seamless and efficient communication between various systems and applications. Also, it has built-in support for messages in the multiple domains such as BLOB
, XML
, JSON
, and DFDL
, and wide range of industrial standards and messaging protocols, such as FIN
, SWIFT
, EDIFACT
and X12
.
Let me demonstrate this by guiding you through an example that will accomplish the following tasks:
- Receive messages from IBM MQ
- Transform messages
- Stream messages to a Kafka cluster
Queuing System Configuration
The streaming queues feature of IBM MQ allows you to configure a queue to put a near-identical copy of every message to a second queue. The feature is useful when you need to create a copy of your messages without impacting ongoing communications.
I have created two local queues, DEV.PAY.IN
and DEV.PAY.EVENTS
.
Messages will be queued in DEV.PAY.IN
, simultaneously, a copy of every message will be stored in DEV.PAY.EVENTS
.
DEFINE QLOCAL(DEV.PAY.EVENTS)
DEFINE QLOCAL(DEV.PAY.IN) STRMQOS(MUSTDUP) STREAMQ(DEV.PAY.EVENTS)
You can configure streaming queues in one of two modes: Best effort and Must duplicate. For more information, see how you configure streaming queues for information on the additional attributes added to local and model queues enabling message streaming.
Develop the Integration Flow
In this section, I will use App Connect to address the following 3 items:
- Receive messages from IBM MQ
- Transform messages
- Stream messages to a Kafka cluster
App Connect supplies built-in nodes that you can use to define your message flows, in the case, reading messages from MQ.
The MQInput node receives messages from a specified queue through a local or client connection to a queue manager. As mentioned earlier, I will be using the DEV.PAY.EVENTS
queue.
Let’s explore the type of messages in this queue. The DEV.PAY.IN
queue stores pacs.008 ISO 20022 messages. The pacs.008
format is used when banks exchange payments with other banks for payments clearing and settlement scenarios.
Before streaming SWIFT messages to a Kafka cluster, the messages need to be converted to a more friendly structure. App Connect provides multiple options to develop the transformation logic, such as Graphical Mapping node, ResetContentDescriptor node, Java Compute node, and ESQL Compute node.
Extended Structured Query Language (ESQL) is a programming language available in App Connect to define and manipulate data within a message flow. ESQL is based on Structured Query Language (SQL) which is in common usage with relational database.
This ESQL code snippet converts an XML message into JSON format:
SET OutputRoot.Properties = InputRoot.Properties;
CREATE LASTCHILD OF OutputRoot DOMAIN('JSON');
CREATE FIELD OutputRoot.JSON.Data;
SET OutputRoot.JSON.Data = InputRoot.XMLNSC;
Let me explain the logic a little more:
- The first line,
SET OutputRoot.Properties = InputRoot.Properties;
, sets the properties of theOutputRoot
object to the same values as those of theInputRoot
object. This is used to copy the state of one object to another - The second line,
CREATE LASTCHILD OF OutputRoot DOMAIN('JSON');
, creates a new child node calledJSON
under theOutputRoot
object, and specifies that its data type should be of theDOMAIN('JSON')
type. This allows me to work with theJSON
data type in the following lines of code. - The third line,
CREATE FIELD OutputRoot.JSON.Data;
, creates a new field calledData
within theJSON
child node. This field will hold the actual JSON data. - The fourth line,
SET OutputRoot.JSON.Data = InputRoot.XMLNSC;
, assigns theXMLNSC
value of theInputRoot
object to theData
field of theJSON
child node. This converts the XML data from theInputRoot
object into JSON format.
Now, the messages are converted to the desired structure, I can simply publish them to a Kafka cluster.
You can use the KafkaProducer node to connect to the Apache Kafka messaging system, and to publish messages from a message flow to a topic, i.e. DEV.PAY.EVENTS
.