Producingandconsumingmessagesin Kafkafrom CICSapplications
Producingandconsumingmessagesin Kafkafrom CICSapplications
Producingandconsumingmessagesin Kafkafrom CICSapplications
messages in
• CICS use cases with Kafka
• Code example
applications
Apache Kafka is a distributed event streaming
platform typically used for high-performance data
pipelines, streaming analytics, data integration, and
interconnecting mission-critical applications. This
session introduces options for CICS applications to
produce and consume messages in Kafka, including
example code of the Kafka API in CICS.
3
Messaging is essential for building
fully connected, efficient and scalable
solutions
Messages must get through, Must be frictionless, scalable Must maintain a projection of
no questions asked. The and lightweight. The system the data for efficient
system must be secure and must be simple to exploit, repeatable processing. The
distributed invisible to the application system must scale for large
data volumes
4
Messaging is essential for building
fully connected, efficient and scalable
solutions
Architectural patterns: Messages and events for Events for data persistence
communication
Systems rely on messages and events to Events represent past state changes,
communicate between services, not just retaining a record enables reply and blurs the
within single applications, but across line between the role of messaging and data
organisations and between them storage
5
Message queueing vs event streaming
Message
Queuing
ü
Transient Data Targeted
Persistence Request / Reply Reliable Delivery
6
What is Apache Kafka?
Apache Kafka is an open-source distributed event streaming platform used by
thousands of companies (more than 80% of Fortune 100 companies) for high-
performance data pipelines, streaming analytics, data integration, and mission-
critical applications. Source https://kafka.apache.org/
7
Getting started
• Its use for extending applications, • Getting started for the key concepts and terminology
of event streaming, Kafka features, and APIs
analytics, data lakes, and
integration with existing systems • Spring for Apache Kafka applies core Spring
see the IBM Garage Event concepts to the development of Kafka-based
Driven Reference Architecture. messaging solutions. It provides a "template" as a
This also discusses options to high-level abstraction for sending messages
replicate data at rest such as in
Db2, and operational data such CICS
as z/OS log and SMF data
• Get started with Java in CICS – how it works, design
• Message queues vs event choices for Java apps, CICS APIs, trying it out, …
streams - Making sense of • Java in CICS samples on GitHub
queues and event streams -
Session 1BA, Today Wed 10 Nov • CICS and Kafka integration blog with code snippets
@ 12:00 GMT
8
CICS application options to Main options for CICS applications:
interact with Kafka 1. Write a CICS application in Java that uses the Kafka
API, packaged using OSGi, Jakarta, or Spring
framework
There are many ways to interact with Kafa
instances - some of the primary ones include: 2. Write a CICS application to call the Kafka REST
APIs, for example via z/OS Connect EE, or your
1. Develop a Java application using the Kafka own Java REST component.
APIs – consumer, producer, stream
3. Write a CICS application to get/put messages to
• Or use one of the clients that provide IBM MQ, and then replicate the IBM MQ queue with
bindings for Node.js, C, and other a Kafka topic using either:
languages.
2. Use Kafka Connect to copy / replicate data • Kafka Connect source connector for IBM MQ:
with other systems. This also provides REST This is an open source project and can run on
APIs. z/OS co-located with IBM MQ, or another
platform.
3. Use the Kafka command line tools. These are
particularly useful when experimenting and • Confluent IBM MQ Sink Connector for Confluent
writing automation scripts.
Platform.
9
CICS application options to
interact with Kafka
CICS application or CICS use cases with Kafka
CICS application event
• CICS application
Producer Producer Producer
CICS application • Producer
and IBM MQ queue
• Consumer
External
System • CICS application event
Source Stream
Connector Processor • Producer
Kafka
cluster
Sink Stream
Connector
• CICS app and IBM MQ on
External Processor
z/OS, and source connector /
System sink on z/OS or distributed
• Producer
• Consumer
Consumer Consumer Consumer
10
CICS application considerations
when interacting with Kafka
1. Transactional scope between Kafka and 4. Facilities to format / transform the data
resources being updated in CICS, Db2, IMS, MQ appropriate for consumption by consumers of
the topic, for example from EBCDIC binary
• Background reading: Does Apache Kafka do format into JSON.
ACID transactions? and
Transactions in Apache Kafka 5. Skills to develop Java applications and setup
2. Network latency and capacity to the Kafka
CICS.
instance that may adversely effect CICS
6. Existing use of products such as IBM MQ for
transaction response times.
z/OS and z/OS Connect EE.
• Ensure using connection pooling
eg. using a Java object pooling framework
11
Example CICS use cases with
Kafka
Where CICS app is a producer of messages: Where CICS app is a consumer of messages:
• When a CICS application is processing an insurance • Consume messages from IoT devices that indicate
claim, capture summary data using a CICS application when stock has arrived in a warehouse to update the
event, format the data into JSON, and write it to a inventory in a VSAM file.
Kafka topic for consumers to aggregate into a
dashboard that plots claims on a map to spot trends.
• When a CICS application updates stock levels, put
new stock levels into a CICS container and link to a
program to format it into JSON and write it to a Kafka
topic to be consumed by an AI program to evaluate
when parts are likely to need re-ordering.
• When a transaction rate reaches a threshold, use a
CICS policy to capture the current rate, format the data
into JSON and write it to a Kafka topic to trigger an
early warning on a dashboard.
12
Customer scenario: CICS COBOL core banking
app with Java EE message producer to Kafka
z/OS CICS TS 5.6
CICS Liberty
Get the data by
Produce a
get container
Get COBOL
message to the Non z/OS
KAFKA Server
Controller Container
Aysnc Cash management
CICS TS 5.6
platform and
Transforming KAFKA Client
Core Banking data to JSON API corporate portal
Kafka Server
JVM Server
SSL Topic
RACF Spring
CA cert Kafka Client
+ Client Cert
Kafka Server
JVM Server
No SSL ATL/TLS Address Space Topic
Spring SSL
Kafka Client
RACF
CA cert
+ Client Cert
CA cert
+ Server Cert
14
CICS catalog manager code 1. When a stock item is ordered (LINK to program
DFH0XSOD), a CICS application event is used to
example capture stock order details.
15
CICS application event
Application capture
point is LINK PROGRAM
16
CICS application event
17
CICS application event
18
CICS application event
Define CICS
transaction definition
for KAFP that specifies
program KAFKAPRO
19
EmitOrder.java
Java annotation results
in CICS program being
install when Java
program installed
20
SimpleProducer.java
Create a Kafka
Producer
21
Kafka properties
kafkaInstance.properties kafkaProducer.properties
22
Producing and
• Messaging and events
• What is Kafka?
messages in
• CICS use cases with Kafka
• Code example
applications Also…
23
GSE UK Virtual Conference 2021
Virtually the best way to learn about Z