Producingandconsumingmessagesin Kafkafrom CICSapplications

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

GSE UK Virtual Conference 2021

Virtually the best way to learn about Z

Producing and consuming messages


in Kafka from CICS applications
Mark Cocker
IBM, CICS Transaction Server Product Manager
mark_cocker@uk.ibm.com
November 2021
Session 4AY
GSE UK Virtual Conference 2021
Virtually the best way to learn about Z

GSE UK Conference 2021 Charity Raffle


• The GSE UK Region team hope that you find this presentation and others that
follow useful and help to expand your knowledge of z Systems.
• Please consider showing your appreciation by kindly donating to our charities
this year, Royal National Lifeboat Institution (RNLI) & Guide Dogs for the Blind.
Then follow the link on your receipt to enter your receipt number & amount
donated into the GSE Raffle. You will get a raffle entry for every pound
donated.
• Follow the link below or scan the QR Code:
http://uk.virginmoneygiving.com/GuideShareEuropeUKRegion
Producing and
• Messaging and events
• What is Kafka?

consuming • CICS application options and considerations to


interact with Kafka

messages in
• CICS use cases with Kafka
• Code example

Kafka from CICS


• Questions?

applications
Apache Kafka is a distributed event streaming
platform typically used for high-performance data
pipelines, streaming analytics, data integration, and
interconnecting mission-critical applications. This
session introduces options for CICS applications to
produce and consume messages in Kafka, including
example code of the Kafka API in CICS.
3
Messaging is essential for building
fully connected, efficient and scalable
solutions

Application patterns: Critical exchange of Real-time event Event streaming for


information from one notification for data caching and
system to another microservice scalability processing

Messages must get through, Must be frictionless, scalable Must maintain a projection of
no questions asked. The and lightweight. The system the data for efficient
system must be secure and must be simple to exploit, repeatable processing. The
distributed invisible to the application system must scale for large
data volumes
4
Messaging is essential for building
fully connected, efficient and scalable
solutions

Architectural patterns: Messages and events for Events for data persistence
communication

Systems rely on messages and events to Events represent past state changes,
communicate between services, not just retaining a record enables reply and blurs the
within single applications, but across line between the role of messaging and data
organisations and between them storage

5
Message queueing vs event streaming

Message
Queuing
ü
Transient Data Targeted
Persistence Request / Reply Reliable Delivery

Making sense of queues and


event streams
Session 1BA, Today Wed 10 Nov

Event @ 12:00 GMT

Streaming Application patterns


Architectural patterns
Differences and similarities
Stream History Scalable Immutable Data
Consumption

6
What is Apache Kafka?
Apache Kafka is an open-source distributed event streaming platform used by
thousands of companies (more than 80% of Fortune 100 companies) for high-
performance data pipelines, streaming analytics, data integration, and mission-
critical applications. Source https://kafka.apache.org/

Producer Producer Producer


Kafka use cases
• Pub/sub messaging
• Website activity tracking
External • Metrics
System • Log aggregation
Source Stream
Connector Processor • Stream processing
Kafka
cluster • Event sourcing
Sink Stream
External Connector Processor • Commit log
System

Consumer Consumer Consumer

7
Getting started

Event driven architecture Kafka

• Its use for extending applications, • Getting started for the key concepts and terminology
of event streaming, Kafka features, and APIs
analytics, data lakes, and
integration with existing systems • Spring for Apache Kafka applies core Spring
see the IBM Garage Event concepts to the development of Kafka-based
Driven Reference Architecture. messaging solutions. It provides a "template" as a
This also discusses options to high-level abstraction for sending messages
replicate data at rest such as in
Db2, and operational data such CICS
as z/OS log and SMF data
• Get started with Java in CICS – how it works, design
• Message queues vs event choices for Java apps, CICS APIs, trying it out, …
streams - Making sense of • Java in CICS samples on GitHub
queues and event streams -
Session 1BA, Today Wed 10 Nov • CICS and Kafka integration blog with code snippets
@ 12:00 GMT

8
CICS application options to Main options for CICS applications:

interact with Kafka 1. Write a CICS application in Java that uses the Kafka
API, packaged using OSGi, Jakarta, or Spring
framework
There are many ways to interact with Kafa
instances - some of the primary ones include: 2. Write a CICS application to call the Kafka REST
APIs, for example via z/OS Connect EE, or your
1. Develop a Java application using the Kafka own Java REST component.
APIs – consumer, producer, stream
3. Write a CICS application to get/put messages to
• Or use one of the clients that provide IBM MQ, and then replicate the IBM MQ queue with
bindings for Node.js, C, and other a Kafka topic using either:
languages.
2. Use Kafka Connect to copy / replicate data • Kafka Connect source connector for IBM MQ:
with other systems. This also provides REST This is an open source project and can run on
APIs. z/OS co-located with IBM MQ, or another
platform.
3. Use the Kafka command line tools. These are
particularly useful when experimenting and • Confluent IBM MQ Sink Connector for Confluent
writing automation scripts.
Platform.

9
CICS application options to
interact with Kafka
CICS application or CICS use cases with Kafka
CICS application event
• CICS application
Producer Producer Producer
CICS application • Producer
and IBM MQ queue
• Consumer

External
System • CICS application event
Source Stream
Connector Processor • Producer
Kafka
cluster
Sink Stream
Connector
• CICS app and IBM MQ on
External Processor
z/OS, and source connector /
System sink on z/OS or distributed
• Producer
• Consumer
Consumer Consumer Consumer

CICS application • Kafka cluster on distributed,


zLinux, or z/OS

10
CICS application considerations
when interacting with Kafka

1. Transactional scope between Kafka and 4. Facilities to format / transform the data
resources being updated in CICS, Db2, IMS, MQ appropriate for consumption by consumers of
the topic, for example from EBCDIC binary
• Background reading: Does Apache Kafka do format into JSON.
ACID transactions? and
Transactions in Apache Kafka 5. Skills to develop Java applications and setup
2. Network latency and capacity to the Kafka
CICS.
instance that may adversely effect CICS
6. Existing use of products such as IBM MQ for
transaction response times.
z/OS and z/OS Connect EE.
• Ensure using connection pooling
eg. using a Java object pooling framework

3. Network reliability to the Kafka instance that may


adversely effect CICS transaction availability.

11
Example CICS use cases with
Kafka

Where CICS app is a producer of messages: Where CICS app is a consumer of messages:
• When a CICS application is processing an insurance • Consume messages from IoT devices that indicate
claim, capture summary data using a CICS application when stock has arrived in a warehouse to update the
event, format the data into JSON, and write it to a inventory in a VSAM file.
Kafka topic for consumers to aggregate into a
dashboard that plots claims on a map to spot trends.
• When a CICS application updates stock levels, put
new stock levels into a CICS container and link to a
program to format it into JSON and write it to a Kafka
topic to be consumed by an AI program to evaluate
when parts are likely to need re-ordering.
• When a transaction rate reaches a threshold, use a
CICS policy to capture the current rate, format the data
into JSON and write it to a Kafka topic to trigger an
early warning on a dashboard.

12
Customer scenario: CICS COBOL core banking
app with Java EE message producer to Kafka
z/OS CICS TS 5.6

CICS Liberty
Get the data by
Produce a
get container
Get COBOL
message to the Non z/OS
KAFKA Server
Controller Container
Aysnc Cash management
CICS TS 5.6
platform and
Transforming KAFKA Client
Core Banking data to JSON API corporate portal

EXEC CICS LINK Kafka


Exception Exception Message Format :
Message format:
CICS LINK File Handler JSON
CPSM
COBOL
CICS TS 5.6

CICS Liberty • Little change to core banking app


Get the data by • Separate CICS regions to host new
Produce a
get container components
message to the
Get COBOL
KAFKA Server
Controller Container • CICS ASYNC API used to queue work in
Aysnc
CICS for fast response to core banking
Transforming KAFKA Client • Java Liberty app to transform COBOL
data to JSON API copybook data to JSON in Java and call
Kafka client API in Java
• Exception handling via shared file and
periodic replay for failures
• Could run Kafka on zLinux or z/OS 13
Customer use case: Secure connections from CICS Java
app using Spring for Apache Kafka framework to Kafka
Option 1. Using Liberty profile to setup the SSL with RACF
CICS

Kafka Server
JVM Server
SSL Topic
RACF Spring
CA cert Kafka Client
+ Client Cert

Option 2. Using ATL/TLS to setup the SSL with RACF CA cert


+ Server Cert
CICS

Kafka Server
JVM Server
No SSL ATL/TLS Address Space Topic
Spring SSL
Kafka Client

RACF
CA cert
+ Client Cert
CA cert
+ Server Cert

14
CICS catalog manager code 1. When a stock item is ordered (LINK to program
DFH0XSOD), a CICS application event is used to
example capture stock order details.

• This means we don’t have to change the


catalog manager application, can start/stop
capturing the event independently to the
application, and the application not effected by
Capture information to send the latency or success/failure writing to Kafka.
to Kafka when dispatching
items from the catalog
• Alternatively, you could change the application
to issue a LINK to the CICS Java program to
wait for the success/failure writing to Kafka.

2. Each CICS application event calls a Java program


to convert the order into a JSON payload. It then
creates a key (stock number), a value (order
details). This, together with connection properties
are used with the Kafka Producer API to write a
message to the stock-ordered Kafka topic.

15
CICS application event

Application capture
point is LINK PROGRAM

16
CICS application event

Filter to only capture


when LINK PROGRAM
specifies program
DFH0XSOD

17
CICS application event

Capture this data from


the COMMAREA
specified on the
LINK PROGRAM

18
CICS application event

To emit the event,


start transaction
KAFP

Define CICS
transaction definition
for KAFP that specifies
program KAFKAPRO

19
EmitOrder.java
Java annotation results
in CICS program being
install when Java
program installed

Can then be called by


CICS application event
(or EXEC CICS LINK)
Get data from CICS container
and copy into a Java object

Create message key


and value pair

Call method to write


to Kafka

20
SimpleProducer.java

Setup a properties object


with connection details

Create a Kafka
Producer

Use producer to send


message to Kafka

21
Kafka properties

kafkaInstance.properties kafkaProducer.properties

22
Producing and
• Messaging and events
• What is Kafka?

consuming • CICS application options and considerations to


interact with Kafka

messages in
• CICS use cases with Kafka
• Code example

Kafka from CICS • Questions?

applications Also…

CICS Q and A - Ask the experts


Making sense of queues and
AKA Chump the Chimp
event streams
10:30 - Thu 11 Nov
Session 1BA, Today Wed 10 Nov
@ 12:00 GMT

23
GSE UK Virtual Conference 2021
Virtually the best way to learn about Z

Please submit your session feedback!


• Do it online at
https://conferences.gse.org.uk/2021/feedback/4AY

• This session is 4AY


GSE UK Virtual Conference 2021
Virtually the best way to learn about Z

Become a member of GSE UK


• Company or individual membership available
• Benefits include:
• GSE Annual Conference: Receive 5 free places + 2 free places for trainees
• 20% discount on fees for IBM Technical Conferences
• 20% on IBM Training Courses in Europe
• 15% discount for IBM STG Technical Conferences in the USA
• 20% discount on the fee for taking the Mainframe Technology Professional
(MTP) exams
• European events – via GSE HQ

• Contact membership@gse.org.uk for details

You might also like