Configuring event streams

PegaSys Plus provides an event streaming plugin for Apache Kafka. The plugin is configured using command line options.

Ensure a Kafka system is available to receive streams from PegaSys Plus.

Multiple topics are created for each event stream in the format <stream_prefix><domain_type>, where the domain types are:

  • block
  • transaction
  • smart-contract
  • node
  • log

<stream_prefix> is defined using --plugin-kafka-stream.

Configuring a Kafka Event Stream

Configure a Kafka event stream in the command line by enabling the plugin and setting the appropriate options.


besu --plugin-kafka-enabled --plugin-kafka-stream=my-besu-stream --plugin-kafka-url= --plugin-kafka-producer-config-override-enabled --plugin-kafka-producer-property=sasl.mechanism=PLAIN

The command line specifies:


If --plugin-kafka-url is not specified, the plugin attempts to connect to a local Kafka broker at

Filtering smart contract event logs

You can configure event streaming plugins to filter event logs from specified smart contracts. To create the filter, use the following CLI options:

To display the filtered events in a more readable format, create a schema file to decode the events. To specify the schema file location, use the --plugin-kafka-log-schema-file option.