Configuring Event Streams

PegaSys Plus provides event streaming plugins for Apache Kafka and Amazon Kinesis. The plugins can be used separately or together, and are configured using command line options.

Ensure a Kafka or Kinesis system is available to receive streams from PegaSys Plus.

Multiple topics are created for each event stream in the format <stream_prefix><domain_type>, where the domain types are:

  • block
  • transaction
  • smart-contract
  • node
  • log

<stream_prefix> is defined using --plugin-kafka-stream and. --plugin-kinesis-stream.

Configuring a Kafka Event Stream

Configure a Kafka event stream in the command line by enabling the plugin and setting the appropriate options.

Example

besu --plugin-kafka-enabled --plugin-kafka-stream=my-besu-stream --plugin-kafka-url=127.0.0.1:9090 --plugin-kafka-producer-config-override-enabled --plugin-kafka-producer-property=sasl.mechanism=PLAIN

The command line specifies:

Note

If --plugin-kafka-url is not specified, the plugin attempts to connect to a local Kafka broker at 127.0.0.1:9092.

Configuring an Amazon Kinesis Event Stream

Configure a Kinesis stream in the command line by enabling the plugin and setting the appropriate options.

Example

besu --plugin-kafka-enabled --plugin-kinesis-stream=my-besu-stream --plugin-kinesis-aws-region=us-east-1 --plugin-kinesis-aws-access-key-id=ABCDEF12356XYZ6DENPQ --plugin-kinesis-aws-secret-key=AluX9paN+Ms95SGLUGtuZi8h1P27v9fFq0uP4Rzn

The command line specifies:

Filtering Smart Contract Event Logs

You can configure event streaming plugins to filter event logs from specified smart contracts. To create the filter, use the following CLI options:

To display the filtered events in a more readable format, create a schema file to decode the events. To specify the schema file location, use the --plugin-kafka-log-schema-file or --plugin-kinesis-log-schema-file options.