Consuming topics can be done:
- from the Kafka Explorer, by right-clicking on a topic and selecting "Start Consumer".
- from a .kafka file by clicking on the
Start consumer
codelens displayed above aCONSUMER
block. - from the Start Consumer, from the command palette.
You can start consuming messages from the Kafka Explorer, by right-clicking on a topic:
Once this command is launched, it creates a consumer group (with an auto-generated id), and opens the Consumer View where you can see the messages being consumed:
In this case, the starting offset can be only be configured via the kafka.consumers.offset preference.
Define simple consumers in a .kafka
file, using the following format:
CONSUMER consumer-group-id
topic: json-events
partitions: 0
from: 1
Click on the Start consumer
link above the CONSUMER
line to start the consumer group:
The CONSUMER
block defines:
consumer group id
which is declared after CONSUMER [required].topic
: the topic id [required].from
: the offset from which the consumer group will start consuming messages from. Possible values are:earliest
,latest
, or an integer value. [optional].partitions
[EXPERIMENTAL option] : the partition number(s), or a partitions range, or a combinaison of partitions ranges [optional]. eg:- 0
- 0,1,2
- 0-2
- 0,2-3
key-format
: deserializer to use for the key [optional].value-format
: deserializer to use for the value [optional].
The deserializers can have the following value:
none
: no deserializer (ignores content).string
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.StringDeserializer. By default it supportsUTF-8
encoding, but you can specify the encoding as parameter like thisstring(base64)
. The valid encoding values are defined in Node.js' buffers and character encodings.double
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.DoubleDeserializer.float
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.FloatDeserializer.integer
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.IntegerDeserializer.long
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.LongDeserializer.short
: similar deserializer to the Kafka Java client org.apache.kafka.common.serialization.ShortDeserializer.
A codelens is displayed above each CONSUMER
line, and provides Start consumer
/ Stop consumer
commands depending on the consumer group status.
Completion snippets can help you quickly bootstrap new CONSUMER
blocks:
Completion is available for
- property name:
- property value:
- string encoding:
- topic:
Validation will help you write valid consumers in .kafka files.
Here is an example of topic validation:
Existing topic validation is done only when cluster is connected
. If the topic doesn't already exist, an error will be reported if the broker configuration is accessible and auto.create.topics.enable=false
.
Hover for properties documentation and topic informations is available in .kafka files.
Here is an example of hover on topic:
The Consumer View
is a read-only editor which shows consumed messages for a given topic:
This editor provides 2 commands on the top right of the editor:
Clear Consumer View
: clears the view.Start/Stop
: to stop or (re)start the consumer.
Consumers are based on virtual documents, available in the VS Code extension API. A consumer will keep running even if you close the document in the editor. You should make sure to close the consumer explicitly, either via the command palette, the status bar element or the start/stop action button as well. The VS Code API does not support detecting if a virtual document is closed immediately. Instead, the underlying virtual document is automatically closed after two minutes if the document is closed in the editor.
You can configure start offset for new consumers in settings (earliest, latest).
You can configure printing headers of message to view in settings (default: false).