Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 12 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,28 +72,25 @@ Kafbat UI wraps major functions of Apache Kafka with an intuitive user interface
![Interface](documentation/images/Interface.gif)

## Topics
Kafbat UI makes it easy for you to create topics in your browser by several clicks,
pasting your own parameters, and viewing topics in the list.
Kafbat UI makes it easy for you to create topics in your browser with just a few clicks, by pasting your own parameters, and viewing topics in the list.

![Create Topic](documentation/images/Create_topic_kafka-ui.gif)

It's possible to jump from connectors view to corresponding topics and from a topic to consumers (back and forth) for more convenient navigation.
connectors, overview topic settings.
You can jump from the connectors view to corresponding topics and from a topic to consumers (back and forth) for more convenient navigation, including connectors and overview topic settings.

![Connector_Topic_Consumer](documentation/images/Connector_Topic_Consumer.gif)

### Messages
Let's say we want to produce messages for our topic. With the Kafbat UI we can send or write data/messages to the Kafka topics without effort by specifying parameters, and viewing messages in the list.
Suppose you want to produce messages for your topic. With Kafbat UI, you can easily send or write data/messages to Kafka topics by specifying parameters and viewing messages in the list.

![Produce Message](documentation/images/Create_message_kafka-ui.gif)

## Schema registry
There are 3 supported types of schemas: Avro®, JSON Schema, and Protobuf schemas.
There are three supported types of schemas: Avro®, JSON Schema, and Protobuf schemas.

![Create Schema Registry](documentation/images/Create_schema.gif)

Before producing avro/protobuf encoded messages, you have to add a schema for the topic in Schema Registry. Now all these steps are easy to do
with a few clicks in a user-friendly interface.
Before producing Avro/Protobuf encoded messages, you need to add a schema for the topic in the Schema Registry. All these steps are now easy to do with just a few clicks in a user-friendly interface.

![Avro Schema Topic](documentation/images/Schema_Topic.gif)

Expand All @@ -111,7 +108,7 @@ docker run -it -p 8080:8080 -e DYNAMIC_CONFIG_ENABLED=true ghcr.io/kafbat/kafka-

Then access the web UI at [http://localhost:8080](http://localhost:8080)

The command is sufficient to try things out. When you're done trying things out, you can proceed with a [persistent installation](https://ui.docs.kafbat.io/quick-start/persistent-start)
This command is sufficient to try things out. When you're done, you can proceed with a [persistent installation](https://ui.docs.kafbat.io/quick-start/persistent-start).

## Persistent installation

Expand Down Expand Up @@ -146,24 +143,24 @@ Please refer to our [configuration](https://ui.docs.kafbat.io/configuration/conf

## Building from sources

[Quick start](https://ui.docs.kafbat.io/development/building/prerequisites) with building
[Quick start](https://ui.docs.kafbat.io/development/building/prerequisites) for building from source

## Liveliness and readiness probes
Liveliness and readiness endpoint is at `/actuator/health`.<br/>
Info endpoint (build info) is located at `/actuator/info`.
The liveness and readiness endpoint is at `/actuator/health`.<br/>
The info endpoint (build info) is located at `/actuator/info`.

# Configuration options

All the environment variables/config properties could be found [here](https://ui.docs.kafbat.io/configuration/misc-configuration-properties).
All environment variables and configuration properties can be found [here](https://ui.docs.kafbat.io/configuration/misc-configuration-properties).

# Contributing

Please refer to [contributing guide](https://ui.docs.kafbat.io/development/contributing), we'll guide you from there.
Please refer to the [contributing guide](https://ui.docs.kafbat.io/development/contributing); we'll guide you from there.

# Support

As we're fully independent, team members contribute in their free time.
Your support is crucial for us, if you wish to sponsor us, take a look [here](https://github.com/sponsors/kafbat)
Your support is crucial for us, if you wish to sponsor us, take a look [here](https://github.com/sponsors/kafbat)

# Powered by

Expand Down
4 changes: 2 additions & 2 deletions api/src/main/java/io/kafbat/ui/config/McpConfig.java
Original file line number Diff line number Diff line change
Expand Up @@ -44,12 +44,12 @@ public McpAsyncServer mcpServer(WebFluxSseServerTransportProvider transport) {
// Configure server capabilities with resource support
var capabilities = McpSchema.ServerCapabilities.builder()
.resources(false, true)
.tools(true) // Tool support with list changes notifications
.tools(true) // Tools support with list changes notifications
.prompts(false) // Prompt support with list changes notifications
.logging() // Logging support
.build();

// Create the server with both tool and resource capabilities
// Create the server with both tools and resource capabilities
return McpServer.async(transport)
.serverInfo("Kafka UI MCP", "0.0.1")
.capabilities(capabilities)
Expand Down
20 changes: 10 additions & 10 deletions documentation/compose/DOCKER_COMPOSE.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
# Descriptions of docker-compose configurations (*.yaml)

1. [kafka-ui.yaml](./kafbat-ui.yaml) - Default configuration with 2 kafka clusters with two nodes of Schema Registry, one kafka-connect and a few dummy topics.
2. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL
3. [kafka-cluster-sr-auth.yaml](./cluster-sr-auth.yaml) - Schema registry with authentication.
1. [kafka-ui.yaml](./kafbat-ui.yaml) - Default configuration with 2 Kafka clusters with two nodes of Schema Registry, one Kafka Connect, and a few dummy topics.
2. [kafka-ui-ssl.yml](./kafka-ssl.yml) - Connect to Kafka via TLS/SSL.
3. [kafka-cluster-sr-auth.yaml](./cluster-sr-auth.yaml) - Schema Registry with authentication.
4. [kafka-ui-auth-context.yaml](./auth-context.yaml) - Basic (username/password) authentication with custom path (URL) (issue 861).
5. [e2e-tests.yaml](./e2e-tests.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and Ksql functionality.
6. [kafka-ui-jmx-secured.yml](./ui-jmx-secured.yml) - Kafkas JMX with SSL and authentication.
7. [kafka-ui-reverse-proxy.yaml](./nginx-proxy.yaml) - An example for using the app behind a proxy (like nginx).
8. [kafka-ui-sasl.yaml](./ui-sasl.yaml) - SASL auth for Kafka.
9. [kafka-ui-traefik-proxy.yaml](./traefik-proxy.yaml) - Traefik specific proxy configuration.
10. [kafka-ui-with-jmx-exporter.yaml](./ui-with-jmx-exporter.yaml) - A configuration with 2 kafka clusters with enabled prometheus jmx exporters instead of jmx.
11. [kafka-with-zookeeper.yaml](./kafka-zookeeper.yaml) - An example for using kafka with zookeeper
5. [e2e-tests.yaml](./e2e-tests.yaml) - Configuration with different connectors (github-source, s3, sink-activities, source-activities) and KSQL functionality.
6. [kafka-ui-jmx-secured.yml](./ui-jmx-secured.yml) - Kafka's JMX with SSL and authentication.
7. [kafka-ui-reverse-proxy.yaml](./nginx-proxy.yaml) - An example of using the app behind a proxy (like nginx).
8. [kafka-ui-sasl.yaml](./ui-sasl.yaml) - SASL authentication for Kafka.
9. [kafka-ui-traefik-proxy.yaml](./traefik-proxy.yaml) - Traefik-specific proxy configuration.
10. [kafka-ui-with-jmx-exporter.yaml](./ui-with-jmx-exporter.yaml) - A configuration with 2 Kafka clusters with enabled Prometheus JMX exporters instead of JMX.
11. [kafka-with-zookeeper.yaml](./kafka-zookeeper.yaml) - An example of using Kafka with ZooKeeper.
6 changes: 3 additions & 3 deletions e2e-playwright/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ End-to-End UI test automation using **Playwright**, **Cucumber.js**, and **TypeS

```bash
Local run:
Run kafbat (docker compose -f ./documentation/compose/e2e-tests.yaml up -d)
Run Kafbat (docker compose -f ./documentation/compose/e2e-tests.yaml up -d)
npm install
npx playwright install

Expand All @@ -24,7 +24,7 @@ npm run debug
npm run test:failed


Gihub action CI example
GitHub Actions CI example
name: CI

on:
Expand Down Expand Up @@ -53,4 +53,4 @@ jobs:

- name: 🚀 Run tests with ENV=prod
run: ENV=prod HEAD=false BASEURL=http://localhost:8080 npm run test
```
10 changes: 5 additions & 5 deletions frontend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Web UI for managing Apache Kafka clusters

## Getting started

Go to react app folder
Go to the React app folder
```sh
cd ./frontend
```
Expand All @@ -42,15 +42,15 @@ Install dependencies
pnpm install
```

Generate API clients from OpenAPI document
Generate API clients from the OpenAPI document
```sh
pnpm gen:sources
```

## Start application
### Proxying API Requests in Development

Create or update existing `.env.local` file with
Create or update the existing `.env.local` file with
```
VITE_DEV_PROXY= https://api.server # your API server
```
Expand All @@ -62,14 +62,14 @@ pnpm dev

### Docker way

Have to be run from root directory.
Must be run from the root directory.

Start Kafbat UI with your Kafka clusters:
```sh
docker-compose -f ./documentation/compose/kafbat-ui.yaml up
```

Make sure that none of the `.env*` files contain `DEV_PROXY` variable
Make sure that none of the `.env*` files contain the `DEV_PROXY` variable

Run the application
```sh
Expand Down
4 changes: 2 additions & 2 deletions frontend/src/lib/hooks/api/ksqlDb.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ export function useExecuteKsqlkDbQueryMutation() {
const getFormattedErrorFromTableData = (
responseValues: KsqlTableResponse['values']
): { title: string; message: string } => {
// We expect someting like that
// We expect something like that
// [[
// "@type",
// "error_code",
Expand All @@ -55,7 +55,7 @@ const getFormattedErrorFromTableData = (
if (!responseValues || !responseValues.length) {
return {
title: 'Unknown error',
message: 'Recieved empty response',
message: 'Received empty response',
};
}

Expand Down
34 changes: 17 additions & 17 deletions frontend/src/lib/permissions.ts
Original file line number Diff line number Diff line change
Expand Up @@ -54,22 +54,22 @@ const valueMatches = (regexp: string | undefined, val: string | undefined) => {
};

/**
* @description it the logic behind depending on the roles whether a certain action
* is permitted or not the philosophy is inspired from Headless UI libraries where
* you separate the logic from the renderer besides the Creation process which is handled by isPermittedToCreate
* @description The logic behind determining whether a certain action
* is permitted or not depending on the roles. The philosophy is inspired by Headless UI libraries where
* you separate the logic from the renderer, besides the Creation process which is handled by isPermittedToCreate.
*
* Algorithm: we Mapped the cluster name and the resource name , because all the actions in them are
* constant and limited and hence faster lookup approach
* Algorithm: We mapped the cluster name and the resource name, because all the actions in them are
* constant and limited, and hence a faster lookup approach.
*
* @example you can use this in the hook format where it used in , or if you want to calculate it dynamically
* you can call this dynamically in your component but the render is on you from that point on
* @example You can use this in the hook format where it is used, or if you want to calculate it dynamically,
* you can call this dynamically in your component, but the render is on you from that point on.
*
* Don't use this anywhere , use the hook version in the component for declarative purposes
* Don't use this anywhere; use the hook version in the component for declarative purposes.
*
* Array action approach bear in mind they should be from the same resource with the same name restrictions, then the logic it
* will try to find every element from the given array inside the permissions data
* Array action approach: bear in mind they should be from the same resource with the same name restrictions; then the logic
* will try to find every element from the given array inside the permissions data.
*
* DON'T use the array approach until it is necessary to do so
* DON'T use the array approach unless it is necessary to do so.
*
* */
export function isPermitted({
Expand Down Expand Up @@ -113,15 +113,15 @@ export function isPermitted({

/**
* @description it the logic behind depending on create roles, since create has extra custom permission logic that is why
* it is seperated from the others
* * it is seperated from the others
*
* Algorithm: we Mapped the cluster name and the resource name , because all the actions in them are
* constant and limited and hence faster lookup approach
* Algorithm: We mapped the cluster name and the resource name, because all the actions in them are
* constant and limited, and hence faster lookup approach.
*
* @example you can use this in the hook format where it used in , or if you want to calculate it dynamically
* you can call this dynamically in your component but the render is on you from that point on
* @example You can use this in the hook format where it is used, or if you want to calculate it dynamically,
* you can call this dynamically in your component, but the render is on you from that point on.
*
* Don't use this anywhere , use the hook version in the component for declarative purposes
* Don't use this anywhere; use the hook version in the component for declarative purposes.
*
* */
export function isPermittedToCreate({
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,11 @@ public final class DeserializeResult {
*/
public enum Type {
/**
* Content is the string. Will be shown as is.
* Content is a string. Will be shown as is.
*/
STRING,
/**
* Content is the json object. Will be parsed by Jackson object mapper.
* Content is a JSON object. Will be parsed by the Jackson object mapper.
*/
JSON
;
Expand All @@ -42,26 +42,26 @@ public DeserializeResult(String result, Type type, Map<String, Object> additiona
}

/**
* Getters for result.
* Getter for result.
* @return string representation of deserialized binary data, can be null
*/
public String getResult() {
return result;
}

/**
* Will be show as json dictionary in UI (serialized with Jackson object mapper).
* @return additional information about deserialized value.
* Will be shown as a JSON dictionary in the UI (serialized with the Jackson object mapper).
* @return additional information about the deserialized value.
* It is recommended to use primitive types and strings for values.
*/
public Map<String, Object> getAdditionalProperties() {
return additionalProperties;
}

/**
* Type of deserialized result. Will be used as hint for some internal logic
* @return type of deserialized result. Will be used as hint for some internal logic
* (ex. if type==STRING smart filters won't try to parse it as json for further usage)
* Type of deserialized result. Will be used as a hint for some internal logic.
* @return type of deserialized result. Will be used as a hint for some internal logic
* (ex., if type==STRING, smart filters won't try to parse it as JSON for further usage)
*/
public Type getType() {
return type;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@

/**
* Provides access to configuration properties.
*Actual implementation uses {@code org.springframework.boot.context.properties.bind.Binder} class
* to bind values to target types. Target type params can be custom configs classes, not only simple types and strings.
* Actual implementation uses {@code org.springframework.boot.context.properties.bind.Binder} class
* to bind values to target types. Target type params can be custom config classes, not only simple types and strings.
*
*/
public interface PropertyResolver {
Expand All @@ -17,30 +17,30 @@ public interface PropertyResolver {
* @param <T> the type of the property
* @param key property name
* @param targetType type of property value
* @return property value or empty {@code Optional} if property not found
* @return property value or empty {@code Optional} if property is not found
*/
<T> Optional<T> getProperty(String key, Class<T> targetType);


/**
* Get list-property value by name
* Get list property value by name.
*
* @param <T> the type of the item
* @param key list property name
* @param itemType type of list element
* @return list property value or empty {@code Optional} if property not found
* @return list property value or empty {@code Optional} if property is not found
*/
<T> Optional<List<T>> getListProperty(String key, Class<T> itemType);

/**
* Get map-property value by name
* Get map property value by name.
*
* @param key map-property name
* @param key map property name
* @param keyType type of map key
* @param valueType type of map value
* @param <K> the type of the key
* @param <V> the type of the value
* @return map-property value or empty {@code Optional} if property not found
* @return map property value or empty {@code Optional} if property is not found
*/
<K, V> Optional<Map<K, V>> getMapProperty(String key, Class<K> keyType, Class<V> valueType);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,9 @@ public final class SchemaDescription {

/**
* Constructor for {@code SchemaDescription}.
* @param schema schema descriptions.
* If contains json-schema (preferred) UI will use it for validation and sample data generation.
* @param additionalProperties additional properties about schema (may be rendered in UI in the future)
* @param schema schema description.
* If it contains a JSON schema (preferred), the UI will use it for validation and sample data generation.
* @param additionalProperties additional properties about the schema (may be rendered in the UI in the future)
*/
public SchemaDescription(String schema, Map<String, Object> additionalProperties) {
this.schema = schema;
Expand All @@ -23,15 +23,15 @@ public SchemaDescription(String schema, Map<String, Object> additionalProperties

/**
* Schema description text. Can be null.
* @return schema description text. Preferably contains json-schema. Can be null.
* @return schema description text. Preferably contains a JSON schema. Can be null.
*/
public String getSchema() {
return schema;
}

/**
* Additional properties about schema.
* @return additional properties about schema
* Additional properties about the schema.
* @return additional properties about the schema
*/
public Map<String, Object> getAdditionalProperties() {
return additionalProperties;
Expand Down
Loading
Loading