View on GitHub

kafkahq

Kafka GUI for topics, topics data, consumers group, schema registry, connect and more...

KafkaHQ

Build Status Last Version

Kafka GUI for topics, topics data, consumers group, schema registry, connect and more…

preview

Features

Quick preview

It will start a Kafka node, a Zookeeper node, a Schema Registry, a Connect, fill with some sample data, start a consumer group and a kafka stream & start KafkaHQ.

Installation

First you need a configuration files in order to configure KafkaHQ connections to Kafka Brokers.

Docker

docker run -d \
    -p 8080:8080 \
    -v /tmp/application.yml:/app/application.yml \
    tchiotludo/kafkahq

Stand Alone

Configuration

Configuration file can by default be provided in either Java properties, YAML, JSON or Groovy files. YML Configuration file example can be found here :application.example.yml

Kafka cluster configuration

SSL Kafka Cluster with basic auth

Configuration example for kafka cluster secured by ssl for saas provider like aiven (full https & basic auth):

You need to generate a jks & p12 file from pem, cert files give by saas provider.

openssl pkcs12 -export -inkey service.key -in service.cert -out client.keystore.p12 -name service_key
keytool -import -file ca.pem -alias CA -keystore client.truststore.jks

Configurations will look like this example:

kafkahq:
  connections:
    ssl-dev:
      properties:
        bootstrap.servers: ".aivencloud.com:12835"
        security.protocol: SSL
        ssl.truststore.location: /avnadmin.truststore.jks
        ssl.truststore.password: 
        ssl.keystore.type: "PKCS12"
        ssl.keystore.location: /avnadmin.keystore.p12
        ssl.keystore.password: 
        ssl.key.password: 
      schema-registry:
        url: "https://.aivencloud.com:12838"
        basic-auth-username: avnadmin
        basic-auth-password: 
      connect:
        url: "https://.aivencloud.com:"
        basic-auth-username: avnadmin
        basic-auth-password: 

KafkaHQ configuration

Pagination

Topic List

Topic creation default values

These parameters are the default values used in the topic creation page.

Topic Data

Security

By default, security & roles is enabled by default but anonymous user have full access. You can completely disabled security with micronaut.security.enabled: false.

If you need a read-only application, simply add this to your configuration files :

kafkahq:
  security:
    default-roles:
      - topic/read
      - node/read
      - topic/data/read
      - group/read
      - registry/read
      - connect/read

Auth

Groups

Groups allow you to limit user

Define groups with specific roles for your users

2 defaults group are available :

Basic Auth

Take care that basic auth will use session store in server memory. If your instance is behind a reverse proxy or a loadbalancer, you will need to forward the session cookie named SESSION and / or use sesssion stickiness

LDAP

Configure how the ldap groups will be matched in KafkaHQ groups

Example using online ldap test server

Configure ldap connection in micronaut

micronaut:
  security:
    ldap:
      default:
        enabled: true
        context:
          server: 'ldap://ldap.forumsys.com:389'
          managerDn: 'cn=read-only-admin,dc=example,dc=com'
          managerPassword: 'password'
        search:
          base: "dc=example,dc=com"
        groups:
          enabled: true
          base: "dc=example,dc=com"

Configure KafkaHQ groups and Ldap groups

kafkahq:
  security:
    groups:
      topic-reader: # Group name
        roles:  # roles for the group
          - topic/read
        attributes:
          # Regexp to filter topic available for group
          topics-filter-regexp: "test\\.reader.*"
      topic-writer: 
        roles:
          - topic/read
          - topic/insert
          - topic/delete
          - topic/config/update
        attributes:
          topics-filter-regexp: "test.*"
    ldap:
      group:
        mathematicians:
          groups:
            - topic-reader
        scientists:
          groups:
            - topic-reader
            - topic-writer

Server

Kafka admin / producer / consumer default properties

Micronaut configuration

Since KafkaHQ is based on Micronaut, you can customize configurations (server port, ssl, …) with Micronaut configuration. More information can be found on Micronaut documentation

Docker

KafkaHQ docker image support 3 environment variables to handle configuraiton :

How to mount configuration file

Take care when you mount configuration files to not remove kafkahq files located on /app. You need to explicitely mount the /app/application.yml and not mount the /app directory. This will remove the KafkaHQ binnaries and give you this error: ` /usr/local/bin/docker-entrypoint.sh: 9: exec: ./kafkahq: not found`

volumeMounts:
- mountPath: /app/application.yml
  subPath: application.yml
  name: config
  readOnly: true

Monitoring endpoint

Several monitoring endpoint is enabled by default. You can disabled it or restrict access only for authenticated users following micronaut configuration below.

Development Environment

Early dev image

You can have access to last feature / bug fix with docker dev image automatically build on tag dev

docker pull tchiotludo/kafkahq:dev

The dev jar is not publish on GitHub, you have 2 solutions to have the dev jar :

Get it from docker image

docker pull tchiotludo/kafkahq:dev
docker run --rm --name=kafkahq -it tchiotludo/kafkahq:dev
docker cp kafkahq:/app/kafkahq.jar . 

Or build it with a ./gradlew shadowJar, the jar will be located here build/libs/kafkahq-*.jar

Development Server

A docker-compose is provide to start a development environnement. Just install docker & docker-compose, clone the repository and issue a simple docker-compose -f docker-compose-dev.yml up to start a dev server. Dev server is a java server & webpack-dev-server with live reload.

Who’s using KafkaHQ

Credits

Many thanks to:

Jetbrains

JetBrains for their free OpenSource license.

License

Apache 2.0 © tchiotludo