Skip to content

Commit

Permalink
Add complete user guide documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Joe Schmetzer committed Aug 20, 2020
1 parent 4c1aa67 commit 5687a90
Show file tree
Hide file tree
Showing 5 changed files with 395 additions and 35 deletions.
37 changes: 2 additions & 35 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,42 +25,9 @@ giving minor performance penalty and reduced LDAPS traffic.

**N.B.** that the directory hosting yaml configuration file must be in CLASSPATH.

## Kafka configuration examples
## Quickstart

Example of Kafka server.properties for using the customized classes for authentication and authorization. The example
focus on minimum configuration only (sasl plaintext). A production environment should utilize plain with TLS.

```properties
# Configure inter-broker communication to use plaintext (Use SSL/TLS in Prod!)
listeners=SASL_PLAINTEXT://localhost:9092
security.inter.broker.protocol=SASL_PLAINTEXT

# Configure brokers to exchange plain text username/password.
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN

# Configure the JAAS context for plain.
# It is also possible to use an external JAAS file instead of this property
listener.name.sasl_plaintext.plain.sasl.jaas.config=\
org.apache.kafka.common.security.plain.PlainLoginModule required \
username="ldap-user" \
password="ldap-password";

# Configure the authentication to use LDAP (verify that client is actually who they say they are)
listener.name.sasl_plaintext.plain.sasl.server.callback.handler.class=\
com.instaclustr.kafka.ldap.authentication.SimpleLDAPAuthentication

# Configure the authorization to use LDAP (verify that client is allowed to perform a specific action)
authorizer.class.name=com.instaclustr.kafka.ldap.authorization.SimpleLDAPAuthorizer
```

## Testing

Use of Unboundid in-memory LDAP server for all test cases.

Tested on Kafka version 2.x

See [Apache Kafka](https://kafka.apache.org/) in order to test locally.
A tutorial with configuration examples is available in the [User Guide](docs/index.md).

## Build

Expand Down
18 changes: 18 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
# User Guide for `kafka-ldap-integration`

This user guide covers the following:
1. How to enable authentication (`SASL_PLAINTEXT`) and authorization for [Apache Kafka](https://kafka.apache.org).
1. Code documentation outlining LDAP bind (authentication verification) and LDAP group membership (authorization) in the Kafka context.
1. Use of Apache Directory Studio, LDAP server up and running in a few minutes

Following these instructions will take approximately 30 minutes. At the end, you should have a working Kafka environment locally.

Two important observations:
* Never use `SASL_PLAINTEXT` in production systems. TLS configuration is an independent activity beyond this context. Just substitute `SASL_PLAINTEXT` with `SASL_SSL` when activating TLS.
* This text is a minimalistic "recipe", please read more on relevant sites for enlightenment.

## Table of Contents

* [Kafka Broker Setup](kafka.md)
* [LDAP Server Setup](ldap.md)
* [Test Drive Kafka LDAP Integration](test.md)
90 changes: 90 additions & 0 deletions docs/kafka.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
# Kafka Broker Setup

This is part of the [User Guide for `kafka-ldap-integration`](index.md).

## Download Kafka

Download Kafka from the [Apache Kafka Downloads](http://kafka.apache.org/downloads) page. There are multiple versions of Kafka available. You should download the version that matches the `kafka_version` and `scala_version` properties in [`build.gradle`](../build.gradle).

Extract the downloaded archive into a directory of your choice. For the remainder of this example, the directory containing the Kafka distribution will be referred to as `$KAFKA_HOME`. It's a good idea to set this as an environment variable.

## Build the Jar and Add to Classpath

From a terminal window or command prompt, run the following commands:

```shell script
# Clone the repository locally
git clone [email protected]:instaclustr/kafka-ldap-integration.git

# Change into the directory
cd kafka-ldap-integration

# Build the jar
./gradlew build shadowJar

# Copy the jar into the Kafka distribution libs folder
cp build/libs/*.jar $KAFKA_HOME/libs
```

## Configure the Broker to Use `kafka-ldap-integration`

Open `$KAFKA_HOME/config/server.properties` in your favourite editor, and add the following lines to the bottom:

```properties
# Configure inter-broker communication to use plaintext (Use SSL/TLS in Prod!)
listeners=SASL_PLAINTEXT://localhost:9092
security.inter.broker.protocol=SASL_PLAINTEXT

# Configure brokers to exchange plain text username/password.
sasl.mechanism.inter.broker.protocol=PLAIN
sasl.enabled.mechanisms=PLAIN

# Configure the JAAS context for plain.
# It is also possible to use an external JAAS file instead of this property
listener.name.sasl_plaintext.plain.sasl.jaas.config=\
org.apache.kafka.common.security.plain.PlainLoginModule required \
username="srvkafkabroker" \
password="broker";

# Configure the authentication to use LDAP (verify that client is actually who they say they are)
listener.name.sasl_plaintext.plain.sasl.server.callback.handler.class=\
com.instaclustr.kafka.ldap.authentication.SimpleLDAPAuthentication

# Configure the authorization to use LDAP (verify that client is allowed to perform a specific action)
authorizer.class.name=com.instaclustr.kafka.ldap.authorization.SimpleLDAPAuthorizer

# Configure super users
super.users=User:srvkafkabroker
```

There needs to be a separate file containing the connection and configuration information for the LDAP authentication and authorization. This file must be available in the root of the classpath and called `ldapconfig.yaml`

Open `$KAFKA_HOME/config/ldapconfig.yaml` in your editor, and copy in the following:

```yaml
#host of the LDAP server
host: localhost
#port of the LDAP server
port: 10636
# connectionTimout in milliseconds for LDAP
connTimeout: 10000
# Placement of users in LDAP tree
usrBaseDN: ou=users,dc=security,dc=example,dc=com
# User attribute for DN completion
usrUid: uid
# Placement of groups in LDAP tree
grpBaseDN: ou=groups,dc=security,dc=example,dc=com
# Group attribute for DN completion
grpUid: cn
# Group membership attribute name
grpAttrName: uniqueMember
# Lifetime of user entry in cache after cache-write - IN MINUTES
usrCacheExpire: 6
# Lifetime of group entry in cache after cache-write - IN MINUTES
grpCacheExpire: 6
```
## Next Steps
With the above configuration, you won't be able to start the broker until the LDAP Directory Server is up and running, with the correct configuration. Instructions on how to do this can be found here:
* [LDAP Server Setup](ldap.md)
168 changes: 168 additions & 0 deletions docs/ldap.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,168 @@
# LDAP Server Setup

This is part of the [User Guide for `kafka-ldap-integration`](index.md).

## Set up a Directory Server

_If you already have a directory server available for testing, you can skip this section._

* Download and install the latest stable version from [ApacheDS](https://directory.apache.org/apacheds/). It's best to use the installer for your specific operating system.
* Start the server. On Linux, the command will be something like "`sudo /etc/init.d/apacheds-<version> start`"

A fresh install of ApacheDS will have a default administration DN of "`uid=admin,ou=system`", with a bind password of "`secret`".

## Set up Test Data

In your favourite editor, create a new file called `test.ldif`, and copy in the following data:

```ldif
dn: dc=security,dc=example,dc=com
objectClass: top
objectClass: domain
dc: security
dn: ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: organizationalUnit
ou: users
dn: uid=srvkafkabroker,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Broker
sn: Broker
uid: srvkafkabroker
userPassword: broker
dn: uid=srvbinder,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Service Binder
sn: Binder
uid: srvbinder
userPassword: binder
dn: uid=srvkafkasregistry,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka SRegistry
sn: SRegistry
uid: srvkafkasregistry
userPassword: sregistry
dn: uid=srvkafkaproducer,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Producer
sn: Producer
uid: srvkafkaproducer
userPassword: producer
dn: uid=srvkafkaproducer2,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Producer2
sn: Producer2
uid: srvkafkaproducer2
userPassword: producer2
dn: uid=srvkafkaproducer3,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Producer3
sn: Producer3
uid: srvkafkaproducer3
userPassword: producer3
dn: uid=srvkafkaconsumer,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Consumer
sn: Consumer
uid: srvkafkaconsumer
userPassword: consumer
dn: uid=srvkafkaconsumer2,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Consumer2
sn: Consumer2
uid: srvkafkaconsumer2
userPassword: consumer2
dn: uid=srvkafkaconsumer3,ou=users,dc=security,dc=example,dc=com
objectClass: top
objectClass: inetOrgPerson
objectClass: person
objectClass: organizationalPerson
cn: Kafka Consumer3
sn: Consumer3
uid: srvkafkaconsumer3
userPassword: consumer3
dn: ou=groups,dc=security,dc=example,dc=com
objectClass: top
objectClass: organizationalUnit
ou: groups
dn: cn=ktconsTest,ou=groups,dc=security,dc=example,dc=com
objectClass: groupOfUniqueNames
objectClass: top
cn: ktacons
cn: ktconstest
uniqueMember: uid=srvkafkaconsumer,ou=users,dc=security,dc=example,dc=com
uniqueMember: uid=srvkafkaconsumer2,ou=users,dc=security,dc=example,dc=com
dn: cn=ktprodTest,ou=groups,dc=security,dc=example,dc=com
objectClass: groupOfUniqueNames
objectClass: top
cn: ktaprod
cn: ktprodtest
uniqueMember: uid=srvkafkaproducer3,ou=users,dc=security,dc=example,dc=com
```

Save the file.

## Import Test Data

You can use any command line tools or GUI specific to your directory server. These instructions assume you are running the [Apache Directory Studio](https://directory.apache.org/studio/) GUI, and connecting to the ApacheDS server described in the previous section.

**Connect to the Directory Server**:
* From the `LDAP` top level menu, select `New Connection...`
* On the Network Parameter tab:
* Hostname: `localhost`
* Port: `10636`
* Encryption method: `Use SSL encryption (ldaps://)`
* Click "Next"
* On the Authentication tab:
* Authentication method: `Simple Authentication`
* Bind DN or user: `uid=admin,ou=system`
* Bind password: `secret`
* Click "Finish"
* Open the connection

**Import the test data**:
* From the `File` top level menu, select `Import...`
* Select `LDIF into LDAP`, and click 'Next'
* Select the LDIF file you saved in the previous section, and click 'Next'
* You should now be able to browse the entries in the LDAP Browser window.

## Next Steps

[Test Drive Kafka LDAP Integration](test.md).
Loading

0 comments on commit 5687a90

Please sign in to comment.