From 5687a9070f307d93617b183ed858decc8eed3ca1 Mon Sep 17 00:00:00 2001 From: Joe Schmetzer Date: Thu, 20 Aug 2020 12:45:27 +1000 Subject: [PATCH] Add complete user guide documentation --- README.md | 37 +---------- docs/index.md | 18 ++++++ docs/kafka.md | 90 +++++++++++++++++++++++++++ docs/ldap.md | 168 ++++++++++++++++++++++++++++++++++++++++++++++++++ docs/test.md | 117 +++++++++++++++++++++++++++++++++++ 5 files changed, 395 insertions(+), 35 deletions(-) create mode 100644 docs/index.md create mode 100644 docs/kafka.md create mode 100644 docs/ldap.md create mode 100644 docs/test.md diff --git a/README.md b/README.md index 65ac989b..2d3e19fc 100644 --- a/README.md +++ b/README.md @@ -25,42 +25,9 @@ giving minor performance penalty and reduced LDAPS traffic. **N.B.** that the directory hosting yaml configuration file must be in CLASSPATH. -## Kafka configuration examples +## Quickstart -Example of Kafka server.properties for using the customized classes for authentication and authorization. The example -focus on minimum configuration only (sasl plaintext). A production environment should utilize plain with TLS. - -```properties -# Configure inter-broker communication to use plaintext (Use SSL/TLS in Prod!) -listeners=SASL_PLAINTEXT://localhost:9092 -security.inter.broker.protocol=SASL_PLAINTEXT - -# Configure brokers to exchange plain text username/password. -sasl.mechanism.inter.broker.protocol=PLAIN -sasl.enabled.mechanisms=PLAIN - -# Configure the JAAS context for plain. -# It is also possible to use an external JAAS file instead of this property -listener.name.sasl_plaintext.plain.sasl.jaas.config=\ - org.apache.kafka.common.security.plain.PlainLoginModule required \ - username="ldap-user" \ - password="ldap-password"; - -# Configure the authentication to use LDAP (verify that client is actually who they say they are) -listener.name.sasl_plaintext.plain.sasl.server.callback.handler.class=\ - com.instaclustr.kafka.ldap.authentication.SimpleLDAPAuthentication - -# Configure the authorization to use LDAP (verify that client is allowed to perform a specific action) -authorizer.class.name=com.instaclustr.kafka.ldap.authorization.SimpleLDAPAuthorizer -``` - -## Testing - -Use of Unboundid in-memory LDAP server for all test cases. - -Tested on Kafka version 2.x - -See [Apache Kafka](https://kafka.apache.org/) in order to test locally. +A tutorial with configuration examples is available in the [User Guide](docs/index.md). ## Build diff --git a/docs/index.md b/docs/index.md new file mode 100644 index 00000000..8ec738a5 --- /dev/null +++ b/docs/index.md @@ -0,0 +1,18 @@ +# User Guide for `kafka-ldap-integration` + +This user guide covers the following: +1. How to enable authentication (`SASL_PLAINTEXT`) and authorization for [Apache Kafka](https://kafka.apache.org). +1. Code documentation outlining LDAP bind (authentication verification) and LDAP group membership (authorization) in the Kafka context. +1. Use of Apache Directory Studio, LDAP server up and running in a few minutes + +Following these instructions will take approximately 30 minutes. At the end, you should have a working Kafka environment locally. + +Two important observations: +* Never use `SASL_PLAINTEXT` in production systems. TLS configuration is an independent activity beyond this context. Just substitute `SASL_PLAINTEXT` with `SASL_SSL` when activating TLS. +* This text is a minimalistic "recipe", please read more on relevant sites for enlightenment. + +## Table of Contents + +* [Kafka Broker Setup](kafka.md) +* [LDAP Server Setup](ldap.md) +* [Test Drive Kafka LDAP Integration](test.md) diff --git a/docs/kafka.md b/docs/kafka.md new file mode 100644 index 00000000..c4b36fbb --- /dev/null +++ b/docs/kafka.md @@ -0,0 +1,90 @@ +# Kafka Broker Setup + +This is part of the [User Guide for `kafka-ldap-integration`](index.md). + +## Download Kafka + +Download Kafka from the [Apache Kafka Downloads](http://kafka.apache.org/downloads) page. There are multiple versions of Kafka available. You should download the version that matches the `kafka_version` and `scala_version` properties in [`build.gradle`](../build.gradle). + +Extract the downloaded archive into a directory of your choice. For the remainder of this example, the directory containing the Kafka distribution will be referred to as `$KAFKA_HOME`. It's a good idea to set this as an environment variable. + +## Build the Jar and Add to Classpath + +From a terminal window or command prompt, run the following commands: + +```shell script +# Clone the repository locally +git clone git@github.com:instaclustr/kafka-ldap-integration.git + +# Change into the directory +cd kafka-ldap-integration + +# Build the jar +./gradlew build shadowJar + +# Copy the jar into the Kafka distribution libs folder +cp build/libs/*.jar $KAFKA_HOME/libs +``` + +## Configure the Broker to Use `kafka-ldap-integration` + +Open `$KAFKA_HOME/config/server.properties` in your favourite editor, and add the following lines to the bottom: + +```properties +# Configure inter-broker communication to use plaintext (Use SSL/TLS in Prod!) +listeners=SASL_PLAINTEXT://localhost:9092 +security.inter.broker.protocol=SASL_PLAINTEXT + +# Configure brokers to exchange plain text username/password. +sasl.mechanism.inter.broker.protocol=PLAIN +sasl.enabled.mechanisms=PLAIN + +# Configure the JAAS context for plain. +# It is also possible to use an external JAAS file instead of this property +listener.name.sasl_plaintext.plain.sasl.jaas.config=\ + org.apache.kafka.common.security.plain.PlainLoginModule required \ + username="srvkafkabroker" \ + password="broker"; + +# Configure the authentication to use LDAP (verify that client is actually who they say they are) +listener.name.sasl_plaintext.plain.sasl.server.callback.handler.class=\ + com.instaclustr.kafka.ldap.authentication.SimpleLDAPAuthentication + +# Configure the authorization to use LDAP (verify that client is allowed to perform a specific action) +authorizer.class.name=com.instaclustr.kafka.ldap.authorization.SimpleLDAPAuthorizer + +# Configure super users +super.users=User:srvkafkabroker +``` + +There needs to be a separate file containing the connection and configuration information for the LDAP authentication and authorization. This file must be available in the root of the classpath and called `ldapconfig.yaml` + +Open `$KAFKA_HOME/config/ldapconfig.yaml` in your editor, and copy in the following: + +```yaml +#host of the LDAP server +host: localhost +#port of the LDAP server +port: 10636 +# connectionTimout in milliseconds for LDAP +connTimeout: 10000 +# Placement of users in LDAP tree +usrBaseDN: ou=users,dc=security,dc=example,dc=com +# User attribute for DN completion +usrUid: uid +# Placement of groups in LDAP tree +grpBaseDN: ou=groups,dc=security,dc=example,dc=com +# Group attribute for DN completion +grpUid: cn +# Group membership attribute name +grpAttrName: uniqueMember +# Lifetime of user entry in cache after cache-write - IN MINUTES +usrCacheExpire: 6 +# Lifetime of group entry in cache after cache-write - IN MINUTES +grpCacheExpire: 6 +``` + +## Next Steps + +With the above configuration, you won't be able to start the broker until the LDAP Directory Server is up and running, with the correct configuration. Instructions on how to do this can be found here: +* [LDAP Server Setup](ldap.md) diff --git a/docs/ldap.md b/docs/ldap.md new file mode 100644 index 00000000..a8279984 --- /dev/null +++ b/docs/ldap.md @@ -0,0 +1,168 @@ +# LDAP Server Setup + +This is part of the [User Guide for `kafka-ldap-integration`](index.md). + +## Set up a Directory Server + +_If you already have a directory server available for testing, you can skip this section._ + +* Download and install the latest stable version from [ApacheDS](https://directory.apache.org/apacheds/). It's best to use the installer for your specific operating system. +* Start the server. On Linux, the command will be something like "`sudo /etc/init.d/apacheds- start`" + +A fresh install of ApacheDS will have a default administration DN of "`uid=admin,ou=system`", with a bind password of "`secret`". + +## Set up Test Data + +In your favourite editor, create a new file called `test.ldif`, and copy in the following data: + +```ldif +dn: dc=security,dc=example,dc=com +objectClass: top +objectClass: domain +dc: security + +dn: ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: organizationalUnit +ou: users + +dn: uid=srvkafkabroker,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Broker +sn: Broker +uid: srvkafkabroker +userPassword: broker + +dn: uid=srvbinder,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Service Binder +sn: Binder +uid: srvbinder +userPassword: binder + +dn: uid=srvkafkasregistry,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka SRegistry +sn: SRegistry +uid: srvkafkasregistry +userPassword: sregistry + +dn: uid=srvkafkaproducer,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Producer +sn: Producer +uid: srvkafkaproducer +userPassword: producer + +dn: uid=srvkafkaproducer2,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Producer2 +sn: Producer2 +uid: srvkafkaproducer2 +userPassword: producer2 + +dn: uid=srvkafkaproducer3,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Producer3 +sn: Producer3 +uid: srvkafkaproducer3 +userPassword: producer3 + +dn: uid=srvkafkaconsumer,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Consumer +sn: Consumer +uid: srvkafkaconsumer +userPassword: consumer + +dn: uid=srvkafkaconsumer2,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Consumer2 +sn: Consumer2 +uid: srvkafkaconsumer2 +userPassword: consumer2 + +dn: uid=srvkafkaconsumer3,ou=users,dc=security,dc=example,dc=com +objectClass: top +objectClass: inetOrgPerson +objectClass: person +objectClass: organizationalPerson +cn: Kafka Consumer3 +sn: Consumer3 +uid: srvkafkaconsumer3 +userPassword: consumer3 + +dn: ou=groups,dc=security,dc=example,dc=com +objectClass: top +objectClass: organizationalUnit +ou: groups + +dn: cn=ktconsTest,ou=groups,dc=security,dc=example,dc=com +objectClass: groupOfUniqueNames +objectClass: top +cn: ktacons +cn: ktconstest +uniqueMember: uid=srvkafkaconsumer,ou=users,dc=security,dc=example,dc=com +uniqueMember: uid=srvkafkaconsumer2,ou=users,dc=security,dc=example,dc=com + +dn: cn=ktprodTest,ou=groups,dc=security,dc=example,dc=com +objectClass: groupOfUniqueNames +objectClass: top +cn: ktaprod +cn: ktprodtest +uniqueMember: uid=srvkafkaproducer3,ou=users,dc=security,dc=example,dc=com +``` + +Save the file. + +## Import Test Data + +You can use any command line tools or GUI specific to your directory server. These instructions assume you are running the [Apache Directory Studio](https://directory.apache.org/studio/) GUI, and connecting to the ApacheDS server described in the previous section. + +**Connect to the Directory Server**: +* From the `LDAP` top level menu, select `New Connection...` +* On the Network Parameter tab: + * Hostname: `localhost` + * Port: `10636` + * Encryption method: `Use SSL encryption (ldaps://)` + * Click "Next" +* On the Authentication tab: + * Authentication method: `Simple Authentication` + * Bind DN or user: `uid=admin,ou=system` + * Bind password: `secret` + * Click "Finish" +* Open the connection + +**Import the test data**: +* From the `File` top level menu, select `Import...` +* Select `LDIF into LDAP`, and click 'Next' +* Select the LDIF file you saved in the previous section, and click 'Next' +* You should now be able to browse the entries in the LDAP Browser window. + +## Next Steps + + [Test Drive Kafka LDAP Integration](test.md). diff --git a/docs/test.md b/docs/test.md new file mode 100644 index 00000000..b2b4f434 --- /dev/null +++ b/docs/test.md @@ -0,0 +1,117 @@ +# Test Drive Kafka LDAP Integration + +This is part of the [User Guide for `kafka-ldap-integration`](index.md). + +## Prerequisites + +This guide assumes that you have already: +* Completed the [Kafka Broker Setup](kafka.md) +* Completed the [LDAP Server Setup](ldap.md) + + +## Start the Kafka Broker + +Zookeeper needs to be started first. From a terminal window or command prompt, run the following commands: + +```shell script +cd $KAFKA_HOME +./bin/zookeeper-server-start.sh config/zookeeper.properties +``` + +Once Zookeeper is up and running, you can start the Kafka broker from a new terminal window: + +```shell script +cd $KAFKA_HOME +export CLASSPATH=$KAFKA_HOME/config # ldapconfig.yaml must be classpath +./bin/kafka-server-start.sh config/server.properties +``` + +Check the log messages in the terminal window. If everything is set up and configured correctly, there should be no error messages about authentication. + +## Set up a Test Topic with Permissions + +Start a new terminal window, and run the following commands: +```shell script +cd $KAFKA_HOME + +# Create a test topic +./bin/kafka-topics.sh --create \ + --zookeeper localhost:2181 \ + --replication-factor 1 \ + --partitions 1 \ + --topic test + +# Define producer access, all users in LDAP group ktprodTest +./bin/kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 \ + --add --topic test \ + --allow-principal User:ktprodTest \ + --producer + +# Define consumer access, all users in LDAP group ktconsTest +./bin/kafka-acls.sh --authorizer-properties zookeeper.connect=localhost:2181 \ + --add --topic test \ + --allow-principal User:ktconsTest \ + --consumer \ + --group * +``` + +## Start the Consumer + +Create a new file called `consumer.properites` in `$KAFKA_HOME/config`: + +```properties +security.protocol=SASL_PLAINTEXT +sasl.mechanism=PLAIN +sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ + username="srvkafkaconsumer" \ + password="consumer"; + +# consumer group id +group.id=test-srvkafkaconsumer-grp +client.id=srvkafkaconsumer + +# list of brokers used for bootstrapping knowledge about the rest of the cluster +# format: host1:port1,host2:port2 ... +bootstrap.servers=localhost:9092 +``` + +From the `$KAFKA_HOME` directory, run the command: + +```shell script +./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 \ + --consumer.config ./config/consumer.properties \ + --topic test +``` + +If configured correctly, you should see no errors in the terminal window. The console consumer will wait for messages to appear in the `test` topic, and print them as they are received. + +## Start the Producer + +Create a new file called `producer.properites` in `$KAFKA_HOME/config`: + +```properties +security.protocol=SASL_PLAINTEXT +sasl.mechanism=PLAIN +sasl.jaas.config=org.apache.kafka.common.security.plain.PlainLoginModule required \ + username="srvkafkaproducer3" \ + password="producer3"; + +# list of brokers used for bootstrapping knowledge about the rest of the cluster +# format: host1:port1,host2:port2 ... +bootstrap.servers=localhost:9092 + +# specify the compression codec for all data generated: none, gzip, snappy, lz4 +compression.type=none +``` + +From the `$KAFKA_HOME` directory, run the command: + +```shell script +./bin/kafka-console-producer.sh --broker-list localhost:9092 \ + --producer.config ./config/producer.properties \ + --topic test +``` + +If configured correctly, you should see no errors in the terminal window. The console producer lets you type text into the terminal window. When you press ``, the text will be sent to the broker, and you should see it appear in the terminal of console consumer. + +This concludes the test! \ No newline at end of file