Skip to content

Commit

Permalink
Improve README.md and add other pictures (#18)
Browse files Browse the repository at this point in the history
  • Loading branch information
AlessandroAlinone authored Oct 22, 2024
1 parent 7b52058 commit 99bcde6
Show file tree
Hide file tree
Showing 11 changed files with 213 additions and 170 deletions.
340 changes: 192 additions & 148 deletions README.md

Large diffs are not rendered by default.

8 changes: 4 additions & 4 deletions examples/airport-demo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ The simulated data, inputted into a [Kafka cluster](https://kafka.apache.org/),
The demo project consists of:
- A web client designed to visualize the airport departure board from a browser.
- A random flight information generator that acts as a message producer for Kafka.
- Files to configure Kafka Connector according to the needs of the demo.
- Files to configure the Kafka Connector according to the needs of the demo.

## The Web Client

Expand All @@ -33,7 +33,7 @@ As you can see, items have been expressed in a parameterized format to activate
```

which requires every subscription to include a filtering value for the _bind parameter_ `key`.
Upon consuming an incoming message, Kafka Connector will then route the record if the subscribed item has specified a filtering value that matches the record key.
Upon consuming an incoming message, the Kafka Connector will then route the record if the subscribed item has specified a filtering value that matches the record key.

## The Producer

Expand All @@ -44,7 +44,7 @@ The source code of the producer is basically contained in the `producer` package

## Connector Configurations

In the [`connector`](connector/) folder, we found the configuration files needed to configure Kafka Connector:
In the [`connector`](connector/) folder, we found the configuration files needed to configure the Kafka Connector:

- `adapters.xml`: in this file, parameters are essentially configured for the connector to consume messages from Kafka, and the mapping between Kafka cluster topics and Lightstreamer items that the client will subscribe to is defined. In the specific case of this demo, message serialization occurs via JSON objects, and therefore, the mapping of fields from the received JSON object to the Lightstreamer item fields to be sent to clients is also defined. In particular, the section defining the field mapping is this one:
```xml
Expand Down Expand Up @@ -117,7 +117,7 @@ To configure our `Flights` topic to be managed in a compacted manner, the follow

- Download Lightstreamer Server version 7.4.2 or later (Lightstreamer Server comes with a free non-expiring demo license for 20 connected users) from [Lightstreamer Download page](https://lightstreamer.com/download/), and install it, as explained in the `GETTING_STARTED.TXT` file in the installation home directory.
- Make sure that Lightstreamer Server is not running.
- Deploy a fresh installation of Lightstreamer Kafka Connector following the instructions provided [here](../../README.md#deploy).
- Deploy a fresh installation of the Lightstreamer Kafka Connector following the instructions provided [here](../../README.md#deploy).
- Replace the `adapters.xml` file with the one of this project and in the case update the settings as discussed in the previous section.
- [Optional] Customize the logging settings in the log4j configuration file `log4j.properties`.
- Launch Lightstreamer Server.
Expand Down
2 changes: 1 addition & 1 deletion examples/airport-demo/connector/adapters.xml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

</metadata_provider>

<!-- Mandatory. Kafka Connector allows the configuration of different independent connections to different Kafka
<!-- Mandatory. The Kafka Connector allows the configuration of different independent connections to different Kafka
broker/clusters.
Every single connection is configured via the definition of its own Lightstreamer Data Adapter. At least one connection
Expand Down
2 changes: 1 addition & 1 deletion examples/docker-kafka-connect/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Kafka Connect Lightstreamer Sink Connector Docker Image

This folder contains the resources required to build a minimal Docker image of Kafka Connect Lightstreamer Sink Connector.
This folder contains the resources required to build a minimal Docker image of the Kafka Connect Lightstreamer Sink Connector.

The image is based on the official [Official Confluent Docker Base Image for Kafka Connect](https://hub.docker.com/r/confluentinc/cp-kafka-connect-base). Check out the [`Dockerfile`](./Dockerfile) for more details.

Expand Down
2 changes: 1 addition & 1 deletion examples/docker/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Lightstreamer Kafka Connector Docker Image

This folder contains the resources required to build a minimal Docker image of Lightstreamer Kafka Connector.
This folder contains the resources required to build a minimal Docker image of the Lightstreamer Kafka Connector.

The image is built by deriving the official [Lightstreamer Docker image](https://hub.docker.com/_/lightstreamer) with the current version of the Kafka Connector. Check out the [`Dockerfile`](./Dockerfile) for more details.

Expand Down
Empty file.
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@

</metadata_provider>

<!-- Mandatory. Kafka Connector allows the configuration of different independent connections to different Kafka
<!-- Mandatory. The Kafka Connector allows the configuration of different independent connections to different Kafka
broker/clusters.
Every single connection is configured via the definition of its own Lightstreamer Data Adapter. At least one connection
Expand Down Expand Up @@ -184,7 +184,7 @@
-->

<!-- Mandatory if authentication is enabled. The SASL mechanism type.
Kafka Connector supports the following authentication mechanisms:
The Kafka Connector accepts the following authentication mechanisms:
- PLAIN
- SCRAM-SHA-256
Expand Down Expand Up @@ -254,11 +254,11 @@

<!-- Optional. Specifies where to start consuming events from:
- LATEST: start consuming events from the end of the topic partition
- EARLIEST: start consuming events from the beginning of the topic partition
- EARLIEST: start consuming events from the beginning of the topic partition
Default value: LATEST. -->
<param name="record.consume.from">EARLIEST</param>

<!-- Optional. The format to be used to deserialize respectively the key and value of a Kafka record.
Can be one of the following:
- AVRO
Expand All @@ -279,15 +279,15 @@
<param name="record.key.evaluator.type">INTEGER</param>
<param name="record.value.evaluator.type">JSON</param>

<!-- Mandatory if evaluator type is AVRO and the Confluent Schema Registry is disabled. The path of the local schema
file relative to the deployment folder (LS_HOME/adapters/lightstreamer-kafka-connector-<version>) for
<!-- Mandatory if evaluator type is AVRO and the Confluent Schema Registry is disabled. The path of the local schema
file relative to the deployment folder (LS_HOME/adapters/lightstreamer-kafka-connector-<version>) for
message validation respectively of the key and the value. -->
<!--
<param name="record.key.evaluator.schema.path">schema/record_key.avsc</param>
<param name="record.value.evaluator.schema.path">schemas/record_value.avsc</param>
-->

<!-- Mandatory if evaluator type is AVRO and no local schema paths are specified. Enable the use of the Confluent Schema Registry for validation respectively of the key and
<!-- Mandatory if evaluator type is AVRO and no local schema paths are specified. Enable the use of the Confluent Schema Registry for validation respectively of the key and
value. Can be one of the following:
- true
- false
Expand Down Expand Up @@ -325,8 +325,8 @@
<!-- Multiple and mandatory. Map the Kafka topic <topic> to:
- one or more simple items
- one or more item templates
- any combination of the above
- any combination of the above
At least one mapping must be provided. -->
<!-- Example 1:
<param name="map.<topic>.to">item1,item2,itemN,...</param>
Expand All @@ -341,10 +341,10 @@

<!-- ##### RECORD MAPPING SETTINGS ##### -->

<!-- Multiple and Mandatory. Map the value extracted through "extraction_expression" to
<!-- Multiple and Mandatory. Map the value extracted through "extraction_expression" to
field <fieldName>. The expression is written in the Data Extraction Language. See documentation at:
https://github.com/lightstreamer/Lightstreamer-kafka-connector?tab=readme-ov-file#record-mapping-fieldfiledname
At least one mapping must be provided. -->
<!--
<param name="field.<fieldName>">extraction_expression</param>
Expand All @@ -371,8 +371,7 @@
<!-- ##### SCHEMA REGISTRY SETTINGS ##### -->

<!-- Mandatory if the Confluent Schema Registry is enabled. The URL of the Confluent Schema Registry.
An encrypted connection is enabled by specifying the "https" protocol
-->
An encrypted connection is enabled by specifying the "https" protocol. -->
<!--
<param name="schema.registry.url">https://schema-registry:8084</param>
-->
Expand All @@ -392,7 +391,7 @@
<param name="schema.registry.basic.authentication.password">authorized-schema-registry-user-password</param>
-->

<!-- The following parameters have the same meaning as the homologous ones defined in
<!-- The following parameters have the same meaning as the homologous ones defined in
the ENCRYPTION SETTINGS section. -->

<!-- Set general encryption settings -->
Expand Down
Binary file added pictures/architecture-full-confluent.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added pictures/architecture-full.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified pictures/architecture.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added pictures/client-platforms.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 99bcde6

Please sign in to comment.