diff --git a/README.md b/README.md index 538259e..23a6f2f 100644 --- a/README.md +++ b/README.md @@ -42,6 +42,9 @@ Some of the questions it will help answer right away: - How to define a Custom Logical type and package it? - How do we maintain our schemas? - Better way of sharing schemas with other team members? +- How can we write complex schemas easily? +- How can schemas be made reusable? + ## Schemas Used - QueryRecord schema @@ -50,7 +53,8 @@ Some of the questions it will help answer right away: - CustomerObjectModel My take on a generic Customer Model of a retail store. -## Usage/Examples + +## Build project - Clone the repo - Build project @@ -68,6 +72,8 @@ Some of the questions it will help answer right away: - Run `QueryRecordOutput.java` and verify the logs. +## Testing with Kafka +Read more at [Test with Kafka](./kafka.md) ## Directory Tree @@ -131,13 +137,16 @@ Some of the questions it will help answer right away: ## Roadmap -- Add Github workflow -- Add unit tests for Conversions -- Publish To Kafka topic -- Add spotless or checkstyle plugins -- Fix for fields which are union logicaltype and null -- Schema evolution -- Keep the README.md updated +- [ ] Add Github workflow +- [ ] Add unit tests for Conversions +- [x] Publish To Kafka topic +- [ ] Add spotless or checkstyle plugins +- [ ] Fix for fields which are union logicaltype and null +- [ ] Schema evolution +- [ ] Keep the README.md updated + + +![Complete flow](./docs/ecommerce.png) ## Tech Stack diff --git a/kafka.md b/kafka.md new file mode 100644 index 0000000..80db79c --- /dev/null +++ b/kafka.md @@ -0,0 +1,26 @@ +## Project setup +Install the confluent kafka on your machine. This is the most easiest way as it comes packaged with all services like schema registry, connector. + +[Installation instructions](https://docs.confluent.io/platform/current/installation/installing_cp/zip-tar.html){:target="_blank"} + + ``` + confluent local services start + ``` + + - Run the `ProducerDemo.java` + + - Run the `ConsumerDemo.java` + + Please take a look the output. + + ``` + confluent local services stop + ``` + + ## Things to Observe: + + - The producer uses the java classes generated from the avdl files. + - The consumer actually decrypts the data it reads from the topic. + - If you use the kafka consumer cli you will see encrypted data. + - The custom logical types along with conversions defined make this happen. The only way to decrypt is by reading the data by using java classes generated. + - If the message on the topic is intercepted, it cannot be decrypted. \ No newline at end of file