Skip to content

Commit

Permalink
Updated readme and instructions for testing with kafka
Browse files Browse the repository at this point in the history
  • Loading branch information
anilkulkarni87 committed Sep 27, 2023
1 parent 1d852f2 commit 3f7130c
Show file tree
Hide file tree
Showing 2 changed files with 43 additions and 8 deletions.
25 changes: 17 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,9 @@ Some of the questions it will help answer right away:
- How to define a Custom Logical type and package it?
- How do we maintain our schemas?
- Better way of sharing schemas with other team members?
- How can we write complex schemas easily?
- How can schemas be made reusable?

## Schemas Used

- QueryRecord schema
Expand All @@ -50,7 +53,8 @@ Some of the questions it will help answer right away:
- CustomerObjectModel

My take on a generic Customer Model of a retail store.
## Usage/Examples

## Build project

- Clone the repo
- Build project
Expand All @@ -68,6 +72,8 @@ Some of the questions it will help answer right away:

- Run `QueryRecordOutput.java` and verify the logs.

## Testing with Kafka
Read more at [Test with Kafka](./kafka.md)


## Directory Tree
Expand Down Expand Up @@ -131,13 +137,16 @@ Some of the questions it will help answer right away:

## Roadmap

- Add Github workflow
- Add unit tests for Conversions
- Publish To Kafka topic
- Add spotless or checkstyle plugins
- Fix for fields which are union logicaltype and null
- Schema evolution
- Keep the README.md updated
- [ ] Add Github workflow
- [ ] Add unit tests for Conversions
- [x] Publish To Kafka topic
- [ ] Add spotless or checkstyle plugins
- [ ] Fix for fields which are union logicaltype and null
- [ ] Schema evolution
- [ ] Keep the README.md updated


![Complete flow](./docs/ecommerce.png)

## Tech Stack

Expand Down
26 changes: 26 additions & 0 deletions kafka.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
## Project setup
Install the confluent kafka on your machine. This is the most easiest way as it comes packaged with all services like schema registry, connector.

[Installation instructions](https://docs.confluent.io/platform/current/installation/installing_cp/zip-tar.html){:target="_blank"}

```
confluent local services start
```

- Run the `ProducerDemo.java`

- Run the `ConsumerDemo.java`

Please take a look the output.

```
confluent local services stop
```

## Things to Observe:

- The producer uses the java classes generated from the avdl files.
- The consumer actually decrypts the data it reads from the topic.
- If you use the kafka consumer cli you will see encrypted data.
- The custom logical types along with conversions defined make this happen. The only way to decrypt is by reading the data by using java classes generated.
- If the message on the topic is intercepted, it cannot be decrypted.

0 comments on commit 3f7130c

Please sign in to comment.