The Microcontroller board Raspberry pi pico wifi is in charge of wired reading data from IoT sensor BME 280, which provides temperature, humidity and pressure data on read with efficiency. For this use case, a development kit and breadboard was used to make it easy to wire the raspberry pi pico w gpio outputs and the components inputs.
Once data is read by the raspberry pi pico w, through tcp/ip connection it sends bytes of data every 2 seconds(configurable) to a tcp/ip socket listener hosted within docker infrastructure. Which is also in charge of publishing the received data into a kafka broker further stored in a kafka topic for 7 days.
Then, we have a Python flask api querying the kafka topic in real time and opening a stream with the web ui for displaying real time data in a dashboard.
In parallel a crontab service is launching a spark batch application every hour for processing chunks of data, cleaning, transforming and aggregating data and finally storing the aggregated data into mysql tables.
Once the data is finilized on mysql tables, they are shown in a beautiful streamlit analytical dashboard with history of the data and filters.
Download docker and docker compose on its latest version.
Run the following commands:
git clone https://github.com/mpavanetti/iot.git
cd iot
# create and add permissions to data folder
sudo mkdir iot_hub/infrastructure/data && sudo chmod -R 777 iot_hub/infrastructure/data
# Create external docker network
sudo docker network create spark-network
# Standard container configuration (No Jupyter Lab)
docker compose up -d
# Containers plus jupyter Lab (Optional)
docker compose --profile jupyter up -d
or in case you want to start it after just run: docker-compose up -d jupyter-notebook-pyspark
Access it at http://localhost/ or http://raspberrypi.local/
In this case, I am using a raspberry pi 4 (8GB) as the IoT Center infrastructure host as matter of convinience.
However you can use and infrastructure (Linux Server) at your choice.
If you decide to use the raspberry pi 4 as me (optional), here are additional steps that I used to set it up.
Notes
Obs: jupyerlab initial password = tad
I am using a raspberry pi pico w to interface with the IoT sensors (bme 280) in order to capture the data read by the sensors and send it through tcp/ip to the server.
- Go to the oficial micropython download page at https://micropython.org/download/RPI_PICO_W/
- Download the latest .uf2 firmware
- Plug your raspberry pi pico w into your pc through the usb port.
- Once it is recognized as an external device, copy the recent .uf2 firmware file to the root of the device. it will reboot.
- Once it starts up again you won't be able to see it.
- Download an ide your choice, in my case I am using thonny https://thonny.org/
- If you are using thonny, go to tools manage packages and install the packages micropython-bme280, micropython_ssd1306, picozero.
- Once you have installed the required library, upload to the raspberry pi pico w the files data.py and main.py.
- Unplug it from your pc.
- Plug it into any 5V usb port.
Webapp home screen.
Real time data streaming screen.
General Hardware Information screen.
Interactive Jupyter Lab and spark client.
Spark Master with 1 spark worker running.
In this use case, I am using a raspberry pi 4 (8GB) acting as the docker server. Any linux host could be used in this case
Raspberry Pi Pico Wifi
Documentation: raspberrypi oficial documentation
Datasheet: raspberry pi oficial datasheet
Purchased at Amazon: Raspberry pi pico w Amazon Canada.
Purchased at Amazon: Pico Breadboard Kit Amazon Canada.
Combined humidity and pressure sensor BME280
Datasheet: Bosh BME280 Datasheet.
Purchased at amazon: BME 280 Amazon Canada.
LCD Display single color, 0.96 inches ssd 1306
Purchased at amazon: SSD1306 Display Amazon US