Skip to content

Commit

Permalink
refactor(spark): improve compose spark build (#83)
Browse files Browse the repository at this point in the history
* feature: build docker spark image within compose

* refactor: removed readme because now implicit
  • Loading branch information
rivamarco authored Jul 9, 2024
1 parent af3ec85 commit 3fb07cc
Show file tree
Hide file tree
Showing 2 changed files with 36 additions and 22 deletions.
36 changes: 36 additions & 0 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ services:
S3_ENDPOINT_URL: "http://minio:9000"
S3_BUCKET_NAME: "test-bucket"
KUBECONFIG_FILE_PATH: "/opt/kubeconfig/kubeconfig.yaml"
SPARK_IMAGE: "radicalbit-spark-py:develop"
depends_on:
postgres:
condition: service_healthy
Expand Down Expand Up @@ -142,6 +143,9 @@ services:
environment:
K3S_KUBECONFIG_OUTPUT: /output/kubeconfig.yaml
K3S_KUBECONFIG_MODE: 666
depends_on:
docker-client:
condition: service_completed_successfully
volumes:
- k3s-server:/var/lib/rancher/k3s
# This is just so that we get the kubeconfig file out
Expand All @@ -159,6 +163,36 @@ services:
ports:
- 6443:6443

dind:
image: docker:dind
privileged: true
hostname: dind
environment:
DOCKER_TLS_CERTDIR: /certs
volumes:
- docker-certs-ca:/certs
- docker-certs-client:/certs/client
healthcheck:
test: nc -w 5 -z localhost 2376
start_period: 5s
interval: 10s
timeout: 5s
retries: 2

docker-client:
image: docker:cli
command: "/bin/sh -c 'docker build ./spark -t radicalbit-spark-py:develop && docker save radicalbit-spark-py:develop -o /images/radicalbit-spark-py:develop.tar'"
environment:
DOCKER_TLS_CERTDIR: /certs
DOCKER_HOST: tcp://dind:2376
depends_on:
dind:
condition: service_healthy
volumes:
- docker-certs-client:/certs/client:ro
- ./spark:/spark
- ./docker/k3s_data/images:/images

k9s:
profiles: ["k9s"]
image: quay.io/derailed/k9s:latest
Expand All @@ -174,6 +208,8 @@ volumes:
k3s-server: {}
radicalbit-data: {}
minio_storage: {}
docker-certs-ca: {}
docker-certs-client: {}
networks:
default:
ipam:
Expand Down
22 changes: 0 additions & 22 deletions spark/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,28 +10,6 @@ This is a poetry project that can be used to develop and test the jobs before pu

To create an additional job, add a `.py` file in `jobs` folder (take as an example `reference_job.py` for the boilerplate) and write unit tests

### End-to-end testing

Before publishing the image is possible to test the platform with new development or improvement done in the spark image.

From this project folder, run

```bash
docker build . -t radicalbit-spark-py:develop && docker save radicalbit-spark-py:develop -o ../docker/k3s_data/images/radicalbit-spark-py:develop.tar
```

This will build and save the new image in `/docker/k3s_data/images/`.

To use this image in the Radicalbit Platform, the docker compose must be modified adding the following environment variable in the `api` container:

```
SPARK_IMAGE: "radicalbit-spark-py:develop"
```

When the k3s cluster inside the docker compose will start, it will automatically load the saved image that can be used to test the code during the development.

NB: when a new image is built and saved, the k3s container must be restarted

#### Formatting and linting

```bash
Expand Down

0 comments on commit 3fb07cc

Please sign in to comment.