Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docker release to the full release process for final releases #1004

Merged
merged 30 commits into from
May 21, 2024
Merged
Show file tree
Hide file tree
Changes from 18 commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
3186972
add docker release to release pipeline
mikealfare Apr 5, 2024
446bb0c
changelog
mikealfare Apr 5, 2024
43e100c
Merge branch 'refs/heads/main' into config/docker-release
mikealfare Apr 13, 2024
548f7df
update docker release to align with other adapters, add dev docker
mikealfare Apr 13, 2024
8e31cc4
update docker release to align with other adapters, add dev docker
mikealfare Apr 13, 2024
e9420b5
remove defaulted input for docker package, override default for docke…
mikealfare Apr 15, 2024
acdc453
point back to main
mikealfare Apr 15, 2024
fb197f0
remove changie entry
mikealfare Apr 15, 2024
fc7de15
fix docker release dependent steps
mikealfare Apr 16, 2024
d9d27b0
only release docker when not testing, allow to only release to docker
emmyoop Apr 17, 2024
c685f9d
Merge branch 'refs/heads/main' into config/docker-release
mikealfare May 2, 2024
a82838a
remove dev container
mikealfare May 2, 2024
c1f7359
clean up test script
mikealfare May 2, 2024
1d9fe5d
Update docker-release/Dockerfile
mikealfare May 2, 2024
6c544c6
rename the spark Dockerfile to make space for the release Dockerfile
mikealfare May 2, 2024
52ad6cb
move the release Dockerfile into ./docker
mikealfare May 2, 2024
5f3e52d
move the release Dockerfile into ./docker
mikealfare May 2, 2024
2326b2d
move the release Dockerfile into ./docker
mikealfare May 2, 2024
b87999b
move the release Dockerfile into ./docker
mikealfare May 2, 2024
85c9d9f
Merge branch 'main' into config/docker-release
mikealfare May 3, 2024
28b8aff
Merge branch 'main' into config/docker-release
mikealfare May 7, 2024
d49184f
Merge branch 'main' into config/docker-release
mikealfare May 13, 2024
78b9a36
Merge branch 'main' into config/docker-release
mikealfare May 14, 2024
0680722
Merge branch 'main' into config/docker-release
mikealfare May 14, 2024
dcdab2d
Merge branch 'main' into config/docker-release
mikealfare May 14, 2024
29011fd
Merge branch 'main' into config/docker-release
mikealfare May 14, 2024
4674f9f
point to dev branch for now
mikealfare May 14, 2024
74f8c78
Merge remote-tracking branch 'origin/config/docker-release' into conf…
mikealfare May 14, 2024
c7731c3
point back to main
mikealfare May 15, 2024
562bebd
remove unused script
mikealfare May 20, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,8 @@ updates:
schedule:
interval: "weekly"
rebase-strategy: "disabled"
- package-ecosystem: "docker"
directory: "/docker-dev"
schedule:
interval: "weekly"
rebase-strategy: "disabled"
64 changes: 33 additions & 31 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,8 @@
# This will only run manually. Run this workflow only after the
# version bump workflow is completed and related changes are reviewed and merged.
#

name: Release to GitHub and PyPI
name: "Release to GitHub, PyPI, and Docker"
run-name: "Release ${{ inputs.version_number }} to GitHub, PyPI, and Docker"

on:
workflow_dispatch:
Expand Down Expand Up @@ -56,6 +56,11 @@ on:
type: boolean
default: true
required: false
only_docker:
description: "Only release Docker image, skip GitHub & PyPI"
type: boolean
default: false
required: false

permissions:
contents: write # this is the permission that allows creating a new release
Expand All @@ -66,7 +71,7 @@ defaults:

jobs:
log-inputs:
name: Log Inputs
name: "Log Inputs"
runs-on: ubuntu-latest
steps:
- name: "[DEBUG] Print Variables"
Expand All @@ -79,6 +84,7 @@ jobs:
echo AWS S3 bucket name: ${{ inputs.s3_bucket_name }}
echo Package test command: ${{ inputs.package_test_command }}
echo Test run: ${{ inputs.test_run }}
echo Only Docker: ${{ inputs.only_docker }}

# The Spark repository uses CircleCI to run integration tests.
# Because of this, the process of version bumps will be manual
Expand All @@ -87,40 +93,32 @@ jobs:
# We are passing `env_setup_script_path` as an empty string
# so that the integration tests stage will be skipped.
audit-version-and-changelog:
name: Bump package version, Generate changelog

name: "Bump package version, Generate changelog"
uses: dbt-labs/dbt-spark/.github/workflows/release-prep.yml@main

with:
sha: ${{ inputs.sha }}
version_number: ${{ inputs.version_number }}
target_branch: ${{ inputs.target_branch }}
env_setup_script_path: ""
test_run: ${{ inputs.test_run }}

secrets: inherit

log-outputs-audit-version-and-changelog:
name: "[Log output] Bump package version, Generate changelog"
if: ${{ !failure() && !cancelled() }}

if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [audit-version-and-changelog]

runs-on: ubuntu-latest

steps:
- name: Print variables
run: |
echo Final SHA : ${{ needs.audit-version-and-changelog.outputs.final_sha }}
echo Changelog path: ${{ needs.audit-version-and-changelog.outputs.changelog_path }}

build-test-package:
name: Build, Test, Package
if: ${{ !failure() && !cancelled() }}
name: "Build, Test, Package"
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [audit-version-and-changelog]

uses: dbt-labs/dbt-release/.github/workflows/build.yml@main

with:
sha: ${{ needs.audit-version-and-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
Expand All @@ -129,55 +127,59 @@ jobs:
s3_bucket_name: ${{ inputs.s3_bucket_name }}
package_test_command: ${{ inputs.package_test_command }}
test_run: ${{ inputs.test_run }}

secrets:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}

github-release:
name: GitHub Release
if: ${{ !failure() && !cancelled() }}

name: "GitHub Release"
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [audit-version-and-changelog, build-test-package]

uses: dbt-labs/dbt-release/.github/workflows/github-release.yml@main

with:
sha: ${{ needs.audit-version-and-changelog.outputs.final_sha }}
version_number: ${{ inputs.version_number }}
changelog_path: ${{ needs.audit-version-and-changelog.outputs.changelog_path }}
test_run: ${{ inputs.test_run }}

pypi-release:
name: PyPI Release

name: "PyPI Release"
if: ${{ !failure() && !cancelled() && !inputs.only_docker }}
needs: [github-release]

uses: dbt-labs/dbt-release/.github/workflows/pypi-release.yml@main

with:
version_number: ${{ inputs.version_number }}
test_run: ${{ inputs.test_run }}

secrets:
PYPI_API_TOKEN: ${{ secrets.PYPI_API_TOKEN }}
TEST_PYPI_API_TOKEN: ${{ secrets.TEST_PYPI_API_TOKEN }}

docker-release:
name: "Docker Release"
# We cannot release to docker on a test run because it uses the tag in GitHub as
# what we need to release but draft releases don't actually tag the commit so it
# finds nothing to release
if: ${{ !failure() && !cancelled() && (!inputs.test_run || inputs.only_docker) }}
needs: [github-release]
permissions:
packages: write
uses: dbt-labs/dbt-release/.github/workflows/release-docker.yml@main
with:
version_number: ${{ inputs.version_number }}
dockerfile: "docker-release/Dockerfile"
test_run: ${{ inputs.test_run }}

slack-notification:
name: Slack Notification
if: ${{ failure() && (!inputs.test_run || inputs.nightly_release) }}

needs:
[
audit-version-and-changelog,
build-test-package,
github-release,
pypi-release,
docker-release,
]

uses: dbt-labs/dbt-release/.github/workflows/slack-post-notification.yml@main
with:
status: "failure"

secrets:
SLACK_WEBHOOK_URL: ${{ secrets.SLACK_DEV_CORE_ALERTS }}
4 changes: 4 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -61,3 +61,7 @@ help: ## Show this help message.
@echo
@echo 'targets:'
@grep -E '^[7+a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'

.PHONY: docker-prod
docker-prod:
docker build -f docker/Dockerfile -t dbt-spark .
4 changes: 3 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@ version: "3.7"
services:

dbt-spark3-thrift:
build: docker/
build:
context: ./docker
dockerfile: spark.Dockerfile
ports:
- "10000:10000"
- "4040:4040"
Expand Down
72 changes: 42 additions & 30 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,30 +1,42 @@
ARG OPENJDK_VERSION=8
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All of this moved whole cloth to spark.Dockerfile.

FROM eclipse-temurin:${OPENJDK_VERSION}-jre

ARG BUILD_DATE
ARG SPARK_VERSION=3.3.2
ARG HADOOP_VERSION=3

LABEL org.label-schema.name="Apache Spark ${SPARK_VERSION}" \
org.label-schema.build-date=$BUILD_DATE \
org.label-schema.version=$SPARK_VERSION

ENV SPARK_HOME /usr/spark
ENV PATH="/usr/spark/bin:/usr/spark/sbin:${PATH}"

RUN apt-get update && \
apt-get install -y wget netcat procps libpostgresql-jdbc-java && \
wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && \
ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && \
apt-get remove -y wget && \
apt-get autoremove -y && \
apt-get clean

COPY entrypoint.sh /scripts/
RUN chmod +x /scripts/entrypoint.sh

ENTRYPOINT ["/scripts/entrypoint.sh"]
CMD ["--help"]
# this image gets published to GHCR for production use
ARG py_version=3.11.2

FROM python:$py_version-slim-bullseye as base

RUN apt-get update \
&& apt-get dist-upgrade -y \
&& apt-get install -y --no-install-recommends \
build-essential=12.9 \
ca-certificates=20210119 \
gcc=4:10.2.1-1 \
git=1:2.30.2-1+deb11u2 \
libpq-dev=13.14-0+deb11u1 \
libsasl2-dev=2.1.27+dfsg-2.1+deb11u1 \
make=4.3-4.1 \
openssh-client=1:8.4p1-5+deb11u3 \
python-dev-is-python2=2.7.18-9 \
software-properties-common=0.96.20.2-2.1 \
unixodbc-dev=2.3.6-0.1+b1 \
&& apt-get clean \
&& rm -rf \
/var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*

ENV PYTHONIOENCODING=utf-8
ENV LANG=C.UTF-8

RUN python -m pip install --upgrade "pip==24.0" "setuptools==69.2.0" "wheel==0.43.0" --no-cache-dir


FROM base as dbt-spark

Check failure on line 32 in docker/Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Missing User Instruction

Rule ID: e54afcf9-dc71-484a-8967-d930e3044062 Severity: High Resource: FROM={{base as dbt-spark}} A user should be specified in the dockerfile, otherwise the image will run as root
Raw output
Expected: The 'Dockerfile' should contain the 'USER' instruction
Found: The 'Dockerfile' does not contain any 'USER' instruction

ARG commit_ref=main
ARG extras=all

HEALTHCHECK CMD dbt --version || exit 1

WORKDIR /usr/app/dbt/
ENTRYPOINT ["dbt"]

RUN python -m pip install --no-cache-dir "dbt-spark[${extras}] @ git+https://github.com/dbt-labs/dbt-spark@${commit_ref}"

Check warning on line 42 in docker/Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Unpinned Package Version in Pip Install

Rule ID: 1f0d05d7-8caf-4f04-bc60-332d472de5a9 Severity: Medium Resource: FROM={{base as dbt-spark}}.{{RUN python -m pip install --no-cache-dir "dbt-spark[${extras}] @ git+https://github.com/dbt-labs/dbt-spark@${commit_ref}"}} Package version pinning reduces the range of versions that can be installed, reducing the chances of failure due to unanticipated changes
Raw output
Expected: RUN instruction with 'pip/pip3 install <package>' should use package pinning form 'pip/pip3 install <package>=<version>'
Found: RUN instruction python -m pip install --no-cache-dir "dbt-spark[all] @ git+https://github.com/dbt-labs/dbt-spark@main" does not use package pinning form
70 changes: 70 additions & 0 deletions docker/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
# Docker for dbt
`Dockerfile` is suitable for building dbt Docker images locally or using with CI/CD to automate populating a container registry.

## Building an image:
This Dockerfile can create images for the following target: `dbt-spark`

In order to build a new image, run the following docker command.
```shell
docker build --tag <your_image_name> --target dbt-spark <path/to/dockerfile>
```
---
> **Note:** Docker must be configured to use [BuildKit](https://docs.docker.com/develop/develop-images/build_enhancements/) in order for images to build properly!

---

By default the image will be populated with the latest version of `dbt-spark` on `main`.
If you need to use a different version you can specify it by git ref using the `--build-arg` flag:
```shell
docker build --tag <your_image_name> \
--target dbt-spark \
--build-arg commit_ref=<commit_ref> \
<path/to/dockerfile>
```

### Examples:
To build an image named "my-dbt" that supports Snowflake using the latest releases:
```shell
cd dbt-core/docker
docker build --tag my-dbt --target dbt-spark .
```

To build an image named "my-other-dbt" that supports Snowflake using the adapter version 1.0.0b1:
```shell
cd dbt-core/docker
docker build \
--tag my-other-dbt \
--target dbt-spark \
--build-arg commit_ref=v1.0.0b1 \
.
```

## Special cases
There are a few special cases worth noting:
* The `dbt-spark` database adapter comes in three different versions named `PyHive`, `ODBC`, and the default `all`.
If you wish to override this you can use the `--build-arg` flag with the value of `extras=<extras_name>`.
See the [docs](https://docs.getdbt.com/reference/warehouse-profiles/spark-profile) for more information.
```shell
docker build --tag my_dbt \
--target dbt-spark \
--build-arg commit_ref=v1.0.0b1 \
--build-arg extras=PyHive \
<path/to/dockerfile>
```

## Running an image in a container:
The `ENTRYPOINT` for this Dockerfile is the command `dbt` so you can bind-mount your project to `/usr/app` and use dbt as normal:
```shell
docker run \
--network=host \
--mount type=bind,source=path/to/project,target=/usr/app \
--mount type=bind,source=path/to/profiles.yml,target=/root/.dbt/profiles.yml \
my-dbt \
ls
```
---
**Notes:**
* Bind-mount sources _must_ be an absolute path
* You may need to make adjustments to the docker networking setting depending on the specifics of your data warehouse/database host.

---
30 changes: 30 additions & 0 deletions docker/spark.Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
ARG OPENJDK_VERSION=8
FROM eclipse-temurin:${OPENJDK_VERSION}-jre

Check failure on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Missing User Instruction

Rule ID: e54afcf9-dc71-484a-8967-d930e3044062 Severity: High Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}} A user should be specified in the dockerfile, otherwise the image will run as root
Raw output
Expected: The 'Dockerfile' should contain the 'USER' instruction
Found: The 'Dockerfile' does not contain any 'USER' instruction

Check warning on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5 Severity: Medium Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update && apt-get install -y wget netcat procps libpostgresql-jdbc-java && wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" && tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && apt-get remove -y wget && apt-get autoremove -y && apt-get clean}} When installing a package, its pin version should be defined
Raw output
Expected: Package 'libpostgresql-jdbc-java' has version defined
Found: Package 'libpostgresql-jdbc-java' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5 Severity: Medium Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update && apt-get install -y wget netcat procps libpostgresql-jdbc-java && wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" && tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && apt-get remove -y wget && apt-get autoremove -y && apt-get clean}} When installing a package, its pin version should be defined
Raw output
Expected: Package 'netcat' has version defined
Found: Package 'netcat' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5 Severity: Medium Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update && apt-get install -y wget netcat procps libpostgresql-jdbc-java && wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" && tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && apt-get remove -y wget && apt-get autoremove -y && apt-get clean}} When installing a package, its pin version should be defined
Raw output
Expected: Package 'wget' has version defined
Found: Package 'wget' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5 Severity: Medium Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update && apt-get install -y wget netcat procps libpostgresql-jdbc-java && wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" && tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && apt-get remove -y wget && apt-get autoremove -y && apt-get clean}} When installing a package, its pin version should be defined
Raw output
Expected: Package 'procps' has version defined
Found: Package 'procps' does not have version defined

Check notice on line 2 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

Healthcheck Instruction Missing

Rule ID: db295f99-0fff-4e7b-9906-ec2a057f384b Severity: Low Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}} Ensure that HEALTHCHECK is being used. The HEALTHCHECK instruction tells Docker how to test a container to check that it is still working
Raw output
Expected: Dockerfile should contain instruction 'HEALTHCHECK'
Found: Dockerfile doesn't contain instruction 'HEALTHCHECK'

ARG BUILD_DATE
ARG SPARK_VERSION=3.3.2
ARG HADOOP_VERSION=3

LABEL org.label-schema.name="Apache Spark ${SPARK_VERSION}" \
org.label-schema.build-date=$BUILD_DATE \
org.label-schema.version=$SPARK_VERSION

ENV SPARK_HOME /usr/spark
ENV PATH="/usr/spark/bin:/usr/spark/sbin:${PATH}"

RUN apt-get update && \

Check notice on line 15 in docker/spark.Dockerfile

View check run for this annotation

Wiz Inc. (266a8a9c32) / Wiz IaC Scanner

APT-GET Not Avoiding Additional Packages

Rule ID: 0cbafd91-7f35-4000-b40a-bebedb7bb5f8 Severity: None Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.{{RUN apt-get update && apt-get install -y wget netcat procps libpostgresql-jdbc-java && wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && apt-get remove -y wget && apt-get autoremove -y && apt-get clean}} Check if any apt-get installs don't use '--no-install-recommends' flag to avoid installing additional packages.
Raw output
Expected: 'RUN apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean' uses '--no-install-recommends' flag to avoid installing additional packages
Found: 'RUN apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean' does not use '--no-install-recommends' flag to avoid installing additional packages
apt-get install -y wget netcat procps libpostgresql-jdbc-java && \
wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" && \
mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark && \
ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar && \
apt-get remove -y wget && \
apt-get autoremove -y && \
apt-get clean

COPY entrypoint.sh /scripts/
RUN chmod +x /scripts/entrypoint.sh

ENTRYPOINT ["/scripts/entrypoint.sh"]
CMD ["--help"]
19 changes: 19 additions & 0 deletions docker/test.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# - VERY rudimentary test script to run latest + specific branch image builds and test them all by running `--version`
mikealfare marked this conversation as resolved.
Show resolved Hide resolved
# TODO: create a real test suite
set -e

echo "\n\n"
echo "####################################"
echo "##### Testing dbt-spark latest #####"
echo "####################################"

docker build --tag dbt-spark --target dbt-spark docker
docker run dbt-spark --version

echo "\n\n"
echo "#####################################"
echo "##### Testing dbt-spark-1.0.0b1 #####"
echo "#####################################"

docker build --tag dbt-spark-1.0.0b1 --target dbt-spark --build-arg commit_ref=v1.0.0b1 docker
docker run dbt-spark-1.0.0b1 --version
Loading