Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docker release to the full release process for final releases #1004

Merged
merged 30 commits into from
May 21, 2024

remove unused script

562bebd
Select commit
Loading
Failed to load commit list.
Merged

Add docker release to the full release process for final releases #1004

remove unused script
562bebd
Select commit
Loading
Failed to load commit list.
Wiz Inc. (266a8a9c32) / Wiz IaC Scanner completed May 20, 2024 in 2s

Wiz IaC Scanner

Greetings, Captain of Configuration! ⚙️

The scrolls of ancient wisdom held the keys to unlock hidden doors of knowledge. 📜🔑

Revealing IaC misconfigurations with Wiz 🪄

🔮 IaC Misconfigurations Detected: 9

0C 2H 5M 1L 1I

― Note from Wiz: "Bugs vanish in your digital magic - keep conjuring solutions! 🪄🐛🔮"

Annotations

Check failure on line 32 in docker/Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Missing User Instruction

Rule ID: e54afcf9-dc71-484a-8967-d930e3044062
Severity: High
Resource: FROM={{base as dbt-spark}}

A user should be specified in the dockerfile, otherwise the image will run as root
Raw output
Expected: The 'Dockerfile' should contain the 'USER' instruction
Found: The 'Dockerfile' does not contain any 'USER' instruction

Check failure on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Missing User Instruction

Rule ID: e54afcf9-dc71-484a-8967-d930e3044062
Severity: High
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}

A user should be specified in the dockerfile, otherwise the image will run as root
Raw output
Expected: The 'Dockerfile' should contain the 'USER' instruction
Found: The 'Dockerfile' does not contain any 'USER' instruction

Check warning on line 42 in docker/Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Unpinned Package Version in Pip Install

Rule ID: 1f0d05d7-8caf-4f04-bc60-332d472de5a9
Severity: Medium
Resource: FROM={{base as dbt-spark}}.{{RUN python -m pip install --no-cache-dir "dbt-spark[${extras}] @ git+https://github.com/dbt-labs/dbt-spark@${commit_ref}"}}

Package version pinning reduces the range of versions that can be installed, reducing the chances of failure due to unanticipated changes
Raw output
Expected: RUN instruction with 'pip/pip3 install <package>' should use package pinning form 'pip/pip3 install <package>=<version>'
Found: RUN instruction python -m pip install --no-cache-dir "dbt-spark[all] @ git+https://github.com/dbt-labs/dbt-spark@main" does not use package pinning form

Check warning on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5
Severity: Medium
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean}}

When installing a package, its pin version should be defined
Raw output
Expected: Package 'libpostgresql-jdbc-java' has version defined
Found: Package 'libpostgresql-jdbc-java' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5
Severity: Medium
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean}}

When installing a package, its pin version should be defined
Raw output
Expected: Package 'netcat' has version defined
Found: Package 'netcat' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5
Severity: Medium
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean}}

When installing a package, its pin version should be defined
Raw output
Expected: Package 'procps' has version defined
Found: Package 'procps' does not have version defined

Check warning on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Apt Get Install Pin Version Not Defined

Rule ID: 8dabde7b-ee7e-440a-8b59-73636b0cfda5
Severity: Medium
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.RUN={{apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-3.3.2/spark-${SPARK_VERSION}-bin-hadoop3.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean}}

When installing a package, its pin version should be defined
Raw output
Expected: Package 'wget' has version defined
Found: Package 'wget' does not have version defined

Check notice on line 2 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

Healthcheck Instruction Missing

Rule ID: db295f99-0fff-4e7b-9906-ec2a057f384b
Severity: Low
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}

Ensure that HEALTHCHECK is being used. The HEALTHCHECK instruction tells Docker how to test a container to check that it is still working
Raw output
Expected: Dockerfile should contain instruction 'HEALTHCHECK'
Found: Dockerfile doesn't contain instruction 'HEALTHCHECK'

Check notice on line 15 in docker/spark.Dockerfile

See this annotation in the file changed.

@wiz-inc-266a8a9c32 wiz-inc-266a8a9c32 / Wiz IaC Scanner

APT-GET Not Avoiding Additional Packages

Rule ID: 0cbafd91-7f35-4000-b40a-bebedb7bb5f8
Severity: None
Resource: FROM={{eclipse-temurin:${OPENJDK_VERSION}-jre}}.{{RUN apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean}}

Check if any apt-get installs don't use '--no-install-recommends' flag to avoid installing additional packages.
Raw output
Expected: 'RUN apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean' uses '--no-install-recommends' flag to avoid installing additional packages
Found: 'RUN apt-get update &&     apt-get install -y wget netcat procps libpostgresql-jdbc-java &&     wget -q "http://archive.apache.org/dist/spark/spark-${SPARK_VERSION}/spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     tar xzf "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     rm "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}.tgz" &&     mv "spark-${SPARK_VERSION}-bin-hadoop${HADOOP_VERSION}" /usr/spark &&     ln -s /usr/share/java/postgresql-jdbc4.jar /usr/spark/jars/postgresql-jdbc4.jar &&     apt-get remove -y wget &&     apt-get autoremove -y &&     apt-get clean' does not use '--no-install-recommends' flag to avoid installing additional packages