Skip to content

Commit

Permalink
compatible: Add integration_test_charm.yaml (#75)
Browse files Browse the repository at this point in the history
Supports lxd; microk8s support planned in the future
  • Loading branch information
carlcsaposs-canonical authored Aug 7, 2023
1 parent d56b5b9 commit 2e84a21
Show file tree
Hide file tree
Showing 20 changed files with 487 additions and 168 deletions.
6 changes: 6 additions & 0 deletions .github/actionlint.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
self-hosted-runner:
labels:
- large
- xlarge
- two-xlarge
- jammy
26 changes: 13 additions & 13 deletions .github/workflows/_get_workflow_version.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -54,18 +54,18 @@ jobs:
import re
import subprocess
from pathlib import Path
import yaml
def get_versions_from_workflow_file(caller_workflow_ref_path: str) -> set:
"""
Get versions of reusable workflow used by caller workflow
Args:
caller_workflow_ref_path: Path to caller workflow with version
(e.g. "octocat/hello-world/.github/workflows/my-workflow.yml@refs/heads/my_branch")
Returns:
Set of reusable workflow versions used by caller workflow
"""
Expand All @@ -82,7 +82,7 @@ jobs:
and path[2].endswith((".yaml", ".yml"))
), f"Invalid {caller_workflow_ref_path=}"
caller_workflow_file_path = Path(repository_name) / Path("/".join(path))
# Checkout caller repository
try:
os.makedirs(repository_name)
Expand All @@ -101,7 +101,7 @@ jobs:
]
for command in commands:
subprocess.check_output(command.split(" "), cwd=repository_name)
jobs = yaml.safe_load(caller_workflow_file_path.read_text())["jobs"]
versions_ = set()
for job in jobs.values():
Expand All @@ -119,29 +119,29 @@ jobs:
call = f"{repository_name}/{call[2:]}@{ref}"
versions_.update(get_versions_from_workflow_file(call))
return versions_
# Example: "canonical/data-platform-workflows"
REPOSITORY = "${{ inputs.repository-name }}"
# Example: "build_charms_with_cache.yaml"
REUSABLE_WORKFLOW_FILE_NAME = "${{ inputs.file-name }}"
CALL_PATTERN = re.compile(
f"{REPOSITORY}/.github/workflows/{REUSABLE_WORKFLOW_FILE_NAME}@(.*)"
)
# Example: "octocat/hello-world/.github/workflows/my-workflow.yml@refs/heads/my_branch"
CALLER_WORKFLOW_REF_PATH = "${{ github.workflow_ref }}"
versions = get_versions_from_workflow_file(CALLER_WORKFLOW_REF_PATH)
assert (
len(versions) > 0
), f"`{REUSABLE_WORKFLOW_FILE_NAME}` workflow not found in caller workflow"
assert (
len(versions) == 1
), f"""Caller workflow uses `{REUSABLE_WORKFLOW_FILE_NAME}` workflow on multiple versions {versions}.
Multiple versions not supported."""
output = f"version={versions.pop()}"
print(output)
with open(os.environ["GITHUB_OUTPUT"], "a") as file:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/build_charm_without_cache.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ jobs:
# Workaround for Docker & LXD on same machine
sudo iptables -F FORWARD
sudo iptables -P FORWARD ACCEPT
sudo snap install charmcraft --classic ${{ steps.charmcraft-snap-version.outputs.install_flag }}
pipx install tox
pipx install poetry
Expand Down
45 changes: 1 addition & 44 deletions .github/workflows/build_charms_with_cache.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
Workflow file: [build_charms_with_cache.yaml](build_charms_with_cache.yaml)

## Usage
### Step 1: Create your workflow
```yaml
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
Expand All @@ -11,52 +10,10 @@ jobs:
uses: canonical/data-platform-workflows/.github/workflows/[email protected]
permissions:
actions: write # Needed to manage GitHub Actions cache

integration-test:
name: Integration tests
needs:
- build
steps:
- name: Checkout
- name: Download packed charm(s)
uses: actions/download-artifact@v0
with:
name: ${{ needs.build.outputs.artifact-name }}
- name: Run integration tests
run: tox run -e integration
```
If any workflows call your workflow (i.e. your workflow includes `on: workflow_call`), recursively add
```yaml
permissions:
actions: write # Needed to manage GitHub Actions cache
```
to every calling workflow job.

### Step 2: Install plugin for pytest-operator (Poetry)
#### Step A
Add
```toml
pytest-operator-cache = {git = "https://github.com/canonical/data-platform-workflows", tag = "v0.0.0", subdirectory = "python/pytest_plugins/pytest_operator_cache"}
```
to your integration test dependencies in `pyproject.toml`.

#### Step B
Disable Poetry's parallel installation for integration test dependencies.

Example `tox.ini`:
```ini
[testenv:integration]
set_env =
{[testenv]set_env}
# Workaround for https://github.com/python-poetry/poetry/issues/6958
POETRY_INSTALLER_PARALLEL = false
```

### Step 3: Pass the CI environment variable
If you're using tox, pass in the `CI` environment variable in `tox.ini`.
```ini
[testenv:integration]
pass_env =
{[testenv]pass_env}
CI
```
to every calling workflow job.
2 changes: 1 addition & 1 deletion .github/workflows/build_charms_with_cache.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ jobs:
# Workaround for Docker & LXD on same machine
sudo iptables -F FORWARD
sudo iptables -P FORWARD ACCEPT
sudo snap install charmcraft --classic ${{ steps.charmcraft-snap-version.outputs.install_flag }}
pipx install tox
pipx install poetry
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/build_snap_without_cache.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ on:
path-to-snap-project-directory:
description: |
Relative path to snap project directory from repository directory
The "snap project directory" is the directory that contains the `snap` directory, not the `snap` directory itself.
default: .
type: string
Expand All @@ -39,7 +39,7 @@ jobs:
# Workaround for Docker & LXD on same machine
sudo iptables -F FORWARD
sudo iptables -P FORWARD ACCEPT
sudo snap install snapcraft --classic
- name: Pack snap
id: pack
Expand Down
124 changes: 124 additions & 0 deletions .github/workflows/integration_test_charm.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,124 @@
Workflow file: [integration_test_charm.yaml](integration_test_charm.yaml)

## Usage
### Step 1: Create your workflow
```yaml
# Copyright 2023 Canonical Ltd.
# See LICENSE file for licensing details.
jobs:
build:
name: Build charm
uses: canonical/data-platform-workflows/.github/workflows/[email protected]
permissions:
actions: write # Needed to manage GitHub Actions cache

integration-test:
name: Integration test charm
needs:
- build
uses: canonical/data-platform-workflows/.github/workflows/[email protected]
with:
artifact-name: ${{ needs.build.outputs.artifact-name }}
cloud: lxd
juju-agent-version: 0.0.0
```
### Step 2: Install plugins for pytest-operator (Poetry)
#### Step A
Add
```toml
pytest-operator-cache = {git = "https://github.com/canonical/data-platform-workflows", tag = "v0.0.0", subdirectory = "python/pytest_plugins/pytest_operator_cache"}
pytest-operator-groups = {git = "https://github.com/canonical/data-platform-workflows", tag = "v0.0.0", subdirectory = "python/pytest_plugins/pytest_operator_groups"}
```
to your integration test dependencies in `pyproject.toml`.

#### Step B
Disable Poetry's parallel installation for integration test dependencies.

Example `tox.ini`:
```ini
[testenv:integration]
set_env =
{[testenv]set_env}
# Workaround for https://github.com/python-poetry/poetry/issues/6958
POETRY_INSTALLER_PARALLEL = false
```

#### Step C
If you're using tox, pass in the `CI` and `GITHUB_OUTPUT` environment variables in `tox.ini`.
```ini
[testenv:integration]
pass_env =
{[testenv]pass_env}
CI
GITHUB_OUTPUT
```

### Step 3: Split test functions into groups
Groups allow a test file (Python module) to be split across parallel GitHub runners. Each group (for each file) gets its own runner.

Add
```python
@pytest.mark.group(1)
```
to every test function. Replace `1` with the group number.

#### Deciding how to split tests into groups
Take a look at this Discourse post: https://discourse.charmhub.io/t/faster-ci-results-by-running-integration-tests-in-parallel/8816

### (Optional) Step 4: Add secrets
#### Step A
Pass in a string representation of a Python dict[str, str] built from multiple GitHub secrets.

Do **not** put the string into a single GitHub secret—build the string from multiple GitHub secrets so that GitHub is more likely to redact the secrets in GitHub Actions logs.
```yaml
jobs:
# ...
integration-test:
# ...
uses: canonical/data-platform-workflows/.github/workflows/[email protected]
with:
# ...
secrets:
integration-test: |
{
"AWS_ACCESS_KEY_ID": "${{ secrets.AWS_ACCESS_KEY_ID }}",
"AWS_SECRET_ACCESS_KEY": "${{ secrets.AWS_SECRET_ACCESS_KEY }}",
}
```
Python code to verify the string format:
```python
import ast
secrets = ast.literal_eval("")
assert isinstance(secrets, dict)
for key, value in secrets.items():
assert isinstance(key, str) and isinstance(value, str)
```
#### Step B (Poetry)
Add
```toml
pytest-github-secrets = {git = "https://github.com/canonical/data-platform-workflows", tag = "v0.0.0", subdirectory = "python/pytest_plugins/github_secrets"}
```
to your integration test dependencies in `pyproject.toml`.

#### Step C
If you're using tox, pass in the `SECRETS_FROM_GITHUB` environment variable in `tox.ini`.
```ini
[testenv:integration]
pass_env =
{[testenv]pass_env}
# ...
SECRETS_FROM_GITHUB
```

#### Step D
Access the secrets from the `github_secrets` [pytest fixture](https://docs.pytest.org/en/stable/how-to/fixtures.html).
```python
def test_foo(github_secrets):
do_something(
access_key_id=github_secrets["AWS_ACCESS_KEY_ID"],
secret_access_key=github_secrets["AWS_SECRET_ACCESS_KEY"],
)
```
Loading

0 comments on commit 2e84a21

Please sign in to comment.