Skip to content

Commit

Permalink
ci: Update fmf plan to add a separate job to prepare managed nodes
Browse files Browse the repository at this point in the history
* Add a preparation job to run on managed nodes
* Update instructions to run tests locally

Signed-off-by: Sergei Petrosian <[email protected]>
  • Loading branch information
spetrosi committed Jul 26, 2024
1 parent dcfa3e7 commit c79d06d
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 35 deletions.
13 changes: 4 additions & 9 deletions plans/README-plans.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,15 +21,10 @@ You can run tests locally with the `tmt try` cli.

### Running Tests Locally

For now, this functionality requires you to push changes to a PR because the plan only runs from the main branch, or from the PR branch.
So this is WIP.
To run tests locally, in the role repository, enter `tmt try -p plans/general <platform>`.

To run tests locally, in the role repository, enter `tmt run plans --name plans/general <platform>`.
Where `<platform>` is the name of the platform you want to run tests against.
This command identifies the plans/general plan and provisions two local VMs, one used as an Ansible control node, and second used as a managed node.

For example, `tmt run plans --name plans/general Fedora-40`.
tmt try is in development and does not identify tests from URL automatically, so after provisioning the machines, you must type `t`, `p`, `t` from the interactive prompt to identify tests, run preparation steps, and run the tests.

This command identifies the plans/general plan and provisions two machines, one used as an Ansible control node, and second used as a managed node.

You can also use `tmt try` to get to an interreactive prompt and be able to ssh into test machines.
You must run `tmt try -p plans/general Fedora-40`, and the in the promt type `p` to prepare the machines, then `t` to run the tests.
You can modify environment variables in `plans/general.fmf` to, e.g. run only specified test playbooks by overwriting `SYSTEM_ROLES_ONLY_TESTS`.
35 changes: 9 additions & 26 deletions plans/general.fmf
Original file line number Diff line number Diff line change
Expand Up @@ -13,45 +13,28 @@ environment:
ANSIBLE_VER: 2.17
REPO_NAME: storage
PYTHON_VERSION: 3.12
# e.g. tests_default.yml
SYSTEM_ROLES_ONLY_TESTS: ""
PR_NUM: ""
TEST_LOCAL_CHANGES: true
prepare:
- name: Use vault.centos.org repos (CS 7, 8 EOL workaround)
script: |
if grep -q -e 'CentOS Stream release 8' -e 'CentOS Linux release 7.9' /etc/redhat-release; then
sed -i '/^mirror/d;s/#\(baseurl=http:\/\/\)mirror/\1vault/' /etc/yum.repos.d/*.repo
fi

- name: Enable epel to install beakerlib on all platforms except CS10 and Fedora, there it's not available and not needed
- name: Enable epel to install beakerlib on all platforms except CS10 and Fedora, there epel not available and not needed
script: |
if ! grep -q -e 'CentOS Stream release 10' -e 'Fedora release' /etc/redhat-release; then
yum install epel-release -y
fi
where: control_node

- name: Additional steps to enable EPEL on EL 7
script: |
if grep -q 'CentOS Linux release 7.9' /etc/redhat-release; then
yum install yum-utils -y
yum-config-manager --enable epel epel-debuginfo epel-source
fi
where: control_node

- name: Install python on managed node when running CS8 with ansible!=2.9
script: |
if [ "$ANSIBLE_VER" != "2.9" ] && grep -q 'CentOS Stream release 8' /etc/redhat-release; then
dnf install -y python"$PYTHON_VERSION"
fi
where: managed_node

- name: Distribute SSH keys when provisioned with how=virtual
script: |
if [ -f ${TMT_TREE%/*}/provision/control_node/id_ecdsa.pub ]; then
cat ${TMT_TREE%/*}/provision/control_node/id_ecdsa.pub >> ~/.ssh/authorized_keys
fi
where: managed_node

discover:
- name: Prepare managed node
how: fmf
url: https://github.com/linux-system-roles/tft-tests
ref: main
where: managed_node
filter: tag:prep_managed_node
- name: Run test playbooks from control_node
how: fmf
url: https://github.com/linux-system-roles/tft-tests
Expand Down

0 comments on commit c79d06d

Please sign in to comment.