This documentation is a guide that introduces the following aspects:
- The basic idea of different test types in ESP-IDF
- How to apply the pytest framework to the test python scripts to make sure the apps are working as expected.
- ESP-IDF CI target test process
- Run ESP-IDF tests with pytest locally
- Tips and tricks on pytest
In ESP-IDF, we use the following plugins by default:
- pytest-embedded with default services
esp,idf
- pytest-rerunfailures
All the introduced concepts and usages are based on the default behavior in ESP-IDF. Not all of them are available in vanilla pytest.
All dependencies could be installed by running the install script with the --enable-pytest
argument, e.g., $ install.sh --enable-pytest
.
If you're facing an error message like:
configure: error: Package requirements (dbus-1 >= 1.8) were not met:
No package 'dbus-1' found
Consider adjusting the PKG_CONFIG_PATH environment variable if you
installed software in a non-standard prefix.
If you're running under ubuntu system, you may need to run:
sudo apt-get install libdbus-glib-1-dev
or
sudo apt-get install libdbus-1-dev
For other linux distros, you may Google the error message and find the solution. This issue could be solved by installing the related header files.
If you're facing an error message like:
error: invalid command 'bdist_wheel'
You may need to run:
python -m pip install -U pip
Or
python -m pip install wheel
Before running the pip commands, please make sure you're using the IDF python virtual environment.
Component-based unit tests are our recommended way to test your component. All the test apps should be located under ${IDF_PATH}/components/<COMPONENT_NAME>/test_apps
.
For example:
components/
└── my_component/
├── include/
│ └── ...
├── test_apps/
│ ├── test_app_1
│ │ ├── main/
│ │ │ └── ...
│ │ ├── CMakeLists.txt
│ │ └── pytest_my_component_app_1.py
│ ├── test_app_2
│ │ ├── ...
│ │ └── pytest_my_component_app_2.py
│ └── parent_folder
│ ├── test_app_3
│ │ ├── ...
│ │ └── pytest_my_component_app_3.py
│ └── ...
├── my_component.c
└── CMakeLists.txt
Example Tests are tests for examples that are intended to demonstrate parts of the ESP-IDF functionality to our customers.
All the test apps should be located under ${IDF_PATH}/examples
. For more information please refer to the :idf_file:`Examples Readme <examples/README.md>`.
For example:
examples/
└── parent_folder/
└── example_1/
├── main/
│ └── ...
├── CMakeLists.txt
└── pytest_example_1.py
Custom Tests are tests that aim to run some arbitrary test internally. They are not intended to demonstrate the ESP-IDF functionality to our customers in any way.
All the test apps should be located under ${IDF_PATH}/tools/test_apps
. For more information please refer to the :idf_file:`Custom Test Readme <tools/test_apps/README.md>`.
Bootstrapping Phase
Create session-scoped caches:
- port-target cache
- port-app cache
Collection Phase
Test Running Phase
Construct the fixtures. In ESP-IDF, the common fixtures are initialized in this order:
pexpect_proc
: pexpect instanceapp
: IdfApp instanceThe information of the app, like sdkconfig, flash_files, partition_table, etc., would be parsed at this phase.
serial
: IdfSerial instanceThe port of the host which connected to the target type parsed from the app would be auto-detected. The flash files would be auto flashed.
dut
: IdfDut instance
Run the real test function
Deconstruct the fixtures in this order:
dut
- close the
serial
port - (Only for apps with unity test framework) generate junit report of the unity test cases
- close the
serial
app
pexpect_proc
: Close the file descriptor
(Only for apps with unity test framework)
Raise
AssertionError
when detected unity test failed if you calldut.expect_from_unity_output()
in the test function.
Reporting Phase
- Generate junit report of the test functions
- Modify the junit report test case name into ESP-IDF test case ID format:
<target>.<config>.<test function name>
Finalizing Phase (Only for apps with unity test framework)
Combine the junit reports if the junit reports of the unity test cases are generated.
This code example is taken from :idf_file:`pytest_console_basic.py <examples/system/console/basic/pytest_console_basic.py>`.
@pytest.mark.esp32
@pytest.mark.esp32c3
@pytest.mark.generic
@pytest.mark.parametrize('config', [
'history',
'nohistory',
], indirect=True)
def test_console_advanced(config: str, dut: IdfDut) -> None:
if config == 'history':
dut.expect('Command history enabled')
elif config == 'nohistory':
dut.expect('Command history disabled')
Let's go through this simple test case line by line in the following subsections.
@pytest.mark.esp32 # <-- support esp32
@pytest.mark.esp32c3 # <-- support esp32c3
@pytest.mark.generic # <-- test env "generic"
The above lines indicate that this test case supports target esp32 and esp32c3, the target board type should be "generic". If you want to know what is the "generic" type refers to, you may run pytest --markers
to get the detailed information of all markers.
Note
If the test case supports all officially ESP-IDF supported targets (You may check the value via "idf.py --list-targets"), you can use a special marker supported_targets
to apply all of them in one line.
You can use pytest.mark.parametrize
with “config” to apply the same test to different apps with different sdkconfig files. For more information about sdkconfig.ci.xxx
files, please refer to the Configuration Files section under :idf_file:`this readme <tools/test_apps/README.md>`.
@pytest.mark.parametrize('config', [
'history', # <-- run with app built by sdkconfig.ci.history
'nohistory', # <-- run with app built by sdkconfig.ci.nohistory
], indirect=True) # <-- `indirect=True` is required
Overall, this test function would be replicated to 4 test cases:
- esp32.history.test_console_advanced
- esp32.nohistory.test_console_advanced
- esp32c3.history.test_console_advanced
- esp32c3.nohistory.test_console_advanced
def test_console_advanced(config: str, dut: IdfDut) -> None: # The value of argument ``config`` is assigned by the parametrization.
if config == 'history':
dut.expect('Command history enabled')
elif config == 'nohistory':
dut.expect('Command history disabled')
When we're using dut.expect(...)
, the string would be compiled into regex at first, and then seeks through the serial output until the compiled regex is matched, or a timeout is exceeded. You may have to pay extra attention when the string contains regex keyword characters, like parentheses, or square brackets.
Actually using dut.expect_exact(...)
here is better, since it would seek until the string is matched. For further reading about the different types of expect
functions, please refer to the pytest-embedded Expecting documentation.
@pytest.mark.esp32s2
@pytest.mark.esp32s3
@pytest.mark.usb_host
@pytest.mark.parametrize('count', [
2,
], indirect=True)
def test_usb_host(dut: Tuple[IdfDut, IdfDut]) -> None:
device = dut[0] # <-- assume the first dut is the device
host = dut[1] # <-- and the second dut is the host
...
After setting the param count
to 2, all these fixtures are changed into tuples.
This code example is taken from :idf_file:`pytest_wifi_getting_started.py <examples/wifi/getting_started/pytest_wifi_getting_started.py>`.
@pytest.mark.esp32
@pytest.mark.multi_dut_generic
@pytest.mark.parametrize(
'count, app_path', [
(2,
f'{os.path.join(os.path.dirname(__file__), "softAP")}|{os.path.join(os.path.dirname(__file__), "station")}'),
], indirect=True
)
def test_wifi_getting_started(dut: Tuple[IdfDut, IdfDut]) -> None:
softap = dut[0]
station = dut[1]
...
Here the first dut was flashed with the app :idf_file:`softap <examples/wifi/getting_started/softAP/main/softap_example_main.c>`, and the second dut was flashed with the app :idf_file:`station <examples/wifi/getting_started/station/main/station_example_main.c>`.
Note
Here the app_path
should be set with absolute path. the __file__
macro in python would return the absolute path of the test script itself.
This code example is taken from :idf_file:`pytest_wifi_getting_started.py <examples/wifi/getting_started/pytest_wifi_getting_started.py>`. As the comment says, for now it's not running in the ESP-IDF CI.
@pytest.mark.parametrize(
'count, app_path, target', [
(2,
f'{os.path.join(os.path.dirname(__file__), "softAP")}|{os.path.join(os.path.dirname(__file__), "station")}',
'esp32|esp32s2'),
(2,
f'{os.path.join(os.path.dirname(__file__), "softAP")}|{os.path.join(os.path.dirname(__file__), "station")}',
'esp32s2|esp32'),
],
indirect=True,
)
def test_wifi_getting_started(dut: Tuple[IdfDut, IdfDut]) -> None:
softap = dut[0]
station = dut[1]
...
Overall, this test function would be replicated to 2 test cases:
- softap with esp32 target, and station with esp32s2 target
- softap with esp32s2 target, and station with esp32 target
This code example is taken from :idf_file:`pytest_panic.py <tools/test_apps/system/panic/pytest_panic.py>` as an advanced example.
CONFIGS = [
pytest.param('coredump_flash_bin_crc', marks=[pytest.mark.esp32, pytest.mark.esp32s2]),
pytest.param('coredump_flash_elf_sha', marks=[pytest.mark.esp32]), # sha256 only supported on esp32
pytest.param('coredump_uart_bin_crc', marks=[pytest.mark.esp32, pytest.mark.esp32s2]),
pytest.param('coredump_uart_elf_crc', marks=[pytest.mark.esp32, pytest.mark.esp32s2]),
pytest.param('gdbstub', marks=[pytest.mark.esp32, pytest.mark.esp32s2]),
pytest.param('panic', marks=[pytest.mark.esp32, pytest.mark.esp32s2]),
]
@pytest.mark.parametrize('config', CONFIGS, indirect=True)
...
Usually, you can write a custom class in these conditions:
- Add more reusable functions for a certain number of DUTs
- Add custom setup and teardown functions in different phases described here
This code example is taken from :idf_file:`panic/conftest.py <tools/test_apps/system/panic/conftest.py>`
class PanicTestDut(IdfDut):
...
@pytest.fixture(scope='module')
def monkeypatch_module(request: FixtureRequest) -> MonkeyPatch:
mp = MonkeyPatch()
request.addfinalizer(mp.undo)
return mp
@pytest.fixture(scope='module', autouse=True)
def replace_dut_class(monkeypatch_module: MonkeyPatch) -> None:
monkeypatch_module.setattr('pytest_embedded_idf.dut.IdfDut', PanicTestDut)
monkeypatch_module
provide a module-scoped monkeypatch fixture.
replace_dut_class
is a module-scoped autouse fixture. This function replaces the IdfDut
class with your custom class.
Sometimes, our test is based on ethernet or wifi. The network may cause the test flaky. We could mark the single test case within the code repo.
This code example is taken from :idf_file:`pytest_esp_eth.py <components/esp_eth/test_apps/pytest_esp_eth.py>`
@pytest.mark.flaky(reruns=3, reruns_delay=5)
def test_esp_eth_ip101(dut: IdfDut) -> None:
...
This flaky marker means that if the test function failed, the test case would rerun for a maximum of 3 times with 5 seconds delay.
Sometimes a test couldn't pass for the following reasons:
- Has a bug
- The success ratio is too low because of environment issue, such as network issue. Retry couldn't help
Now you may mark this test case with marker xfail with a user-friendly readable reason.
This code example is taken from :idf_file:`pytest_panic.py <tools/test_apps/system/panic/pytest_panic.py>`
@pytest.mark.xfail('config.getvalue("target") == "esp32s2"', reason='raised IllegalInstruction instead')
def test_cache_error(dut: PanicTestDut, config: str, test_func_name: str) -> None:
This marker means that if the test would be a known failure one on esp32s2.
Some test cases are only triggered in nightly run pipelines due to a lack of runners.
@pytest.mark.nightly_run
This marker means that the test case would only be run with env var NIGHTLY_RUN
or INCLUDE_NIGHTLY_RUN
.
Some test cases which can pass locally may need to be temporarily disabled in CI due to a lack of runners.
@pytest.mark.temp_skip_ci(targets=['esp32', 'esp32s2'], reason='lack of runners')
This marker means that the test case could still be run locally with pytest --target esp32
, but will not run in CI.
For component-based unit test apps, one line could do the trick to run all single-board test cases, including normal test cases and multi-stage test cases:
def test_component_ut(dut: IdfDut):
dut.run_all_single_board_cases()
It would also skip all the test cases with [ignore]
mark.
If you need to run a group of test cases, you may run:
def test_component_ut(dut: IdfDut):
dut.run_all_single_board_cases(group='psram')
It would trigger all test cases with module name [psram]
.
You may also see that there are some test scripts with the following statements, which are deprecated. Please use the suggested one as above.
def test_component_ut(dut: IdfDut):
dut.expect_exact('Press ENTER to see the list of tests')
dut.write('*')
dut.expect_unity_test_output()
For further reading about our unit testing in ESP-IDF, please refer to :doc:`our unit testing guide <../api-guides/unit-tests>`.
The workflow in CI is simple, build jobs -> target test jobs.
- Component-based Unit Tests:
build_pytest_components_<target>
- Example Tests:
build_pytest_examples_<target>
- Custom Tests:
build_pytest_test_apps_<target>
The command used by CI to build all the relevant tests is: python $IDF_PATH/tools/ci/ci_build_apps.py <parent_dir> --target <target> -vv --pytest-apps
All apps which supported the specified target would be built with all supported sdkconfig files under build_<target>_<config>
.
For example, If you run python $IDF_PATH/tools/ci/ci_build_apps.py $IDF_PATH/examples/system/console/basic --target esp32 --pytest-apps
, the folder structure would be like this:
basic
├── build_esp32_history/
│ └── ...
├── build_esp32_nohistory/
│ └── ...
├── main/
├── CMakeLists.txt
├── pytest_console_basic.py
└── ...
All the binaries folders would be uploaded as artifacts under the same directories.
- Component-based Unit Tests:
component_ut_pytest_<target>_<test_env>
- Example Tests:
example_test_pytest_<target>_<test_env>
- Custom Tests:
test_app_test_pytest_<target>_<test_env>
The command used by CI to run all the relevant tests is: pytest <parent_dir> --target <target> -m <test_env_marker>
All test cases with the specified target marker and the test env marker under the parent folder would be executed.
The binaries in the target test jobs are downloaded from build jobs, the artifacts would be placed under the same directories.
The local executing process is the same as the CI process.
For example, if you want to run all the esp32 tests under the $IDF_PATH/examples/system/console/basic
folder, you may:
$ cd $IDF_PATH
$ bash install.sh --enable-pytest
$ . ./export.sh
$ cd examples/system/console/basic
$ python $IDF_PATH/tools/ci/ci_build_apps.py . --target esp32 -vv --pytest-apps
$ pytest --target esp32
filter by target with
pytest --target <target>
pytest would run all the test cases that support specified target.
filter by sdkconfig file with
pytest --sdkconfig <sdkconfig>
if
<sdkconfig>
isdefault
, pytest would run all the test cases with the sdkconfig filesdkconfig.defaults
.In other cases, pytest would run all the test cases with sdkconfig file
sdkconfig.ci.<sdkconfig>
.
We’re using two types of custom markers, target markers which indicate that the test cases should support this target, and env markers which indicate that the test case should be assigned to runners with these tags in CI.
You can add new markers by adding one line under the ${IDF_PATH}/conftest.py
. If it's a target marker, it should be added into TARGET_MARKERS
. If it's a marker that specifies a type of test environment, it should be added into ENV_MARKERS
. The grammar should be: <marker_name>: <marker_description>
.
You can call pytest with --junitxml <filepath>
to generate the JUnit report. In ESP-IDF, the test case name would be unified as "<target>.<config>.<function_name>".
Skipping auto-flash binary every time would be useful when you're debugging your test script.
You can call pytest with --skip-autoflash y
to achieve it.
Sometimes you may need to record some statistics while running the tests, like the performance test statistics.
You can use record_xml_attribute fixture in your test script, and the statistics would be recorded as attributes in the JUnit report.
Sometimes you may need to add some extra logging lines while running the test cases.
You can use python logging module to achieve this.
def test_hello_world(
dut: IdfDut,
log_performance: Callable[[str, object], None],
) -> None:
log_performance('test', 1)
The above example would log the performance item with pre-defined format: "[performance][test]: 1" and record it under the properties
tag in the junit report if --junitxml <filepath>
is specified. The junit test case node would look like:
<testcase classname="examples.get-started.hello_world.pytest_hello_world" file="examples/get-started/hello_world/pytest_hello_world.py" line="13" name="esp32.default.test_hello_world" time="8.389">
<properties>
<property name="test" value="1"/>
</properties>
</testcase>
We provide C macros TEST_PERFORMANCE_LESS_THAN
and TEST_PERFORMANCE_GREATER_THAN
to log the performance item and check if the value is in the valid range. Sometimes the performance item value could not be measured in C code, so we also provide a python function for the same purpose. Please note that using C macros is the preferred approach, since the python function couldn't recognize the threshold values of the same performance item under different ifdef blocks well.
def test_hello_world(
dut: IdfDut,
check_performance: Callable[[str, float, str], None],
) -> None:
check_performance('RSA_2048KEY_PUBLIC_OP', 123, 'esp32')
check_performance('RSA_2048KEY_PUBLIC_OP', 19001, 'esp32')
The above example would first get the threshold values of the performance item RSA_2048KEY_PUBLIC_OP
from :idf_file:`components/idf_test/include/idf_performance.h` and the target-specific one :idf_file:`components/idf_test/include/esp32/idf_performance_target.h`, then check if the value reached the minimum limit or exceeded the maximum limit.
Let's assume the value of IDF_PERFORMANCE_MAX_RSA_2048KEY_PUBLIC_OP
is 19000. so the first check_performance
line would pass and the second one would fail with warning: [Performance] RSA_2048KEY_PUBLIC_OP value is 19001, doesn\'t meet pass standard 19000.0
- pytest documentation: https://docs.pytest.org/en/latest/contents.html
- pytest-embedded documentation: https://docs.espressif.com/projects/pytest-embedded/en/latest/