-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The test was skipped,although the dependency test is success. #65
Comments
I'm having the same issue. I've used this plugin successfuly in the past, but on my current setup it just skips every test that depends on another. Code sample: @pytest.mark.dependency()
def test_ping_host(host):
"""Pinging provided host."""
try:
response = requests.get(host)
except (urllib3.exceptions.MaxRetryError, requests.exceptions.ConnectionError):
raise AssertionError("Host didn't respond.")
assert response is not None
@pytest.mark.dependency(depends=['test_ping_host'])
def test_authentication(user_auth):
"""Asserting that user authentication in conftest was successful."""
# Checking for error key in the authenticate_user object; it existing implies authentication failed
assert not user_auth.get("error")
# Checking to make sure a proper token was returned by our auth function in conftest
assert user_auth.get("token") is not ""
(venv) PS C:\Users\wmetr\Documents\xxx\pytest_demo> pytest ../pytest_demo/ --host http://127.0.0.1:5000
================================================================ test session starts =================================================================
platform win32 -- Python 3.8.0, pytest-7.1.1, pluggy-1.0.0
rootdir: C:\Users\wmetr\Documents\xxx\pytest_demo
plugins: dependency-0.5.1
collected 2 items
test_sample.py .s [100%]
============================================================ 1 passed, 1 skipped in 0.06s ============================================================``` |
I'm also seeing this issue.
|
Any workaround discovered? |
I just had a look at these (as this issue has been mentioned to me elsewhere), and saw that the original issue by @jianchao00 behaves as expected: the tests have only been run for the second test module (test_mod_02), and they are skipped because they depend on tests that have not been run. The third example by @johncuyle has not enough information about the tests and how they have been run. The second example by @wmetryka is weird though - I tried to reproduce it without success; this is a trivial case that has always worked, and I suspect something is different in the environment, but I have no idea what it is. @wmetryka: have you resolved this? Maybe you can point to a repo where this happens? |
I had the issue of dependant tests being skipped always, but only when their module name was alphabetically before their dependencies. Solved using pytest-order plugin with the flag --order-dependencies. |
the link: https://pytest-dependency.readthedocs.io/en/stable/scope.html#explicitely-specifying-the-scope.
test_mod_01.py
import pytest
@pytest.mark.dependency()
def test_a():
print("test_a")
pass
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_b():
print("test_b")
assert False
@pytest.mark.dependency(depends=["test_a"])
def test_c():
print("test_c")
pass
class TestClass(object):
test_mod_02.py
import pytest
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
print("test_a")
assert False
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::test_a", "tests/test_mod_01.py::test_c"],
scope='session'
)
def test_e():
print("test_e")
pass
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::test_b", "tests/test_mod_02.py::test_e"],
scope='session'
)
def test_f():
print("test_f")
pass
@pytest.mark.dependency(
depends=["tests/test_mod_01.py::TestClass::test_b"],
scope='session'
)
def test_g():
print("test_g")
pass
"E:\Program Files\Python37\python.exe" "E:\Program Files\JetBrains\PyCharm 2021.3\plugins\python\helpers\pycharm_jb_pytest_runner.py" --path E:/PycharmProjects/SelniumPOM-master/tests/test_mod_02.py
Testing started at 19:23 ...
Launching pytest with arguments E:/PycharmProjects/SelniumPOM-master/tests/test_mod_02.py --no-header --no-summary -q in E:\PycharmProjects\SelniumPOM-master\tests
============================= test session starts =============================
collecting ... collected 4 items
test_mod_02.py::test_a XFAIL (deliberate fail) [ 25%]test_a
@pytest.mark.dependency()
@pytest.mark.xfail(reason="deliberate fail")
def test_a():
print("test_a")
E assert False
test_mod_02.py:7: AssertionError
test_mod_02.py::test_e SKIPPED (test_e depends on tests/test_mod_01....) [ 50%]
Skipped: test_e depends on tests/test_mod_01.py::test_a
test_mod_02.py::test_f SKIPPED (test_f depends on tests/test_mod_01....) [ 75%]
Skipped: test_f depends on tests/test_mod_01.py::test_b
test_mod_02.py::test_g SKIPPED (test_g depends on tests/test_mod_01....) [100%]
Skipped: test_g depends on tests/test_mod_01.py::TestClass::test_b
======================== 3 skipped, 1 xfailed in 0.07s ========================
Process finished with exit code 0
the expect result is test_e and test_g is success!
The text was updated successfully, but these errors were encountered: