Purfe

Learn, create and have fun with Python

Pytest explained

Quick ref

  • pytest will execute all the files of format test_* or *_test in the current directory and subdirectories. It requires the test function names to start with test.
  • pytest -v for verbose output
  • pytest -q, --quiet less verbose
  • pytest -l, --showlocals show local variables in tracebacks
  • pytest --tb=short for shorter tracebacks
  • pytest --tb=native Python standard traceback formatting
  • pytest [file_name].py --collect-only dry run, collect info only
  • --ignore=lib/foo/bar.py --ignore=lib/hello/ ignore, exclude files, directories
  • pytest -q -s [file_name].py::[ClassName] run a single test
  • pytest -v To execute the tests from a specific file
  • To run only a specific set of tests:
    • pytest -k [substring] -v to run tests based on substring matching of test names (func names).
    • To run tests based on the markers applied.

      To apply marker: @pytest.mark.[markername]

      To run marked tests: pytest -m [markername] -v

      pytest --markers show available markers

  • pytest --maxfail = [num] stop after n failures
  • To run in parallel: install pip install pytest-xdist and use pytest -n [num] where [num] is number of workers
  • pytest -v --junitxml="[file_name].xml" to save test results as .xml. For xml parsing:
    • testcase tag gives the details of each executed test.
    • failure tag gives the details of the failed test code.
  • pytest -x stop after first failure
  • pytest -r[char] Show extra test summary info as specified by chars: (f)ailed (pytest -rf), (E)error, (s)skipped, (x)failed, (X)passed (w)pytest-warnings (p)passed, (P)passed with output, (a)all except pP.
  • pytest -x --pdb switch to PDB on first failure
  • pytest --durations=[num] slowest [num] durations

Pytest Fixtures

To mark function as fixture: @pytest.fixture

import pytest

@pytest.fixture
def input_value():
   input = 39
   return input

def test_divisible_by_3(input_value):
   assert input_value % 3 == 0

A fixture function defined inside a test file has scope within the test file only. Use conftest.py to define fixtures that need to be available for multiple test files

#conftest.py
import pytest

@pytest.fixture
def input_value():
   input = 39
   return input
#test.py
import pytest

def test_divisible_by_3(input_value):
   assert input_value % 3 == 0

Parameterization

@pytest.mark.parametrize to run the test against multiple sets of inputs.

import pytest

@pytest.mark.parametrize("test_input,expected", [("3+5", 8), ("2+4", 6), ("6*9", 42)])
def test_eval(test_input, expected):
    assert eval(test_input) == expected

Pytest Skipping, Failing

@pytest.mark.xfail test will be executed, but it will not be considered as part failed or passed tests. Details will not be printed, only xfail or xpass

@pytest.mark.skip test will not be executed

Pytest plugins

  • pytest-cov coverage reports. pytest -v --cov=. --cov-report html full html coverage report. pytest --cov=[myproj] tests/ simple report in terminal
  • pytest-sugar changes the default look and feel of pytest, adds a progress bar, and shows failing tests instantly. pip install pytest-sugar. To disable pytest -p no:sugar
  • pytest-dash Pytest plugin for Dash. Pytest-dash aims to ease automated testing of dash applications and components. Fixtures to start Dash applications and closes them in teardown. Declarative behavior test api in a yaml format. Selenium wait for wrappers.

Resources

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top