2

I know there are plugins for performance tests and profiling for py.test but is there a way to generate arbitrary values which are reported or somehow accessible after the test?

Imagine I have a test like this

def test_minimum_learning_rate():
    """Make some fancy stuff and generate a learning performance value"""
    learning_rate = fancy_learning_function().rate
    pytest.report("rate", learning_rate)
    assert learning_rate > 0.5

The pytest.report(..) line is what I'd like to have (but isn't there, is it?)

And now I'd like to have something like minimum_learning_rate[rate] being written along with the actual test results to the report (or maybe at least on the screen).

Really nice would be some plugin for Jenkins which creates a nice chart from that data.

Is there a typical wording for this? I've been looking for kpi, arbitrary values, user defined values but without any luck yet..

hoefling
  • 59,418
  • 12
  • 147
  • 194
frans
  • 8,868
  • 11
  • 58
  • 132
  • Check out [Hypothesis](https://hypothesis.readthedocs.io/en/latest/index.html) – Amit Jun 06 '19 at 20:19
  • Well - this is just another testing framework, isn't it? My question is about py.test. I'll have a look anyway but after quickly looking into it I didn't find much about writing actual values to reports.. – frans Jun 06 '19 at 20:34
  • You can use it with `pytest` to run a test for different inputs [Link](https://hypothesis.readthedocs.io/en/latest/details.html?highlight=pytest#the-hypothesis-pytest-plugin), i.e. Hypothesis can generate inputs for a test based on some strategy – Amit Jun 06 '19 at 20:37
  • It's not quite clear how you want to report the values - a simple `print` in the test code and `pytest -s` will already display the data. However, note that `pytest` doesn't output between tests by default; custom output can (and should) be printed after the test execution finishes (there are hooks for that). If you want custom data in the JUnit report, there are fixtures for that either. – hoefling Jun 07 '19 at 13:24
  • 1
    As for plotting the data in Jenkins, `pytest` can't handle this, it must be done by a Jenkins plugin. You can check out how [Jenkins JUnit plugin](https://github.com/jenkinsci/junit-plugin) displays test charts and write your own plugin based on that. – hoefling Jun 07 '19 at 13:27
  • @hoefling Can you provide names for those fixtures and hooks and turn your comment into an answer? – frans Jun 07 '19 at 13:30

1 Answers1

4

If you just want to output some debug values, a print call combined with the -s argument will already suffice:

def test_spam():
    print('debug')
    assert True

Running pytest -s:

collected 1 item                                                                                                                                                                                                  

test_spam.py debug
.

If you are looking for a solution that is better integrated into pytest execution flow, write custom hooks. The examples below should give you some ideas.

Printing custom lines after each test execution

# conftest.py

def pytest_report_teststatus(report, config):
    if report.when == 'teardown':  # you may e.g. also check the outcome here to filter passed or failed tests only
        rate = getattr(config, '_rate', None)
        if rate is not None:
            terminalreporter = config.pluginmanager.get_plugin('terminalreporter')
            terminalreporter.ensure_newline()
            terminalreporter.write_line(f'test {report.nodeid}, rate: {rate}', red=True, bold=True)

Tests:

def report(rate, request):
    request.config._rate = rate

def test_spam(request):
    report(123, request)

def test_eggs(request):
    report(456, request)

Output:

collected 2 items                                                                                                                                                                                                 

test_spam.py .
test test_spam.py::test_spam, rate: 123

test_spam.py .
test test_spam.py::test_eggs, rate: 456
===================================================== 2 passed in 0.01 seconds =====================================================

Collecting data and printing after test execution

# conftest.py

def pytest_configure(config):
    config._rates = dict()

def pytest_terminal_summary(terminalreporter, exitstatus, config):
    terminalreporter.ensure_newline()
    for testid, rate in config._rates.items():
        terminalreporter.write_line(f'test {testid}, rate: {rate}', yellow=True, bold=True)

Tests:

def report(rate, request):
    request.config._rates[request.node.nodeid] = rate

def test_spam(request):
    report(123, request)

def test_eggs(request):
    report(456, request)

Output:

collected 2 items                                                                                                                  

test_spam.py ..

test test_spam.py::test_spam, rate: 123
test test_spam.py::test_eggs, rate: 456
===================================================== 2 passed in 0.01 seconds =====================================================

Appending data in JUnit XML report

Using the record_property fixture:

def test_spam(record_property):
    record_property('rate', 123)

def test_eggs(record_property):
    record_property('rate', 456)

Resulting report:

$ pytest --junit-xml=report.xml
...
$ xmllint --format report.xml
<testsuite errors="0" failures="0" name="pytest" skipped="0" tests="2" time="0.056">
  <testcase classname="test_spam" file="test_spam.py" line="12" name="test_spam" time="0.001">
    <properties>
      <property name="rate" value="123"/>
    </properties>
  </testcase>
  <testcase classname="test_spam" file="test_spam.py" line="15" name="test_eggs" time="0.001">
    <properties>
      <property name="rate" value="456"/>
    </properties>
  </testcase>
</testsuite>
hoefling
  • 59,418
  • 12
  • 147
  • 194
  • I'll check this next week - but at least `record_property` sounds quite close to what I'm looking for – frans Jun 07 '19 at 17:46