Add benchmark for Docutils by AA-Turner · Pull Request #216 · python/pyperformance

I have no clue what is causing CI to fail, when I ran the bench_docutils function localy everything worked fine. The logs are also unhelpful (exit code 1 != 0).

A

I think the clue might be in here:

Traceback (most recent call last):
Command failed with exit code 1
  File "/home/runner/work/pyperformance/pyperformance/pyperformance/data-files/benchmarks/bm_docutils/run_benchmark.py", line 57, in <module>
    runner.bench_time_func("docutils", bench_docutils, DOC_ROOT)
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_runner.py", line 462, in bench_time_func
    return self._main(task)
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_runner.py", line 427, in _main
    bench = self._worker(task)
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_runner.py", line 401, in _worker
    run = task.create_run()
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_worker.py", line 284, in create_run
    self.compute()
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_worker.py", line 348, in compute
    WorkerTask.compute(self)
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_worker.py", line 273, in compute
    self.compute_warmups_values()
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_worker.py", line 255, in compute_warmups_values
    self._compute_values(self.values, args.values)
  File "/home/runner/work/pyperformance/pyperformance/venv/cpython3.10-6b3a2b4e6fa6-compat-c0d88e07feb9/lib/python3.10/site-packages/pyperf/_worker.py", line 72, in _compute_values
    raise ValueError("benchmark function returned zero")
ValueError: benchmark function returned zero

In some case during the test run, the new benchmark function is returning a time that is zero.

You could try to reproduce this locally by running:

python -u -m pyperformance.tests