4
4
Developer's tips for testing
5
5
============================
6
6
7
- Matplotlib has a testing infrastructure based on nose _, making it easy
8
- to write new tests. The tests are in :mod: `matplotlib.tests `, and
9
- customizations to the nose testing infrastructure are in
10
- :mod: `matplotlib.testing `. (There is other old testing cruft around,
11
- please ignore it while we consolidate our testing to these locations.)
12
-
13
- .. _nose : https://nose.readthedocs.io/en/latest/
7
+ Matplotlib's testing infrastructure depends on pytest _. The tests are in
8
+ :file: `lib/matplotlib/tests `, and customizations to the pytest testing
9
+ infrastructure are in :mod: `matplotlib.testing `.
10
+
11
+ .. _pytest : http://doc.pytest.org/en/latest/
12
+ .. _mock : https://docs.python.org/dev/library/unittest.mock.html>
13
+ .. _Ghostscript : https://www.ghostscript.com/
14
+ .. _Inkscape : https://inkscape.org
15
+ .. _pytest-cov : https://pytest-cov.readthedocs.io/en/latest/
16
+ .. _pytest-pep8 : https://pypi.python.org/pypi/pytest-pep8
17
+ .. _pytest-xdist : https://pypi.python.org/pypi/pytest-xdist
18
+ .. _pytest-timeout : https://pypi.python.org/pypi/pytest-timeout
14
19
15
20
Requirements
16
21
------------
17
22
18
23
The following software is required to run the tests:
19
24
20
- - nose _, version 1.0 or later
21
- - `mock <https://docs.python.org/dev/library/unittest.mock.html >`_, when running python
22
- versions < 3.3
23
- - `Ghostscript <https://www.ghostscript.com/ >`_ (to render PDF
24
- files)
25
- - `Inkscape <https://inkscape.org >`_ (to render SVG files)
25
+ - pytest _, version 3.0.0 or later
26
+ - mock _, when running Python versions < 3.3
27
+ - Ghostscript _ (to render PDF files)
28
+ - Inkscape _ (to render SVG files)
26
29
27
30
Optionally you can install:
28
31
29
- - `coverage <https://coverage.readthedocs.io/en/latest/ >`_ to collect coverage
30
- information
31
- - `pep8 <http://pep8.readthedocs.io/en/latest >`_ to test coding standards
32
+ - pytest-cov _ to collect coverage information
33
+ - pytest-pep8 _ to test coding standards
34
+ - pytest-timeout _ to limit runtime in case of stuck tests
35
+ - pytest-xdist _ to run tests in parallel
36
+
32
37
33
38
Building matplotlib for image comparison tests
34
39
----------------------------------------------
@@ -53,58 +58,64 @@ value.
53
58
Running the tests
54
59
-----------------
55
60
56
- Running the tests is simple. Make sure you have nose installed and run::
61
+ Running the tests is simple. Make sure you have pytest installed and run::
62
+
63
+ py.test
64
+
65
+ or::
57
66
58
67
python tests.py
59
68
60
69
in the root directory of the distribution. The script takes a set of
61
70
commands, such as:
62
71
63
72
======================== ===========
64
- ``--pep8 `` pep8 checks
65
- ``--no-pep8 `` Do not perform pep8 checks
66
- ``--no-network `` Disable tests that require network access
73
+ ``--pep8 `` Perform pep8 checks (requires pytest-pep8 _)
74
+ ``-m "not network" `` Disable tests that require network access
67
75
======================== ===========
68
76
69
- Additional arguments are passed on to nosetests. See the nose
70
- documentation for supported arguments. Some of the more important ones are given
71
- here:
77
+ Additional arguments are passed on to pytest. See the pytest documentation for
78
+ `supported arguments `_. Some of the more important ones are given here:
72
79
73
80
============================= ===========
74
81
``--verbose `` Be more verbose
75
- ``--processes=NUM `` Run tests in parallel over NUM processes
76
- ``--process-timeout=SECONDS `` Set timeout for results from test runner process
77
- ``--nocapture `` Do not capture stdout
82
+ ``--n NUM `` Run tests in parallel over NUM
83
+ processes (requires pytest-xdist _)
84
+ ``--timeout=SECONDS `` Set timeout for results from each test
85
+ process (requires pytest-timeout _)
86
+ ``--capture=no `` or ``-s `` Do not capture stdout
78
87
============================= ===========
79
88
80
- To run a single test from the command line, you can provide a
81
- dot-separated path to the module followed by the function separated by
82
- a colon, e.g., (this is assuming the test is installed )::
89
+ To run a single test from the command line, you can provide a file path,
90
+ optionally followed by the function separated by two colons, e.g., (tests do
91
+ not need to be installed, but Matplotlib should be )::
83
92
84
- python tests.py matplotlib.tests.test_simplification:test_clipping
93
+ py.test lib/matplotlib/tests/test_simplification.py::test_clipping
94
+
95
+ or, if tests are installed, a dot-separated path to the module, optionally
96
+ followed by the function separated by two colons, such as::
97
+
98
+ py.test --pyargs matplotlib.tests.test_simplification::test_clipping
85
99
86
100
If you want to run the full test suite, but want to save wall time try
87
101
running the tests in parallel::
88
102
89
- python tests.py --nocapture -- verbose --processes=5 --process-timeout=300
103
+ py.test --verbose -n 5
90
104
105
+ Depending on your version of Python and pytest-xdist, you may need to set
106
+ ``PYTHONHASHSEED `` to a fixed value when running in parallel::
91
107
92
- An alternative implementation that does not look at command line
93
- arguments works from within Python is to run the tests from the
94
- matplotlib library function :func: `matplotlib.test `::
108
+ PYTHONHASHSEED=0 py.test --verbose -n 5
109
+
110
+ An alternative implementation that does not look at command line arguments
111
+ and works from within Python is to run the tests from the Matplotlib library
112
+ function :func: `matplotlib.test `::
95
113
96
114
import matplotlib
97
115
matplotlib.test()
98
116
99
- .. hint ::
100
-
101
- To run the tests you need to install nose and mock if using python 2.7::
102
117
103
- pip install nose
104
- pip install mock
105
-
106
-
107
- .. _`nosetest arguments` : http://nose.readthedocs.io/en/latest/usage.html
118
+ .. _supported arguments : http://doc.pytest.org/en/latest/usage.html
108
119
109
120
110
121
Writing a simple test
@@ -113,30 +124,20 @@ Writing a simple test
113
124
Many elements of Matplotlib can be tested using standard tests. For
114
125
example, here is a test from :mod: `matplotlib.tests.test_basic `::
115
126
116
- from nose.tools import assert_equal
117
-
118
127
def test_simple():
119
128
"""
120
129
very simple example test
121
130
"""
122
- assert_equal(1+1,2)
123
-
124
- Nose determines which functions are tests by searching for functions
125
- beginning with "test" in their name.
131
+ assert 1 + 1 == 2
126
132
127
- If the test has side effects that need to be cleaned up, such as
128
- creating figures using the pyplot interface, use the `` @cleanup ``
129
- decorator::
133
+ Pytest determines which functions are tests by searching for files whose names
134
+ begin with `` "test_" `` and then within those files for functions beginning with
135
+ `` "test" `` or classes beginning with `` "Test" ``.
130
136
131
- from matplotlib.testing.decorators import cleanup
132
-
133
- @cleanup
134
- def test_create_figure():
135
- """
136
- very simple example test that creates a figure using pyplot.
137
- """
138
- fig = figure()
139
- ...
137
+ Some tests have internal side effects that need to be cleaned up after their
138
+ execution (such as created figures or modified rc params). The pytest fixture
139
+ :func: `~matplotlib.testing.conftest.mpl_test_settings ` will automatically clean
140
+ these up; there is no need to do anything further.
140
141
141
142
142
143
Writing an image comparison test
@@ -203,24 +204,22 @@ decorator:
203
204
Known failing tests
204
205
-------------------
205
206
206
- If you're writing a test, you may mark it as a known failing test with
207
- the :func: `~matplotlib.testing.decorators.knownfailureif `
208
- decorator. This allows the test to be added to the test suite and run
209
- on the buildbots without causing undue alarm. For example, although
210
- the following test will fail, it is an expected failure::
207
+ If you're writing a test, you may mark it as a known failing test with the
208
+ :func: `pytest.mark.xfail ` decorator. This allows the test to be added to the
209
+ test suite and run on the buildbots without causing undue alarm. For example,
210
+ although the following test will fail, it is an expected failure::
211
211
212
- from nose.tools import assert_equal
213
- from matplotlib.testing.decorators import knownfailureif
212
+ import pytest
214
213
215
- @knownfailureif(True)
214
+ @pytest.mark.xfail
216
215
def test_simple_fail():
217
216
'''very simple example test that should fail'''
218
- assert_equal(1+1,3)
217
+ assert 1 + 1 == 3
219
218
220
- Note that the first argument to the
221
- :func: ` ~matplotlib.testing.decorators.knownfailureif ` decorator is a
222
- fail condition, which can be a value such as True, False, or
223
- 'indeterminate', or may be a dynamically evaluated expression .
219
+ Note that the first argument to the :func: ` ~pytest.mark.xfail ` decorator is a
220
+ fail condition, which can be a value such as True, False, or may be a
221
+ dynamically evaluated expression. If a condition is supplied, then a reason
222
+ must also be supplied with the `` reason='message' `` keyword argument .
224
223
225
224
Creating a new module in matplotlib.tests
226
225
-----------------------------------------
@@ -229,11 +228,6 @@ We try to keep the tests categorized by the primary module they are
229
228
testing. For example, the tests related to the ``mathtext.py `` module
230
229
are in ``test_mathtext.py ``.
231
230
232
- Let's say you've added a new module named ``whizbang.py `` and you want
233
- to add tests for it in ``matplotlib.tests.test_whizbang ``. To add
234
- this module to the list of default tests, append its name to
235
- ``default_test_modules `` in :file: `lib/matplotlib/__init__.py `.
236
-
237
231
Using Travis CI
238
232
---------------
239
233
0 commit comments