Skip to content

Figure equality-based tests. #11408

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 23, 2018
Merged

Conversation

anntzer
Copy link
Contributor

@anntzer anntzer commented Jun 10, 2018

Implement a check_figures_equal decorator, which allows tests where
both the reference and the test image are generated. The idea is to
allow tests of the form "this feature should be equivalent to that
(usually more complex) way of achieving the same thing" without further
bloating the test image directory.

The saved images are properly created in the result_images folder, but
cannot be "accepted" or "rejected" using the triage_tests UI (as there
is indeed no reference image to be saved in the repo).

Includes an example use case. Other PRs that could use this feature
include #9426 and #11407.

PR Summary

PR Checklist

  • Has Pytest style unit tests
  • Code is PEP 8 compliant
  • New features are documented, with examples if plot related
  • Documentation is sphinx and numpydoc compliant
  • Added an entry to doc/users/next_whats_new/ if major new feature (follow instructions in README.rst there)
  • Documented in doc/api/api_changes.rst if API changed in a backward-incompatible way

@tacaswell
Copy link
Member

I like this!

@tacaswell tacaswell added this to the v3.0 milestone Jun 10, 2018
@anntzer anntzer force-pushed the check_figures_equal branch from 2eec7e8 to 91354a1 Compare June 10, 2018 21:02
@jklymak
Copy link
Member

jklymak commented Jun 11, 2018

Seems useful. However, the new test is failing 😉

@anntzer
Copy link
Contributor Author

anntzer commented Jun 11, 2018

Yes, and I can't reproduce the failure locally... Can anyone else give it a try?

@afvincent
Copy link
Contributor

@anntzer I just pulled your PR branch and ran test_axes.py: no failure on my local machine either :/.

Python 3.6.4 (from conda)
Matplotlib master from GitHub (obviously)
Fedora 28

@jklymak
Copy link
Member

jklymak commented Jun 12, 2018

yeah, sorry, works on my machine as well (well I skip svg)

@jklymak
Copy link
Member

jklymak commented Jun 14, 2018

Interestingly my machine is a mac and it passes on my machine, and this passes the travis Mac tests. But not the linux ones. No idea whats going on here.

@anntzer anntzer force-pushed the check_figures_equal branch from 91354a1 to 3e16090 Compare June 15, 2018 21:55
Implement a `check_figures_equal` decorator, which allows tests where
both the reference and the test image are generated.  The idea is to
allow tests of the form "this feature should be equivalent to that
(usually more complex) way of achieving the same thing" without further
bloating the test image directory.

The saved images are properly created in the `result_images` folder, but
cannot be "accepted" or "rejected" using the triage_tests UI (as there
is indeed no reference image to be saved in the repo).

Includes an example use case.
@anntzer anntzer force-pushed the check_figures_equal branch from 3e16090 to 9cd3fbe Compare June 19, 2018 17:00
@anntzer
Copy link
Contributor Author

anntzer commented Jun 19, 2018

Almost certain that the failure actually already existed previously on svg and pdf, except that we didn't actually test these two formats (only png) before; moreover that's likely due to old versions of inkscape and gs on travis.
So I just made the test only check the png format.

The new failure on Py3.5 is likely unrelated too.

@timhoffm timhoffm merged commit 3a5d8c2 into matplotlib:master Jun 23, 2018
@anntzer anntzer deleted the check_figures_equal branch June 23, 2018 17:54
@anntzer anntzer mentioned this pull request Jul 16, 2018
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants