Skip to content

Matplotlib git master version fails to pass serveral pytest's tests. #17310

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
hongyi-zhao opened this issue May 3, 2020 · 15 comments
Closed

Comments

@hongyi-zhao
Copy link

Hi,

The environment is as follows: Ubuntu 19.10, python 3.8.2 managed by pyenv and matplotlib git master version. See the following for the testing steps and results:

$ sudo apt-get build-dep python3-matplotlib python3-matplotlib-venn
$ pip install -e .[dev,extra,all]
$ pytest
========================================== test session starts ==========================================
platform linux -- Python 3.8.2, pytest-5.4.1, py-1.5.4, pluggy-0.13.1
rootdir: /home/werner/Public/hpc/tools/matplotlib.git, inifile: pytest.ini, testpaths: lib
plugins: cov-2.8.1, datadir-1.3.1, regressions-1.0.6, timeout-1.3.4
collected 7591 items     
[...]
================================================ FAILURES =================================================
___________________________________________ test_pdflatex[pdf] ____________________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                'images not close (RMS %(rms).3f):\n\t%(actual)s\n\t%(expected)s '
                 % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 11.669):
E           	result_images/test_backend_pgf/pgf_pdflatex_pdf.png
E           	result_images/test_backend_pgf/pgf_pdflatex-expected_pdf.png

lib/matplotlib/testing/decorators.py:135: ImageComparisonFailure
______________________________________________ test_rcupdate ______________________________________________

    @needs_xelatex
    @needs_pdflatex
    @pytest.mark.skipif(not _has_sfmath(), reason='needs sfmath.sty')
    @pytest.mark.style('default')
    @pytest.mark.backend('pgf')
    def test_rcupdate():
        rc_sets = [{'font.family': 'sans-serif',
                    'font.size': 30,
                    'figure.subplot.left': .2,
                    'lines.markersize': 10,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'xelatex'},
                   {'font.family': 'monospace',
                    'font.size': 10,
                    'figure.subplot.left': .1,
                    'lines.markersize': 20,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'pdflatex',
                    'pgf.preamble': ('\\usepackage[utf8x]{inputenc}'
                                     '\\usepackage[T1]{fontenc}'
                                     '\\usepackage{sfmath}')}]
        tol = [6, 0]
        for i, rc_set in enumerate(rc_sets):
            with mpl.rc_context(rc_set):
                create_figure()
>               compare_figure('pgf_rcupdate%d.pdf' % (i + 1), tol=tol[i])

lib/matplotlib/tests/test_backend_pgf.py:157: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

fname = 'pgf_rcupdate2.pdf', savefig_kwargs = {}, tol = 0

    def compare_figure(fname, savefig_kwargs={}, tol=0):
        actual = os.path.join(result_dir, fname)
        plt.savefig(actual, **savefig_kwargs)
    
        expected = os.path.join(result_dir, "expected_%s" % fname)
        shutil.copyfile(os.path.join(baseline_dir, fname), expected)
        err = compare_images(expected, actual, tol=tol)
        if err:
>           raise ImageComparisonFailure(err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: Error: Image files did not match.
E             RMS Value: 13.166575390337284
E             Expected:  
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/expected_pgf_rcupdate2_pdf.png
E             Actual:    
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf.png
E             Difference:
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf-failed-diff.png
E             Tolerance: 
E               0

lib/matplotlib/tests/test_backend_pgf.py:63: ImageComparisonFailure
____________________________________________ test_usetex[png] _____________________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_usetex/test_usetex-expected.png'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_usetex/test_usetex.png')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                'images not close (RMS %(rms).3f):\n\t%(actual)s\n\t%(expected)s '
                 % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 12.877):
E           	result_images/test_usetex/test_usetex.png
E           	result_images/test_usetex/test_usetex-expected.png

lib/matplotlib/testing/decorators.py:135: ImageComparisonFailure
========================================= short test summary info =========================================
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_pdflatex[pdf] - matplotlib.testing.exceptions.Imag...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_rcupdate - matplotlib.testing.exceptions.ImageComp...
FAILED lib/matplotlib/tests/test_usetex.py::test_usetex[png] - matplotlib.testing.exceptions.ImageCompar...
=================== 3 failed, 7517 passed, 60 skipped, 11 xfailed in 630.55s (0:10:30) ====================

Any hints for solving this problems?

Regards

@hongyi-zhao hongyi-zhao changed the title pytest failed for some tests. Matploglib git master version fails to pass serveral pytest's tests. May 3, 2020
@hongyi-zhao hongyi-zhao changed the title Matploglib git master version fails to pass serveral pytest's tests. Matplotlib git master version fails to pass serveral pytest's tests. May 3, 2020
@timhoffm
Copy link
Member

timhoffm commented May 3, 2020

Compare the actual and expected images in the result_images folder. That tells you what's unexpected and will give a hint what's causing the failure.

@hongyi-zhao
Copy link
Author

hongyi-zhao commented May 3, 2020

I find all of the failed-diff files as following:

werner@ubuntu-01:~/Public/hpc/tools/matplotlib.git/result_images$ find . -type f -name '*failed-diff*'
./test_compare_images/all127-failed-diff.png
./test_compare_images/all128-failed-diff.png
./test_compare_images/basn3p02-failed-diff.png
./test_backend_pgf/pgf_rcupdate2_pdf-failed-diff.png
./test_backend_pgf/pgf_pdflatex_pdf-failed-diff.png
./test_usetex/test_usetex-failed-diff.png

But some of them are "empty" without content, others have something like garbled. I still cannot figure out the reason. See the following for you info:

image
image
image
image

@anntzer
Copy link
Contributor

anntzer commented May 3, 2020

What is your version of dvipng? If it is 1.16 there's a known issue about that (#16476 (comment)). Otherwise please delete your ~/.cache/matplotlib/tex.cache and try again.

@hongyi-zhao
Copy link
Author

hongyi-zhao commented May 4, 2020

Now, I updated to dvipng 1.17 shipped with TeX Live 2020, see following:

$ dvipng --version
This is dvipng 1.17 Copyright 2002-2015, 2019 Jan-Ake Larsson
dvipng (TeX Live) 1.17
kpathsea version 6.3.2
Compiled with Freetype 2.10.1
Using libft 2.10.1
Copyright (C) 2002-2015, 2019 Jan-Ake Larsson.
There is NO warranty.  You may redistribute this software
under the terms of the GNU Lesser General Public License
version 3, see the COPYING file in the dvipng distribution
or <http://www.gnu.org/licenses/>.

But pytest still throws errors, see following:


$ pytest
=============================== test session starts ===============================
platform linux -- Python 3.8.2, pytest-5.4.1, py-1.5.4, pluggy-0.13.1
rootdir: /home/werner/Public/hpc/tools/matplotlib.git, inifile: pytest.ini, testpaths: lib
plugins: cov-2.8.1, datadir-1.3.1, regressions-1.0.6, timeout-1.3.4
collected 7593 items                                                  
[...]
==================================== FAILURES =====================================
_______________________________ test_pdflatex[pdf] ________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                'images not close (RMS %(rms).3f):\n\t%(actual)s\n\t%(expected)s '
                 % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 11.669):
E           	result_images/test_backend_pgf/pgf_pdflatex_pdf.png
E           	result_images/test_backend_pgf/pgf_pdflatex-expected_pdf.png

lib/matplotlib/testing/decorators.py:135: ImageComparisonFailure
__________________________________ test_rcupdate __________________________________

    @needs_xelatex
    @needs_pdflatex
    @pytest.mark.skipif(not _has_sfmath(), reason='needs sfmath.sty')
    @pytest.mark.style('default')
    @pytest.mark.backend('pgf')
    def test_rcupdate():
        rc_sets = [{'font.family': 'sans-serif',
                    'font.size': 30,
                    'figure.subplot.left': .2,
                    'lines.markersize': 10,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'xelatex'},
                   {'font.family': 'monospace',
                    'font.size': 10,
                    'figure.subplot.left': .1,
                    'lines.markersize': 20,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'pdflatex',
                    'pgf.preamble': ('\\usepackage[utf8x]{inputenc}'
                                     '\\usepackage[T1]{fontenc}'
                                     '\\usepackage{sfmath}')}]
        tol = [6, 0]
        for i, rc_set in enumerate(rc_sets):
            with mpl.rc_context(rc_set):
                create_figure()
>               compare_figure('pgf_rcupdate%d.pdf' % (i + 1), tol=tol[i])

lib/matplotlib/tests/test_backend_pgf.py:157: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

fname = 'pgf_rcupdate2.pdf', savefig_kwargs = {}, tol = 0

    def compare_figure(fname, savefig_kwargs={}, tol=0):
        actual = os.path.join(result_dir, fname)
        plt.savefig(actual, **savefig_kwargs)
    
        expected = os.path.join(result_dir, "expected_%s" % fname)
        shutil.copyfile(os.path.join(baseline_dir, fname), expected)
        err = compare_images(expected, actual, tol=tol)
        if err:
>           raise ImageComparisonFailure(err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: Error: Image files did not match.
E             RMS Value: 13.166575390337284
E             Expected:  
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/expected_pgf_rcupdate2_pdf.png
E             Actual:    
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf.png
E             Difference:
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf-failed-diff.png
E             Tolerance: 
E               0

lib/matplotlib/tests/test_backend_pgf.py:63: ImageComparisonFailure
============================= short test summary info =============================
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_pdflatex[pdf] - matplotlib...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_rcupdate - matplotlib.test...
======= 2 failed, 7520 passed, 60 skipped, 11 xfailed in 627.69s (0:10:27) ========

werner@ubuntu-01:~/Public/hpc/tools/matplotlib.git/result_images$ find . -type f -name '*failed-diff*'
./test_compare_images/all127-failed-diff.png
./test_compare_images/all128-failed-diff.png
./test_compare_images/basn3p02-failed-diff.png
./test_backend_pgf/pgf_rcupdate2_pdf-failed-diff.png
./test_backend_pgf/pgf_pdflatex_pdf-failed-diff.png

@tacaswell
Copy link
Member

But some of them are only blank without content,

Almost all of our tests are set to 0 difference thresholds, some of the "empty" images make have only a few pixels off by a few bits which is extremely hard to see by eye in the difference images. That said, the three images in test_compare_images/ are red-herrings, they are testing that things do fail when expected.

@tacaswell tacaswell added this to the v3.3.0 milestone May 4, 2020
@hongyi-zhao
Copy link
Author

But what's the meaning of this from my testing results:

60 skipped, 11 xfailed

Regards

@tacaswell
Copy link
Member

Sorry for the very slow response

60 skipped, 11 xfailed

That means 60 tests were skipped (usually because you are missing an optional dependancy) and 11 tests that we expect to fail failed (some are tests that are documenting bugs, at least one is testing that the tests fail when we think they should).

@hongyi-zhao
Copy link
Author

hongyi-zhao commented Jun 2, 2020

Sorry for the very slow response

60 skipped, 11 xfailed

That means 60 tests were skipped (usually because you are missing an optional dependancy)

How to check out the full list for these dependencies?

I install the git master version of matplotlib using the following command in its local repo folder:
$ pip install -e .

and 11 tests that we expect to fail failed (some are tests that are documenting bugs, at least one is testing that the tests fail when we think they should).

@tacaswell
Copy link
Member

Look at the files https://github.com/matplotlib/matplotlib/tree/master/requirements/testing for what we install on CI.

I think we have some platform specific tests so it may not be possible to run every test on your system (this is why we test on windows, linux, and OSX as part of CI).

I suspect you are missing one or more of the GUI frameworks.

@hongyi-zhao
Copy link
Author

Look at the files https://github.com/matplotlib/matplotlib/tree/master/requirements/testing for what we install on CI.

Still no so clear. There are several files and do you mean I should install all the packages listed in these files for doing the tests?

I think we have some platform specific tests so it may not be possible to run every test on your system (this is why we test on windows, linux, and OSX as part of CI).

I suspect you are missing one or more of the GUI frameworks.

@hongyi-zhao
Copy link
Author

I installed the requirements list here with the following commands:

$ pip install -r flake8.txt
$ pip install -r travis_all.txt
$ pip install -r travis_extra.txt

Then run pytest and obtain the following results:

===================================== FAILURES ======================================
_____________________________ test_use14corefonts[pdf] ______________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pdf/pdf_use14corefonts-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pdf/pdf_use14corefonts.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected", "diff"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                ('images not close (RMS %(rms).3f):'
                    '\n\t%(actual)s\n\t%(expected)s\n\t%(diff)s') % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 3.233):
E           	result_images/test_backend_pdf/pdf_use14corefonts_pdf.png
E           	result_images/test_backend_pdf/pdf_use14corefonts-expected_pdf.png
E           	result_images/test_backend_pdf/pdf_use14corefonts_pdf-failed-diff.png

lib/matplotlib/testing/decorators.py:139: ImageComparisonFailure
_________________________________ test_xelatex[pdf] _________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_xelatex-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_xelatex.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected", "diff"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                ('images not close (RMS %(rms).3f):'
                    '\n\t%(actual)s\n\t%(expected)s\n\t%(diff)s') % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 1.043):
E           	result_images/test_backend_pgf/pgf_xelatex_pdf.png
E           	result_images/test_backend_pgf/pgf_xelatex-expected_pdf.png
E           	result_images/test_backend_pgf/pgf_xelatex_pdf-failed-diff.png

lib/matplotlib/testing/decorators.py:139: ImageComparisonFailure
________________________________ test_pdflatex[pdf] _________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_pdflatex.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected", "diff"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                ('images not close (RMS %(rms).3f):'
                    '\n\t%(actual)s\n\t%(expected)s\n\t%(diff)s') % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 11.704):
E           	result_images/test_backend_pgf/pgf_pdflatex_pdf.png
E           	result_images/test_backend_pgf/pgf_pdflatex-expected_pdf.png
E           	result_images/test_backend_pgf/pgf_pdflatex_pdf-failed-diff.png

lib/matplotlib/testing/decorators.py:139: ImageComparisonFailure
___________________________________ test_rcupdate ___________________________________

    @needs_xelatex
    @needs_pdflatex
    @pytest.mark.skipif(not _has_sfmath(), reason='needs sfmath.sty')
    @pytest.mark.style('default')
    @pytest.mark.backend('pgf')
    def test_rcupdate():
        rc_sets = [{'font.family': 'sans-serif',
                    'font.size': 30,
                    'figure.subplot.left': .2,
                    'lines.markersize': 10,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'xelatex'},
                   {'font.family': 'monospace',
                    'font.size': 10,
                    'figure.subplot.left': .1,
                    'lines.markersize': 20,
                    'pgf.rcfonts': False,
                    'pgf.texsystem': 'pdflatex',
                    'pgf.preamble': ('\\usepackage[utf8x]{inputenc}'
                                     '\\usepackage[T1]{fontenc}'
                                     '\\usepackage{sfmath}')}]
        tol = [6, 0]
        for i, rc_set in enumerate(rc_sets):
            with mpl.rc_context(rc_set):
                create_figure()
>               compare_figure('pgf_rcupdate%d.pdf' % (i + 1), tol=tol[i])

lib/matplotlib/tests/test_backend_pgf.py:157: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

fname = 'pgf_rcupdate2.pdf', savefig_kwargs = {}, tol = 0

    def compare_figure(fname, savefig_kwargs={}, tol=0):
        actual = os.path.join(result_dir, fname)
        plt.savefig(actual, **savefig_kwargs)
    
        expected = os.path.join(result_dir, "expected_%s" % fname)
        shutil.copyfile(os.path.join(baseline_dir, fname), expected)
        err = compare_images(expected, actual, tol=tol)
        if err:
>           raise ImageComparisonFailure(err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: Error: Image files did not match.
E             RMS Value: 13.166575390337284
E             Expected:  
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/expected_pgf_rcupdate2_pdf.png
E             Actual:    
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf.png
E             Difference:
E               /home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_rcupdate2_pdf-failed-diff.png
E             Tolerance: 
E               0

lib/matplotlib/tests/test_backend_pgf.py:63: ImageComparisonFailure
________________________________ test_mixedmode[pdf] ________________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_mixedmode-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_backend_pgf/pgf_mixedmode.pdf')
tol = 1.086

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected", "diff"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                ('images not close (RMS %(rms).3f):'
                    '\n\t%(actual)s\n\t%(expected)s\n\t%(diff)s') % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 5.063):
E           	result_images/test_backend_pgf/pgf_mixedmode_pdf.png
E           	result_images/test_backend_pgf/pgf_mixedmode-expected_pdf.png
E           	result_images/test_backend_pgf/pgf_mixedmode_pdf-failed-diff.png

lib/matplotlib/testing/decorators.py:139: ImageComparisonFailure
______________________________ test_font_scaling[pdf] _______________________________

expected = '/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_text/font_scaling-expected.pdf'
actual = PosixPath('/home/werner/Public/hpc/tools/matplotlib.git/result_images/test_text/font_scaling.pdf')
tol = 0

    def _raise_on_image_difference(expected, actual, tol):
        __tracebackhide__ = True
    
        err = compare_images(expected, actual, tol, in_decorator=True)
        if err:
            for key in ["actual", "expected", "diff"]:
                err[key] = os.path.relpath(err[key])
>           raise ImageComparisonFailure(
                ('images not close (RMS %(rms).3f):'
                    '\n\t%(actual)s\n\t%(expected)s\n\t%(diff)s') % err)
E           matplotlib.testing.exceptions.ImageComparisonFailure: images not close (RMS 2.177):
E           	result_images/test_text/font_scaling_pdf.png
E           	result_images/test_text/font_scaling-expected_pdf.png
E           	result_images/test_text/font_scaling_pdf-failed-diff.png

lib/matplotlib/testing/decorators.py:139: ImageComparisonFailure
================================= warnings summary ==================================
lib/matplotlib/widgets.py:1911
  /home/werner/Public/hpc/tools/matplotlib.git/lib/matplotlib/widgets.py:1911: DeprecationWarning: invalid escape sequence \s
    """

-- Docs: https://docs.pytest.org/en/latest/warnings.html
============================== short test summary info ==============================
FAILED lib/matplotlib/tests/test_backend_pdf.py::test_use14corefonts[pdf] - matplo...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_xelatex[pdf] - matplotlib.te...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_pdflatex[pdf] - matplotlib.t...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_rcupdate - matplotlib.testin...
FAILED lib/matplotlib/tests/test_backend_pgf.py::test_mixedmode[pdf] - matplotlib....
FAILED lib/matplotlib/tests/test_text.py::test_font_scaling[pdf] - matplotlib.test...
== 6 failed, 6979 passed, 666 skipped, 11 xfailed, 1 warning in 574.61s (0:09:34) ===

@tacaswell tacaswell modified the milestones: v3.3.0, v3.4.0 Jun 9, 2020
@tacaswell
Copy link
Member

Moved this to 3.4 as I think that the pdf / pgf failures are due to the latex changing under us.

@hongyi-zhao
Copy link
Author

hongyi-zhao commented Jun 9, 2020

@tacaswell Have you tested them?

@QuLogic
Copy link
Member

QuLogic commented Jun 30, 2020

For test_usetex, I can always reproduce that failure on Fedora 31+ (it used to be okay in 30, I think.) For the pgf tests, that failure can be reproduce on Fedora Rawhide only, because it has new texlive. Somewhere in the latex stack, the degree glyph was switched from Computer Modern to Latin Modern (which matches the rest of the text), so the line is a bit different now.

@QuLogic QuLogic modified the milestones: v3.4.0, v3.5.0 Jan 27, 2021
@QuLogic
Copy link
Member

QuLogic commented Jun 3, 2021

Those remaining tests have now got updated images, because we no longer run CI on Ubuntu 16.04 that needed the old ones. So I'm closing this.

@QuLogic QuLogic closed this as completed Jun 3, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants