Skip to content

Enable flake8 and re-enable it everywhere #11477

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jul 8, 2018
Merged

Conversation

QuLogic
Copy link
Member

@QuLogic QuLogic commented Jun 22, 2018

PR Summary

Due to the removal of the examples symlink in #11141, pytest was no longer seeing the examples and running pep8 on them. This adds that directory back to pytest's search.

It also adds a rebase of #10940 to enable flake8.

PR Checklist

  • [N/A] Has Pytest style unit tests
  • Code is PEP 8 compliant
  • [N/A] New features are documented, with examples if plot related
  • [N/A] Documentation is sphinx and numpydoc compliant
  • [N/A] Added an entry to doc/users/next_whats_new/ if major new feature (follow instructions in README.rst there)
  • [N/A] Documented in doc/api/api_changes.rst if API changed in a backward-incompatible way

@QuLogic QuLogic added this to the v3.0 milestone Jun 22, 2018
@jklymak jklymak added the Release critical For bugs that make the library unusable (segfaults, incorrect plots, etc) and major regressions. label Jun 22, 2018
@QuLogic QuLogic mentioned this pull request Jun 22, 2018
6 tasks
@QuLogic
Copy link
Member Author

QuLogic commented Jun 22, 2018

The meaning of the additional exceptions to flake8 options are described here by @cclauss.

pytest.ini Outdated
@@ -7,8 +7,9 @@ markers =
network: Mark a test that uses the network.
style: Set alternate Matplotlib style temporarily.

pep8ignore =
* E111 E114 E115 E116 E121 E122 E123 E124 E125 E126 E127 E128 E129 E131 E226 E240 E241 E242 E243 E244 E245 E246 E247 E248 E249 E265 E266 E704 W503
flake8-max-line-length = 80
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't this be left at 79 (the default, i.e. remove this line)?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I got this from @cclauss's PR, but I think you're right, so I'm going to remove it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sorry, off-by-one bug on my part.

@cclauss
Copy link

cclauss commented Jun 22, 2018

LOL... #10940 (comment)

@QuLogic
Copy link
Member Author

QuLogic commented Jun 23, 2018

Well, okay, now pyside(2) seems to be broken.

Also, I have no idea why, but flake8 make the tests ridiculously slow. I ran a non-parallel test locally and it took ~7 minutes, but with the --flake8 flag it's not finished for the past hour.

@timhoffm
Copy link
Member

Maybe related tholo/pytest-flake8#19?

@anntzer
Copy link
Contributor

anntzer commented Jul 2, 2018

If #11477 (comment) is still an issue, perhaps replace pytest-flake8 by a direct invocation of flake8? (with e.g. https://github.com/snoack/flake8-per-file-ignores for per-file ignores)
(TBH I don't really see the point of a pytest plugin that just calls another executable (if it is not active by default).)

@jklymak
Copy link
Member

jklymak commented Jul 6, 2018

I ran

  • pytest -v --pep8 lib/matplotlib/tests/test_axes.py: 143.09 s
  • pytest -v --flake8 lib/matplotlib/tests/test_axes.py: 131.09 s

Sure I didn't have the exceptions turned on or anything (the pep8 test failed).

@cclauss
Copy link

cclauss commented Jul 6, 2018

Try running PEP8 and flake8 WITHOUT passing thru pytest.

@QuLogic
Copy link
Member Author

QuLogic commented Jul 7, 2018

Well, I tried conda, virtualenv, and even downloading the docker image that Travis used, but I cannot reproduce the slow checks anymore. I guess standalone flake8 is the only option; the only annoyance is that configuration will have to go into a new file, as pytest.ini is not a supported file.

@cclauss
Copy link

cclauss commented Jul 7, 2018

From http://flake8.pycqa.org/en/latest/user/invocation.html

$ flake8 . --config=CONFIG # Path to the config file that will be the authoritative config source. This will cause Flake8 to ignore all other configuration files.

@tacaswell
Copy link
Member

Is the issue that pytest spins up a new flake8 process for each file which just takes forever on the travis vms?

@tacaswell
Copy link
Member

watching a running test it is not hung, just from 91% on it seems to be taking it's time (and then we hit the 50 minute timeout).

@cclauss
Copy link

cclauss commented Jul 7, 2018

One minute, 10 seconds to flake8 test the entire repo seems reasonable to me.

$ time flake8 . --config=pytest.ini # Python 3.7 on Travis CI

real	0m26.769s
user	1m10.204s
sys	0m0.953s

Copy link

@cclauss cclauss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Running flake8 from inside pytest has proved to be a bad idea when each file in the repo has its own style rules. Running flake8 standalone with the --config=pytest.ini seems the best approach.

@QuLogic QuLogic force-pushed the flake8 branch 2 times, most recently from 9d55403 to 6fa1391 Compare July 7, 2018 23:57
@QuLogic QuLogic changed the title Enable flake8 and re-enable pep8 on examples Enable flake8 and re-enable it everywhere Jul 7, 2018
@QuLogic
Copy link
Member Author

QuLogic commented Jul 7, 2018

At some point in the near future, I would like to get rid of the ** exceptions, as they cause confusion when linting a single file (per-file-ignores thinks that the exception is unnecessary and raises its X100 error).

@jklymak
Copy link
Member

jklymak commented Jul 8, 2018

Where does the flake8 output show up in the test? I can't see any evidence that it ran, but maybe I'm looking in the wrong spot...

@QuLogic
Copy link
Member Author

QuLogic commented Jul 8, 2018

That's because it passed 😉 I'll add a message to say it's okay on there.

jklymak
jklymak previously approved these changes Jul 8, 2018
@cclauss
Copy link

cclauss commented Jul 8, 2018

Fixes #11550

.travis.yml Outdated
@@ -160,6 +160,9 @@ before_script: |
script: |
echo "Calling pytest with the following arguments: $PYTEST_ADDOPTS"
python -mpytest
if [[ $RUN_FLAKE8 == 1 ]]; then
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspect this is going to have the same issue we discovered in the pytzectomy PR where if the tests fail but flake8 passes, travis will report success.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably set -e is the way to go?
Or add || return 1 at the end of each (relevant) line.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't like set -e; it bleeds into Travis' code and causes spurious errors, so you need to remember to turn it off. The last time that happened, we ended up just removing it instead. I split these into separate entries instead, which does fail correctly.

@@ -25,13 +25,13 @@ def __get__(self, obj, objtype=None):


class TaggedValueMeta(type):
def __init__(cls, name, bases, dict):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is used as a meta-class so I ma not sure that this change is correct (I am also not sure off the top of my head what self is in this case and when this __init__ gets called).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It may be that the object is a class/type here, but it's still an instance at this point, so I'm not sure calling it something other than self is warranted.

Copy link
Member

@tacaswell tacaswell left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the way this is called in travis will mask failures in the test suite from failing the test.

Anyone can clear this if this is fixed or they are sure I am wrong.

@jklymak jklymak mentioned this pull request Jul 8, 2018
6 tasks
@cclauss
Copy link

cclauss commented Jul 8, 2018

Do the flake8 tests first and the pytests after?

@jklymak
Copy link
Member

jklymak commented Jul 8, 2018

@cclauss then it won't fail if flake8 fails?

#11597 shows that @tacaswell is indeed correct - that failing a test still returns a positive Travis check the way this is setup now....

@jklymak jklymak dismissed their stale review July 8, 2018 16:48

Doesn't catch failing pytest tests anymore

@jklymak
Copy link
Member

jklymak commented Jul 8, 2018

#11597 shows that adding set -e causes a failure when pytest should fail. Testing if it fails if there is a flake8 error

@QuLogic
Copy link
Member Author

QuLogic commented Jul 8, 2018

Splitting the two commands into separate entries should work fine.

@tacaswell tacaswell dismissed their stale review July 8, 2018 18:37

fixed.

@tacaswell tacaswell merged commit 94f41d1 into matplotlib:master Jul 8, 2018
@tacaswell
Copy link
Member

Thanks everyone!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Release critical For bugs that make the library unusable (segfaults, incorrect plots, etc) and major regressions. topic: testing
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants