Skip to content

check azure build #24616

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed

Conversation

TomAugspurger
Copy link
Contributor

This should fail the numpydev build on azure.

@TomAugspurger
Copy link
Contributor Author

I'll try to turn this into a proper test that can actually be merged later.

@TomAugspurger
Copy link
Contributor Author

https://dev.azure.com/pandas-dev/pandas/_build/results?buildId=6513 failed, so this check seems to be working correctly.

Most likely the old warnings are coming from code that are skipped in the (rather limited) numpy-dev env.

One possible option is to add a filterwarning to raise all warnings coming from pandas itself to all the builds. I think pytest supports something like that.

@codecov
Copy link

codecov bot commented Jan 4, 2019

Codecov Report

Merging #24616 into master will increase coverage by <.01%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #24616      +/-   ##
==========================================
+ Coverage   92.37%   92.38%   +<.01%     
==========================================
  Files         166      166              
  Lines       52396    52396              
==========================================
+ Hits        48403    48404       +1     
+ Misses       3993     3992       -1
Flag Coverage Δ
#multiple 90.8% <ø> (ø) ⬆️
#single 43% <ø> (-0.01%) ⬇️
Impacted Files Coverage Δ
pandas/util/testing.py 88.09% <0%> (+0.09%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 19f715c...f027368. Read the comment docs.

1 similar comment
@codecov
Copy link

codecov bot commented Jan 4, 2019

Codecov Report

Merging #24616 into master will increase coverage by <.01%.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff             @@
##           master   #24616      +/-   ##
==========================================
+ Coverage   92.37%   92.38%   +<.01%     
==========================================
  Files         166      166              
  Lines       52396    52396              
==========================================
+ Hits        48403    48404       +1     
+ Misses       3993     3992       -1
Flag Coverage Δ
#multiple 90.8% <ø> (ø) ⬆️
#single 43% <ø> (-0.01%) ⬇️
Impacted Files Coverage Δ
pandas/util/testing.py 88.09% <0%> (+0.09%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 19f715c...f027368. Read the comment docs.

@jreback
Copy link
Contributor

jreback commented Jan 4, 2019

One possible option is to add a filterwarning to raise all warnings coming from pandas itself to all the builds. I think pytest supports something like that.

this is what setting PANDAS_TESTING_MODE is for (but could migrate to pytest explicity), though using
-W error::DeprecationWarning I could not get things to fail

@jorisvandenbossche
Copy link
Member

Something else: locally my pytest "warning summary" includes which test the warning is coming from. Anybody know why this is not the case on travis / azure?

@TomAugspurger
Copy link
Contributor Author

TomAugspurger commented Jan 4, 2019 via email

@jorisvandenbossche
Copy link
Member

I typically don't use that, but tested now some tests with -n 2, and still get the name of the test in the warning summary:

(dev) joris@joris-XPS-13-9350:~/scipy/pandas$ pytest pandas/tests/io/test_pytables.py -n 2
=============================================================================================== test session starts ================================================================================================
platform linux -- Python 3.5.5, pytest-3.6.2, py-1.5.4, pluggy-0.6.0
hypothesis profile 'default' -> database=DirectoryBasedExampleDatabase('/home/joris/scipy/pandas/.hypothesis/examples')
rootdir: /home/joris/scipy/pandas, inifile: setup.cfg
plugins: xdist-1.25.0, repeat-0.4.1, forked-0.2, hypothesis-3.70.3
gw0 [182] / gw1 [182]
......................................................................................................x.................................................s.............................                       [100%]
================================================================================================= warnings summary =================================================================================================
pandas/tests/io/test_pytables.py::TestHDFStore::()::test_legacy_table_read
  /home/joris/scipy/pandas/pandas/io/pytables.py:3953: FutureWarning: 
  Panel is deprecated and will be removed in a future version.
  The recommended way to represent these types of 3-dimensional data are with a MultiIndex on a DataFrame, via the Panel.to_frame() method
  Alternatively, you can use the xarray package http://xarray.pydata.org/en/stable/.
  Pandas provides a `.to_xarray()` method to help automate this conversion.
  
    return super(LegacyFrameTable, self).read(*args, **kwargs)['value']
  /home/joris/scipy/pandas/pandas/io/pytables.py:730: FutureWarning: 
  Panel is deprecated and will be removed in a future version.
  The recommended way to represent these types of 3-dimensional data are with a MultiIndex on a DataFrame, via the Panel.to_frame() method
  Alternatively, you can use the xarray package http://xarray.pydata.org/en/stable/.
  Pandas provides a `.to_xarray()` method to help automate this conversion.
  
    columns=columns)

-- Docs: http://doc.pytest.org/en/latest/warnings.html
========================================================================== 180 passed, 1 skipped, 1 xfailed, 2 warnings in 25.33 seconds ===========================================================================

@datapythonista datapythonista added the CI Continuous Integration label Jan 6, 2019
@jorisvandenbossche
Copy link
Member

Should we consider expanding the installed libraries / test set that is run on this build to catch more of the possible warnings?

@TomAugspurger
Copy link
Contributor Author

TomAugspurger commented Jan 6, 2019 via email

@TomAugspurger
Copy link
Contributor Author

I'm probably not getting back to this anytime soon. Closing for now.

@TomAugspurger TomAugspurger deleted the azure-warning-test branch January 17, 2019 15:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CI Continuous Integration
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants