Skip to content

Commit 8355584

Browse files
lestevejnothman
authored andcommitted
MNT CI Fix for sphinx-gallery 0.3.1 + 404 errors on Debian Jessie packages (scikit-learn#13527)
* Fix for sphinx-gallery 0.3.1. Figure numbering have been changed to ignore matplotlib figure number. * Use circleci/3.6 image. This docker image uses Debian stretch and fixes the problems seen with apt-get update with Debian jessie. * Install additional fonts. Seems to be needed to convert doc/images/iris.svg. * Another missed example using figure number 0. [doc build]
1 parent 03fe036 commit 8355584

File tree

9 files changed

+28
-31
lines changed

9 files changed

+28
-31
lines changed

.circleci/config.yml

Lines changed: 4 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ version: 2
33
jobs:
44
python3:
55
docker:
6-
- image: circleci/python:3.6.1
6+
- image: circleci/python:3.6
77
environment:
88
- MINICONDA_PATH: ~/miniconda
99
- CONDA_ENV_NAME: testenv
@@ -33,10 +33,7 @@ jobs:
3333

3434
python2:
3535
docker:
36-
# We use the python 3 docker image for simplicity. Python is installed
37-
# through conda and the python version actually used is set via the
38-
# PYTHON_VERSION environment variable.
39-
- image: circleci/python:3.6.1
36+
- image: circleci/python:3.6
4037
environment:
4138
# Test examples run with minimal dependencies
4239
- MINICONDA_PATH: ~/miniconda
@@ -66,7 +63,7 @@ jobs:
6663

6764
lint:
6865
docker:
69-
- image: circleci/python:3.6.1
66+
- image: circleci/python:3.6
7067
steps:
7168
- checkout
7269
- run: ./build_tools/circle/checkout_merge_commit.sh
@@ -95,7 +92,7 @@ jobs:
9592

9693
deploy:
9794
docker:
98-
- image: circleci/python:3.6.1
95+
- image: circleci/python:3.6
9996
steps:
10097
- checkout
10198
- run: ./build_tools/circle/checkout_merge_commit.sh

build_tools/circle/build_doc.sh

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -101,7 +101,7 @@ sudo -E apt-get -yq remove texlive-binaries --purge
101101
sudo -E apt-get -yq --no-install-suggests --no-install-recommends --force-yes \
102102
install dvipng texlive-latex-base texlive-latex-extra \
103103
texlive-latex-recommended texlive-latex-extra texlive-fonts-recommended\
104-
latexmk
104+
latexmk gsfonts
105105

106106
# deactivate circleci virtualenv and setup a miniconda env instead
107107
if [[ `type -t deactivate` ]]; then

doc/modules/calibration.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -171,7 +171,7 @@ probability vectors predicted by the same classifier after sigmoid calibration
171171
on a hold-out validation set. Colors indicate the true class of an instance
172172
(red: class 1, green: class 2, blue: class 3).
173173

174-
.. figure:: ../auto_examples/calibration/images/sphx_glr_plot_calibration_multiclass_000.png
174+
.. figure:: ../auto_examples/calibration/images/sphx_glr_plot_calibration_multiclass_001.png
175175
:target: ../auto_examples/calibration/plot_calibration_multiclass.html
176176
:align: center
177177

@@ -183,7 +183,7 @@ method='sigmoid' on the remaining 200 datapoints reduces the confidence of the
183183
predictions, i.e., moves the probability vectors from the edges of the simplex
184184
towards the center:
185185

186-
.. figure:: ../auto_examples/calibration/images/sphx_glr_plot_calibration_multiclass_001.png
186+
.. figure:: ../auto_examples/calibration/images/sphx_glr_plot_calibration_multiclass_002.png
187187
:target: ../auto_examples/calibration/plot_calibration_multiclass.html
188188
:align: center
189189

doc/modules/gaussian_process.rst

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -88,14 +88,14 @@ estimate the noise level of data. An illustration of the
8888
log-marginal-likelihood (LML) landscape shows that there exist two local
8989
maxima of LML.
9090

91-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_000.png
91+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001.png
9292
:target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
9393
:align: center
9494

9595
The first corresponds to a model with a high noise level and a
9696
large length scale, which explains all variations in the data by noise.
9797

98-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_001.png
98+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002.png
9999
:target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
100100
:align: center
101101

@@ -106,7 +106,7 @@ hyperparameters, the gradient-based optimization might also converge to the
106106
high-noise solution. It is thus important to repeat the optimization several
107107
times for different initializations.
108108

109-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_002.png
109+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_noisy_003.png
110110
:target: ../auto_examples/gaussian_process/plot_gpr_noisy.html
111111
:align: center
112112

@@ -306,11 +306,11 @@ The second figure shows the log-marginal-likelihood for different choices of
306306
the kernel's hyperparameters, highlighting the two choices of the
307307
hyperparameters used in the first figure by black dots.
308308

309-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_000.png
309+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_001.png
310310
:target: ../auto_examples/gaussian_process/plot_gpc.html
311311
:align: center
312312

313-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_001.png
313+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpc_002.png
314314
:target: ../auto_examples/gaussian_process/plot_gpc.html
315315
:align: center
316316

@@ -493,7 +493,7 @@ kernel as covariance function have mean square derivatives of all orders, and ar
493493
very smooth. The prior and posterior of a GP resulting from an RBF kernel are shown in
494494
the following figure:
495495

496-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_000.png
496+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_001.png
497497
:target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
498498
:align: center
499499

@@ -534,7 +534,7 @@ allows adapting to the properties of the true underlying functional relation.
534534
The prior and posterior of a GP resulting from a Matérn kernel are shown in
535535
the following figure:
536536

537-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_004.png
537+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_005.png
538538
:target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
539539
:align: center
540540

@@ -556,7 +556,7 @@ The kernel is given by:
556556
The prior and posterior of a GP resulting from a :class:`RationalQuadratic` kernel are shown in
557557
the following figure:
558558

559-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_001.png
559+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_002.png
560560
:target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
561561
:align: center
562562

@@ -574,7 +574,7 @@ The kernel is given by:
574574
The prior and posterior of a GP resulting from an ExpSineSquared kernel are shown in
575575
the following figure:
576576

577-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_002.png
577+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_003.png
578578
:target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
579579
:align: center
580580

@@ -594,7 +594,7 @@ is called the homogeneous linear kernel, otherwise it is inhomogeneous. The kern
594594
The :class:`DotProduct` kernel is commonly combined with exponentiation. An example with exponent 2 is
595595
shown in the following figure:
596596

597-
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_003.png
597+
.. figure:: ../auto_examples/gaussian_process/images/sphx_glr_plot_gpr_prior_posterior_004.png
598598
:target: ../auto_examples/gaussian_process/plot_gpr_prior_posterior.html
599599
:align: center
600600

examples/calibration/plot_calibration_multiclass.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ class of an instance (red: class 1, green: class 2, blue: class 3).
6464
sig_score = log_loss(y_test, sig_clf_probs)
6565

6666
# Plot changes in predicted probabilities via arrows
67-
plt.figure(0)
67+
plt.figure()
6868
colors = ["r", "g", "b"]
6969
for i in range(clf_probs.shape[0]):
7070
plt.arrow(clf_probs[i, 0], clf_probs[i, 1],
@@ -131,7 +131,7 @@ class of an instance (red: class 1, green: class 2, blue: class 3).
131131
"200 datapoint: %.3f" % sig_score)
132132

133133
# Illustrate calibrator
134-
plt.figure(1)
134+
plt.figure()
135135
# generate grid over 2-simplex
136136
p1d = np.linspace(0, 1, 20)
137137
p0, p1 = np.meshgrid(p1d, p1d)

examples/exercises/plot_iris_exercise.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -35,11 +35,11 @@
3535
y_test = y[int(.9 * n_sample):]
3636

3737
# fit the model
38-
for fig_num, kernel in enumerate(('linear', 'rbf', 'poly')):
38+
for kernel in ('linear', 'rbf', 'poly'):
3939
clf = svm.SVC(kernel=kernel, gamma=10)
4040
clf.fit(X_train, y_train)
4141

42-
plt.figure(fig_num)
42+
plt.figure()
4343
plt.clf()
4444
plt.scatter(X[:, 0], X[:, 1], c=y, zorder=10, cmap=plt.cm.Paired,
4545
edgecolor='k', s=20)

examples/gaussian_process/plot_gpc.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@
6363

6464

6565
# Plot posteriors
66-
plt.figure(0)
66+
plt.figure()
6767
plt.scatter(X[:train_size, 0], y[:train_size], c='k', label="Train data",
6868
edgecolors=(0, 0, 0))
6969
plt.scatter(X[train_size:, 0], y[train_size:], c='g', label="Test data",
@@ -80,7 +80,7 @@
8080
plt.legend(loc="best")
8181

8282
# Plot LML landscape
83-
plt.figure(1)
83+
plt.figure()
8484
theta0 = np.logspace(0, 8, 30)
8585
theta1 = np.logspace(-1, 1, 29)
8686
Theta0, Theta1 = np.meshgrid(theta0, theta1)

examples/gaussian_process/plot_gpr_noisy.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@
3535
y = 0.5 * np.sin(3 * X[:, 0]) + rng.normal(0, 0.5, X.shape[0])
3636

3737
# First run
38-
plt.figure(0)
38+
plt.figure()
3939
kernel = 1.0 * RBF(length_scale=100.0, length_scale_bounds=(1e-2, 1e3)) \
4040
+ WhiteKernel(noise_level=1, noise_level_bounds=(1e-10, 1e+1))
4141
gp = GaussianProcessRegressor(kernel=kernel,
@@ -54,7 +54,7 @@
5454
plt.tight_layout()
5555

5656
# Second run
57-
plt.figure(1)
57+
plt.figure()
5858
kernel = 1.0 * RBF(length_scale=1.0, length_scale_bounds=(1e-2, 1e3)) \
5959
+ WhiteKernel(noise_level=1e-5, noise_level_bounds=(1e-10, 1e+1))
6060
gp = GaussianProcessRegressor(kernel=kernel,
@@ -73,7 +73,7 @@
7373
plt.tight_layout()
7474

7575
# Plot LML landscape
76-
plt.figure(2)
76+
plt.figure()
7777
theta0 = np.logspace(-2, 3, 49)
7878
theta1 = np.logspace(-2, 0, 50)
7979
Theta0, Theta1 = np.meshgrid(theta0, theta1)

examples/gaussian_process/plot_gpr_prior_posterior.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,12 +33,12 @@
3333
1.0 * Matern(length_scale=1.0, length_scale_bounds=(1e-1, 10.0),
3434
nu=1.5)]
3535

36-
for fig_index, kernel in enumerate(kernels):
36+
for kernel in kernels:
3737
# Specify Gaussian Process
3838
gp = GaussianProcessRegressor(kernel=kernel)
3939

4040
# Plot prior
41-
plt.figure(fig_index, figsize=(8, 8))
41+
plt.figure(figsize=(8, 8))
4242
plt.subplot(2, 1, 1)
4343
X_ = np.linspace(0, 5, 100)
4444
y_mean, y_std = gp.predict(X_[:, np.newaxis], return_std=True)

0 commit comments

Comments
 (0)