Skip to content

[MRG] SVR plot updates size and color #12207

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 11 commits into from

Conversation

ml4713
Copy link

@ml4713 ml4713 commented Sep 29, 2018

Reference Issues/PRs

relevant PRs: #8367
closes: #8365

What does this implement/fix? Explain your changes.

  • marker size has been changed to 50
  • color match for lines and markers

Any other comments?

@ml4713 ml4713 changed the title Svr plot chnage SVR plot change Sep 29, 2018
@ml4713 ml4713 changed the title SVR plot change SVR plot updates size and color Sep 29, 2018
plt.scatter(X[svr_poly.support_], y[svr_poly.support_], facecolor="none",
edgecolor="g", marker='s', label='poly support vectors', s=50)

plt.hold('on')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove the "hold" line. It doesn't exist any more and causes a test failure.

@ml4713 ml4713 changed the title SVR plot updates size and color [MRG] SVR plot updates size and color Sep 29, 2018
@@ -25,7 +25,7 @@
# Fit regression model
svr_rbf = SVR(kernel='rbf', C=100, gamma=0.1, epsilon=.1)
svr_lin = SVR(kernel='linear', C=100)
svr_poly = SVR(kernel='poly', C=100, degree=3, epsilon=.1, coef0=1)
svr_poly = SVR(kernel='poly', C=100, degree=3, epsilon=.1, coef0=0)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why did you change that back?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea that was a typo. I was trying to see what errors it will produce if coef0=0.

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't find these colours very distinctive.

I also think this is quite crowded. I wonder if we can use multiple subplots effectively.

@vasudhathinks
Copy link

I think it's a good point about the colors not being distinctive, especially since some folks have mentioned that the visuals should be colorblind friendly -- we could pick colors from here: http://colorbrewer2.org/

about sub-plots, not sure but open to trialing

what do you think @ml4713 ?

@ml4713
Copy link
Author

ml4713 commented Sep 30, 2018

I don't think changing colors helps here since we only have a very simple and limited data to visualize and it makes sense for RBF, poly and linear to pick largely overlapped support vectors. As long as those vectors are on top of each other, it will look a little messier.

Something like this maybe?

fig, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True)
ax1.plot(X, y_rbf, color='m', lw=lw, label='RBF model')
ax1.scatter(X[svr_rbf.support_], y[svr_rbf.support_], facecolor="none",
            edgecolor="m", marker='8', label='rbf support vectors', s=50)
ax1.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
          ncol=3, fancybox=True, shadow=True)

ax2.plot(X, y_lin, color='c', lw=lw, label='Linear model')
ax2.scatter(X[svr_lin.support_], y[svr_lin.support_], facecolor="none",
            edgecolor="c", marker='^', label='linear support vectors', s=50)
ax2.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
          ncol=3, fancybox=True, shadow=True)

ax3.plot(X, y_poly, color='g', lw=lw, label='Polynomial model')
ax3.scatter(X[svr_poly.support_], y[svr_poly.support_], facecolor="none",
            edgecolor="g", marker='s', label='poly support vectors', s=50)
ax3.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
          ncol=3, fancybox=True, shadow=True)

fig.text(0.5, 0.04, 'data', ha='center', va='center')
fig.text(0.06, 0.5, 'target', ha='center', va='center', rotation='vertical')
fig.suptitle("Support Vector Regression", fontsize=14)
plt.show()

screen shot 2018-09-30 at 4 14 39 pm

@amueller
Copy link
Member

amueller commented Oct 1, 2018

hm I can't see the rendered docs?

@ml4713
Copy link
Author

ml4713 commented Oct 1, 2018

strange, i am able to see the plot.
@vasudhathinks are you able to see?

screen shot 2018-10-01 at 7 50 29 pm

@vasudhathinks
Copy link

yes, I can see the plots similar to rita's screenshot. I had to define lw = 1

@amueller
Copy link
Member

amueller commented Oct 2, 2018

Sorry I meant I can't find the artifact generated by circleCI, I haven't tried running locally.

@TomDLT
Copy link
Member

TomDLT commented Nov 15, 2018

Separating the plots is great. +1
I suggest to change the figure size (something like plt.subplots(1, 3, figsize=(10, 4))) to have something wider.
We also need to plot all the training points in each subplot (maybe with a black dot?).

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great improvement!


ax2.plot(X, y_lin, color='c', lw=lw, label='Linear model')
ax2.scatter(X[svr_lin.support_], y[svr_lin.support_], facecolor="none",
edgecolor="c", marker='^', label='linear support vectors', s=50)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not convinced we need to change both colour and marker shape for the support vectors now.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yea its kind of redundant now since we separated the graphs

@jnothman
Copy link
Member

@reshamas
Copy link
Member

@ml4713 Checking in to see how this PR is going.

@ml4713
Copy link
Author

ml4713 commented Dec 18, 2018

@reshamas I just realized that the travis CI build failed and not sure why. other than this, i think I am done.

@reshamas
Copy link
Member

@ml4713
I looked through the travis test file and the error is:
Failed: The splits for data, are same even when random state is not set
and
/home/travis/build/scikit-learn/scikit-learn/sklearn/model_selection/tests/test_split.py:487: Failed

But, I don't know what they mean.
@amueller @jnothman Are you able to explain the error and how to fix it?

@jnothman
Copy link
Member

I don't know that failure, but we no longer test python 2.7 in master, so merging in updated master should fix it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

SVR example misleading
7 participants