-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG] SVR plot updates size and color #12207
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Change coef0 value for better plot of poly kernal. Also improve the support vectors marking in SVR plot. Fixes: scikit-learn#8365
Format a code to remove flake8 error from travis build Issue: scikit-learn#8365
Svr plot to update colors
examples/svm/plot_svm_regression.py
Outdated
plt.scatter(X[svr_poly.support_], y[svr_poly.support_], facecolor="none", | ||
edgecolor="g", marker='s', label='poly support vectors', s=50) | ||
|
||
plt.hold('on') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove the "hold" line. It doesn't exist any more and causes a test failure.
examples/svm/plot_svm_regression.py
Outdated
@@ -25,7 +25,7 @@ | |||
# Fit regression model | |||
svr_rbf = SVR(kernel='rbf', C=100, gamma=0.1, epsilon=.1) | |||
svr_lin = SVR(kernel='linear', C=100) | |||
svr_poly = SVR(kernel='poly', C=100, degree=3, epsilon=.1, coef0=1) | |||
svr_poly = SVR(kernel='poly', C=100, degree=3, epsilon=.1, coef0=0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why did you change that back?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea that was a typo. I was trying to see what errors it will produce if coef0=0
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't find these colours very distinctive.
I also think this is quite crowded. I wonder if we can use multiple subplots effectively.
I think it's a good point about the colors not being distinctive, especially since some folks have mentioned that the visuals should be colorblind friendly -- we could pick colors from here: http://colorbrewer2.org/ about sub-plots, not sure but open to trialing what do you think @ml4713 ? |
I don't think changing colors helps here since we only have a very simple and limited data to visualize and it makes sense for RBF, poly and linear to pick largely overlapped support vectors. As long as those vectors are on top of each other, it will look a little messier. Something like this maybe? fig, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True)
ax1.plot(X, y_rbf, color='m', lw=lw, label='RBF model')
ax1.scatter(X[svr_rbf.support_], y[svr_rbf.support_], facecolor="none",
edgecolor="m", marker='8', label='rbf support vectors', s=50)
ax1.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
ncol=3, fancybox=True, shadow=True)
ax2.plot(X, y_lin, color='c', lw=lw, label='Linear model')
ax2.scatter(X[svr_lin.support_], y[svr_lin.support_], facecolor="none",
edgecolor="c", marker='^', label='linear support vectors', s=50)
ax2.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
ncol=3, fancybox=True, shadow=True)
ax3.plot(X, y_poly, color='g', lw=lw, label='Polynomial model')
ax3.scatter(X[svr_poly.support_], y[svr_poly.support_], facecolor="none",
edgecolor="g", marker='s', label='poly support vectors', s=50)
ax3.legend(loc='upper center', bbox_to_anchor=(0.5, 1.05),
ncol=3, fancybox=True, shadow=True)
fig.text(0.5, 0.04, 'data', ha='center', va='center')
fig.text(0.06, 0.5, 'target', ha='center', va='center', rotation='vertical')
fig.suptitle("Support Vector Regression", fontsize=14)
plt.show() |
hm I can't see the rendered docs? |
strange, i am able to see the plot. |
yes, I can see the plots similar to rita's screenshot. I had to define lw = 1 |
Sorry I meant I can't find the artifact generated by circleCI, I haven't tried running locally. |
Separating the plots is great. +1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a great improvement!
examples/svm/plot_svm_regression.py
Outdated
|
||
ax2.plot(X, y_lin, color='c', lw=lw, label='Linear model') | ||
ax2.scatter(X[svr_lin.support_], y[svr_lin.support_], facecolor="none", | ||
edgecolor="c", marker='^', label='linear support vectors', s=50) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not convinced we need to change both colour and marker shape for the support vectors now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yea its kind of redundant now since we separated the graphs
@ml4713 Checking in to see how this PR is going. |
@reshamas I just realized that the travis CI build failed and not sure why. other than this, i think I am done. |
@ml4713 But, I don't know what they mean. |
I don't know that failure, but we no longer test python 2.7 in master, so merging in updated master should fix it |
Reference Issues/PRs
relevant PRs: #8367
closes: #8365
What does this implement/fix? Explain your changes.
Any other comments?