-
-
Notifications
You must be signed in to change notification settings - Fork 26.2k
Fixed Convergence Warnings On MLP Training Curves #14144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@martinoywa tests failing |
An example shows users how to use our library, and a ConvergenceWarning is there to protect our users. I think it's appropriate that if you ignore a warning in an example, you leave a comment explaining where those warnings come up and why you might want to ignore them in this context... I.e. say that they might rather increase iterations. (I also don't know the internals of how these examples are executed by sphinx-gallery to determine if there is a chance this warning filter will apply globally to other examples.) |
from sklearn.neural_network import MLPClassifier | ||
from sklearn.preprocessing import MinMaxScaler | ||
from sklearn import datasets | ||
from sklearn.exceptions import ConvergenceWarning | ||
|
||
warnings.filterwarnings("ignore", category=ConvergenceWarning, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe add a comment like
Some of the parameter combinations will not converge as can be seen on the plots so we ignore them here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me get to it
probably a |
for ax, data, name in zip(axes.ravel(), data_sets, ['iris', 'digits', | ||
# some parameter combinations will not converge as can be seen on the | ||
# plots so they are ignored here | ||
with warnings.catch_warnings(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
only mlp.fit
needs to be inside of the context manager I think
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just seeing that now. Thanks
There are some changes I've made, is that what you were aiming for? |
print("training: %s" % label) | ||
mlp = MLPClassifier(verbose=0, random_state=0, | ||
max_iter=max_iter, **param) | ||
mlp.fit(X, y) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the context manager should just be for this line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was putting it before the preprocessing step instead of the model fitting. Sorry for that.
@@ -64,12 +70,19 @@ def plot_on_dataset(X, y, ax, name): | |||
print("training: %s" % label) | |||
mlp = MLPClassifier(verbose=0, random_state=0, | |||
max_iter=max_iter, **param) | |||
mlp.fit(X, y) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this line has trailing white spaces but that's not really important. LGTM anyway.
thanks @martinoywa
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @martinoywa :)
…4144) * fix convergence warnings * fix convergence warnings * PEP8 * PEP8 * Fix Convergence Warning by changing the Optimization Algorithm * PEP8 * Fixed Future Warnings by explicitly defining n_estimators. * PEP8 * deleted all * Fixed Convergence Warnings * removed changes on unrelated examples * add comment and with statement * PEP8 * context manager fix * fixed indentation * PEP8 * flake8
…4144) * fix convergence warnings * fix convergence warnings * PEP8 * PEP8 * Fix Convergence Warning by changing the Optimization Algorithm * PEP8 * Fixed Future Warnings by explicitly defining n_estimators. * PEP8 * deleted all * Fixed Convergence Warnings * removed changes on unrelated examples * add comment and with statement * PEP8 * context manager fix * fixed indentation * PEP8 * flake8
Referencing PR / Issue
This addresses #14117 examples/neural_networks/plot_mlp_training_curves.py
Note
There were a lot of Convergence Warnings on the Stochastic Optimizer.
This can be solved by increasing the maximum iterations, but there will be
a trade-off in time taken. So I filtered the warnings by ignoring them.
#WiMLDS_Nairobi
cc: @lagvier @mcecily @toopurity @felexkemboi