-
Notifications
You must be signed in to change notification settings - Fork 228
[MRG] Remove random_seed in fit and use the one in init #224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG] Remove random_seed in fit and use the one in init #224
Conversation
@@ -866,21 +866,6 @@ def test_iris(self): | |||
csep = class_separation(rca.transform(self.iris_points), self.iris_labels) | |||
self.assertLess(csep, 0.25) | |||
|
|||
def test_feature_null_variance(self): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
while I was at it I removed this test because it tested pca_comps
which is now deprecated. It failed (when it passed before) probably because of the random_seed
, which may change because of this PR
test/metric_learn_test.py
Outdated
@@ -861,7 +861,8 @@ def test_deprecation_num_dims_lfda(num_dims): | |||
|
|||
class TestRCA(MetricTestCase): | |||
def test_iris(self): | |||
rca = RCA_Supervised(n_components=2, num_chunks=30, chunk_size=2) | |||
rca = RCA_Supervised(n_components=2, num_chunks=30, chunk_size=2, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I set the random seed
here to pass the test, because the random_seed
may be different than before.
LGTM. There is just some conflicts with master to address before we can merge |
# Conflicts: # metric_learn/itml.py # metric_learn/lsml.py # metric_learn/mmc.py # metric_learn/rca.py # metric_learn/sdml.py
There was a small pb with the merge but it should be fixed right now, if it's green I think we can merge |
metric_learn/constraints.py
Outdated
@@ -46,7 +49,8 @@ def positive_negative_pairs(self, num_constraints, same_length=False, | |||
return a, b, c, d | |||
|
|||
def _pairs(self, num_constraints, same_label=True, max_iter=10, | |||
random_state=np.random): | |||
random_state=None): | |||
random_state = check_random_state(random_state) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's probably safe to remove this check, as all callers of _pairs
should have done it already.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree, done
Fixes #171
This PR replaces the
random_seed
infit
, by therandom_seed
ininit
.Note that I also updated the functions
positive_negative_pairs
,_pairs
, andadjacency_matrix
to be more like scikit-learn (i.e. have a therandom_state
parameter to beNone
by default), but I don't think it's necessary to put aChangedBehaviour
Warning or aDeprecationWarning
, since those are util functions, that were not previously documented, (and probably less used than*_Supervised
versions of algorithms), and what is more the previous behaviour will still work (givingnp.random
or anyRandomState
), what do you think ?