Skip to content

[MRG] Remove random_seed in fit and use the one in init #224

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

wdevazelhes
Copy link
Member

Fixes #171

This PR replaces the random_seed in fit, by the random_seed in init.

Note that I also updated the functions positive_negative_pairs, _pairs, and adjacency_matrix to be more like scikit-learn (i.e. have a the random_state parameter to be None by default), but I don't think it's necessary to put a ChangedBehaviour Warning or a DeprecationWarning, since those are util functions, that were not previously documented, (and probably less used than *_Supervised versions of algorithms), and what is more the previous behaviour will still work (giving np.random or any RandomState), what do you think ?

@wdevazelhes wdevazelhes requested review from bellet and perimosocordiae and removed request for bellet June 25, 2019 08:01
@@ -866,21 +866,6 @@ def test_iris(self):
csep = class_separation(rca.transform(self.iris_points), self.iris_labels)
self.assertLess(csep, 0.25)

def test_feature_null_variance(self):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

while I was at it I removed this test because it tested pca_comps which is now deprecated. It failed (when it passed before) probably because of the random_seed, which may change because of this PR

@@ -861,7 +861,8 @@ def test_deprecation_num_dims_lfda(num_dims):

class TestRCA(MetricTestCase):
def test_iris(self):
rca = RCA_Supervised(n_components=2, num_chunks=30, chunk_size=2)
rca = RCA_Supervised(n_components=2, num_chunks=30, chunk_size=2,
Copy link
Member Author

@wdevazelhes wdevazelhes Jun 25, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I set the random seed here to pass the test, because the random_seed may be different than before.

@wdevazelhes
Copy link
Member Author

I added the ChangedBehaviorWarning that we talked about @bellet @nvauquie , if all tests are green, we should be ready to merge

@bellet
Copy link
Member

bellet commented Jul 3, 2019

LGTM. There is just some conflicts with master to address before we can merge

William de Vazelhes added 2 commits July 3, 2019 18:05
# Conflicts:
#	metric_learn/itml.py
#	metric_learn/lsml.py
#	metric_learn/mmc.py
#	metric_learn/rca.py
#	metric_learn/sdml.py
@wdevazelhes
Copy link
Member Author

There was a small pb with the merge but it should be fixed right now, if it's green I think we can merge

@@ -46,7 +49,8 @@ def positive_negative_pairs(self, num_constraints, same_length=False,
return a, b, c, d

def _pairs(self, num_constraints, same_label=True, max_iter=10,
random_state=np.random):
random_state=None):
random_state = check_random_state(random_state)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's probably safe to remove this check, as all callers of _pairs should have done it already.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, done

@bellet bellet merged commit 44fd427 into scikit-learn-contrib:master Jul 3, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Remove random seed in fit
3 participants