-
-
Notifications
You must be signed in to change notification settings - Fork 25.8k
[MRG+1] Added sample weight parameter to linearsvc.fit, includes tests and documentation. #6939
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[MRG+1] Added sample weight parameter to linearsvc.fit, includes tests and documentation. #6939
Conversation
Look at your history it is still a mess |
@agramfort , what do you recommend to fix it? |
Hi, it seems master is getting merged into the branch and messing up with the history. AFAIK the following is the best method to revert back git checkout master
# git remote add upstream git://github.com/scikit-learn/scikit-learn.git
git fetch upstream
git merge upstream/master
# Now the master of your fork and scikit learn are on par
git checkout linearsvc_sample_weight
# rebase on top of master. This may give conflicts
git rebase -i master
# solve the conflict manually if any
# git remote add origin git@github.com:imaculate/scikit-learn.git
# force push onto this branch
git push origin linearsvc_sample_weight -f Cheers. |
Thanks @maniteja123 . I have done that, after the last push, it says 'Everything uptodate' . |
Oh sorry, even I am no expert in this. But looking at the comparison here it seems that your master is ahead of the scikit learn master. Maybe you could first update your local master to be on par with upstream master and then rebase on it. |
As you have only one commit, you can cherry-pick it from a freshly downloaded master: # go to your branch
git checkout linearsvc_sample_weight
# always do a backup
git checkout -b linearsvc_sample_weight_back_up
# delete master
git branch -D master
# and download a fresh new version of master
git fetch upstream master:master
# go to master
git checkout master
# delete your branch (you have a backup)
git branch -D linearsvc_sample_weight
# recreate your branch from fresh master
git checkout -b linearsvc_sample_weight
# apply only your last commit
git cherry-pick 8e403d69df5a9de0daf9344e662677dd7abc45ac
# force push
git push -f origin linearsvc_sample_weight |
8e403d6
to
94dc0a6
Compare
Thanks @TomDLT , I have done that. |
unit_weight = np.ones(n_samples) | ||
clf = svm.LinearSVC(random_state=0).fit(X, Y) | ||
clf_unitweight = svm.LinearSVC(random_state=0).fit(X, Y, | ||
sample_weight=unit_weight) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please follow PEP8 indentation. An easy option is to do the .fit
on a separate line.
You have a testing failure to deal with. |
Fixed the above @jnothman |
unit_weight = np.ones(n_samples) | ||
clf = svm.LinearSVC(random_state=0).fit(X, Y) | ||
clf_unitweight = svm.LinearSVC(random_state=0).\ | ||
fit(X, Y, sample_weight=unit_weight) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I actually meant just clf_unitweight.fit(...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But this is okay...
LGTM |
@@ -177,6 +177,11 @@ def fit(self, X, y): | |||
y : array-like, shape = [n_samples] | |||
Target vector relative to X | |||
|
|||
sample_weight : array-like, shape = [n_samples], optional | |||
Array of weights that are assigned to individual |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Keep the same alignment as in the other docstrings: all the lines after the parameter first line are indented by four spaces.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Done!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, I am afraid that you got this alignment wrong: you can have a look at the lines right above those that you modified: you documentation line should be aligned the same way as the documentation line on the "y" parameter.
sample_weight : array-like, shape = [n_samples], optional | ||
Array of weights that are assigned to individual | ||
samples. If not provided, | ||
then each sample is given unit weight. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This one is correct :).
Aside from the minor comment, this look good to me. Thanks! |
@GaelVaroquaux , Done! Thank you. @lesteve should I rename the function? |
yes, rename the test and we'll merge. |
thanks! |
Thanks. I've updated |
Reference Issue
A redo of #6937 from fresh master . Stems from #6862.
What does this implement/fix? Explain your changes.
Any other comments?