Skip to content

Update fbeta_score to accept beta=0#13231

Merged
rth merged 7 commits intoscikit-learn:masterfrom
corona10:gh-13224
Jul 19, 2019
Merged

Update fbeta_score to accept beta=0#13231
rth merged 7 commits intoscikit-learn:masterfrom
corona10:gh-13224

Conversation

@corona10
Copy link
Copy Markdown
Contributor

Reference Issues/PRs

Fixes: #13224

What does this implement/fix? Explain your changes.

Update fbeta_score API to accept beta=0

Any other comments?

I am the first-time contributor for this project.
Please let me know if I make mistakes!
Thanks

@corona10
Copy link
Copy Markdown
Contributor Author

@qinhanmin2014
Hi, Can you take a look please?

IMHO, Current failure does not relate with this PR
>>> from sklearn.feature_extraction import image 343 >>> # Use the array data from the first image in this dataset: 344

Thanks a lot!

Copy link
Copy Markdown
Member

@qinhanmin2014 qinhanmin2014 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please update the doc accordingly, e.g., beta : float >= 0
ignore the test failure if it's irrelevant

@qinhanmin2014
Copy link
Copy Markdown
Member

Please add an entry to the change log at doc/whats_new/v*.rst. Like the other entries there, please reference this pull request with :issue: and credit yourself (and other contributors if applicable) with :user:.

@corona10
Copy link
Copy Markdown
Contributor Author

@qinhanmin2014 Thanks for the kind review!

Copy link
Copy Markdown
Member

@qinhanmin2014 qinhanmin2014 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks @corona10

assert_equal(1., precision_score([1, 1], [1, 1]))
assert_equal(1., recall_score([1, 1], [1, 1]))
assert_equal(1., f1_score([1, 1], [1, 1]))
assert_equal(1., fbeta_score([1, 1], [1, 1], 0))
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to update the comment above : Test precision, recall and F1 score -> Test precision, recall and fscore or something you like.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you need to update the comment above : Test precision, recall and F1 score -> Test precision, recall and fscore or something you like

Where's this change? You don't need to squash your commits.
Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf'))) here.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qinhanmin2014 Looks like there's some accident during I worked on another machine.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe also add assert_equal(XXX, fbeta_score([1, 1], [1, 1], float('inf'))) here.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@qinhanmin2014
aa90c28
I've added the test for that case.
Due to https://bugs.python.org/issue28579, I use numpy.isnan method for this test.

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

assert_true is deprecated, use bare assert instead

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@corona10
Copy link
Copy Markdown
Contributor Author

@qinhanmin2014
Done! Thanks!

Copy link
Copy Markdown
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we're supporting beta=0, we should support beta=inf. Can we please add a test for that?

@corona10
Copy link
Copy Markdown
Contributor Author

@jnothman Sure!

@corona10
Copy link
Copy Markdown
Contributor Author

@qinhanmin2014 PTAL

@corona10 corona10 changed the title Update fbeta_score to accept beta=0 [WIP] Update fbeta_score to accept beta=0 Feb 28, 2019
@corona10 corona10 changed the title [WIP] Update fbeta_score to accept beta=0 Update fbeta_score to accept beta=0 Feb 28, 2019
@qinhanmin2014
Copy link
Copy Markdown
Member

So actually we do not support beta=inf currently (See the new test). Do you think we need to add support to that @jnothman ? I'm +0 so you can make the decision here :)

@jnothman
Copy link
Copy Markdown
Member

I'm +0 for beta=0. But if we support beta=0 we should support beta=inf!

@jnothman
Copy link
Copy Markdown
Member

jnothman commented May 1, 2019

Note that if you keep working on this, the what's new entry should move to v0.22.rst as v0.21 has been closed.

@rth
Copy link
Copy Markdown
Member

rth commented Jul 18, 2019

@corona10 Would you be interested in continuing this PR?

You would need to resolve merge conflicts and handle the case of beta=inf. I can review this one that is done.

@corona10
Copy link
Copy Markdown
Contributor Author

@rth Thanks! I will take a look

@corona10 corona10 changed the title Update fbeta_score to accept beta=0 [WIP] Update fbeta_score to accept beta=0 Jul 19, 2019
@corona10 corona10 force-pushed the gh-13224 branch 2 times, most recently from 92f153e to 4663d09 Compare July 19, 2019 04:41
@corona10 corona10 changed the title [WIP] Update fbeta_score to accept beta=0 Update fbeta_score to accept beta=0 Jul 19, 2019
@corona10
Copy link
Copy Markdown
Contributor Author

@rth cc @jnothman
Please take a look

Copy link
Copy Markdown
Member

@rth rth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! A last few comments. Also please update in the beta docstring, that it needs to be positive.

Otherwise LGTM.

@corona10
Copy link
Copy Markdown
Contributor Author

@rth
I've updated the PR, please take a look

The `beta` parameter determines the weight of recall in the combined
score. ``beta < 1`` lends more weight to precision, while ``beta > 1``
favors recall (``beta -> 0`` considers only precision, ``beta -> inf``
favors recall (``beta -> 0`` considers only precision, ``beta -> +inf``
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@rth Is this the docstring you mean?
if not please let me know what docstring should be updated.
Thanks!

Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean the,

beta : float
        Weight of precision in harmonic mean.

but nevermind, this description is enough.

Copy link
Copy Markdown
Member

@rth rth left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @corona10 !

Merging with a +2, despite Joel's +0. I don't think this is too controversial.

@rth rth merged commit b4c1c4e into scikit-learn:master Jul 19, 2019
@corona10 corona10 deleted the gh-13224 branch July 19, 2019 14:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

fbeta_score does not accept beta=0

4 participants