Skip to content

Classification metrics overhaul: accuracy metrics (2/n)#4838

Merged
tchaton merged 136 commits intoLightning-AI:release/1.2-devfrom
tadejsv:cls_metrics_accuracy
Dec 21, 2020
Merged

Classification metrics overhaul: accuracy metrics (2/n)#4838
tchaton merged 136 commits intoLightning-AI:release/1.2-devfrom
tadejsv:cls_metrics_accuracy

Conversation

@tadejsv
Copy link
Copy Markdown
Contributor

@tadejsv tadejsv commented Nov 24, 2020

This PR is a spin-off from #4835.

What does this PR do?

Metrics (changes and additions)

The metrics below are based on the new input formatting function from #4837

Accuracy

The accuracy metric now get a new subset_accuracy parameter, to calculate subset accuracy in case of multi-label or multi-dimensional multi-class inputs. This parameter is set to False by default, which preserves the default behavior of the old metric.

Additionally, a top_k parameter enables it to be used as TopK accuracy for multi-class inputs (with probabilities) - thanks @rohitgr7 . The top_k format does not yet have an equivalent in sklearn (will in 0.24).

HammingDistance (new)

This is equivalent to hamming_loss from sklearn.

@pep8speaks
Copy link
Copy Markdown

pep8speaks commented Nov 24, 2020

Hello @tadejsv! Thanks for updating this PR.

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2020-12-21 13:07:13 UTC

@tadejsv tadejsv changed the title Classification metrics overhaul: accuracy metrics (2a/n) Classification metrics overhaul: accuracy metrics (2/n) Nov 24, 2020
@SkafteNicki SkafteNicki added feature Is an improvement or enhancement Metrics labels Nov 25, 2020
@SkafteNicki SkafteNicki added this to the 1.1 milestone Nov 25, 2020
@Borda Borda changed the title Classification metrics overhaul: accuracy metrics (2/n) [blocked by #4837] Classification metrics overhaul: accuracy metrics (2/n) [wip] Nov 25, 2020
@Borda Borda requested a review from justusschock December 17, 2020 10:17
@Borda
Copy link
Copy Markdown
Collaborator

Borda commented Dec 17, 2020

@Borda any chance we can get this merged :) ?

yes, it ready, let me check why it cannot be merged...
@justusschock @teddykoker mind re-review? (your approvals was reseted by the system)

justusschock
justusschock previously approved these changes Dec 17, 2020
@Borda
Copy link
Copy Markdown
Collaborator

Borda commented Dec 17, 2020

@tadejsv mind rebase on release/1.2-dev to remove all the wrong edits?

@SkafteNicki
Copy link
Copy Markdown
Collaborator

@Borda approvals still seems to be dismissed by new commits

@tadejsv
Copy link
Copy Markdown
Contributor Author

tadejsv commented Dec 17, 2020

@Borda I already rebased (and I did it again right now). The problem is that I rebased to master in the time period between when 1.2-dev branch was created, and when it was announced in the slack channel. In my opinion that branch should have been rebased to master right before it was announced.

It doesn't contain, for example, PR #5134, which is "accidentally" picked up in this PR. If I remove this then tests here will fail, if I disabled DDP tests then another PR will be needed once 1.2-dev is rebased on master to re-enable these tests...

@Borda
Copy link
Copy Markdown
Collaborator

Borda commented Dec 17, 2020

@Borda approvals still seems to be dismissed by new commits

I know, I am pushing @williamFalcon as I can... @edenlightning @tchaton

@Borda
Copy link
Copy Markdown
Collaborator

Borda commented Dec 21, 2020

@tadejsv it seems we resolved the issue with dismissed approvals, but mind resolve conflicts with release/1.2-dev so we can go... 🐰
@justusschock @teddykoker @ananthsub @SkafteNicki mind have a quick look, I believe this was once already approved :]

@tadejsv
Copy link
Copy Markdown
Contributor Author

tadejsv commented Dec 21, 2020

@Borda conflicts resolved, we're ready to go :)

Copy link
Copy Markdown
Collaborator

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

Copy link
Copy Markdown
Contributor

@rohitgr7 rohitgr7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one comment. Else LGTM. Nice work!!

Accepts all input types listed in :ref:`metrics:Input types`.

Args:
threshold:
Copy link
Copy Markdown
Contributor

@rohitgr7 rohitgr7 Dec 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just wondering, should threshold be None too by default?? because it's not used when we provider pred_labels, and use 0.5 by default when we have pred_probs

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, I guess this could be done - but in a way that if pred_probs are passed in, None would default to 0.5. otherwise this would be a very disturbing breaking change for people used to using accuracy without extra params

Copy link
Copy Markdown
Contributor

@rohitgr7 rohitgr7 Dec 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah, that's what meant here 0.5 by default when we have pred_probs

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, will add this in the next PR.

Copy link
Copy Markdown
Contributor

@tchaton tchaton left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great ! minor comments

tensor(0.6667)
"""

if top_k is not None and top_k <= 0:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should test top_k is not a float.

return class_reduce(tps, sups, sups, class_reduction=class_reduction)


def _confmat_normalize(cm):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we provide the normalisation dimension ?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For what?

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some people like to visualise both matrices.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I don't understand. If you are speaking about the confusion matrix - that is not part of this PR (this one is about the accuracy metric).

Copy link
Copy Markdown
Collaborator

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, wait with merge to after release branch gets synced with master tomorrow. Then this PR just needs to updated such that the fix from #5134 is not part of it anymore.

@tchaton tchaton merged commit ccffc34 into Lightning-AI:release/1.2-dev Dec 21, 2020
@tchaton
Copy link
Copy Markdown
Contributor

tchaton commented Dec 21, 2020

LGTM, wait with merge to after release branch gets synced with master tomorrow. Then this PR just needs to updated such that the fix from #5134 is not part of it anymore.

Oh sorry @SkafteNicki, I didn't have your message when I merged. I refreshed the page and saw it.

@SkafteNicki
Copy link
Copy Markdown
Collaborator

@tchaton no probs. What should we do about it? Solve merge conflicts when master is merged into release/1.2-dev or revert this PR?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

feature Is an improvement or enhancement ready to be merged PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.