Skip to content

Commit c35620f

Browse files
committed
fix doc linking
1 parent 826c937 commit c35620f

14 files changed

Lines changed: 51 additions & 111 deletions

File tree

src/torchmetrics/classification/accuracy.py

Lines changed: 3 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -350,8 +350,6 @@ class Accuracy(StatScores):
350350
changed to subset accuracy (which requires all labels or sub-samples in the sample to
351351
be correctly predicted) by setting ``subset_accuracy=True``.
352352
353-
Accepts all input types listed in :ref:`pages/classification:input types`.
354-
355353
Args:
356354
num_classes:
357355
Number of classes. Necessary for ``'macro'``, ``'weighted'`` and ``None`` average methods.
@@ -387,11 +385,10 @@ class Accuracy(StatScores):
387385
- ``'samplewise'``: In this case, the statistics are computed separately for each
388386
sample on the ``N`` axis, and then averaged over samples.
389387
The computation for each sample is done by treating the flattened extra axes ``...``
390-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
388+
as the ``N`` dimension within the sample,
391389
and computing the metric for the sample based on that.
392390
393391
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
394-
(see :ref:`pages/classification:input types`)
395392
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
396393
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
397394
@@ -409,9 +406,7 @@ class Accuracy(StatScores):
409406
410407
multiclass:
411408
Used only in certain special cases, where you want to treat inputs as a different type
412-
than what they appear to be. See the parameter's
413-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
414-
for a more detailed explanation and examples.
409+
than what they appear to be.
415410
416411
subset_accuracy:
417412
Whether to compute subset accuracy for multi-label and multi-dimensional
@@ -557,9 +552,7 @@ def __init__(
557552
self.add_state("total", default=tensor(0), dist_reduce_fx="sum")
558553

559554
def update(self, preds: Tensor, target: Tensor) -> None: # type: ignore
560-
"""Update state with predictions and targets. See
561-
:ref:`pages/classification:input types` for more information on input
562-
types.
555+
"""Update state with predictions and targets.
563556
564557
Args:
565558
preds: Predictions from model (logits, probabilities, or labels)

src/torchmetrics/classification/dice.py

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ class Dice(StatScores):
3333
3434
The reduction method (how the precision scores are aggregated) is controlled by the
3535
``average`` parameter, and additionally by the ``mdmc_average`` parameter in the
36-
multi-dimensional multi-class case. Accepts all inputs listed in :ref:`pages/classification:input types`.
36+
multi-dimensional multi-class case.
3737
3838
Args:
3939
num_classes:
@@ -69,11 +69,11 @@ class Dice(StatScores):
6969
- ``'samplewise'``: In this case, the statistics are computed separately for each
7070
sample on the ``N`` axis, and then averaged over samples.
7171
The computation for each sample is done by treating the flattened extra axes ``...``
72-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
72+
as the ``N`` dimension within the sample,
7373
and computing the metric for the sample based on that.
7474
7575
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
76-
(see :ref:`pages/classification:input types`) are flattened into a new ``N_X`` sample axis, i.e.
76+
are flattened into a new ``N_X`` sample axis, i.e.
7777
the inputs are treated as if they were ``(N_X, C)``.
7878
From here on the ``average`` parameter applies as usual.
7979
@@ -90,9 +90,7 @@ class Dice(StatScores):
9090
9191
multiclass:
9292
Used only in certain special cases, where you want to treat inputs as a different type
93-
than what they appear to be. See the parameter's
94-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
95-
for a more detailed explanation and examples.
93+
than what they appear to be.
9694
9795
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
9896

src/torchmetrics/classification/f_beta.py

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -773,11 +773,10 @@ class FBetaScore(StatScores):
773773
- ``'samplewise'``: In this case, the statistics are computed separately for each
774774
sample on the ``N`` axis, and then averaged over samples.
775775
The computation for each sample is done by treating the flattened extra axes ``...``
776-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
776+
as the ``N`` dimension within the sample,
777777
and computing the metric for the sample based on that.
778778
779779
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
780-
(see :ref:`pages/classification:input types`)
781780
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
782781
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
783782
@@ -795,9 +794,7 @@ class FBetaScore(StatScores):
795794
796795
multiclass:
797796
Used only in certain special cases, where you want to treat inputs as a different type
798-
than what they appear to be. See the parameter's
799-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
800-
for a more detailed explanation and examples.
797+
than what they appear to be.
801798
802799
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
803800
@@ -958,11 +955,10 @@ class F1Score(FBetaScore):
958955
- ``'samplewise'``: In this case, the statistics are computed separately for each
959956
sample on the ``N`` axis, and then averaged over samples.
960957
The computation for each sample is done by treating the flattened extra axes ``...``
961-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
958+
as the ``N`` dimension within the sample,
962959
and computing the metric for the sample based on that.
963960
964961
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
965-
(see :ref:`pages/classification:input types`)
966962
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
967963
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
968964
@@ -979,9 +975,7 @@ class F1Score(FBetaScore):
979975
980976
multiclass:
981977
Used only in certain special cases, where you want to treat inputs as a different type
982-
than what they appear to be. See the parameter's
983-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
984-
for a more detailed explanation and examples.
978+
than what they appear to be.
985979
986980
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
987981

src/torchmetrics/classification/hamming.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -340,8 +340,6 @@ class HammingDistance(Metric):
340340
treats each possible label separately - meaning that, for example, multi-class data is
341341
treated as if it were multi-label.
342342
343-
Accepts all input types listed in :ref:`pages/classification:input types`.
344-
345343
Args:
346344
threshold:
347345
Threshold for transforming probability or logit predictions to binary ``(0,1)`` predictions, in the case
@@ -423,8 +421,6 @@ def __init__(
423421
def update(self, preds: Tensor, target: Tensor) -> None: # type: ignore
424422
"""Update state with predictions and targets.
425423
426-
See :ref:`pages/classification:input types` for more information on input types.
427-
428424
Args:
429425
preds: Predictions from model (probabilities, logits or labels)
430426
target: Ground truth labels

src/torchmetrics/classification/precision_recall.py

Lines changed: 7 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -623,7 +623,7 @@ class Precision(StatScores):
623623
624624
The reduction method (how the precision scores are aggregated) is controlled by the
625625
``average`` parameter, and additionally by the ``mdmc_average`` parameter in the
626-
multi-dimensional multi-class case. Accepts all inputs listed in :ref:`pages/classification:input types`.
626+
multi-dimensional multi-class case.
627627
628628
Args:
629629
num_classes:
@@ -657,11 +657,11 @@ class Precision(StatScores):
657657
- ``'samplewise'``: In this case, the statistics are computed separately for each
658658
sample on the ``N`` axis, and then averaged over samples.
659659
The computation for each sample is done by treating the flattened extra axes ``...``
660-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
660+
as the ``N`` dimension within the sample,
661661
and computing the metric for the sample based on that.
662662
663663
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
664-
(see :ref:`pages/classification:input types`) are flattened into a new ``N_X`` sample axis, i.e.
664+
are flattened into a new ``N_X`` sample axis, i.e.
665665
the inputs are treated as if they were ``(N_X, C)``.
666666
From here on the ``average`` parameter applies as usual.
667667
@@ -678,9 +678,7 @@ class Precision(StatScores):
678678
679679
multiclass:
680680
Used only in certain special cases, where you want to treat inputs as a different type
681-
than what they appear to be. See the parameter's
682-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
683-
for a more detailed explanation and examples.
681+
than what they appear to be.
684682
685683
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
686684
@@ -813,7 +811,7 @@ class Recall(StatScores):
813811
814812
The reduction method (how the recall scores are aggregated) is controlled by the
815813
``average`` parameter, and additionally by the ``mdmc_average`` parameter in the
816-
multi-dimensional multi-class case. Accepts all inputs listed in :ref:`pages/classification:input types`.
814+
multi-dimensional multi-class case.
817815
818816
Args:
819817
num_classes:
@@ -846,11 +844,10 @@ class Recall(StatScores):
846844
- ``'samplewise'``: In this case, the statistics are computed separately for each
847845
sample on the ``N`` axis, and then averaged over samples.
848846
The computation for each sample is done by treating the flattened extra axes ``...``
849-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
847+
as the ``N`` dimension within the sample,
850848
and computing the metric for the sample based on that.
851849
852850
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
853-
(see :ref:`pages/classification:input types`)
854851
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
855852
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
856853
@@ -868,9 +865,7 @@ class Recall(StatScores):
868865
869866
multiclass:
870867
Used only in certain special cases, where you want to treat inputs as a different type
871-
than what they appear to be. See the parameter's
872-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
873-
for a more detailed explanation and examples.
868+
than what they appear to be.
874869
875870
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
876871

src/torchmetrics/classification/specificity.py

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -314,7 +314,7 @@ class Specificity(StatScores):
314314
315315
The reduction method (how the specificity scores are aggregated) is controlled by the
316316
``average`` parameter, and additionally by the ``mdmc_average`` parameter in the
317-
multi-dimensional multi-class case. Accepts all inputs listed in :ref:`pages/classification:input types`.
317+
multi-dimensional multi-class case.
318318
319319
Args:
320320
num_classes:
@@ -348,11 +348,10 @@ class Specificity(StatScores):
348348
- ``'samplewise'``: In this case, the statistics are computed separately for each
349349
sample on the ``N`` axis, and then averaged over samples.
350350
The computation for each sample is done by treating the flattened extra axes ``...``
351-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
351+
as the ``N`` dimension within the sample,
352352
and computing the metric for the sample based on that.
353353
354354
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
355-
(see :ref:`pages/classification:input types`)
356355
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
357356
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
358357
@@ -371,9 +370,7 @@ class Specificity(StatScores):
371370
372371
multiclass:
373372
Used only in certain special cases, where you want to treat inputs as a different type
374-
than what they appear to be. See the parameter's
375-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
376-
for a more detailed explanation and examples.
373+
than what they appear to be.
377374
378375
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
379376

src/torchmetrics/classification/stat_scores.py

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -503,7 +503,7 @@ class StatScores(Metric):
503503
``reduce`` parameter, and additionally by the ``mdmc_reduce`` parameter in the
504504
multi-dimensional multi-class case.
505505
506-
Accepts all inputs listed in :ref:`pages/classification:input types`.
506+
507507
508508
Args:
509509
threshold:
@@ -539,7 +539,7 @@ class StatScores(Metric):
539539
mdmc_reduce: Defines how the multi-dimensional multi-class inputs are handeled. Should be one of the following:
540540
541541
- ``None`` [default]: Should be left unchanged if your data is not multi-dimensional
542-
multi-class (see :ref:`pages/classification:input types` for the definition of input types).
542+
multi-class.
543543
544544
- ``'samplewise'``: In this case, the statistics are computed separately for each
545545
sample on the ``N`` axis, and then the outputs are concatenated together. In each
@@ -553,9 +553,7 @@ class StatScores(Metric):
553553
554554
multiclass:
555555
Used only in certain special cases, where you want to treat inputs as a different type
556-
than what they appear to be. See the parameter's
557-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
558-
for a more detailed explanation and examples.
556+
than what they appear to be.
559557
560558
kwargs: Additional keyword arguments, see :ref:`Metric kwargs` for more info.
561559
@@ -690,8 +688,6 @@ def __init__(
690688
def update(self, preds: Tensor, target: Tensor) -> None: # type: ignore
691689
"""Update state with predictions and targets.
692690
693-
See :ref:`pages/classification:input types` for more information on input types.
694-
695691
Args:
696692
preds: Predictions from model (probabilities, logits or labels)
697693
target: Ground truth values

src/torchmetrics/functional/classification/accuracy.py

Lines changed: 3 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -660,8 +660,6 @@ def accuracy(
660660
changed to subset accuracy (which requires all labels or sub-samples in the sample to
661661
be correctly predicted) by setting ``subset_accuracy=True``.
662662
663-
Accepts all input types listed in :ref:`pages/classification:input types`.
664-
665663
Args:
666664
preds: Predictions from model (probabilities, logits or labels)
667665
target: Ground truth labels
@@ -693,11 +691,11 @@ def accuracy(
693691
- ``'samplewise'``: In this case, the statistics are computed separately for each
694692
sample on the ``N`` axis, and then averaged over samples.
695693
The computation for each sample is done by treating the flattened extra axes ``...``
696-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
694+
as the ``N`` dimension within the sample,
697695
and computing the metric for the sample based on that.
698696
699697
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
700-
(see :ref:`pages/classification:input types`)
698+
701699
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
702700
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
703701
@@ -715,9 +713,7 @@ def accuracy(
715713
Should be left at default (``None``) for all other types of inputs.
716714
multiclass:
717715
Used only in certain special cases, where you want to treat inputs as a different type
718-
than what they appear to be. See the parameter's
719-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
720-
for a more detailed explanation and examples.
716+
than what they appear to be.
721717
ignore_index:
722718
Integer specifying a target class to ignore. If given, this class index does not contribute
723719
to the returned score, regardless of reduction method. If an index is ignored, and ``average=None``

src/torchmetrics/functional/classification/dice.py

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ def dice(
178178
179179
The reduction method (how the recall scores are aggregated) is controlled by the
180180
``average`` parameter, and additionally by the ``mdmc_average`` parameter in the
181-
multi-dimensional multi-class case. Accepts all inputs listed in :ref:`pages/classification:input types`.
181+
multi-dimensional multi-class case.
182182
183183
Args:
184184
preds: Predictions from model (probabilities, logits or labels)
@@ -213,11 +213,10 @@ def dice(
213213
- ``'samplewise'``: In this case, the statistics are computed separately for each
214214
sample on the ``N`` axis, and then averaged over samples.
215215
The computation for each sample is done by treating the flattened extra axes ``...``
216-
(see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,
216+
as the ``N`` dimension within the sample,
217217
and computing the metric for the sample based on that.
218218
219219
- ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs
220-
(see :ref:`pages/classification:input types`)
221220
are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they
222221
were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.
223222
@@ -240,9 +239,7 @@ def dice(
240239
Should be left at default (``None``) for all other types of inputs.
241240
multiclass:
242241
Used only in certain special cases, where you want to treat inputs as a different type
243-
than what they appear to be. See the parameter's
244-
:ref:`documentation section <pages/classification:using the multiclass parameter>`
245-
for a more detailed explanation and examples.
242+
than what they appear to be.
246243
247244
Return:
248245
The shape of the returned tensor depends on the ``average`` parameter

0 commit comments

Comments
 (0)