Skip to content

DOC Ensures that function passes numpydoc validation: f1_score#22358

Merged
glemaitre merged 3 commits intoscikit-learn:mainfrom
NumberPiOso:docstring-f1-score
Mar 9, 2022
Merged

DOC Ensures that function passes numpydoc validation: f1_score#22358
glemaitre merged 3 commits intoscikit-learn:mainfrom
NumberPiOso:docstring-f1-score

Conversation

@NumberPiOso
Copy link
Copy Markdown
Contributor

Reference Issues/PRs
Addresses #21350.

What does this implement/fix? Explain your changes.

Updates docstring in function to fix numpydoc errors.

Copy link
Copy Markdown
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @NumberPiOso ! There does not appear to be any changes to f1_score's docstring in this PR. You can see the numpydoc failures on the CI here

@NumberPiOso
Copy link
Copy Markdown
Contributor Author

NumberPiOso commented Feb 3, 2022

Thanks for the PR @NumberPiOso ! There does not appear to be any changes to f1_score's docstring in this PR. You can see the numpydoc failures on the CI here

Yes, I do get to see the errors in the CI

E           Notes
E           -----
E           When ``true positive + false positive == 0``, precision is undefined.
E           When ``true positive + false negative == 0``, recall is undefined.
E           In such cases, by default the metric will be set to 0, as will f-score,
E           and ``UndefinedMetricWarning`` will be raised. This behavior can be
E           modified with ``zero_division``.
E           
E           # Errors
E           
E            - GL07: Sections are in the wrong order. Correct order is: Parameters, Returns, See Also, Notes, References, Examples
E            - SA04: Missing description for See Also "fbeta_score" reference
E            - SA04: Missing description for See Also "precision_recall_fscore_support" reference
E            - SA04: Missing description for See Also "jaccard_score" reference
E            - SA04: Missing description for See Also "multilabel_confusion_matrix" reference

However when I run pytest -v sklearn/tests/test_docstrings.py or pytest -v sklearn/tests/test_docstrings.py -k sklearn.metrics._classification.f1_score in my computer I get

================================================== test session starts ===================================================
platform linux -- Python 3.9.10, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 -- /home/pi/anaconda3/envs/sklearn-env/bin/python3.9
cachedir: .pytest_cache
rootdir: /home/pi/git/scikit-learn, configfile: setup.cfg
plugins: cov-3.0.0
collected 0 items / 2 skipped 

@thomasjpfan Do you know why the error is not shown ?

@thomasjpfan
Copy link
Copy Markdown
Member

thomasjpfan commented Feb 3, 2022

You should have some tests running tests with pytest -v sklearn/tests/test_docstrings.py. So if you are getting skipped tests something is off about the installation. Usually these are difficult to debug. I recommend starting over from the Contributing guide.

When I pull in your branch in locally, I can see the same error message as the CI when running:

pytest -v sklearn/tests/test_docstrings.py -k sklearn.metrics._classification.f1_score

@GauravChoudhay
Copy link
Copy Markdown
Contributor

@thomasjpfan can I work on this issue as it seems no activities for a few days.
thanks

@NumberPiOso
Copy link
Copy Markdown
Contributor Author

Let me keep working at this, I just have not been able to create the environment properly

@GauravChoudhay
Copy link
Copy Markdown
Contributor

okay sure :)

@glemaitre
Copy link
Copy Markdown
Member

I pushed the necessary change and I will merge this PR when the CIs turn green.

@glemaitre glemaitre merged commit 7645845 into scikit-learn:main Mar 9, 2022
glemaitre added a commit to glemaitre/scikit-learn that referenced this pull request Apr 6, 2022
…t-learn#22358)

Co-authored-by: Guillaume Lemaitre <g.lemaitre58@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants