Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC Add examples of make_scorer usage to fbeta_score docstring #28755

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

marijavlajic
Copy link
Contributor

Reference Issues/PRs

Related to #28671

What does this implement/fix? Explain your changes.

PR adds examples of make_scorer usage to docstrings of those metrics functions that cannot be passed to the scoring parameter directly and need to instead first be passed to make_scorer to create callable scorer objects.

Any other comments?

This commit only contains the change for fbeta_score / classification metrics. If the format and content are fine, I will add the four regression metrics functions as well.

Copy link

github-actions bot commented Apr 2, 2024

✔️ Linting Passed

All linting checks passed. Your pull request is in excellent shape! ☀️

Generated for commit: 49c1ed3. Link to the linter CI: here

@marijavlajic
Copy link
Contributor Author

I realized in a rush I messed up and created the new branch from the old branch rather than main. I'm happy to close this and create a new cleaner PR if you prefer, so as not to pollute it with other commits.

@glemaitre
Copy link
Member

I realized in a rush I messed up and created the new branch from the old branch rather than main. I'm happy to close this and create a new cleaner PR if you prefer, so as not to pollute it with other commits.

It is fine to revert the changes that are already done in #28750. Don't worry about the commit history because we are squashing all the history in a single commit and use the title of the PR.

Copy link
Member

@ogrisel ogrisel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another pass of feedback.

sklearn/metrics/_classification.py Show resolved Hide resolved
sklearn/metrics/_classification.py Outdated Show resolved Hide resolved
value to minimize, the lower the better. When converting
into a scorer object using :func:`make_scorer`, set
the ``greater_is_better`` parameter to ``False`` (``True`` by default; see the
parameter description below).

Metrics available for various machine learning tasks are detailed in sections
below.

Many metrics are not given names to be used as ``scoring`` values,
sometimes because they require additional parameters, such as
:func:`fbeta_score`. In such cases, you need to generate an appropriate
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To remove those unwanted changes from this branch, you can try:

git checkout main -- doc/modules/model_evaluation.rst

and commit and push the results.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marijavlajic have you tried the above command? Doesn't it work as expected? It would be good to disantangle the contents from this PR from #28750.

@ogrisel
Copy link
Member

ogrisel commented Apr 3, 2024

Please update the title of the PR to be explicit that this is a bout fbeta_score. Once this is merged you or others can do similar PRs for the other metric functions such as mean_pinball_loss and co.

@marijavlajic marijavlajic changed the title DOC Add examples of make_scorer usage to docstrings of individual metrics functions DOC Add examples of make_scorer usage to fbeta_score docstring Apr 4, 2024
Copy link
Member

@ogrisel ogrisel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor nitpicks but otherwise LGTM!

Comment on lines +1440 to +1443
F-beta score is not implemented as a named scorer that can be
passed to the `scoring` parameter directly; :func:`make_scorer`
needs to be used first instead to create a callable object to then
pass to the `scoring` parameter, see examples for details.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
F-beta score is not implemented as a named scorer that can be
passed to the `scoring` parameter directly; :func:`make_scorer`
needs to be used first instead to create a callable object to then
pass to the `scoring` parameter, see examples for details.
F-beta score is not implemented as a named scorer that can be passed to
the `scoring` parameter of cross-validation tools directly: it requires to be
wrapped with :func:`make_scorer` so as to specify the value of `beta`. See
examples for details.

Comment on lines +1480 to +1481
>>> grid = GridSearchCV(LinearSVC(dual="auto"), param_grid={'C': [1, 10]},
... scoring=ftwo_scorer, cv=5)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's follow the black formatting conventions in our docstrings as well (even if it was not always the case in the past.

Suggested change
>>> grid = GridSearchCV(LinearSVC(dual="auto"), param_grid={'C': [1, 10]},
... scoring=ftwo_scorer, cv=5)
>>> grid = GridSearchCV(
... LinearSVC(dual="auto"),
... param_grid={'C': [1, 10]},
... scoring=ftwo_scorer,
... cv=5,
... )

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We might want to update the multiline call to fbeta_score in the previous example snippet to follow the same format style.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants