Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

type promote clamp #77035

Closed
wants to merge 9 commits into from
Closed

type promote clamp #77035

wants to merge 9 commits into from

Conversation

ngimel
Copy link
Collaborator

@ngimel ngimel commented May 7, 2022

Fixes #76630
When clamp(Tensor, Tensor) is structured, big parts of this PR won't be needed, but for now let's fix type promotion to make behavior more regular.
Some special-casing is done in hardtanh to preserve legacy behavior of not promoting to boundaries.
Also, relu is explicitly disabled on bool inputs to avoid accidental promotion to integer type.
BC-breaking note:
Previously, min and max arguments in clamp did not participate in type promotion, which made it inconsistent with minimum and maximum operations. This PR (and PRs porting clamp_min/clamp_max to structured) make min and max arguments participate in type promotion.

@facebook-github-bot
Copy link
Contributor

facebook-github-bot commented May 7, 2022

🔗 Helpful links

✅ No Failures (16 Pending)

As of commit e93afa7 (more details on the Dr. CI page):

Expand to see more

💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@ngimel ngimel requested a review from mruberry as a code owner May 8, 2022 05:24
@@ -12,6 +12,7 @@ struct ResultTypeState {
};

TORCH_API ResultTypeState update_result_type_state(const Tensor& tensor, const ResultTypeState& in_state);
TORCH_API ResultTypeState update_result_type_state(const Scalar& scalar, const ResultTypeState& in_state);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice addition

Copy link
Collaborator

@mruberry mruberry left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool! Stamped!

@ngimel
Copy link
Collaborator Author

ngimel commented May 9, 2022

@pytorchbot merge this

@github-actions
Copy link

github-actions bot commented May 9, 2022

Hey @ngimel.
You've committed this PR, but it does not have both a 'release notes: ...' and 'topics: ...' label. Please add one of each to the PR. The 'release notes: ...' label should represent the part of PyTorch that this PR changes (fx, autograd, distributed, etc) and the 'topics: ...' label should represent the kind of PR it is (not user facing, new feature, bug fix, perf improvement, etc). The list of valid labels can be found here for the 'release notes: ...' and here for the 'topics: ...'.
For changes that are 'topic: not user facing' there is no need for a release notes label.

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be marked as BC-breaking?

@ngimel
Copy link
Collaborator Author

ngimel commented May 9, 2022

@albanD good point

@ngimel ngimel added the module: bc-breaking Related to a BC-breaking change label May 9, 2022
facebook-github-bot pushed a commit that referenced this pull request May 13, 2022
Summary:
Fixes #76630
When clamp(Tensor, Tensor) is structured, big parts of this PR won't be needed, but for now let's fix type promotion to make behavior more regular.

Pull Request resolved: #77035
Approved by: https://github.com/mruberry

Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/362525724bdba375defd6405cfe1b46a6ea222d3

Reviewed By: seemethere, osalpekar

Differential Revision: D36250567

Pulled By: malfet

fbshipit-source-id: 43571838f8e2ddb788b6a3230ec5271718450500
@github-actions github-actions bot deleted the ngimel/clamp_promo branch February 16, 2024 01:53
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

torch.clamp type promotes strangely
5 participants