Skip to content

Align rrelu_with_noise schema to reflect noise mutation#8309

Closed
IvanKobzarev wants to merge 1 commit intopytorch:masterfrom
IvanKobzarev:rrelu_with_noise
Closed

Align rrelu_with_noise schema to reflect noise mutation#8309
IvanKobzarev wants to merge 1 commit intopytorch:masterfrom
IvanKobzarev:rrelu_with_noise

Conversation

@IvanKobzarev
Copy link
Copy Markdown
Collaborator

pytorch PR is pytorch/pytorch#138503

rrelu_with_noise actually mutates noise, but this was not reflected in schema.
As a result compilation did not capture its mutation.

This PR is to align xla override for rrelu_with_noise to remove const for noise argument

@IvanKobzarev
Copy link
Copy Markdown
Collaborator Author

@qihqi @miladm @JackCaoG Could you please take a look at this PR.

@qihqi qihqi self-requested a review November 6, 2024 16:41
@qihqi
Copy link
Copy Markdown
Collaborator

qihqi commented Nov 6, 2024

Please fix the build error:

 torch_xla/csrc/aten_xla_type.cpp:3021:12: error: no declaration matches 'at::Tensor torch_xla::XLANativeFunctions::rrelu_with_noise(const at::Tensor&, at::Tensor&, const c10::Scalar&, const c10::Scalar&, bool, std::optional<at::Generator>)'
     3021 | at::Tensor XLANativeFunctions::rrelu_with_noise(
          |            ^~~~~~~~~~~~~~~~~~
    In file included from torch_xla/csrc/aten_xla_type.cpp:25:
    bazel-out/k8-opt/bin/torch_xla/csrc/XLANativeFunctions.h:319:19: note: candidate is: 'static at::Tensor torch_xla::XLANativeFunctions::rrelu_with_noise(const at::Tensor&, const at::Tensor&, const c10::Scalar&, const c10::Scalar&, bool, std::optional<at::Generator>)'
      319 | static at::Tensor rrelu_with_noise(const at::Tensor & self, const at::Tensor & noise, const at::Scalar & lower, const at::Scalar & upper, bool training, ::std::optional<at::Generator> generator);
          |                   ^~~~~~~~~~~~~~~~
    In file included from torch_xla/csrc/aten_xla_type.cpp:25:
    bazel-out/k8-opt/bin/torch_xla/csrc/XLANativeFunctions.h:14:8: note: 'struct torch_xla::XLANativeFunctions' defined here
       14 | struct XLANativeFunctions {
          |        ^~~~~~~~~~~~~~~~~~

You can verify it built or not locally with python setup.py develop.

Copy link
Copy Markdown
Collaborator

@qihqi qihqi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is meant to be land together with pytorch/pytorch#138503
but because of them being 2 different repos, it has to be done one by one.
The current plan is to land this one, land the torch one + xla pin update.

@IvanKobzarev
Copy link
Copy Markdown
Collaborator Author

@qihqi Thanks for approving. This build error looks like caused from missing const (what we actually fixing).
Is there a way how to specify in xla PR to use some specific revision of pytorch or CI always takes pytorch HEAD?

@IvanKobzarev
Copy link
Copy Markdown
Collaborator Author

Abandoning in favor of #8363 (PR not fro fork to be able to use in xla pin in pytorch)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants