Skip to content

[Fixbug] batch_matmul: move cc checking inside schedule#264

Merged
yaoyaoding merged 2 commits intohidet-org:mainfrom
hjjq:mm-script
Jun 2, 2023
Merged

[Fixbug] batch_matmul: move cc checking inside schedule#264
yaoyaoding merged 2 commits intohidet-org:mainfrom
hjjq:mm-script

Conversation

@hjjq
Copy link
Copy Markdown
Collaborator

@hjjq hjjq commented Jun 1, 2023

No description provided.

@hjjq hjjq changed the title [Operator] batch_matmul: move cc checking inside schedule [Fixbug] batch_matmul: move cc checking inside schedule Jun 1, 2023
@yaoyaoding
Copy link
Copy Markdown
Member

Thanks @hjjq !

@yaoyaoding yaoyaoding merged commit c1cfef8 into hidet-org:main Jun 2, 2023
vadiklyutiy pushed a commit that referenced this pull request Dec 19, 2024
…_fpn` (#455)

Closes #264 

The error encountered in the linked issues was due to a subtle
difference in type promotions when calling `torch.div` with the argument
`rounding_mode='floor'`. Specifically, if both of the two operands are
of the integer type, then the output would still be integer type. This
is different from my original implementation, which first calls
`truediv` and then `ops.floor`, which will make the output datatype
`float32`.

After fixing this issue, another error was encountered:

```
File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 70, in __call__
    return self.forward(*args)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 237, in forward
    self._raise_exception(e, node.target, exec_func, hidet_args, hidet_kwargs)
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 186, in _raise_exception
    raise RuntimeError('\n'.join(msg))
torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
RuntimeError: Can not interpret torch.nn.functional.batch_norm given arguments:
  torch.nn.functional.batch_norm(tensor(...), tensor(...), tensor(...), tensor(...), tensor(...), training=False, eps=1e-05)
Possible candidates are:
  batch_norm(x: hidet.Tensor, running_mean: Optional[hidet.Tensor], running_var: Optional[hidet.Tensor], weight: Optional[hidet.Tensor], bias: Optional[hidet.Tensor], training: bool, momentum: float, eps: float)
    File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 302
```

This PR also fixes this error by adding default values to some
parameters of the `batch_norm` function in
`torch.nn.functional.batch_norm`, to match the signature as in the[
PyTorch
documents](https://pytorch.org/docs/stable/generated/torch.nn.functional.batch_norm.html).
vadiklyutiy pushed a commit that referenced this pull request Dec 20, 2024
…_fpn` (#455)

Closes #264 

The error encountered in the linked issues was due to a subtle
difference in type promotions when calling `torch.div` with the argument
`rounding_mode='floor'`. Specifically, if both of the two operands are
of the integer type, then the output would still be integer type. This
is different from my original implementation, which first calls
`truediv` and then `ops.floor`, which will make the output datatype
`float32`.

After fixing this issue, another error was encountered:

```
File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 70, in __call__
    return self.forward(*args)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 237, in forward
    self._raise_exception(e, node.target, exec_func, hidet_args, hidet_kwargs)
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 186, in _raise_exception
    raise RuntimeError('\n'.join(msg))
torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
RuntimeError: Can not interpret torch.nn.functional.batch_norm given arguments:
  torch.nn.functional.batch_norm(tensor(...), tensor(...), tensor(...), tensor(...), tensor(...), training=False, eps=1e-05)
Possible candidates are:
  batch_norm(x: hidet.Tensor, running_mean: Optional[hidet.Tensor], running_var: Optional[hidet.Tensor], weight: Optional[hidet.Tensor], bias: Optional[hidet.Tensor], training: bool, momentum: float, eps: float)
    File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 302
```

This PR also fixes this error by adding default values to some
parameters of the `batch_norm` function in
`torch.nn.functional.batch_norm`, to match the signature as in the[
PyTorch
documents](https://pytorch.org/docs/stable/generated/torch.nn.functional.batch_norm.html).
vadiklyutiy pushed a commit that referenced this pull request Dec 26, 2024
…_fpn` (#455)

Closes #264 

The error encountered in the linked issues was due to a subtle
difference in type promotions when calling `torch.div` with the argument
`rounding_mode='floor'`. Specifically, if both of the two operands are
of the integer type, then the output would still be integer type. This
is different from my original implementation, which first calls
`truediv` and then `ops.floor`, which will make the output datatype
`float32`.

After fixing this issue, another error was encountered:

```
File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 70, in __call__
    return self.forward(*args)
           ^^^^^^^^^^^^^^^^^^^
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 237, in forward
    self._raise_exception(e, node.target, exec_func, hidet_args, hidet_kwargs)
  File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/interpreter.py", line 186, in _raise_exception
    raise RuntimeError('\n'.join(msg))
torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
RuntimeError: Can not interpret torch.nn.functional.batch_norm given arguments:
  torch.nn.functional.batch_norm(tensor(...), tensor(...), tensor(...), tensor(...), tensor(...), training=False, eps=1e-05)
Possible candidates are:
  batch_norm(x: hidet.Tensor, running_mean: Optional[hidet.Tensor], running_var: Optional[hidet.Tensor], weight: Optional[hidet.Tensor], bias: Optional[hidet.Tensor], training: bool, momentum: float, eps: float)
    File "/home/bolin/Desktop/hidet/python/hidet/graph/frontend/torch/register_functions.py", line 302
```

This PR also fixes this error by adding default values to some
parameters of the `batch_norm` function in
`torch.nn.functional.batch_norm`, to match the signature as in the[
PyTorch
documents](https://pytorch.org/docs/stable/generated/torch.nn.functional.batch_norm.html).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants