Skip to content

[Torch] Add some torch frontend mappings for roberta-base#267

Merged
hjjq merged 2 commits intohidet-org:mainfrom
hjjq:frontend
Jun 3, 2023
Merged

[Torch] Add some torch frontend mappings for roberta-base#267
hjjq merged 2 commits intohidet-org:mainfrom
hjjq:frontend

Conversation

@hjjq
Copy link
Copy Markdown
Collaborator

@hjjq hjjq commented Jun 2, 2023

Add mappings for:

torch.cumsum
torch.ne
torch.Tensor.int
torch.Tensor.long
torch.Tensor.type_as

Also update torch.mean to cast dtype before computing.

@hjjq hjjq changed the title [Torch] Add some torch frontend mappings [Torch] Add some torch frontend mappings for roberta-base Jun 2, 2023
@hjjq hjjq merged commit 870a702 into hidet-org:main Jun 3, 2023
@hjjq hjjq deleted the frontend branch July 12, 2023 16:39
vadiklyutiy pushed a commit that referenced this pull request Jul 22, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
vadiklyutiy pushed a commit that referenced this pull request Jul 23, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
vadiklyutiy pushed a commit that referenced this pull request Dec 26, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant