Skip to content

[Fixbug][Hidet Script] Fix a bug that hidet script does not recognize return type#329

Merged
yaoyaoding merged 2 commits intohidet-org:mainfrom
yaoyaoding:fix-hidet-script-bug
Jul 25, 2023
Merged

[Fixbug][Hidet Script] Fix a bug that hidet script does not recognize return type#329
yaoyaoding merged 2 commits intohidet-org:mainfrom
yaoyaoding:fix-hidet-script-bug

Conversation

@yaoyaoding
Copy link
Copy Markdown
Member

Fix hidet script transpiler to run recognize the following code

def test_unroll():
    from hidet.lang import printf, attrs
    from hidet.ir.dtypes import float32x8, float32
    from hidet.ir import primitives
    from hidet.lang import address, cast

    with hidet.script_module() as script_module:

        @hidet.script
        def example() -> float32x8:
            attrs.func_kind = 'cpu_internal'
            return primitives.cpu.avx_f32x8_setzero()

        @hidet.script
        def main():
            attrs.func_kind = 'cpu_kernel'

            a = example()
            a_unpacked = cast(address(a), ~float32)
            for i in range(8):
                printf("%f ", a_unpacked[i])
            printf("\n")

    func = script_module.build()
    func()

    return func

Previously, hidet can not recognize the float32x8 return type.

@yaoyaoding yaoyaoding merged commit b356f3d into hidet-org:main Jul 25, 2023
@yaoyaoding yaoyaoding deleted the fix-hidet-script-bug branch July 25, 2023 15:32
vadiklyutiy pushed a commit that referenced this pull request Jul 22, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
vadiklyutiy pushed a commit that referenced this pull request Jul 23, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
vadiklyutiy pushed a commit that referenced this pull request Dec 26, 2024
After disallowing functions unsupported by Hidet as in #317 , the
compilation of the model `vision_maskrcnn` (previously failed on
unsupported `topk` method, as in #267 ) failed on a TypeError with the
following traceback message:

> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 75, in visit
> ret = self.visit_Operator(obj) # pylint: disable=assignment-from-none
>           ^^^^^^^^^^^^^^^^^^^^^^^^
> File
"/home/bolin/Desktop/hidet/python/hidet/graph/graph_utils/functors.py",
line 126, in visit_Operator
>     updated_outputs = op.reforward(inputs)
>                       ^^^^^^^^^^^^^^^^^^^^
> File "/home/bolin/Desktop/hidet/python/hidet/graph/operator.py", line
185, in reforward
>     return cls(*inputs, **attributes).outputs
>            ^^^^^^^^^^^^^^^^^^^^^^^^^^
> torch._dynamo.exc.BackendCompilerFailed: backend='hidet' raised:
> TypeError: ClampOp.__init__() missing 2 required positional arguments:
'min_value' and 'max_value'


The cause is that, inside the[ `reforward`
function](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/operator.py#L180-L185),
during the call to `cls(*inputs, **attributes)`, where `cls` is
`ClampOp`, `inputs` only consists of the input tensor and `attributes`
is an empty dictionary, so the `min_value` and `max_values` cannot be
passed to the initializer. This is because we did not initialize the
`attributes` dictionary to contain the values of these two parameter
[while initializing
`ClampOp`](https://github.com/CentML/hidet/blob/da56e48148c5b075f1fba6d1d878a82889c9f731/python/hidet/graph/ops/arithmetic.py#L586-L595).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant