Conversation
yiyixuxu
left a comment
There was a problem hiding this comment.
looks good to me,
but it's currently failing though
|
|
||
| @property | ||
| def output_shape(self) -> tuple: | ||
| return (16, 8, 8) |
There was a problem hiding this comment.
Can you check the output shape here? It looks like the model output shape is the same as the input shape (4, 8, 8), so setting output_shape to (16, 8, 8) causes training tests such as TestGlmImageTransformerTraining.test_training to fail with a shape error.
dg845
left a comment
There was a problem hiding this comment.
Thanks for the PR! Left one question about the test output shape.
In addition to the tests which failed in the latest CI run (https://github.com/huggingface/diffusers/actions/runs/23590953774/job/68695918619?pr=13344), I also got the following test failures locally:
TestGlmImageTransformerCompile.test_torch_compile_repeated_blocks: I think this can be fixed by setting_repeated_blocks = ["GlmImageTransformerBlock"]inGlmImageTransformer2DModel.TestGlmImageTransformer.test_model_parallelism: it looks like some submodules inGlmImageCombinedTimestepSizeEmbeddingsmight end up on different devices, causing a device mismatch error. Could you look into it?
…ffusers into glmimage-refactor
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
|
@dg845 Thanks for flagging. Fixed the issues 👍🏽 |
* update * update * update * update
What does this PR do?
Fixes # (issue)
Before submitting
documentation guidelines, and
here are tips on formatting docstrings.
Who can review?
Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.