Skip to content

[CP][RFC] Enable FlexCP for llama3 with function wrapper#1696

Closed
fegin wants to merge 1 commit intomainfrom
chienchin/flex_cp
Closed

[CP][RFC] Enable FlexCP for llama3 with function wrapper#1696
fegin wants to merge 1 commit intomainfrom
chienchin/flex_cp

Conversation

@fegin
Copy link
Contributor

@fegin fegin commented Sep 10, 2025

This PR requires pytorch/pytorch#162542

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Sep 10, 2025
@fegin fegin changed the title [CP][RFC] Enable FlexCP for llama3 [CP][RFC] Enable FlexCP for llama3 with function wrapper Sep 12, 2025
fegin added a commit that referenced this pull request Sep 12, 2025
Similar to #1696, but this PR uses parallel_module similar to TP/SP.

This PR also requires pytorch/pytorch#162542
@fegin
Copy link
Contributor Author

fegin commented Sep 16, 2025

Close this PR as people generally like #1707

@fegin fegin closed this Sep 16, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant