Skip to content

remove allow-untyped-defs from torch/_higher_order_ops/run_const_graph.py#157847

Closed
bobrenjc93 wants to merge 2 commits intogh/bobrenjc93/523/basefrom
gh/bobrenjc93/523/head
Closed

remove allow-untyped-defs from torch/_higher_order_ops/run_const_graph.py#157847
bobrenjc93 wants to merge 2 commits intogh/bobrenjc93/523/basefrom
gh/bobrenjc93/523/head

Conversation

[ghstack-poisoned]
@pytorch-bot
Copy link

pytorch-bot bot commented Jul 8, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/157847

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 8b998fd with merge base 8134684 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

[ghstack-poisoned]
@run_const_graph.py_impl(FakeTensorMode)
def run_const_graph_fake_tensor_mode(mode, graph, args):
def run_const_graph_fake_tensor_mode(
mode: FakeTensorMode, graph: torch.fx.GraphModule, args: tuple[object, ...]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Most of these tuple[object] should be a typing_extensions.TypeVarTuple

def run_const_graph_functional(ctx, graph, args):
def run_const_graph_functional(
ctx: "BaseFunctionalizeAPI", graph: torch.fx.GraphModule, args: tuple[Any, ...]
) -> Any:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

return Any or object?

super().__init__("run_const_graph")

def __call__(self, graph, args):
def __call__(self, graph: torch.fx.GraphModule, args: tuple[object, ...]) -> object:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TypeVarTuple here?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wish graph module had a ParamSpec Generic, is it possible to add one? Or would it violate serialization?

) -> object:
const_gm, weights = graph, args
p_args = pytree.tree_map(mode.tracer.unwrap_proxy, (graph, args))
p_args = pytree.tree_map(mode.tracer.unwrap_proxy, (graph, args)) # type: ignore[union-attr]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shoudl we use typing_utils not_none here?

@bobrenjc93 bobrenjc93 added the topic: not user facing topic category label Jul 9, 2025
@bobrenjc93 bobrenjc93 marked this pull request as ready for review July 9, 2025 22:55
@bobrenjc93 bobrenjc93 requested a review from zou3519 as a code owner July 9, 2025 22:55
@pytorchmergebot
Copy link
Collaborator

Starting merge as part of PR stack under #157848

@bobrenjc93
Copy link
Contributor Author

oops sorry i just saw comments, cancelling merge now

@pytorchmergebot
Copy link
Collaborator

Starting merge as part of PR stack under #157848

pytorchmergebot pushed a commit that referenced this pull request Jul 12, 2025
…c/modules/linear_relu.py (#157848)

Pull Request resolved: #157848
Approved by: https://github.com/Skylion007
ghstack dependencies: #157847
@github-actions github-actions bot deleted the gh/bobrenjc93/523/head branch August 12, 2025 02:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants