Skip to content

[feature request] Forward-mode automatic differentiation #10223

@krishnap25

Description

@krishnap25

Thanks for the awesome library! It would be great if PyTorch could support forward-mode automatic differentiation. The main use case is to compute a Jacobian-vector product. I tried using this trick that simulates forward-mode autodiff by running reverse-mode twice, but it causes my GPU to run out of memory with AlexNet. HIPS/autograd supports this operation, and it would be really nice if PyTorch could as well. Thanks!

cc @ezyang @gchanan @zou3519 @bdhirsh @jbschlosser @albanD @gqchen @pearu @nikitaved @soulitzer @anjali411 @dylanbespalko @mruberry @ssnl

Metadata

Metadata

Assignees

Labels

complex_autogradfeatureA request for a proper, new feature.high prioritymodule: autogradRelated to torch.autograd, and the autograd engine in generalmodule: complexRelated to complex number support in PyTorchquansight-nackHigh-prio issues that have been reviewed by Quansight and are judged to be not actionable.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions