Skip to content

We're binding a bunch of crap to 'torch' namespace which shouldn't be there #16322

@ezyang

Description

@ezyang

Example:

>>> import torch
>>> torch.hinge_embedding_loss
<built-in method hinge_embedding_loss of type object at 0x7ff9c8948d20>

Official docs tell you to use torch.nn.functional.hinge_embedding_loss and for good reason: the hinge_embedding_loss in torch exposes a reduction argument that is an int, not a string.

Metadata

Metadata

Assignees

No one assigned

    Labels

    better-engineeringRelatively self-contained tasks for better engineering contributorstriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions