Skip to content

Batch normalization needs input dim checks #378

@szagoruyko

Description

@szagoruyko

Right now PyTorch batch_norm accepts any weight and running_mean shapes (eg 128 and 512) and doesn't complain at all, these checks https://github.com/torch/nn/blob/master/BatchNormalization.lua#L78 should probably be moved to C/CUDA side?

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: dependency bugProblem is not caused by us, but caused by an upstream library we use

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions