Skip to content

[Needs someone to complete] Reduce sum on many axes#2116

Closed
vlasenkov wants to merge 2 commits intopytorch:masterfrom
vlasenkov:reduce-sum
Closed

[Needs someone to complete] Reduce sum on many axes#2116
vlasenkov wants to merge 2 commits intopytorch:masterfrom
vlasenkov:reduce-sum

Conversation

@vlasenkov
Copy link
Contributor

@vlasenkov vlasenkov commented Jul 16, 2017

Resolves #2006

  • arrange algorithm
  • arrange places to insert the feature (as separate function or as Tensor/Variable method)
  • implement the feature
  • rename all keepdim to keepdims
  • write docs
  • write tests

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR!
I wonder if it would be better to remove the implementation from cwrap, to avoid conflicts? I think that it's better than implementing these ops in cwrap, because we automatically have support for autograd.
Also, what is the behavior of numpy for operations like median when multiple axis are passed? Does it perform multiple kernel calls, or does it transpose+view+kernel call? For sum it moght not matter, but for other ops that might make a difference.

from ._utils import _range
from operator import mul
from functools import reduce
import collections

This comment was marked as off-topic.

This comment was marked as off-topic.

input = input.sum(ax, keepdims=True)
else:
for ax in sorted(axes, reverse=True):
input = input.sum(ax)

This comment was marked as off-topic.

This comment was marked as off-topic.

def sum(input, axes, keepdims=False, out=None):
if isinstance(axes, collections.Iterable)
if a.dim() > 3:
if keepdims:

This comment was marked as off-topic.

This comment was marked as off-topic.

# permute
# reduce single dim
else:
return torch._C.sum(input, axes, keepdims, out)

This comment was marked as off-topic.

This comment was marked as off-topic.

@vlasenkov vlasenkov changed the title [WIP] Reduce sum on many axes [Needs someone to complete] Reduce sum on many axes Aug 10, 2017
@bernardohenz
Copy link

I am trying to perform std on many axes, pretty similar to what you are doing with sum. Is this problem solved?

zou3519 pushed a commit to zou3519/pytorch that referenced this pull request Mar 30, 2018
@karandwivedi42
Copy link
Contributor

@zou3519 closed in #6152?

@fmassa fmassa closed this Jun 27, 2018
@tstandley
Copy link

I also want to do var over many axes. is this solved? (same question as @bernardohenz except with var).

I should note that numpy supports this, and the only way to do this in pytorch currently is to compute the mean, subtract (using expand), square, and then take the mean. Basically manually.

@soumith
Copy link
Collaborator

soumith commented Aug 15, 2018

@tstandley we are working on mean, variance and stdv on multiple axes. @colesbury should put up a PR soon for it.

@anilkeshwani
Copy link

anilkeshwani commented Nov 1, 2021

For signposting:
Users / developers who are looking for up-to-date information on reduction operators in general (e.g. sum, min, max, var etc.) should see the Reductions tracking issue #61417 which lists all of these proposed modifications and ongoing work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Tensor.sum() over multiple axes

8 participants