A few times, VolumetricMaxPooling (in legacy.nn) kept failing on the continuous builds, and needed to investigate that.
So it turns out, there were these one set of indices that were in the same Pooling window that differed by less than epsilon. So, the numeric gradient was hitting boundary conditions (max-pooling is discontinuous of course).


The obvious fix is to modify the test to not have the input tensor have these boundary conditions.
However, as an aside, I am not sure why the boundary condition is only hit sometimes and not always, because the same random numbers are generated always, and division is unstable but it is deterministic (right? or have i always got this wrong)
A few times, VolumetricMaxPooling (in legacy.nn) kept failing on the continuous builds, and needed to investigate that.
So it turns out, there were these one set of indices that were in the same Pooling window that differed by less than epsilon. So, the numeric gradient was hitting boundary conditions (max-pooling is discontinuous of course).
The obvious fix is to modify the test to not have the input tensor have these boundary conditions.
However, as an aside, I am not sure why the boundary condition is only hit sometimes and not always, because the same random numbers are generated always, and division is unstable but it is deterministic (right? or have i always got this wrong)