-
Notifications
You must be signed in to change notification settings - Fork 5.3k
Description
Until ~June 6, 2019, our memory-delta measurements were byte-accurate, and we could crisply use EXPECT_EQ to track memory bloat in tests.
However around that time, there were divergences whose sources are not completely known. One of these is use of libstdc++ which inlines short strings (length <= 22 bytes) into std::string, however that was not the only source of entropy. Other sources are not known.
Note that we do not seem to be having memory allocation pattern variations due to non-deterministic real-time timing in multi-threaded tests, yet. Though that might happen in the future.
In the meantime, as a stop gap, EXPECT_EQ memory tests were changed to EXPECT_LE in #7208 to resolve #7196 . We need to allow for the maximum memory used in an operation on any platform where we run memory-tests, including all platforms where we use TCMALLOC.
However this is not ideal: if you are developing on a platform that uses less memory, you may not notice you've bloated memory until the test runs on another platform, such as Bazel CI or Envoy CI on MacOS.
So the idea here is to have exact comparisons run only when running in CI -- which we can convey to the test code with a compile-time define. When running outside of CI we can use some sort of approximation and EXPECT_LE.
@lizan @mattklein123 @htuch WDYT?