Consider the following log calls:
log_color("some/entity", frame_nr=0, [{255, 0, 0, 255}])
log_point("some/entity", frame_nr=1, [{1.0, 1.0}])
log_point("some/entity", frame_nr=2, [{2.0, 2.0}])
log_point("some/entity", frame_nr=3, [{3.0, 3.0}])
log_point("some/entity", frame_nr=4, [{4.0, 4.0}])
log_point("some/entity", frame_nr=5, [{5.0, 5.0}])
Querying for LatestAt("some/entity", ("frame_nr", 5)) will unsurprisingly yield a red point at (5.0, 5.0).
Now, consider what happens after running a GC that drops 50% of the data, leaving us with:
log_point("some/entity", frame_nr=3, [{3.0, 3.0}])
log_point("some/entity", frame_nr=4, [{4.0, 4.0}])
log_point("some/entity", frame_nr=5, [{5.0, 5.0}])
Querying for LatestAt("some/entity", ("frame_nr", 5)) will now yield a point at (5.0, 5.0) with whatever is currently defined as the default color, rather than red. This is just plain wrong.
This happens because the GC blindly drops data rather than doing the correct thing: compacting what gets dropped into a latest-at kind of state and keeping that around for future queries.
Consider the following log calls:
Querying for
LatestAt("some/entity", ("frame_nr", 5))will unsurprisingly yield a red point at(5.0, 5.0).Now, consider what happens after running a GC that drops 50% of the data, leaving us with:
Querying for
LatestAt("some/entity", ("frame_nr", 5))will now yield a point at(5.0, 5.0)with whatever is currently defined as the default color, rather than red. This is just plain wrong.This happens because the GC blindly drops data rather than doing the correct thing: compacting what gets dropped into a latest-at kind of state and keeping that around for future queries.