Skip to content

AggregatorTestCase#testSupportedFieldTypes fails with some generated half-floats #55360

@williamrandolph

Description

@williamrandolph

I got a strange little edge case on a PR check. It reproduces reliably with the test seed in the lines below:

./gradlew ':x-pack:plugin:analytics:test' \
  --tests "org.elasticsearch.xpack.analytics.mapper.HDRPreAggregatedPercentileRanksAggregatorTests.testSupportedFieldTypes" \
  -Dtests.seed=F58749BFAC5AEB11 \
  -Dtests.security.manager=true \
  -Dtests.locale=es-NI \
  -Dtests.timezone=Asia/Harbin \
  -Dcompiler.java=14

The test's error:

java.lang.AssertionError: 
Aggregator [percentile_ranks] supports field type [half_float] but executing against the field threw an exception: [The value Infinity is out of bounds for histogram, current covered range [6.668014432879854E240, 5.3344115463038834E241) cannot be extended any further.
Caused by: java.lang.ArrayIndexOutOfBoundsException: Values above 4.49423283715579E307 cannot be recorded]

But what we really want is that out of bounds exception and its stack trace:

Caused by: java.lang.ArrayIndexOutOfBoundsException: 
The value Infinity is out of bounds for histogram, current covered range [6.668014432879854E240, 5.3344115463038834E241) cannot be extended any further.
Caused by: java.lang.ArrayIndexOutOfBoundsException: Values above 4.49423283715579E307 cannot be recorded
Close stacktrace
at org.HdrHistogram.DoubleHistogram.autoAdjustRangeForValueSlowPath(DoubleHistogram.java:412)
at org.HdrHistogram.DoubleHistogram.autoAdjustRangeForValue(DoubleHistogram.java:378)
at org.HdrHistogram.DoubleHistogram.recordSingleValue(DoubleHistogram.java:345)
at org.HdrHistogram.DoubleHistogram.recordValue(DoubleHistogram.java:290)
at org.elasticsearch.search.aggregations.metrics.AbstractHDRPercentilesAggregator$1.collect(AbstractHDRPercentilesAggregator.java:86)
at org.elasticsearch.search.aggregations.LeafBucketCollector.collect(LeafBucketCollector.java:82)
at org.apache.lucene.search.AssertingLeafCollector.collect(AssertingLeafCollector.java:49)
at org.apache.lucene.search.MatchAllDocsQuery$1$1.score(MatchAllDocsQuery.java:64)
at org.apache.lucene.search.BulkScorer.score(BulkScorer.java:39)
at org.apache.lucene.search.AssertingBulkScorer.score(AssertingBulkScorer.java:71)
at org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:661)
at org.elasticsearch.search.aggregations.AggregatorTestCase$ShardSearcher.search(AggregatorTestCase.java:540)
at org.elasticsearch.search.aggregations.AggregatorTestCase.searchAndReduce(AggregatorTestCase.java:484)
at org.elasticsearch.search.aggregations.AggregatorTestCase.searchAndReduce(AggregatorTestCase.java:425)
at org.elasticsearch.search.aggregations.AggregatorTestCase.testSupportedFieldTypes(AggregatorTestCase.java:684)

It looks like this might be a bug in how we're generating half floats for this test. Here's AggregatorTestCase lines 724-729:

            if (typeName.equals(NumberFieldMapper.NumberType.HALF_FLOAT.typeName())) {
                // Generate a random float that respects the limits of half float
                float f = Math.abs((randomFloat() * 2 - 1) * 70000);
                v = HalfFloatPoint.halfFloatToSortableShort(f);
                json = "{ \"" + fieldName + "\" : \"" + f + "\" }";
            }

I stepped through this and it looks like we're getting a float of 65968.47, which appears to get rounded to infinity at some point in the process.

Since this appears to be a rare corner case, as evidenced by Gradle Enterprise having no other record of this failure in the last 90 days, I'm not going to mute the test.

Metadata

Metadata

Assignees

Type

No type
No fields configured for issues without a type.

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions