-
Notifications
You must be signed in to change notification settings - Fork 100
Description
Currently the WGSL spec specifies the accuracy required for a variety of FP builtins, https://www.w3.org/TR/WGSL/#floating-point-accuracy
A large number of them resolve down to an accuracy in terms of ULP, which can be then converted into a concrete epsilon that values should be within to the true value.
The issue becomes, what is the true value?
Is it the JS/Typescript value for that specific operation?
Is it from a table of values generated offline via a precision C math library? Maybe generated as part of the build process for the CTS?
Is there some other source of truth?
Additionally, how many values do we test it against, just a set of specific interesting values, all the potential values, or maybe a random selection of values?
Instead of having the tests for each builtin figure this out for themselves, and potentially be inconsistent with each other, there should probably be a standardized way of doing this, and probably a related doc outlining it.