[release/3.0] Fix the max token size threshold to correctly compute to 125MB for Base64 bytes.#40796
Conversation
…se64 bytes. (dotnet#40792) * Fix the max token size threshold to correctly compute to 125MB for Base64 bytes. * Rename constant to fix transpose error: Base46 -> Base64 * Enable the outerloop tests for windows and osx only and update to use platform specific new line.
|
Since this was approved, merging. cc @ericstj |
The limits were actually 2_604_166 (~2.5MB) and with the parens in the correct place the limit is 125_000_000 (125MB) old: 2_604_166 == (1_000_000_000 >> 6) / 6) == (1_000_000_000 / 384) Example var buffer = new System.Buffers.ArrayBufferWriter<byte>();
var writer = new Utf8JsonWriter(buffer);
writer.WriteStartArray();
// this works
byte[] bytes = Encoding.UTF8.GetBytes(new String('a', 2_604_166));
writer.WriteBase64StringValue(bytes);
// WriteBase64StringValue throws below (before fix)
byte[] bytes = Encoding.UTF8.GetBytes(new String('a', 2_604_167));
writer.WriteBase64StringValue(bytes); |
Yes, of course. Thanks for providing that detail @steveharter and making it clear. My description was focusing on precisely what changed and the typo in isolation, and was not attempting to show the result of the whole calculation (excluded the division by 6 for brevity). |
Ports #40792 to 3.0
Fixes https://github.com/dotnet/corefx/issues/40755
cc @lauxjpn, @steveharter, @scalablecory, @stephentoub, @ericstj
Description
Fix the typo in threshold calculation where rather than doing
(1_000_000_000/2^2) * 3( == 750_000_000), we were accidentally doing1_000_000_000/(2^(2 * 3))(== 15_625_000) because the bracket was misplaced. Turns out, C# ordering of mathematical operations has multiplication first before bit-shifts, but we want to do bit-shifts first. Also, rename the constant to fix transpose error: Base46 -> Base64.Customer Impact
The bug was customer-reported where the user was unable to return a 3.5MB byte[] from a controller's action within an aspnet app. Effectively, trying to write a byte[] as Base64-encoded string that is > 2.5MB wouldn't work, which falls within the "reasonable" size for binary payload. The actual token size limit was intended to be ~125 MB, which customers have not run up against (this limitation is due to the way
Utf8JsonWriteris implemented). Direct users of theUtf8JsonWriterhave a sub-optimal workaround for this bug, where they could explicitly encode the byte[] to Base64 string themselves, and pass that toUtf8JsonWriter.WriteString(). However, most folks would hit this via the JsonSerializer or from within an asp.net app, where they may not be able to modify their object model from byte[] to string (encoding to base64 up-front themselves), and there is no workaround for them.Regression?
No.
Risk
Utf8JsonWriter.WriteBase64String(...)breaks. Given how isolated the fix is (just a single line - math operation ordering corrected), the risk of this change having unintentional side effects is low. Previously, any caller trying to write large binary data as Base64 string would have hit an exception (which is fixed now) and most callers wouldn't hit this issue. There is a workaround when usingUtf8JsonWriterdirectly. However, since there isn't a workaround for folks using theJsonSerializeror returning byte[] within an ASP.NET app, this certainly meets the servicing bar and given the code change, I believe this should be ported.Tests run / added