Description
Using StringComparison.InvariantCulture on string.LastIndexOf(string, comparison) returns different results in .NET 5 compared to .NET Core 3.1 when looking for the index of a unicode character.
Witness the following code:
string specialChar = "\u007f";
string testString = "hello" + specialChar + "world";
Debug.Assert(testString.LastIndexOf(specialChar) == 5);
On .NET Core 3.1, this assertion is true, whereas in .NET 5 this assertion is false. In fact, in .NET 5, the LastIndexOf method is returning the index of the end of the string whereas in 3.1 it is correctly returning "5".
Supposition
I have tried source stepping into the LastIndexOf method but frankly, I do not understand it well enough to make an informed analysis of why this happens :( My supposition is that is something to do with InvariantCulture and Unicode, because if I change the comparison mode to Ordinal, the assertion is correct in both .NET versions.
Description
Using
StringComparison.InvariantCultureonstring.LastIndexOf(string, comparison)returns different results in .NET 5 compared to .NET Core 3.1 when looking for the index of a unicode character.Witness the following code:
On .NET Core 3.1, this assertion is
true, whereas in .NET 5 this assertion isfalse. In fact, in .NET 5, theLastIndexOfmethod is returning the index of the end of the string whereas in 3.1 it is correctly returning "5".Supposition
I have tried source stepping into the
LastIndexOfmethod but frankly, I do not understand it well enough to make an informed analysis of why this happens :( My supposition is that is something to do withInvariantCultureand Unicode, because if I change the comparison mode to Ordinal, the assertion is correct in both .NET versions.