If we use interpreter.SetDefaultNumberType(DefaultNumberType.Decimal); as in code below.
interpreter = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
After this if we try to evaluate below string for a loaded variable strVariable , Below expression will not evaluate.
strVariable.Substring(0, 40)
if we create the interpreter with this interpreter = new Interpreter() then above expression will evaluate.
I have tried to add reference for the string and Math lib as below , but it still does not work , it only works if we remove the SetDefaultNumberType option.
interpreter = interpreter.Reference(typeof(System.String));
interpreter = interpreter.Reference(typeof(System.Math));
can someone help on this issue. as we need to set Default Number Type and also have some access to libraries for extension methods etc.
If we use interpreter.SetDefaultNumberType(DefaultNumberType.Decimal); as in code below.
interpreter = new Interpreter().SetDefaultNumberType(DefaultNumberType.Decimal);
After this if we try to evaluate below string for a loaded variable strVariable , Below expression will not evaluate.
strVariable.Substring(0, 40)
if we create the interpreter with this interpreter = new Interpreter() then above expression will evaluate.
I have tried to add reference for the string and Math lib as below , but it still does not work , it only works if we remove the SetDefaultNumberType option.
can someone help on this issue. as we need to set Default Number Type and also have some access to libraries for extension methods etc.