Skip to content

Add helper methods to IApiWebRequest for serializing JSON directly to a Stream#8019

Merged
andrewlock merged 1 commit intomasterfrom
andrew/optimize-json-post
Jan 9, 2026
Merged

Add helper methods to IApiWebRequest for serializing JSON directly to a Stream#8019
andrewlock merged 1 commit intomasterfrom
andrew/optimize-json-post

Conversation

@andrewlock
Copy link
Member

@andrewlock andrewlock commented Dec 30, 2025

Summary of changes

Adds PostAsJson<T> method to IApiWebRequest

Reason for change

Currently, if we want to send JSON, we serialize it to a string locally, convert the string to utf-8 bytes, potentially compress those bytes, and then copy that to the stream. Doing that as efficiently as we can is somewhat tricky, and we haven't always got it right. By creating a central method, and writing directly to the underlying stream where we can, we can potentially see efficiency gains, and can also potentially make it easier to move to a more modern serializer later.

Implementation details

  • Added IApiRequest.PostAsJsonAsync<T>(T payload, MultipartCompression compression) (and an overload that accepts json settings)
  • Implemented the method as a "push stream" approach in each of the three implementations we currently have (ApiRequest, HttpClient, HttpStream)
  • Benchmarked the implementation to confirm no regressions (see below)

Test coverage

Added unit tests by specifically serializing telemetry data, and confirming we get the correct results when we deserialize the other end (telemetry is one of the candidates for using this approach).

When running benchmarks, it became apparent that we had a serious regression in our allocations when we added GZIP-ing of our telemetry 😅 I didn't investigate the root cause, because switching to the new approach (in #8017) will resolve the issue anyway.

Overall conclusion:

  • In general, the new approach allocates slightly less than before
  • We have a big allocation and speed regression in GZip (specifically for telemetry), which the new approach will resolve completely
  • In general, the new approach allocates the same whether you use gzip or not
  • Throughput is roughly the same both before and after

ApiWebRequest (.NET FX)

Method Mean Allocated Alloc Ratio
ApiWebRequest_Before_Gzip 6.460 ms 572.44 KB 1.00
ApiWebRequest_After_Gzip 2.037 ms 20.75 KB 0.04
ApiWebRequest_Before 1.949 ms 22.34 KB 1.00
ApiWebRequest_After 1.908 ms 20.75 KB 0.93

HttpClientRequest (.NET Core 3.1, .NET 6) - had to re-enable keep-alive to avoid connection exhaustion!

Method Runtime Mean Allocated Alloc Ratio
HttpClient_Before_Gzip .NET 6.0 4,980.1 us 406.27 KB 0.98
HttpClient_After_Gzip .NET 6.0 161.5 us 12.04 KB 0.03
HttpClient_Before_Gzip .NET Core 3.1 4,847.4 us 414.43 KB 1.00
HttpClient_After_Gzip .NET Core 3.1 166.3 us 12.97 KB 0.03
HttpClient_Before .NET 6.0 129.2 us 13.03 KB 0.91
HttpClient_After .NET 6.0 154.9 us 12.05 KB 0.84
HttpClient_Before .NET Core 3.1 162.2 us 14.27 KB 1.00
HttpClient_After .NET Core 3.1 189.6 us 13.2 KB 0.92

HttpStreamRequest over UDS (.NET Core 3.1, .NET 6)

Method Runtime Mean Allocated Alloc Ratio
HttpStream_Before_Gzip .NET 6.0 5,362.8 us 440.87 KB 0.99
HttpStream_After_Gzip .NET 6.0 444.8 us 42.63 KB 0.10
HttpStream_Before_Gzip .NET Core 3.1 5,421.3 us 446.78 KB 1.00
HttpStream_After_Gzip .NET Core 3.1 462.6 us 42.77 KB 0.10
HttpStream_Before .NET 6.0 428.7 us 43.73 KB 1.01
HttpStream_After .NET 6.0 433.3 us 42.3 KB 0.97
HttpStream_Before .NET Core 3.1 445.3 us 43.5 KB 1.00
HttpStream_After .NET Core 3.1 448.5 us 42.62 KB 0.98

SocketHandlerRequest over UDS (..NET 6)

Method Mean Allocated Alloc Ratio
SocketHandler_Before_Gzip 5,070.65 us 406.26 KB 1.00
SocketHandler_After_Gzip 97.66 us 12.28 KB 0.03
SocketHandler_Before 53.95 us 13.01 KB 1.00
SocketHandler_After 87.64 us 12.28 KB 0.94
Benchmark used (approx)
using System;
using System.IO;
using System.IO.Compression;
using System.Net;
using System.Text;
using System.Threading.Tasks;
using BenchmarkDotNet.Attributes;
using BenchmarkDotNet.Configs;
using Datadog.Trace;
using Datadog.Trace.Agent;
using Datadog.Trace.Agent.StreamFactories;
using Datadog.Trace.Agent.Transports;
using Datadog.Trace.Configuration;
using Datadog.Trace.DogStatsd;
using Datadog.Trace.HttpOverStreams;
using Datadog.Trace.Tagging;
using Datadog.Trace.Telemetry;
using Datadog.Trace.Telemetry.Transports;
using Datadog.Trace.Util;
using Datadog.Trace.Vendors.Newtonsoft.Json;
using Datadog.Trace.Vendors.Newtonsoft.Json.Serialization;

namespace Benchmarks.Trace;

[MemoryDiagnoser]
[GroupBenchmarksBy(BenchmarkLogicalGroupRule.ByCategory)]
[CategoriesColumn]
public class TelemetryHttpClientBenchmark
{
    private const string BaseUrl = "http://localhost:5035";
    private const string Socket = @"C:\repos\temp\temp74\bin\Release\net10.0\test.socket";

    private TelemetryData _telemetryData;
    private ApiWebRequestFactory _apiWebRequestFactory;
    private Uri _apiEndpointUri;

#if NETCOREAPP3_1_OR_GREATER
    private HttpClientRequestFactory _httpClientRequestFactory;
    private Uri _httpClientEndpointUri;

    private HttpStreamRequestFactory _httpStreamRequestFactory;
    private Uri _httpStreamEndpointUri;
#endif
#if NET5_0_OR_GREATER
    private SocketHandlerRequestFactory _socketHandlerRequestFactory;
    private Uri _socketHandlerEndpointUri;
#endif

    [GlobalSetup]
    public void GlobalSetup()
    {
        _telemetryData = GetData();

        var config = TracerHelper.DefaultConfig;
        config.Add(ConfigurationKeys.TraceEnabled, false);
        var settings = TracerSettings.Create(config);

        _apiWebRequestFactory = new ApiWebRequestFactory(new Uri(BaseUrl), AgentHttpHeaderNames.MinimalHeaders);
        _apiEndpointUri = _apiWebRequestFactory.GetEndpoint("/");

#if NETCOREAPP3_1_OR_GREATER
        _httpClientRequestFactory = new HttpClientRequestFactory(new Uri(BaseUrl), AgentHttpHeaderNames.MinimalHeaders);
        _httpClientEndpointUri = _httpClientRequestFactory.GetEndpoint("/");

        _httpStreamRequestFactory = new HttpStreamRequestFactory(
            new UnixDomainSocketStreamFactory(Socket),
            new DatadogHttpClient(new MinimalAgentHeaderHelper()),
            new Uri(BaseUrl));
        _httpStreamEndpointUri = _httpStreamRequestFactory.GetEndpoint("/");
#endif

#if NET5_0_OR_GREATER
        _socketHandlerRequestFactory = new SocketHandlerRequestFactory(
            new UnixDomainSocketStreamFactory(Socket),
            AgentHttpHeaderNames.MinimalHeaders,
            new Uri(BaseUrl));
        _socketHandlerEndpointUri = _socketHandlerRequestFactory.GetEndpoint("/");
#endif
    }

    [GlobalCleanup]
    public void GlobalCleanup()
    {
    }

    [BenchmarkCategory("ApiWebRequest", "Uncompressed"), Benchmark(Baseline = true)]
    public async Task<int> ApiWebRequest_Before()
    {
        var request = _apiWebRequestFactory.Create(_apiEndpointUri);
        var data = SerializeTelemetry(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes(data)), "application/json", contentEncoding: null).ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("ApiWebRequest", "Gzip"), Benchmark(Baseline = true)]
    public async Task<int> ApiWebRequest_Before_Gzip()
    {
        var request = _apiWebRequestFactory.Create(_apiEndpointUri);
        var data = SerializeTelemetryWithGzip(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(data), "application/json", contentEncoding: "gzip").ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("ApiWebRequest", "Uncompressed"), Benchmark]
    public async Task<int> ApiWebRequest_After()
    {
        var request = _apiWebRequestFactory.Create(_apiEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }

    [BenchmarkCategory("ApiWebRequest", "Gzip"), Benchmark]
    public async Task<int> ApiWebRequest_After_Gzip()
    {
        var request = _apiWebRequestFactory.Create(_apiEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }

#if NETCOREAPP3_1_OR_GREATER

    [BenchmarkCategory("HttpClient", "Uncompressed"), Benchmark(Baseline = true)]
    public async Task<int> HttpClient_Before()
    {
        var request = _httpClientRequestFactory.Create(_httpClientEndpointUri);
        var data = SerializeTelemetry(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes(data)), "application/json", contentEncoding: null).ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpClient", "Gzip"), Benchmark(Baseline = true)]
    public async Task<int> HttpClient_Before_Gzip()
    {
        var request = _httpClientRequestFactory.Create(_httpClientEndpointUri);
        var data = SerializeTelemetryWithGzip(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(data), "application/json", contentEncoding: "gzip").ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpClient", "Uncompressed"), Benchmark]
    public async Task<int> HttpClient_After()
    {
        var request = _httpClientRequestFactory.Create(_httpClientEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpClient", "Gzip"), Benchmark]
    public async Task<int> HttpClient_After_Gzip()
    {
        var request = _httpClientRequestFactory.Create(_httpClientEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }
#endif

#if NETCOREAPP3_1_OR_GREATER
    [BenchmarkCategory("HttpStream", "Uncompressed"), Benchmark(Baseline = true)]
    public async Task<int> HttpStream_Before()
    {
        var request = _httpStreamRequestFactory.Create(_httpStreamEndpointUri);
        var data = SerializeTelemetry(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes(data)), "application/json", contentEncoding: null).ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpStream", "Gzip"), Benchmark(Baseline = true)]
    public async Task<int> HttpStream_Before_Gzip()
    {
        var request = _httpStreamRequestFactory.Create(_httpStreamEndpointUri);
        var data = SerializeTelemetryWithGzip(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(data), "application/json", contentEncoding: "gzip").ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpStream", "Uncompressed"), Benchmark]
    public async Task<int> HttpStream_After()
    {
        var request = _httpStreamRequestFactory.Create(_httpStreamEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }

    [BenchmarkCategory("HttpStream", "Gzip"), Benchmark]
    public async Task<int> HttpStream_After_Gzip()
    {
        var request = _httpStreamRequestFactory.Create(_httpStreamEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }
#endif

#if NET5_0_OR_GREATER
    [BenchmarkCategory("SocketHandler", "Uncompressed"), Benchmark(Baseline = true)]
    public async Task<int> SocketHandler_Before()
    {
        var request = _socketHandlerRequestFactory.Create(_socketHandlerEndpointUri);
        var data = SerializeTelemetry(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(Encoding.UTF8.GetBytes(data)), "application/json", contentEncoding: null).ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("SocketHandler", "Gzip"), Benchmark(Baseline = true)]
    public async Task<int> SocketHandler_Before_Gzip()
    {
        var request = _socketHandlerRequestFactory.Create(_socketHandlerEndpointUri);
        var data = SerializeTelemetryWithGzip(_telemetryData);
        using var response = await request.PostAsync(new ArraySegment<byte>(data), "application/json", contentEncoding: "gzip").ConfigureAwait(false);
        return response.StatusCode;
    }

    [BenchmarkCategory("SocketHandler", "Uncompressed"), Benchmark]
    public async Task<int> SocketHandler_After()
    {
        var request = _socketHandlerRequestFactory.Create(_socketHandlerEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }

    [BenchmarkCategory("SocketHandler", "Gzip"), Benchmark]
    public async Task<int> SocketHandler_After_Gzip()
    {
        var request = _socketHandlerRequestFactory.Create(_socketHandlerEndpointUri);
        using var response = await request.PostAsJsonAsync(request, compression: MultipartCompression.None);
        return response.StatusCode;
    }
#endif

    internal static string SerializeTelemetry<T>(T data) => JsonConvert.SerializeObject(data, Formatting.None, JsonTelemetryTransport.SerializerSettings);

    internal static byte[] SerializeTelemetryWithGzip<T>(T data)
    {
        using var memStream = new MemoryStream();
        using (var zipStream = new GZipStream(memStream, CompressionMode.Compress, true))
        {
            using var streamWriter = new StreamWriter(zipStream);
            using var jsonWriter = new JsonTextWriter(streamWriter);
            var serializer = new JsonSerializer { NullValueHandling = NullValueHandling.Ignore, ContractResolver = new DefaultContractResolver { NamingStrategy = new SnakeCaseNamingStrategy(), }, Formatting = Formatting.None };

            serializer.Serialize(jsonWriter, data);
        }

        return memStream.ToArray();
    }

    private TelemetryData GetData() =>
        new TelemetryData(
            requestType: TelemetryRequestTypes.GenerateMetrics,
            runtimeId: "20338dfd-f700-4e5c-b3f6-0d470f054ae8",
            seqId: 5672,
            tracerTime: 1628099086,
            application: new ApplicationTelemetryData(
                serviceName: "myapp",
                env: "prod",
                serviceVersion: "1.2.3",
                tracerVersion: "0.33.1",
                languageName: "node.js",
                languageVersion: "14.16.1",
                runtimeName: "dotnet",
                runtimeVersion: "7.0.3",
                commitSha: "testCommitSha",
                repositoryUrl: "testRepositoryUrl",
                processTags: "entrypoint.basedir:Users,entrypoint.workdir:Downloads"),
            host: new HostTelemetryData(
                hostname: "i-09ecf74c319c49be8",
                os: "GNU/Linux",
                architecture: "x86_64")
            {
                OsVersion = "ubuntu 18.04.5 LTS (Bionic Beaver)",
                KernelName = "Linux",
                KernelRelease = "5.4.0-1037-gcp",
                KernelVersion = "#40~18.04.1-Ubuntu SMP Fri Feb 5 15:41:35 UTC 2021"
            },
            payload: new GenerateMetricsPayload(
                new MetricData[]
                {
                    new(
                        "tracer_init_time",
                        new MetricSeries()
                        {
                            new(1575317847, 2241),
                            new(1575317947, 2352),
                        },
                        common: true,
                        type: MetricTypeConstants.Count)
                    {
                        Tags = new[]
                        {
                            "org_id: 2",
                            "environment:test"
                        }
                    },
                    new(
                        "app_sec_initialization_time",
                        new MetricSeries()
                        {
                            new(1575317447, 254),
                            new(1575317547, 643),
                        },
                        common: false,
                        type: MetricTypeConstants.Gauge)
                    {
                        Namespace = MetricNamespaceConstants.ASM,
                        Interval = 60,
                    },
                }));
}

Other details

Part of a small stack

@dd-trace-dotnet-ci-bot
Copy link

dd-trace-dotnet-ci-bot bot commented Dec 30, 2025

Execution-Time Benchmarks Report ⏱️

Execution-time results for samples comparing This PR (8019) and master.

✅ No regressions detected - check the details below

Full Metrics Comparison

FakeDbCommand

Metric Master (Mean ± 95% CI) Current (Mean ± 95% CI) Change Status
.NET Framework 4.8 - Baseline
duration68.16 ± (68.18 - 68.39) ms68.43 ± (68.47 - 68.65) ms+0.4%✅⬆️
.NET Framework 4.8 - Bailout
duration72.18 ± (72.02 - 72.30) ms72.05 ± (71.98 - 72.19) ms-0.2%
.NET Framework 4.8 - CallTarget+Inlining+NGEN
duration1004.61 ± (1008.90 - 1017.71) ms1005.02 ± (1011.20 - 1020.56) ms+0.0%✅⬆️
.NET Core 3.1 - Baseline
process.internal_duration_ms21.97 ± (21.93 - 22.01) ms21.96 ± (21.94 - 21.99) ms-0.0%
process.time_to_main_ms78.42 ± (78.26 - 78.57) ms78.89 ± (78.72 - 79.07) ms+0.6%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed10.90 ± (10.90 - 10.91) MB10.92 ± (10.91 - 10.92) MB+0.1%✅⬆️
runtime.dotnet.threads.count12 ± (12 - 12)12 ± (12 - 12)+0.0%
.NET Core 3.1 - Bailout
process.internal_duration_ms21.92 ± (21.90 - 21.94) ms21.91 ± (21.89 - 21.93) ms-0.1%
process.time_to_main_ms79.74 ± (79.65 - 79.84) ms79.93 ± (79.82 - 80.04) ms+0.2%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed10.95 ± (10.94 - 10.95) MB10.96 ± (10.95 - 10.96) MB+0.1%✅⬆️
runtime.dotnet.threads.count13 ± (13 - 13)13 ± (13 - 13)+0.0%
.NET Core 3.1 - CallTarget+Inlining+NGEN
process.internal_duration_ms241.58 ± (237.45 - 245.71) ms251.85 ± (247.74 - 255.95) ms+4.2%✅⬆️
process.time_to_main_ms469.82 ± (469.43 - 470.20) ms472.26 ± (471.78 - 472.73) ms+0.5%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed48.14 ± (48.12 - 48.16) MB48.19 ± (48.16 - 48.21) MB+0.1%✅⬆️
runtime.dotnet.threads.count28 ± (28 - 28)28 ± (28 - 28)-0.1%
.NET 6 - Baseline
process.internal_duration_ms20.61 ± (20.58 - 20.64) ms20.55 ± (20.52 - 20.59) ms-0.3%
process.time_to_main_ms67.90 ± (67.78 - 68.02) ms68.26 ± (68.14 - 68.38) ms+0.5%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed10.61 ± (10.61 - 10.61) MB10.64 ± (10.63 - 10.64) MB+0.2%✅⬆️
runtime.dotnet.threads.count10 ± (10 - 10)10 ± (10 - 10)+0.0%
.NET 6 - Bailout
process.internal_duration_ms20.52 ± (20.48 - 20.55) ms20.51 ± (20.48 - 20.53) ms-0.0%
process.time_to_main_ms68.72 ± (68.66 - 68.78) ms69.24 ± (69.17 - 69.31) ms+0.8%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed10.66 ± (10.66 - 10.67) MB10.75 ± (10.74 - 10.75) MB+0.8%✅⬆️
runtime.dotnet.threads.count11 ± (11 - 11)11 ± (11 - 11)+0.0%
.NET 6 - CallTarget+Inlining+NGEN
process.internal_duration_ms243.74 ± (241.62 - 245.86) ms248.51 ± (247.50 - 249.51) ms+2.0%✅⬆️
process.time_to_main_ms439.37 ± (438.95 - 439.79) ms440.52 ± (440.09 - 440.95) ms+0.3%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed48.69 ± (48.66 - 48.72) MB48.63 ± (48.60 - 48.66) MB-0.1%
runtime.dotnet.threads.count28 ± (28 - 28)28 ± (28 - 28)+0.0%✅⬆️
.NET 8 - Baseline
process.internal_duration_ms18.86 ± (18.83 - 18.89) ms18.81 ± (18.79 - 18.84) ms-0.2%
process.time_to_main_ms66.96 ± (66.86 - 67.07) ms67.32 ± (67.20 - 67.45) ms+0.5%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed7.67 ± (7.66 - 7.68) MB7.67 ± (7.66 - 7.67) MB-0.0%
runtime.dotnet.threads.count10 ± (10 - 10)10 ± (10 - 10)+0.0%
.NET 8 - Bailout
process.internal_duration_ms18.87 ± (18.84 - 18.89) ms18.88 ± (18.86 - 18.91) ms+0.1%✅⬆️
process.time_to_main_ms68.14 ± (68.09 - 68.20) ms68.37 ± (68.31 - 68.42) ms+0.3%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed7.73 ± (7.72 - 7.74) MB7.71 ± (7.71 - 7.72) MB-0.2%
runtime.dotnet.threads.count11 ± (11 - 11)11 ± (11 - 11)+0.0%
.NET 8 - CallTarget+Inlining+NGEN
process.internal_duration_ms179.43 ± (178.40 - 180.47) ms179.22 ± (178.26 - 180.18) ms-0.1%
process.time_to_main_ms426.67 ± (426.07 - 427.28) ms426.71 ± (426.12 - 427.31) ms+0.0%✅⬆️
runtime.dotnet.exceptions.count0 ± (0 - 0)0 ± (0 - 0)+0.0%
runtime.dotnet.mem.committed36.33 ± (36.30 - 36.36) MB36.35 ± (36.32 - 36.38) MB+0.1%✅⬆️
runtime.dotnet.threads.count27 ± (27 - 27)27 ± (27 - 27)+0.1%✅⬆️

HttpMessageHandler

Metric Master (Mean ± 95% CI) Current (Mean ± 95% CI) Change Status
.NET Framework 4.8 - Baseline
duration193.33 ± (193.49 - 194.38) ms192.95 ± (192.77 - 193.44) ms-0.2%
.NET Framework 4.8 - Bailout
duration196.51 ± (196.65 - 197.38) ms197.21 ± (196.99 - 197.54) ms+0.4%✅⬆️
.NET Framework 4.8 - CallTarget+Inlining+NGEN
duration1105.95 ± (1108.43 - 1116.17) ms1111.73 ± (1112.54 - 1120.15) ms+0.5%✅⬆️
.NET Core 3.1 - Baseline
process.internal_duration_ms187.65 ± (187.23 - 188.07) ms189.20 ± (188.83 - 189.58) ms+0.8%✅⬆️
process.time_to_main_ms80.43 ± (80.23 - 80.62) ms81.40 ± (81.14 - 81.65) ms+1.2%✅⬆️
runtime.dotnet.exceptions.count3 ± (3 - 3)3 ± (3 - 3)+0.0%
runtime.dotnet.mem.committed16.07 ± (16.04 - 16.10) MB16.11 ± (16.08 - 16.13) MB+0.2%✅⬆️
runtime.dotnet.threads.count20 ± (20 - 20)20 ± (19 - 20)-0.2%
.NET Core 3.1 - Bailout
process.internal_duration_ms187.09 ± (186.83 - 187.35) ms188.08 ± (187.64 - 188.51) ms+0.5%✅⬆️
process.time_to_main_ms81.80 ± (81.68 - 81.93) ms82.27 ± (82.12 - 82.43) ms+0.6%✅⬆️
runtime.dotnet.exceptions.count3 ± (3 - 3)3 ± (3 - 3)+0.0%
runtime.dotnet.mem.committed16.17 ± (16.14 - 16.19) MB16.18 ± (16.16 - 16.21) MB+0.1%✅⬆️
runtime.dotnet.threads.count21 ± (21 - 21)21 ± (21 - 21)-0.1%
.NET Core 3.1 - CallTarget+Inlining+NGEN
process.internal_duration_ms423.39 ± (420.21 - 426.57) ms427.43 ± (424.17 - 430.68) ms+1.0%✅⬆️
process.time_to_main_ms472.31 ± (471.64 - 472.97) ms475.70 ± (475.02 - 476.39) ms+0.7%✅⬆️
runtime.dotnet.exceptions.count3 ± (3 - 3)3 ± (3 - 3)+0.0%
runtime.dotnet.mem.committed58.81 ± (58.70 - 58.92) MB58.60 ± (58.48 - 58.72) MB-0.4%
runtime.dotnet.threads.count29 ± (29 - 30)29 ± (29 - 30)-0.0%
.NET 6 - Baseline
process.internal_duration_ms191.89 ± (191.51 - 192.28) ms192.45 ± (192.03 - 192.86) ms+0.3%✅⬆️
process.time_to_main_ms69.81 ± (69.67 - 69.95) ms69.86 ± (69.69 - 70.04) ms+0.1%✅⬆️
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed16.28 ± (16.18 - 16.37) MB16.22 ± (16.12 - 16.33) MB-0.3%
runtime.dotnet.threads.count19 ± (19 - 19)19 ± (19 - 19)-0.3%
.NET 6 - Bailout
process.internal_duration_ms191.02 ± (190.79 - 191.25) ms191.06 ± (190.73 - 191.38) ms+0.0%✅⬆️
process.time_to_main_ms70.83 ± (70.72 - 70.93) ms70.81 ± (70.71 - 70.90) ms-0.0%
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed16.22 ± (16.08 - 16.35) MB16.07 ± (15.93 - 16.22) MB-0.9%
runtime.dotnet.threads.count19 ± (19 - 19)19 ± (19 - 20)+0.2%✅⬆️
.NET 6 - CallTarget+Inlining+NGEN
process.internal_duration_ms455.89 ± (454.05 - 457.74) ms456.21 ± (454.43 - 457.99) ms+0.1%✅⬆️
process.time_to_main_ms444.94 ± (444.42 - 445.46) ms445.82 ± (445.32 - 446.32) ms+0.2%✅⬆️
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed58.42 ± (58.30 - 58.54) MB58.11 ± (58.01 - 58.21) MB-0.5%
runtime.dotnet.threads.count29 ± (29 - 30)30 ± (29 - 30)+0.2%✅⬆️
.NET 8 - Baseline
process.internal_duration_ms190.41 ± (190.01 - 190.81) ms190.08 ± (189.77 - 190.38) ms-0.2%
process.time_to_main_ms69.42 ± (69.24 - 69.59) ms69.35 ± (69.19 - 69.50) ms-0.1%
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed11.78 ± (11.75 - 11.81) MB11.75 ± (11.71 - 11.78) MB-0.2%
runtime.dotnet.threads.count18 ± (18 - 18)18 ± (18 - 18)-0.1%
.NET 8 - Bailout
process.internal_duration_ms189.67 ± (189.39 - 189.96) ms189.52 ± (189.28 - 189.75) ms-0.1%
process.time_to_main_ms70.53 ± (70.39 - 70.67) ms70.38 ± (70.30 - 70.47) ms-0.2%
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed11.81 ± (11.78 - 11.84) MB11.80 ± (11.77 - 11.83) MB-0.1%
runtime.dotnet.threads.count19 ± (19 - 19)19 ± (19 - 19)+0.1%✅⬆️
.NET 8 - CallTarget+Inlining+NGEN
process.internal_duration_ms362.18 ± (360.73 - 363.63) ms364.83 ± (363.26 - 366.39) ms+0.7%✅⬆️
process.time_to_main_ms428.11 ± (427.57 - 428.66) ms429.50 ± (428.78 - 430.22) ms+0.3%✅⬆️
runtime.dotnet.exceptions.count4 ± (4 - 4)4 ± (4 - 4)+0.0%
runtime.dotnet.mem.committed47.95 ± (47.92 - 47.98) MB47.97 ± (47.93 - 48.00) MB+0.0%✅⬆️
runtime.dotnet.threads.count29 ± (29 - 29)29 ± (29 - 29)+0.2%✅⬆️
Comparison explanation

Execution-time benchmarks measure the whole time it takes to execute a program, and are intended to measure the one-off costs. Cases where the execution time results for the PR are worse than latest master results are highlighted in **red**. The following thresholds were used for comparing the execution times:

  • Welch test with statistical test for significance of 5%
  • Only results indicating a difference greater than 5% and 5 ms are considered.

Note that these results are based on a single point-in-time result for each branch. For full results, see the dashboard.

Graphs show the p99 interval based on the mean and StdDev of the test run, as well as the mean value of the run (shown as a diamond below the graph).

Duration charts
FakeDbCommand (.NET Framework 4.8)
gantt
    title Execution time (ms) FakeDbCommand (.NET Framework 4.8)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (69ms)  : 67, 70
    master - mean (68ms)  : 67, 70

    section Bailout
    This PR (8019) - mean (72ms)  : 71, 73
    master - mean (72ms)  : 71, 74

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (1,016ms)  : 945, 1087
    master - mean (1,013ms)  : 948, 1078

Loading
FakeDbCommand (.NET Core 3.1)
gantt
    title Execution time (ms) FakeDbCommand (.NET Core 3.1)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (106ms)  : 104, 108
    master - mean (105ms)  : 103, 108

    section Bailout
    This PR (8019) - mean (107ms)  : 106, 108
    master - mean (107ms)  : 105, 108

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (749ms)  : 690, 809
    master - mean (735ms)  : 667, 804

Loading
FakeDbCommand (.NET 6)
gantt
    title Execution time (ms) FakeDbCommand (.NET 6)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (94ms)  : 92, 96
    master - mean (93ms)  : 91, 95

    section Bailout
    This PR (8019) - mean (94ms)  : 93, 96
    master - mean (94ms)  : 93, 95

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (714ms)  : 691, 738
    master - mean (708ms)  : 671, 745

Loading
FakeDbCommand (.NET 8)
gantt
    title Execution time (ms) FakeDbCommand (.NET 8)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (92ms)  : 90, 94
    master - mean (92ms)  : 90, 94

    section Bailout
    This PR (8019) - mean (93ms)  : 92, 94
    master - mean (93ms)  : 92, 94

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (634ms)  : 621, 648
    master - mean (634ms)  : 617, 650

Loading
HttpMessageHandler (.NET Framework 4.8)
gantt
    title Execution time (ms) HttpMessageHandler (.NET Framework 4.8)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (193ms)  : 190, 196
    master - mean (194ms)  : 190, 198

    section Bailout
    This PR (8019) - mean (197ms)  : 195, 200
    master - mean (197ms)  : 193, 201

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (1,116ms)  : 1061, 1171
    master - mean (1,112ms)  : 1054, 1170

Loading
HttpMessageHandler (.NET Core 3.1)
gantt
    title Execution time (ms) HttpMessageHandler (.NET Core 3.1)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (279ms)  : 273, 286
    master - mean (276ms)  : 272, 281

    section Bailout
    This PR (8019) - mean (278ms)  : 273, 283
    master - mean (277ms)  : 274, 280

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (935ms)  : 885, 984
    master - mean (925ms)  : 883, 967

Loading
HttpMessageHandler (.NET 6)
gantt
    title Execution time (ms) HttpMessageHandler (.NET 6)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (271ms)  : 263, 278
    master - mean (270ms)  : 266, 274

    section Bailout
    This PR (8019) - mean (270ms)  : 267, 274
    master - mean (270ms)  : 267, 273

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (930ms)  : 896, 965
    master - mean (931ms)  : 898, 964

Loading
HttpMessageHandler (.NET 8)
gantt
    title Execution time (ms) HttpMessageHandler (.NET 8)
    dateFormat  x
    axisFormat %Q
    todayMarker off
    section Baseline
    This PR (8019) - mean (269ms)  : 265, 274
    master - mean (269ms)  : 264, 275

    section Bailout
    This PR (8019) - mean (269ms)  : 266, 272
    master - mean (270ms)  : 266, 274

    section CallTarget+Inlining+NGEN
    This PR (8019) - mean (824ms)  : 803, 845
    master - mean (822ms)  : 807, 837

Loading

@andrewlock andrewlock force-pushed the andrew/optimize-json-post branch from d45ddc7 to cdfa351 Compare December 30, 2025 11:08
@andrewlock andrewlock added area:tracer The core tracer library (Datadog.Trace, does not include OpenTracing, native code, or integrations) type:performance Performance, speed, latency, resource usage (CPU, memory) labels Dec 30, 2025
@andrewlock andrewlock marked this pull request as ready for review December 30, 2025 11:32
@andrewlock andrewlock requested a review from a team as a code owner December 30, 2025 11:32
@datadog-datadog-prod-us1
Copy link

datadog-datadog-prod-us1 bot commented Dec 30, 2025

⚠️ Tests

Fix all issues with Cursor

⚠️ Warnings

❄️ 1 New flaky test detected

IpcClientTest from Datadog.Trace.Tests.Ci.Ipc.IpcTests (Datadog) (Fix with Cursor)
Timeout waiting for messages. Values went up to [20, 20]

ℹ️ Info

🧪 All tests passed

This comment will be updated automatically if new data arrives.
🔗 Commit SHA: cdfa351 | Docs | Datadog PR Page | Was this helpful? Give us feedback!

@pr-commenter
Copy link

pr-commenter bot commented Dec 30, 2025

Benchmarks

Benchmark execution time: 2025-12-30 12:08:12

Comparing candidate commit cdfa351 in PR branch andrew/optimize-json-post with baseline commit 4fb09d3 in branch master.

Found 9 performance improvements and 6 performance regressions! Performance is the same for 161 metrics, 10 unstable metrics.

scenario:Benchmarks.Trace.ActivityBenchmark.StartStopWithChild net6.0

  • 🟥 execution_time [+17.561ms; +23.762ms] or [+9.237%; +12.498%]

scenario:Benchmarks.Trace.Asm.AppSecBodyBenchmark.AllCycleMoreComplexBody net6.0

  • 🟩 execution_time [-17.652ms; -11.977ms] or [-8.218%; -5.576%]

scenario:Benchmarks.Trace.AspNetCoreBenchmark.SendRequest net6.0

  • 🟩 execution_time [-37.876ms; -36.221ms] or [-28.350%; -27.112%]

scenario:Benchmarks.Trace.CIVisibilityProtocolWriterBenchmark.WriteAndFlushEnrichedTraces net472

  • 🟩 execution_time [-61.261ms; -56.183ms] or [-25.960%; -23.809%]
  • 🟩 throughput [+301.169op/s; +327.906op/s] or [+31.719%; +34.535%]

scenario:Benchmarks.Trace.CIVisibilityProtocolWriterBenchmark.WriteAndFlushEnrichedTraces net6.0

  • 🟩 execution_time [-17.513ms; -12.267ms] or [-9.751%; -6.830%]
  • 🟩 throughput [+100.402op/s; +144.209op/s] or [+7.507%; +10.783%]

scenario:Benchmarks.Trace.CIVisibilityProtocolWriterBenchmark.WriteAndFlushEnrichedTraces netcoreapp3.1

  • 🟥 execution_time [+20.473ms; +26.066ms] or [+14.150%; +18.015%]
  • 🟥 throughput [-250.825op/s; -193.615op/s] or [-16.119%; -12.443%]

scenario:Benchmarks.Trace.ElasticsearchBenchmark.CallElasticsearchAsync netcoreapp3.1

  • 🟥 throughput [-45985.892op/s; -38044.203op/s] or [-10.585%; -8.757%]

scenario:Benchmarks.Trace.Iast.StringAspectsBenchmark.StringConcatAspectBenchmark netcoreapp3.1

  • 🟥 throughput [-251.379op/s; -113.368op/s] or [-11.940%; -5.385%]

scenario:Benchmarks.Trace.Log4netBenchmark.EnrichedLog netcoreapp3.1

  • 🟩 execution_time [-33.361ms; -29.147ms] or [-16.527%; -14.440%]

scenario:Benchmarks.Trace.SpanBenchmark.StartFinishScope net6.0

  • 🟩 execution_time [-12.549ms; -11.532ms] or [-5.837%; -5.364%]

scenario:Benchmarks.Trace.SpanBenchmark.StartFinishSpan netcoreapp3.1

  • 🟥 execution_time [+12.029ms; +17.411ms] or [+6.104%; +8.835%]

scenario:Benchmarks.Trace.SpanBenchmark.StartFinishTwoScopes netcoreapp3.1

  • 🟩 execution_time [-19.857ms; -13.900ms] or [-9.378%; -6.564%]

Copy link
Collaborator

@bouwkast bouwkast left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice 👍

}

private TelemetryData GetData() =>
new TelemetryData(
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where did this telemetry data come from? Just wondering

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just copy-pasted it from some other tests that use the same thing as a "somewhat representative example"

Copy link
Contributor

@zacharycmontoya zacharycmontoya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@andrewlock andrewlock merged commit c865b1b into master Jan 9, 2026
153 checks passed
@andrewlock andrewlock deleted the andrew/optimize-json-post branch January 9, 2026 12:10
@github-actions github-actions bot added this to the vNext-v3 milestone Jan 9, 2026
andrewlock added a commit that referenced this pull request Jan 9, 2026
## Summary of changes

Update telemetry and remote config to use the `PostAsJson<T>` method
introduced in #8019

## Reason for change

As shown in #8019, using the `PostAsJson<T>` has some performance
benefits (for telemetry, when using gzip, _significant_ benefits). It
may also make refactoring to other JSON libraries later somewhat easier.

## Implementation details

Update Telemetry and Remote Config to use the new streaming
`PostAsJson<T>` model, and remove the old implementation

## Test coverage

- Update `MockTracerAgent` to handle chunked encoding in the telemetry
and remote config responses (previously these were requiring a
`Content-Length`, but chunked responses don't have one)
- Add additional unit test to verify above testing fix
- Update other tests that require Content-Length to also allow chunked
encoding

## Other details

Part of a small stack

- #8019
- #8017 👈
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:tracer The core tracer library (Datadog.Trace, does not include OpenTracing, native code, or integrations) type:performance Performance, speed, latency, resource usage (CPU, memory)

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants