-
Notifications
You must be signed in to change notification settings - Fork 5.3k
Closed
Labels
questionQuestions that are neither investigations, bugs, nor enhancementsQuestions that are neither investigations, bugs, nor enhancements
Description
Description:
All listeners from LDS are marked as "active", but they don't even listen() if at least one of them is waiting for an SDS update.
I believe there are 2 issues here, really:
- All listeners are marked as "active" instead of "warming",
- All listeners are blocked on a single listener in a warming state.
Repro steps / Admin and Stats Output:
Start "fake" xDS server that will accept connection, but will never respond:
$ nc -l -p 9903 &
Start Envoy:
bazel-bin/source/exe/envoy-static -c /tmp/configs/config_not_warming.yaml -l trace --service-node node --service-cluster cluster
Verify that listeners from LDS are configured, and marked as active and not warming:
$ curl -s 127.0.0.1:9901/listeners
["0.0.0.0:9080","0.0.0.0:9443"]
$ curl -s 127.0.0.1:9901/config_dump | jq -r ".configs[].dynamic_active_listeners[]? | .listener.address"
{
"socket_address": {
"address": "0.0.0.0",
"port_value": 9080
}
}
{
"socket_address": {
"address": "0.0.0.0",
"port_value": 9443
}
}
$ curl -s 127.0.0.1:9901/config_dump | jq -r ".configs[].dynamic_warming_listeners[]? | .listener.address"
<no output>
Try to connect to both listeners, discover that neither is listening:
$ curl 127.0.0.1:9080
curl: (7) Failed to connect to 127.0.0.1 port 9080: Connection refused
$ curl 127.0.0.1:9443
curl: (7) Failed to connect to 127.0.0.1 port 9443: Connection refused
Config:
$ cat /tmp/configs/config_not_warming.yaml
admin:
access_log_path: /dev/null
address:
socket_address:
address: 127.0.0.1
port_value: 9901
static_resources:
clusters:
- name: default_cluster
connect_timeout: 1s
type: STATIC
lb_policy: ROUND_ROBIN
load_assignment:
cluster_name: default_cluster
endpoints:
- lb_endpoints:
- endpoint:
address:
socket_address:
address: 127.0.0.1
port_value: 9902
- name: xds_cluster
connect_timeout: 1s
type: STATIC
lb_policy: ROUND_ROBIN
http2_protocol_options: {}
load_assignment:
cluster_name: xds_cluster
endpoints:
- lb_endpoints:
- endpoint:
address:
socket_address:
address: 127.0.0.1
port_value: 9903
dynamic_resources:
lds_config:
path: /tmp/configs/listeners_not_warming.yaml
$ cat /tmp/configs/listeners_not_warming.yaml
version_info: "0"
resources:
- '@type': type.googleapis.com/envoy.api.v2.Listener
name: http_listener
address:
socket_address:
address: 0.0.0.0
port_value: 9080
filter_chains:
- filters:
- name: envoy.tcp_proxy
config:
stat_prefix: stats
cluster: default_cluster
- '@type': type.googleapis.com/envoy.api.v2.Listener
name: https_listener
address:
socket_address:
address: 0.0.0.0
port_value: 9443
filter_chains:
- filters:
- name: envoy.tcp_proxy
config:
stat_prefix: stats
cluster: default_cluster
tls_context:
common_tls_context:
tls_certificate_sds_secret_configs:
- name: default_secret
sds_config:
api_config_source:
api_type: GRPC
grpc_services:
envoy_grpc:
cluster_name: xds_cluster
Logs:
[info][main] [source/server/server.cc:203] initializing epoch 0 (hot restart version=10.200.16384.127.options=capacity=16384, num_slots=8209 hash=228984379728933363 size=2654312)
[info][main] [source/server/server.cc:205] statically linked extensions:
[info][main] [source/server/server.cc:207] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log,envoy.wasm_access_log
[info][main] [source/server/server.cc:210] filters.http: envoy.buffer,envoy.cors,envoy.ext_authz,envoy.fault,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.rbac,envoy.f
ilters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.router,envoy.squash,envoy.wasm
[info][main] [source/server/server.cc:213] filters.listener: envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,envoy.listener.tls_inspector
[info][main] [source/server/server.cc:216] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,envoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.
network.thrift_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_proxy,envoy.tcp_proxy
[info][main] [source/server/server.cc:218] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd
[info][main] [source/server/server.cc:220] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.zipkin
[info][main] [source/server/server.cc:223] transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[info][main] [source/server/server.cc:226] transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.tap,raw_buffer,tls
[info][main] [source/server/server.cc:228] wasm: envoy.wasm
[info][main] [source/server/server.cc:275] admin address: 127.0.0.1:9901
[debug][main] [source/server/overload_manager_impl.cc:171] No overload action configured for envoy.overload_actions.stop_accepting_connections.
[info][config] [source/server/configuration_impl.cc:50] loading 0 static secret(s)
[info][config] [source/server/configuration_impl.cc:56] loading 2 cluster(s)
[debug][grpc] [source/common/grpc/google_async_client_impl.cc:42] completionThread running
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:793] adding TLS initial cluster default_cluster
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:793] adding TLS initial cluster xds_cluster
[trace][upstream] [source/common/upstream/upstream_impl.cc:933] Local locality:
[debug][upstream] [source/common/upstream/upstream_impl.cc:729] initializing secondary cluster default_cluster completed
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:938] membership update for TLS cluster default_cluster added 1 removed 0
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:91] cm init: init complete: cluster=default_cluster primary=0 secondary=0
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:63] cm init: adding: cluster=default_cluster primary=0 secondary=0
[trace][upstream] [source/common/upstream/upstream_impl.cc:933] Local locality:
[debug][upstream] [source/common/upstream/upstream_impl.cc:729] initializing secondary cluster xds_cluster completed
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:938] membership update for TLS cluster xds_cluster added 1 removed 0
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:91] cm init: init complete: cluster=xds_cluster primary=0 secondary=0
[debug][upstream] [source/common/upstream/cluster_manager_impl.cc:63] cm init: adding: cluster=xds_cluster primary=0 secondary=0
[info][upstream] [source/common/upstream/cluster_manager_impl.cc:136] cm init: all clusters initialized
[info][config] [source/server/configuration_impl.cc:65] loading 0 listener(s)
[info][config] [source/server/configuration_impl.cc:90] loading tracing configuration
[info][config] [source/server/configuration_impl.cc:110] loading stats sink configuration
[debug][file] [source/common/filesystem/inotify/watcher_impl.cc:49] added watch for directory: '/tmp/configs' file: 'listeners_not_warming.yaml' fd: 1
[info][main] [source/server/server.cc:466] all clusters initialized. initializing init manager
[debug][config] [bazel-out/k8-fastbuild/bin/source/common/config/_virtual_includes/filesystem_subscription_lib/common/config/filesystem_subscription_impl.h:58] Filesystem config refresh for /tmp/configs/listeners_not_warming.yaml
[warning][misc] [source/common/protobuf/utility.cc:129] Using deprecated option 'envoy.api.v2.listener.Filter.config'. This configuration will be removed from Envoy soon. Please see https://github.com/envoyproxy/envoy/blob/master/DEPRECATED.md for detail
s.
[warning][misc] [source/common/protobuf/utility.cc:129] Using deprecated option 'envoy.api.v2.listener.Filter.config'. This configuration will be removed from Envoy soon. Please see https://github.com/envoyproxy/envoy/blob/master/DEPRECATED.md for detail
s.
[debug][config] [source/server/listener_manager_impl.cc:729] begin add/update listener: name=http_listener hash=5551506821223866441
[debug][config] [source/server/listener_manager_impl.cc:55] filter #0:
[debug][config] [source/server/listener_manager_impl.cc:56] name: envoy.tcp_proxy
[debug][config] [source/server/listener_manager_impl.cc:59] config: {"cluster":"default_cluster","stat_prefix":"stats"}
[debug][config] [source/server/listener_manager_impl.cc:617] add active listener: name=http_listener, hash=5551506821223866441, address=0.0.0.0:9080
[info][upstream] [source/server/lds_api.cc:73] lds: add/update listener 'http_listener'
[debug][config] [source/server/listener_manager_impl.cc:729] begin add/update listener: name=https_listener hash=16855263350469956370
[debug][config] [source/common/config/grpc_mux_impl.cc:73] gRPC mux subscribe for type.googleapis.com/envoy.api.v2.auth.Secret
[trace][config] [bazel-out/k8-fastbuild/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:111] draining discovery requests 1
[debug][config] [source/common/config/grpc_mux_impl.cc:34] No stream available to sendDiscoveryRequest for type.googleapis.com/envoy.api.v2.auth.Secret
[debug][config] [bazel-out/k8-fastbuild/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:53] Establishing new gRPC bidi stream for rpc StreamSecrets(stream .envoy.api.v2.DiscoveryRequest) returns (stream .envoy.api.v
2.DiscoveryResponse);
[debug][router] [source/common/router/router.cc:320] [C0][S7940596661014428145] cluster 'xds_cluster' match for URL '/envoy.service.discovery.v2.SecretDiscoveryService/StreamSecrets'
[debug][router] [source/common/router/router.cc:381] [C0][S7940596661014428145] router decoding headers:
':method', 'POST'
':path', '/envoy.service.discovery.v2.SecretDiscoveryService/StreamSecrets'
':authority', 'xds_cluster'
':scheme', 'http'
'te', 'trailers'
'content-type', 'application/grpc'
'x-envoy-internal', 'true'
'x-forwarded-for', '10.138.0.2'
[debug][client] [source/common/http/codec_client.cc:26] [C0] connecting
[debug][connection] [source/common/network/connection_impl.cc:639] [C0] connecting to 127.0.0.1:9903
[debug][connection] [source/common/network/connection_impl.cc:648] [C0] connection in progress
[debug][http2] [source/common/http/http2/codec_impl.cc:721] [C0] setting stream-level initial window size to 268435456
[debug][http2] [source/common/http/http2/codec_impl.cc:743] [C0] updating connection-level initial window size to 268435456
[debug][pool] [source/common/http/conn_pool_base.cc:20] queueing request due to no available connections
[trace][config] [bazel-out/k8-fastbuild/bin/source/common/config/_virtual_includes/grpc_stream_lib/common/config/grpc_stream.h:111] draining discovery requests 1
[trace][config] [source/common/config/grpc_mux_impl.cc:59] Sending DiscoveryRequest for type.googleapis.com/envoy.api.v2.auth.Secret: node {
id: "node"
cluster: "cluster"
build_version: "5f7bf108a93e962bf21dce7bbdfd9294d747cc71/1.10.0-dev/Modified/DEBUG/BoringSSL"
}
resource_names: "default_secret"
type_url: "type.googleapis.com/envoy.api.v2.auth.Secret"
[trace][router] [source/common/router/router.cc:1018] [C0][S7940596661014428145] buffering 162 bytes
[debug][config] [source/server/listener_manager_impl.cc:55] filter #0:
[debug][config] [source/server/listener_manager_impl.cc:56] name: envoy.tcp_proxy
[debug][config] [source/server/listener_manager_impl.cc:59] config: {"cluster":"default_cluster","stat_prefix":"stats"}
[debug][config] [source/server/listener_manager_impl.cc:617] add active listener: name=https_listener, hash=16855263350469956370, address=0.0.0.0:9443
[info][upstream] [source/server/lds_api.cc:73] lds: add/update listener 'https_listener'
[debug][config] [bazel-out/k8-fastbuild/bin/source/common/config/_virtual_includes/filesystem_subscription_lib/common/config/filesystem_subscription_impl.h:70] Filesystem config update accepted for /tmp/configs/listeners_not_warming.yaml: version_info: $
0"
resources {
[type.googleapis.com/envoy.api.v2.Listener] {
name: "http_listener"
address {
socket_address {
address: "0.0.0.0"
port_value: 9080
}
}
filter_chains {
filters {
name: "envoy.tcp_proxy"
config {
fields {
key: "cluster"
value {
string_value: "default_cluster"
}
}
fields {
key: "stat_prefix"
value {
string_value: "stats"
}
}
}
}
}
}
}
resources {
[type.googleapis.com/envoy.api.v2.Listener] {
name: "https_listener"
address {
socket_address {
address: "0.0.0.0"
port_value: 9443
}
}
filter_chains {
tls_context {
common_tls_context {
tls_certificate_sds_secret_configs {
name: "default_secret"
sds_config {
api_config_source {
api_type: GRPC
grpc_services {
envoy_grpc {
cluster_name: "xds_cluster"
}
}
}
}
}
}
}
filters {
name: "envoy.tcp_proxy"
config {
fields {
key: "cluster"
value {
string_value: "default_cluster"
}
}
fields {
key: "stat_prefix"
value {
string_value: "stats"
}
}
}
}
}
}
}
[info][main] [source/server/server.cc:492] starting main dispatch loop
[debug][file] [source/common/filesystem/inotify/watcher_impl.cc:74] notification: fd: 1 mask: 20 file: listeners_not_warming.yaml
[debug][file] [source/common/filesystem/inotify/watcher_impl.cc:74] notification: fd: 1 mask: 1 file: listeners_not_warming.yaml
[debug][file] [source/common/filesystem/inotify/watcher_impl.cc:74] notification: fd: 1 mask: 10 file: listeners_not_warming.yaml
[trace][connection] [source/common/network/connection_impl.cc:440] [C0] socket event: 2
[trace][connection] [source/common/network/connection_impl.cc:508] [C0] write ready
[debug][connection] [source/common/network/connection_impl.cc:517] [C0] connected
[debug][client] [source/common/http/codec_client.cc:64] [C0] connected
[debug][pool] [source/common/http/http2/conn_pool.cc:83] [C0] creating stream
[debug][router] [source/common/router/router.cc:1128] [C0][S7940596661014428145] pool ready
[trace][http2] [source/common/http/http2/codec_impl.cc:554] [C0] send data: bytes=24
[trace][connection] [source/common/network/connection_impl.cc:376] [C0] writing 24 bytes, end_stream false
[trace][http2] [source/common/http/http2/codec_impl.cc:554] [C0] send data: bytes=21
[trace][connection] [source/common/network/connection_impl.cc:376] [C0] writing 21 bytes, end_stream false
[trace][http2] [source/common/http/http2/codec_impl.cc:508] [C0] sent frame type=4
[trace][http2] [source/common/http/http2/codec_impl.cc:554] [C0] send data: bytes=13
[trace][connection] [source/common/network/connection_impl.cc:376] [C0] writing 13 bytes, end_stream false
[trace][http2] [source/common/http/http2/codec_impl.cc:508] [C0] sent frame type=8
[trace][http2] [source/common/http/http2/codec_impl.cc:554] [C0] send data: bytes=132
[trace][connection] [source/common/network/connection_impl.cc:376] [C0] writing 132 bytes, end_stream false
[trace][http2] [source/common/http/http2/codec_impl.cc:508] [C0] sent frame type=1
[trace][connection] [source/common/network/connection_impl.cc:376] [C0] writing 171 bytes, end_stream false
[trace][http2] [source/common/http/http2/codec_impl.cc:508] [C0] sent frame type=0
[trace][connection] [source/common/network/connection_impl.cc:508] [C0] write ready
[trace][connection] [source/common/network/raw_buffer_socket.cc:62] [C0] write returns: 361
[trace][connection] [source/common/network/connection_impl.cc:440] [C0] socket event: 2
[trace][connection] [source/common/network/connection_impl.cc:508] [C0] write ready
[debug][main] [source/server/server.cc:145] flushing stats
cc @mattklein123 @htuch @duderino @costinm @jplevyak @silentdai @philrud @howardjohn @wattli @quanjielin @JimmyCYJ
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
questionQuestions that are neither investigations, bugs, nor enhancementsQuestions that are neither investigations, bugs, nor enhancements