Hit during backport PR #10401
https://jenkins.cilium.io/job/Cilium-PR-Ginkgo-Tests-K8s/2911/testReport/junit/Suite-k8s-1/16/K8sServicesTest_External_services_To_Services_first_endpoint_creation/
Stacktrace
/home/jenkins/workspace/Cilium-PR-Ginkgo-Tests-K8s/1.16-gopath/src/github.com/cilium/cilium/test/ginkgo-ext/scopes.go:384
Endpoints are not ready after timeout
Expected
<*helpers.SSHMetaError | 0xc00141a3e0>: Error: context deadline exceeded
Extended info: Cilium Pod: cilium-56pmk Endpoint: 898 Identity: 5 State: ready
Cilium Pod: cilium-56pmk Endpoint: 1479 Identity: 5 State: ready
Cilium Pod: cilium-56pmk Endpoint: 1978 Identity: 4 State: ready
Cilium Pod: cilium-56pmk Endpoint: 2221 Identity: 3334 State: ready
Cilium Pod: cilium-56pmk Endpoint: 3532 Identity: 5 State: ready
Cilium Pod: cilium-56pmk Endpoint: 3844 Identity: 36243 State: ready
Cilium Pod: cilium-x42zq Endpoint: 2079 Identity: 61585 State: ready
Cilium Pod: cilium-x42zq Endpoint: 2816 Identity: 4 State: ready
to be nil
/home/jenkins/workspace/Cilium-PR-Ginkgo-Tests-K8s/1.16-gopath/src/github.com/cilium/cilium/test/k8sT/Services.go:832
Standard Output
Number of "context deadline exceeded" in logs: 0
Number of "level=error" in logs: 0
Number of "level=warning" in logs: 0
Number of "Cilium API handler panicked" in logs: 0
Number of "Goroutine took lock for more than" in logs: 0
No errors/warnings found in logs
Cilium pods: [cilium-56pmk cilium-x42zq]
Netpols loaded:
CiliumNetworkPolicies loaded:
Endpoint Policy Enforcement:
Pod Ingress Egress
toservices false false
coredns-5495c8f48d-2f5nl false false
app1-68cb4f68c5-cftnr true true
app1-68cb4f68c5-s75bl true true
app2-776cf4f9c6-p9rkr false false
app3-65f9dc989c-fx756 true true
⚠️ Cilium agent 'cilium-56pmk': Status: Ok Health: Ok Nodes "" ContinerRuntime: Kubernetes: Ok KVstore: Ok Controllers: Total 33 Failed 3
Failed controllers:
controller resolve-labels-default/app1-68cb4f68c5-cftnr failure 'pod.core "app1-68cb4f68c5-cftnr" not found'
controller resolve-labels-default/app1-68cb4f68c5-s75bl failure 'pod.core "app1-68cb4f68c5-s75bl" not found'
controller resolve-labels-default/app3-65f9dc989c-fx756 failure 'pod.core "app3-65f9dc989c-fx756" not found'
Cilium agent 'cilium-x42zq': Status: Ok Health: Ok Nodes "" ContinerRuntime: Kubernetes: Ok KVstore: Ok Controllers: Total 15 Failed 0
Failed controllers:
controller resolve-labels-default/app1-68cb4f68c5-cftnr failure 'pod.core "app1-68cb4f68c5-cftnr" not found'
controller resolve-labels-default/app1-68cb4f68c5-s75bl failure 'pod.core "app1-68cb4f68c5-s75bl" not found'
controller resolve-labels-default/app3-65f9dc989c-fx756 failure 'pod.core "app3-65f9dc989c-fx756" not found'
Standard Error
===================== TEST FAILED =====================
cmd: kubectl get pods -o wide --all-namespaces
Exitcode: 0
Stdout:
NAMESPACE NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
default toservices 1/1 Running 0 4m6s 10.10.1.143 k8s2 <none> <none>
kube-system cilium-56pmk 1/1 Running 0 8m41s 192.168.36.11 k8s1 <none> <none>
kube-system cilium-operator-7db659d4db-frcd6 1/1 Running 0 11m 192.168.36.11 k8s1 <none> <none>
kube-system cilium-x42zq 1/1 Running 0 8m41s 192.168.36.12 k8s2 <none> <none>
kube-system coredns-5495c8f48d-2f5nl 1/1 Running 0 17m 10.10.0.82 k8s1 <none> <none>
kube-system etcd-k8s1 1/1 Running 0 16m 192.168.36.11 k8s1 <none> <none>
kube-system kube-apiserver-k8s1 1/1 Running 0 17m 192.168.36.11 k8s1 <none> <none>
kube-system kube-controller-manager-k8s1 1/1 Running 0 16m 192.168.36.11 k8s1 <none> <none>
kube-system kube-proxy-8z86s 1/1 Running 0 11m 192.168.36.12 k8s2 <none> <none>
kube-system kube-proxy-c5vw7 1/1 Running 0 17m 192.168.36.11 k8s1 <none> <none>
kube-system kube-scheduler-k8s1 1/1 Running 0 16m 192.168.36.11 k8s1 <none> <none>
kube-system log-gatherer-lz67b 1/1 Running 0 11m 192.168.36.12 k8s2 <none> <none>
kube-system log-gatherer-pjlmv 1/1 Running 0 11m 192.168.36.11 k8s1 <none> <none>
kube-system registry-adder-lfqb8 1/1 Running 0 11m 192.168.36.12 k8s2 <none> <none>
kube-system registry-adder-qpcd9 1/1 Running 0 11m 192.168.36.11 k8s1 <none> <none>
Stderr:
Fetching command output from pods [cilium-56pmk cilium-x42zq]
cmd: kubectl exec -n kube-system cilium-56pmk -- cilium service list
Exitcode: 0
Stdout:
ID Frontend Service Type Backend
1 10.96.0.10:53 ClusterIP 1 => 10.10.0.82:53
2 10.96.0.10:9153 ClusterIP 1 => 10.10.0.82:9153
3 10.96.0.1:443 ClusterIP 1 => 192.168.36.11:6443
Stderr:
cmd: kubectl exec -n kube-system cilium-56pmk -- cilium endpoint list
Exitcode: 0
Stdout:
ENDPOINT POLICY (ingress) POLICY (egress) IDENTITY LABELS (source:key[=value]) IPv6 IPv4 STATUS
ENFORCEMENT ENFORCEMENT
898 Enabled Enabled 5 reserved:init f00d::a0a:0:0:af1 10.10.0.151 ready
1479 Enabled Enabled 5 reserved:init f00d::a0a:0:0:969e 10.10.0.149 ready
1978 Disabled Disabled 4 reserved:health f00d::a0a:0:0:6e0a 10.10.0.136 ready
2221 Disabled Disabled 3334 k8s:io.cilium.k8s.policy.cluster=default f00d::a0a:0:0:44fa 10.10.0.82 ready
k8s:io.cilium.k8s.policy.serviceaccount=coredns
k8s:io.kubernetes.pod.namespace=kube-system
k8s:k8s-app=kube-dns
3532 Enabled Enabled 5 reserved:init f00d::a0a:0:0:d858 10.10.0.128 ready
3844 Disabled Disabled 36243 k8s:appSecond=true f00d::a0a:0:0:4f5b 10.10.0.164 ready
k8s:id=app2
k8s:io.cilium.k8s.policy.cluster=default
k8s:io.cilium.k8s.policy.serviceaccount=app2-account
k8s:io.kubernetes.pod.namespace=default
k8s:zgroup=testapp
Stderr:
cmd: kubectl exec -n kube-system cilium-x42zq -- cilium service list
Exitcode: 0
Stdout:
ID Frontend Service Type Backend
1 10.96.0.1:443 ClusterIP 1 => 192.168.36.11:6443
2 10.96.0.10:53 ClusterIP 1 => 10.10.0.82:53
3 10.96.0.10:9153 ClusterIP 1 => 10.10.0.82:9153
Stderr:
cmd: kubectl exec -n kube-system cilium-x42zq -- cilium endpoint list
Exitcode: 0
Stdout:
ENDPOINT POLICY (ingress) POLICY (egress) IDENTITY LABELS (source:key[=value]) IPv6 IPv4 STATUS
ENFORCEMENT ENFORCEMENT
2079 Disabled Disabled 61585 k8s:io.cilium.k8s.policy.cluster=default f00d::a0a:100:0:ff0b 10.10.1.143 ready
k8s:io.cilium.k8s.policy.serviceaccount=default
k8s:io.kubernetes.pod.namespace=default
k8s:test=toservices
k8s:zgroup=external
2816 Disabled Disabled 4 reserved:health f00d::a0a:100:0:9aa2 10.10.1.222 ready
Stderr:
===================== Exiting AfterFailed =====================
af777943_K8sServicesTest_External_services_To_Services_first_endpoint_creation(1).zip
Hit during backport PR #10401
https://jenkins.cilium.io/job/Cilium-PR-Ginkgo-Tests-K8s/2911/testReport/junit/Suite-k8s-1/16/K8sServicesTest_External_services_To_Services_first_endpoint_creation/
Stacktrace
Standard Output
Standard Error
af777943_K8sServicesTest_External_services_To_Services_first_endpoint_creation(1).zip