ResultSUCCESS
Tests 1 failed / 51 succeeded
Started2019-11-07 16:32
Elapsed2h9m
Work namespaceci-op-1jlm19c4
pod4.2.0-0.nightly-2019-11-07-162436-metal-serial

Test Failures


openshift-tests Monitor cluster while tests execute 1h20m

go run hack/e2e.go -v -test --test_args='--ginkgo.focus=openshift\-tests\sMonitor\scluster\swhile\stests\sexecute$'
19 error level events were detected during this test run:

Nov 07 17:25:14.173 E ns/openshift-marketplace pod/community-operators-85b6d6668c-r4zsq node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=community-operators container exited with code 137 (ContainerStatusUnknown): The container could not be located when the pod was terminated
Nov 07 17:25:14.333 E ns/openshift-marketplace pod/redhat-operators-68f4b646c7-dlvjl node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=redhat-operators container exited with code 2 (Error): 
Nov 07 17:25:15.375 E ns/openshift-image-registry pod/image-registry-846975798-wr8mc node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=registry container exited with code 137 (ContainerStatusUnknown): The container could not be located when the pod was terminated
Nov 07 17:25:15.534 E ns/openshift-monitoring pod/grafana-5d5d6cdf5-dp74q node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=grafana-proxy container exited with code 2 (Error): 
Nov 07 17:25:17.574 E ns/openshift-monitoring pod/prometheus-adapter-6648b8fc66-2df5s node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=prometheus-adapter container exited with code 137 (ContainerStatusUnknown): The container could not be located when the pod was terminated
Nov 07 17:25:18.133 E ns/openshift-monitoring pod/prometheus-adapter-6648b8fc66-w6wpr node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=prometheus-adapter container exited with code 2 (Error): I1107 17:01:51.541800       1 adapter.go:93] successfully using in-cluster auth\nI1107 17:01:51.669083       1 secure_serving.go:116] Serving securely on [::]:6443\n
Nov 07 17:25:18.933 E ns/openshift-ingress pod/router-default-85dc7b7799-l482l node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=router container exited with code 2 (Error): .go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:05:01.484312       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:05:06.482080       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:05:16.320110       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:05:21.319193       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:05:39.645006       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:17.594845       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:30.420221       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:35.418525       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:48.596874       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:07:08.715804       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:07:13.715116       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nW1107 17:15:57.529733       1 reflector.go:341] github.com/openshift/router/pkg/router/controller/factory/factory.go:112: watch of *v1.Route ended with: The resourceVersion for the provided watch is too old.\nW1107 17:25:01.633148       1 reflector.go:341] github.com/openshift/router/pkg/router/controller/factory/factory.go:112: watch of *v1.Route ended with: The resourceVersion for the provided watch is too old.\n
Nov 07 17:25:19.174 E ns/openshift-marketplace pod/certified-operators-7678846796-9bw2g node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=certified-operators container exited with code 2 (Error): 
Nov 07 17:25:20.334 E ns/openshift-marketplace pod/community-operators-85b6d6668c-lq5j7 node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=community-operators container exited with code 2 (Error): 
Nov 07 17:25:20.775 E ns/openshift-ingress pod/router-default-85dc7b7799-27qtn node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=router container exited with code 2 (Error):  and closed the connection; LastStreamID=51, ErrCode=NO_ERROR, debug=""\nE1107 17:06:11.822996       1 streamwatcher.go:109] Unable to decode an event from the watch stream: http2: server sent GOAWAY and closed the connection; LastStreamID=51, ErrCode=NO_ERROR, debug=""\nE1107 17:06:11.822991       1 streamwatcher.go:109] Unable to decode an event from the watch stream: http2: server sent GOAWAY and closed the connection; LastStreamID=51, ErrCode=NO_ERROR, debug=""\nW1107 17:06:11.838700       1 reflector.go:341] github.com/openshift/router/pkg/router/template/service_lookup.go:32: watch of *v1.Service ended with: too old resource version: 11915 (13408)\nI1107 17:06:17.597046       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:30.421442       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:35.420459       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:06:48.598892       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:07:08.791749       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nI1107 17:07:13.715978       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\nW1107 17:13:24.871325       1 reflector.go:341] github.com/openshift/router/pkg/router/controller/factory/factory.go:112: watch of *v1.Route ended with: The resourceVersion for the provided watch is too old.\nW1107 17:23:12.915824       1 reflector.go:341] github.com/openshift/router/pkg/router/controller/factory/factory.go:112: watch of *v1.Route ended with: The resourceVersion for the provided watch is too old.\nI1107 17:25:11.330499       1 router.go:561] Router reloaded:\n - Checking http://localhost:80 ...\n - Health check ok : 0 retry attempt(s).\n
Nov 07 17:25:21.378 E ns/openshift-monitoring pod/prometheus-k8s-0 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=rules-configmap-reloader container exited with code 2 (Error): 
Nov 07 17:25:21.378 E ns/openshift-monitoring pod/prometheus-k8s-0 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=prometheus-config-reloader container exited with code 2 (Error): 
Nov 07 17:25:21.378 E ns/openshift-monitoring pod/prometheus-k8s-0 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=prometheus-proxy container exited with code 2 (Error): 2019/11/07 17:02:11 provider.go:109: Defaulting client-id to system:serviceaccount:openshift-monitoring:prometheus-k8s\n2019/11/07 17:02:11 provider.go:114: Defaulting client-secret to service account token /var/run/secrets/kubernetes.io/serviceaccount/token\n2019/11/07 17:02:11 provider.go:291: Delegation of authentication and authorization to OpenShift is enabled for bearer tokens and client certificates.\n2019/11/07 17:02:11 oauthproxy.go:200: mapping path "/" => upstream "http://localhost:9090/"\n2019/11/07 17:02:11 oauthproxy.go:221: compiled skip-auth-regex => "^/metrics"\n2019/11/07 17:02:11 oauthproxy.go:227: OAuthProxy configured for  Client ID: system:serviceaccount:openshift-monitoring:prometheus-k8s\n2019/11/07 17:02:11 oauthproxy.go:237: Cookie settings: name:_oauth_proxy secure(https):true httponly:true expiry:168h0m0s domain:<default> refresh:disabled\n2019/11/07 17:02:11 main.go:154: using htpasswd file /etc/proxy/htpasswd/auth\n2019/11/07 17:02:11 http.go:96: HTTPS: listening on [::]:9091\n2019/11/07 17:03:11 oauthproxy.go:774: basicauth: 10.129.2.7:59908 Authorization header does not start with 'Basic', skipping basic authentication\n2019/11/07 17:07:42 oauthproxy.go:774: basicauth: 10.129.2.7:33164 Authorization header does not start with 'Basic', skipping basic authentication\n2019/11/07 17:12:12 oauthproxy.go:774: basicauth: 10.129.2.7:33918 Authorization header does not start with 'Basic', skipping basic authentication\n2019/11/07 17:16:43 oauthproxy.go:774: basicauth: 10.129.2.7:34676 Authorization header does not start with 'Basic', skipping basic authentication\n2019/11/07 17:21:13 oauthproxy.go:774: basicauth: 10.129.2.7:35404 Authorization header does not start with 'Basic', skipping basic authentication\n
Nov 07 17:25:22.774 E ns/openshift-monitoring pod/kube-state-metrics-f55c697ff-zp25b node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=kube-state-metrics container exited with code 2 (Error): 
Nov 07 17:25:23.334 E ns/openshift-console pod/downloads-6674c66cf4-2925g node/worker-0.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=download-server container exited with code 137 (Error): 9 17:22:26] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:27] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:36] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:37] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:46] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:47] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:56] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:22:57] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:06] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:07] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:16] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:17] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:26] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:27] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:36] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:37] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:46] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:47] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:56] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:23:57] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:06] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:07] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:16] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:17] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:26] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:27] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:36] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:37] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:46] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:47] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:56] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:24:57] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:25:06] "GET / HTTP/1.1" 200 -\n10.128.2.1 - - [07/Nov/2019 17:25:07] "GET / HTTP/1.1" 200 -\n
Nov 07 17:25:23.575 E ns/openshift-console pod/downloads-6674c66cf4-f7gc9 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=download-server container exited with code 137 (Error): 9 17:22:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:22:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:07] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:07] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:17] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:17] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:23:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:07] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:07] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:17] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:17] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:27] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:37] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:47] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:24:57] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:25:07] "GET / HTTP/1.1" 200 -\n10.130.0.1 - - [07/Nov/2019 17:25:07] "GET / HTTP/1.1" 200 -\n
Nov 07 17:25:24.175 E ns/openshift-monitoring pod/alertmanager-main-2 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=config-reloader container exited with code 2 (Error): 
Nov 07 17:25:24.175 E ns/openshift-monitoring pod/alertmanager-main-2 node/worker-1.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=alertmanager-proxy container exited with code 2 (Error): 2019/11/07 17:01:53 provider.go:109: Defaulting client-id to system:serviceaccount:openshift-monitoring:alertmanager-main\n2019/11/07 17:01:53 provider.go:114: Defaulting client-secret to service account token /var/run/secrets/kubernetes.io/serviceaccount/token\n2019/11/07 17:01:53 provider.go:291: Delegation of authentication and authorization to OpenShift is enabled for bearer tokens and client certificates.\n2019/11/07 17:01:53 oauthproxy.go:200: mapping path "/" => upstream "http://localhost:9093/"\n2019/11/07 17:01:53 oauthproxy.go:221: compiled skip-auth-regex => "^/metrics"\n2019/11/07 17:01:53 oauthproxy.go:227: OAuthProxy configured for  Client ID: system:serviceaccount:openshift-monitoring:alertmanager-main\n2019/11/07 17:01:53 oauthproxy.go:237: Cookie settings: name:_oauth_proxy secure(https):true httponly:true expiry:168h0m0s domain:<default> refresh:disabled\n2019/11/07 17:01:53 http.go:96: HTTPS: listening on [::]:9095\n
Nov 07 17:25:34.736 E ns/openshift-monitoring pod/prometheus-k8s-0 node/worker-2.ci-op-1jlm19c4-a4af7.origin-ci-int-aws.dev.rhcloud.com container=prometheus container exited with code 1 (Error): 

				
				Click to see stdout/stderrfrom junit_e2e_20191107-183449.xml

Find was mentions in log files


Show 51 Passed Tests

Show 170 Skipped Tests