Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

LeaderElectorTest is flaky in GitHub Actions #1480

Closed
brendandburns opened this issue Jan 6, 2021 · 3 comments
Closed

LeaderElectorTest is flaky in GitHub Actions #1480

brendandburns opened this issue Jan 6, 2021 · 3 comments

Comments

@brendandburns
Copy link
Contributor

It looks like requests to the Kubernetes API server are timing out, here is a detailed log:

[INFO] Running io.kubernetes.client.e2e.extended.leaderelection.LeaderElectorTest
8601 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Extract resourceVersion 1006 list meta
8605 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Start watching with 1006...
8606 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Start watch with resource version 1006
8623 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 42
8624 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 43
8624 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 155
8625 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 244
8626 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 982
8627 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 1006
8629 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 13
8657 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Start leader election with lock default/leader-election-it
8657 [candidate-leader-elector-main] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Attempting to acquire leader lease...
8665 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
8674 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is true
11661 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Successfully acquired lease, became leader
11662 [candidate-leader-elector-main] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Attempting to renew leader lease...
11669 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Update lock to renew lease
11672 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Closing...
11696 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.ConfigMapLock - received 0 when updating configmap lock
io.kubernetes.client.openapi.ApiException: java.io.InterruptedIOException: interrupted
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:892)
	at io.kubernetes.client.openapi.apis.CoreV1Api.replaceNamespacedConfigMapWithHttpInfo(CoreV1Api.java:50965)
	at io.kubernetes.client.openapi.apis.CoreV1Api.replaceNamespacedConfigMap(CoreV1Api.java:50925)
	at io.kubernetes.client.extended.leaderelection.resourcelock.ConfigMapLock.update(ConfigMapLock.java:124)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.updateLock(LeaderElector.java:348)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:328)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.lambda$renewLoop$2(LeaderElector.java:208)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.InterruptedIOException: interrupted
	at okio.Timeout.throwIfReached(Timeout.java:146)
	at okio.Okio$1.write(Okio.java:76)
	at okio.AsyncTimeout$1.write(AsyncTimeout.java:180)
	at okio.RealBufferedSink.flush(RealBufferedSink.java:224)
	at okhttp3.internal.http1.Http1ExchangeCodec.finishRequest(Http1ExchangeCodec.java:190)
	at okhttp3.internal.connection.Exchange.finishRequest(Exchange.java:101)
	at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.java:86)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at io.kubernetes.client.openapi.ApiClient$2.intercept(ApiClient.java:1274)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:43)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:94)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:88)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:229)
	at okhttp3.RealCall.execute(RealCall.java:81)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:888)
	... 10 more
11698 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Failed to renew lease, lose leadership
11698 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Giving up the lock....
11713 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Closed
11757 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Start leader election with lock default/leader-election-it
11757 [candidate-leader-elector-main] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Attempting to acquire leader lease...
11768 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
11776 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is true
13214 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 1033
13226 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 1034
13568 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 1038
13581 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 1039
14763 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Successfully acquired lease, became leader
14764 [candidate-leader-elector-main] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Attempting to renew leader lease...
14764 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Closing...
14766 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.LeaderElector - Error retrieving resource lock default/leader-election-it
io.kubernetes.client.openapi.ApiException: java.io.InterruptedIOException: interrupted
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:892)
	at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedEndpointsWithHttpInfo(CoreV1Api.java:46089)
	at io.kubernetes.client.openapi.apis.CoreV1Api.readNamespacedEndpoints(CoreV1Api.java:46059)
	at io.kubernetes.client.extended.leaderelection.resourcelock.EndpointsLock.get(EndpointsLock.java:61)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:258)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.lambda$renewLoop$2(LeaderElector.java:208)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.InterruptedIOException: interrupted
	at okio.Timeout.throwIfReached(Timeout.java:146)
	at okio.Okio$1.write(Okio.java:76)
	at okio.AsyncTimeout$1.write(AsyncTimeout.java:180)
	at okio.RealBufferedSink.flush(RealBufferedSink.java:224)
	at okhttp3.internal.http1.Http1ExchangeCodec.finishRequest(Http1ExchangeCodec.java:190)
	at okhttp3.internal.connection.Exchange.finishRequest(Exchange.java:101)
	at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.java:86)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at io.kubernetes.client.openapi.ApiClient$2.intercept(ApiClient.java:1274)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.java:43)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.java:94)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.java:93)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.java:88)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:142)
	at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.java:117)
	at okhttp3.RealCall.getResponseWithInterceptorChain(RealCall.java:229)
	at okhttp3.RealCall.execute(RealCall.java:81)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:888)
	... 9 more
14767 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Giving up the lock....
14767 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Failed to renew lease, lose leadership
14798 [Time-limited test] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Closed
14841 [candidate-leader-elector-main] INFO io.kubernetes.client.extended.leaderelection.LeaderElector - Start leader election with lock default/leader-election-it
14841 [candidate-leader-elector-main] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Attempting to acquire leader lease...
14846 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
14854 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
14855 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
20156 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
20159 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
20160 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
25460 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
25465 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
25470 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
30771 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
30774 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
30774 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
36077 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
36080 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
36080 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
41381 [leader-elector-lease-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - Lock not found, try to create it
41385 [leader-elector-lease-worker-1] ERROR io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock - received 400 when creating lease lock
io.kubernetes.client.openapi.ApiException: Bad Request
	at io.kubernetes.client.openapi.ApiClient.handleResponse(ApiClient.java:977)
	at io.kubernetes.client.openapi.ApiClient.execute(ApiClient.java:889)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLeaseWithHttpInfo(CoordinationV1Api.java:228)
	at io.kubernetes.client.openapi.apis.CoordinationV1Api.createNamespacedLease(CoordinationV1Api.java:193)
	at io.kubernetes.client.extended.leaderelection.resourcelock.LeaseLock.create(LeaseLock.java:76)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.createLock(LeaderElector.java:338)
	at io.kubernetes.client.extended.leaderelection.LeaderElector.tryAcquireOrRenew(LeaderElector.java:270)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
41385 [leader-elector-scheduled-worker-1] DEBUG io.kubernetes.client.extended.leaderelection.LeaderElector - The tryAcquireOrRenew result is false
43583 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] INFO io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Read timeout retry list and watch
44584 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] INFO io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Start listing and watching...
44597 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Extract resourceVersion 1039 list meta
44598 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Start watching with 1039...
44598 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Start watch with resource version 1039
44601 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 42
44605 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 43
44605 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 155
44605 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 244
44605 [controller-reflector-io.kubernetes.client.openapi.models.V1Namespace-1] DEBUG io.kubernetes.client.informer.cache.ReflectorRunnable - class io.kubernetes.client.openapi.models.V1Namespace#Receiving resourceVersion 13
Error:  Tests run: 3, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 36.257 s <<< FAILURE! - in io.kubernetes.client.e2e.extended.leaderelection.LeaderElectorTest
Error:  testSingleCandidateLeaderElection[Lease]  Time elapsed: 30.048 s  <<< ERROR!
org.junit.runners.model.TestTimedOutException: test timed out after 30000 milliseconds
	at sun.misc.Unsafe.park(Native Method)
	at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.parkAndCheckInterrupt(AbstractQueuedSynchronizer.java:836)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:997)
	at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1304)
	at java.util.concurrent.CountDownLatch.await(CountDownLatch.java:231)
	at io.kubernetes.client.e2e.extended.leaderelection.LeaderElectorTest.testSingleCandidateLeaderElection(LeaderElectorTest.java:134)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:288)
	at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:282)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.lang.Thread.run(Thread.java:748)

[INFO] 
[INFO] Results:
[INFO] 
Error:  Errors: 
Error:    LeaderElectorTest.testSingleCandidateLeaderElection:134 » TestTimedOut test ti...
@brendandburns
Copy link
Contributor Author

We should consider disabling this test until this is fixed.

@yue9944882
Copy link
Member

@brendandburns #1477 will be fixing the failing e2e test

@brendandburns
Copy link
Contributor Author

I think this is fixed, closing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants