Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/permissions/full_ml_access·ts - machine learning permissions for user with full ML access with data loaded (ft_ml_poweruser) should display elements on File Data Visualizer page correctly #104042

Closed
kibanamachine opened this issue Jul 1, 2021 · 16 comments
Assignees
Labels
blocker failed-test A test failure on a tracked branch, potentially flaky-test :ml skipped-test v8.0.0

Comments

@kibanamachine
Copy link
Contributor

kibanamachine commented Jul 1, 2021

A test failed on a tracked branch

TimeoutError: The element [data-test-subj="dataVisualizerPageFileLoading"] was still present when it should have disappeared.
Wait timed out after 10019ms
    at /dev/shm/workspace/parallel/23/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (internal/process/task_queues.js:95:5) {
  remoteStacktrace: ''
}

First failure: Jenkins Build

@kibanamachine kibanamachine added the failed-test A test failure on a tracked branch, potentially flaky-test label Jul 1, 2021
@botelastic botelastic bot added the needs-team Issues missing a team label label Jul 1, 2021
@mistic mistic added the :ml label Jul 1, 2021
@elasticmachine
Copy link
Contributor

Pinging @elastic/ml-ui (:ml)

@botelastic botelastic bot removed the needs-team Issues missing a team label label Jul 1, 2021
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

spalger added a commit that referenced this issue Jul 1, 2021
@spalger
Copy link
Contributor

spalger commented Jul 1, 2021

23 failures the last couple hours across master and PRs for tests in the functional/apps/ml/permissions directory. Something might have changed in ES to make these more flaky. Skipped the entire containing suite.

image

master/8.0: a06c0f1

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

1 similar comment
@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@droberts195
Copy link
Contributor

This will be the same problem as elastic/elasticsearch#74810. It should be fixed by elastic/elasticsearch#74814.

Given the increased prominence of file upload in 7.14 I think we should consider this a blocker for 7.14.0. It wouldn't look good to put a feature centre-stage in the add-data workflow, shout about it in the release blog, and then have it repeatedly time out when used.

@spalger
Copy link
Contributor

spalger commented Jul 1, 2021

I'll trigger a new ES snapshot build so we can validate that these tests aren't flaky anymore in a PR and get this unskipped to unblock 7.14 as quick as possible.

@spalger
Copy link
Contributor

spalger commented Jul 1, 2021

Interestingly though, these tests were only flaky in master, maybe the 7.14 backport of the problematic change missed the new nightly ES builds.

@droberts195
Copy link
Contributor

droberts195 commented Jul 1, 2021

Interestingly though, these tests were only flaky in master, maybe the 7.14 backport of the problematic change missed the new nightly ES builds.

Yes, you're right. As of last night the problematic change was not in 7.14. Ironically the ES test that picked this problem up failed often enough in the 7.14 backport PR to delay merging it. It was eventually merged after adding the performance fix to it - see elastic/elasticsearch#74829 (comment).

The inefficiency that causes timeouts has also been fixed in ES master by elastic/elasticsearch#74814.

I don't think it's desperate to unmute these Kibana tests on master. The tests are still unmuted on 7.14 so as long as they don't start failing on 7.14 when the next ES snapshot is promoted then it shows that the ES fixed worked.

@kibanamachine
Copy link
Contributor Author

New failure: Jenkins Build

@spalger
Copy link
Contributor

spalger commented Jul 1, 2021

Starting to see failures in 7.13? Was it backported that far back? How about the fix? The 8.0 snapshot has been updated, going to kick off rebuilds of all the snapshots in hopes of stemming this flakiness from spreading too far into backports.

ES snapshot builds for Kibana CI are now running for branches 7.x, 7.14, and 7.13

@droberts195
Copy link
Contributor

7.13 now has the fix.

However, unlike 7.14 there was a time when the 7.13 branch had the breakage but not the fix, so if you promoted a commit from this period it will cause this problem. If you can promote latest ES 7.13 that will fix it.

jgowdyelastic pushed a commit that referenced this issue Jul 6, 2021
jgowdyelastic pushed a commit that referenced this issue Jul 6, 2021
madirey pushed a commit to madirey/kibana that referenced this issue Jul 6, 2021
@pheyos
Copy link
Member

pheyos commented Jul 12, 2021

Fixed with #104387

@pheyos pheyos closed this as completed Jul 12, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocker failed-test A test failure on a tracked branch, potentially flaky-test :ml skipped-test v8.0.0
Projects
None yet
Development

No branches or pull requests

6 participants