Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark instrumentation tests are flaky #6957

Open
PerfectSlayer opened this issue Apr 26, 2024 · 0 comments
Open

Spark instrumentation tests are flaky #6957

PerfectSlayer opened this issue Apr 26, 2024 · 0 comments
Labels
inst: apache spark Apache Spark instrumentation tag: flaky test Flaky tests

Comments

@PerfectSlayer
Copy link
Contributor

Test suite

compute a JOIN sql query plan from Spark32SqlTest.
Marked flaky by CircleCI

Error

Condition not satisfied:

expected[key] == actual[key]
|       ||    |  |     ||
|       ||    |  |     |nodeId
|       ||    |  |     -178151041
|       ||    |  [node:Exchange, nodeId:-178151041, metrics:[[shuffle records written:1, type:sum], [shuffle write time:615575, type:nsTiming], [records read:1, type:sum], [local bytes read:59, type:size], [fetch wait time:0, type:timing], [local blocks read:1, type:sum], [data size:16, type:size], [shuffle bytes written:59, type:size]]]
|       ||    false
|       |nodeId
|       -178148375
[node:Exchange, nodeId:-178148375, metrics:[[data size:any, type:size], [fetch wait time:any, type:timing], [local blocks read:any, type:sum], [local bytes read:any, type:size], [records read:any, type:sum], [shuffle bytes written:any, type:size], [shuffle records written:any, type:sum], [shuffle write time:any, type:nsTiming]]]

[node:Exchange, nodeId:-178148375, metrics:[[data size:any, type:size], [fetch wait time:any, type:timing], [local blocks read:any, type:sum], [local bytes read:any, type:size], [records read:any, type:sum], [shuffle bytes written:any, type:size], [shuffle records written:any, type:sum], [shuffle write time:any, type:nsTiming]]] does not match [node:Exchange, nodeId:-178151041, metrics:[[shuffle records written:1, type:sum], [shuffle write time:615575, type:nsTiming], [records read:1, type:sum], [local bytes read:59, type:size], [fetch wait time:0, type:timing], [local blocks read:1, type:sum], [data size:16, type:size], [shuffle bytes written:59, type:size]]]

	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals_closure1(AbstractSpark24SqlTest.groovy:72)
	at groovy.lang.Closure.call(Closure.java:412)
	at groovy.lang.Closure.call(Closure.java:428)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals(AbstractSpark24SqlTest.groovy:66)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals_closure7(AbstractSpark24SqlTest.groovy:109)
	at groovy.lang.Closure.call(Closure.java:412)
	at groovy.lang.Closure.call(Closure.java:428)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals(AbstractSpark24SqlTest.groovy:108)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals_closure7(AbstractSpark24SqlTest.groovy:109)
	at groovy.lang.Closure.call(Closure.java:412)
	at groovy.lang.Closure.call(Closure.java:428)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals(AbstractSpark24SqlTest.groovy:108)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals_closure7(AbstractSpark24SqlTest.groovy:109)
	at groovy.lang.Closure.call(Closure.java:412)
	at groovy.lang.Closure.call(Closure.java:428)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals(AbstractSpark24SqlTest.groovy:108)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals_closure7(AbstractSpark24SqlTest.groovy:109)
	at groovy.lang.Closure.call(Closure.java:412)
	at groovy.lang.Closure.call(Closure.java:428)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertSQLPlanEquals(AbstractSpark24SqlTest.groovy:108)
	at datadog.trace.instrumentation.spark.AbstractSpark24SqlTest.assertStringSQLPlanEquals(AbstractSpark24SqlTest.groovy:39)
	at datadog.trace.instrumentation.spark.AbstractSpark32SqlTest$__spock_feature_2_1_closure2.closure14$_closure18(AbstractSpark32SqlTest.groovy:145)
	at datadog.trace.agent.test.asserts.SpanAssert.assertSpan(SpanAssert.groovy:37)
	at datadog.trace.agent.test.asserts.SpanAssert.assertSpan(SpanAssert.groovy:28)
	at datadog.trace.agent.test.asserts.TraceAssert.span(TraceAssert.groovy:75)
	at datadog.trace.agent.test.asserts.TraceAssert.span(TraceAssert.groovy:63)
	at datadog.trace.instrumentation.spark.AbstractSpark32SqlTest.$spock_feature_2_1_closure2$_closure14(AbstractSpark32SqlTest.groovy:141)
	at datadog.trace.agent.test.asserts.TraceAssert.assertTrace(TraceAssert.groovy:50)
	at datadog.trace.agent.test.asserts.ListWriterAssert.trace(ListWriterAssert.groovy:124)
	at datadog.trace.agent.test.asserts.ListWriterAssert.trace(ListWriterAssert.groovy:109)
	at datadog.trace.instrumentation.spark.AbstractSpark32SqlTest.compute a JOIN sql query plan_closure2(AbstractSpark32SqlTest.groovy:126)
	at datadog.trace.agent.test.asserts.ListWriterAssert.assertTraces(ListWriterAssert.groovy:59)
	at datadog.trace.agent.test.asserts.ListWriterAssert.assertTraces(ListWriterAssert.groovy:41)
	at datadog.trace.agent.test.AgentTestRunner.assertTraces(AgentTestRunner.groovy:600)
	at datadog.trace.agent.test.AgentTestRunner.assertTraces(AgentTestRunner.groovy:589)
	at datadog.trace.instrumentation.spark.AbstractSpark32SqlTest.compute a JOIN sql query plan(AbstractSpark32SqlTest.groovy:125)

Cause

The SQL plan node id does not match.

Environment

Any version of Java and JVM.

Logs

test_spark32.zip

@PerfectSlayer PerfectSlayer added tag: flaky test Flaky tests inst: apache spark Apache Spark instrumentation labels Apr 26, 2024
PerfectSlayer added a commit that referenced this issue Apr 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
inst: apache spark Apache Spark instrumentation tag: flaky test Flaky tests
Projects
None yet
Development

No branches or pull requests

1 participant