Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

java.lang.UnsupportedOperationException: newFileChannel #553

Closed
wajda opened this issue Sep 3, 2021 · 9 comments
Closed

java.lang.UnsupportedOperationException: newFileChannel #553

wajda opened this issue Sep 3, 2021 · 9 comments

Comments

@wajda
Copy link

wajda commented Sep 3, 2021

When running under Java 1.8 it works great, but attempts to run it under Java 11 (tried different releases) it throws the following:

io.github.classgraph.ClassGraphException: Uncaught exception during scan
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1558)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1575)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1588)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.$anonfun$PluginClasses$2(AutoDiscoveryPluginRegistry.scala:78)
  at za.co.absa.commons.lang.ARM$.using(ARM.scala:30)
  at za.co.absa.commons.lang.ARM$ResourceWrapper.flatMap(ARM.scala:43)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.<init>(AutoDiscoveryPluginRegistry.scala:78)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry$.<clinit>(AutoDiscoveryPluginRegistry.scala)
  at za.co.absa.spline.harvester.plugin.registry.AutoDiscoveryPluginRegistry.<init>(AutoDiscoveryPluginRegistry.scala:48)
  at za.co.absa.spline.harvester.LineageHarvesterFactory.<init>(LineageHarvesterFactory.scala:40)
  at za.co.absa.spline.harvester.conf.DefaultSplineConfigurer.harvesterFactory(DefaultSplineConfigurer.scala:140)
  at za.co.absa.spline.harvester.conf.DefaultSplineConfigurer.queryExecutionEventHandler(DefaultSplineConfigurer.scala:104)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.initEventHandler(QueryExecutionEventHandlerFactory.scala:64)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.$anonfun$createEventHandler$6(QueryExecutionEventHandlerFactory.scala:43)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.withErrorHandling(QueryExecutionEventHandlerFactory.scala:55)
  at za.co.absa.spline.harvester.QueryExecutionEventHandlerFactory.createEventHandler(QueryExecutionEventHandlerFactory.scala:42)
  at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener$.za$co$absa$spline$harvester$listener$SplineQueryExecutionListener$$constructEventHandler(SplineQueryExecutionListener.scala:67)
  at za.co.absa.spline.harvester.listener.SplineQueryExecutionListener.<init>(SplineQueryExecutionListener.scala:37)
  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
  at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2788)
  at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:245)
  at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
  at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
  at scala.collection.TraversableLike.flatMap(TraversableLike.scala:245)
  at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:242)
  at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
  at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2777)
  at org.apache.spark.sql.util.ExecutionListenerManager.$anonfun$new$2(QueryExecutionListener.scala:88)
  at org.apache.spark.sql.util.ExecutionListenerManager.$anonfun$new$2$adapted(QueryExecutionListener.scala:87)
  at scala.Option.foreach(Option.scala:407)
  at org.apache.spark.sql.util.ExecutionListenerManager.<init>(QueryExecutionListener.scala:87)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.$anonfun$listenerManager$2(BaseSessionStateBuilder.scala:319)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.listenerManager(BaseSessionStateBuilder.scala:319)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:346)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1145)
  at org.apache.spark.sql.SparkSession.$anonfun$sessionState$2(SparkSession.scala:159)
  at scala.Option.getOrElse(Option.scala:189)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:155)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:152)
  at org.apache.spark.sql.SparkSession.$anonfun$new$3(SparkSession.scala:112)
  at scala.Option.map(Option.scala:230)
  at org.apache.spark.sql.SparkSession.$anonfun$new$1(SparkSession.scala:112)
  at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:196)
  at org.apache.spark.sql.types.DataType.sameType(DataType.scala:97)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.$anonfun$haveSameType$1(TypeCoercion.scala:291)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.$anonfun$haveSameType$1$adapted(TypeCoercion.scala:291)
  at scala.collection.LinearSeqOptimized.forall(LinearSeqOptimized.scala:85)
  at scala.collection.LinearSeqOptimized.forall$(LinearSeqOptimized.scala:82)
  at scala.collection.immutable.List.forall(List.scala:89)
  at org.apache.spark.sql.catalyst.analysis.TypeCoercion$.haveSameType(TypeCoercion.scala:291)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataTypeCheck(Expression.scala:1057)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataTypeCheck$(Expression.scala:1052)
  at org.apache.spark.sql.catalyst.expressions.If.dataTypeCheck(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType(Expression.scala:1063)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType$(Expression.scala:1062)
  at org.apache.spark.sql.catalyst.expressions.If.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType$lzycompute(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.If.org$apache$spark$sql$catalyst$expressions$ComplexTypeMergingExpression$$internalDataType(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataType(Expression.scala:1067)
  at org.apache.spark.sql.catalyst.expressions.ComplexTypeMergingExpression.dataType$(Expression.scala:1067)
  at org.apache.spark.sql.catalyst.expressions.If.dataType(conditionalExpressions.scala:36)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.isSerializedAsStruct(ExpressionEncoder.scala:309)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.isSerializedAsStructForTopLevel(ExpressionEncoder.scala:319)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder.<init>(ExpressionEncoder.scala:248)
  at org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:61)
  at org.apache.spark.sql.Encoders$.product(Encoders.scala:285)
  at org.apache.spark.sql.LowPrioritySQLImplicits.newProductEncoder(SQLImplicits.scala:251)
  at org.apache.spark.sql.LowPrioritySQLImplicits.newProductEncoder$(SQLImplicits.scala:251)
  at org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:32)
  ... 47 elided
Caused by: java.lang.UnsupportedOperationException: newFileChannel
  at java.base/jdk.internal.jrtfs.JrtFileSystem.newFileChannel(JrtFileSystem.java:338)
  at java.base/jdk.internal.jrtfs.JrtPath.newFileChannel(JrtPath.java:702)
  at java.base/jdk.internal.jrtfs.JrtFileSystemProvider.newFileChannel(JrtFileSystemProvider.java:316)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:292)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:345)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:118)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:140)
  at io.github.classgraph.ClasspathElementPathDir$1.openClassfile(ClasspathElementPathDir.java:253)
  at io.github.classgraph.Classfile.<init>(Classfile.java:1925)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:734)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:657)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkLoop(WorkQueue.java:246)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkQueue(WorkQueue.java:161)
  at io.github.classgraph.Scanner.processWorkUnits(Scanner.java:342)
  at io.github.classgraph.Scanner.performScan(Scanner.java:970)
  at io.github.classgraph.Scanner.openClasspathElementsThenScan(Scanner.java:1112)
  at io.github.classgraph.Scanner.call(Scanner.java:1146)
  at io.github.classgraph.Scanner.call(Scanner.java:83)
  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  at java.base/java.lang.Thread.run(Thread.java:829)
@lukehutch
Copy link
Member

Hi, thanks for the report. I have never seen this, and I run on JDK 11+ all the time, so I don't know how to reproduce this.

I was not aware of jrtfs previously, but it looks like it does not support FileChannel but does support SeekableByteChannel. However I need to know what format the jrtfs paths are in in order to know how to support this.

I checked in a change to stop ClassGraph dying in this sort of situation. This change would prevent the above stacktrace from occurring, but it would also mean that jrt.jar is skipped (i.e. you won't be able to scan system classes with this change, until the bug is fixed).

Would you be able to produce a minimal test project please that shows the problem?

@wajda
Copy link
Author

wajda commented Sep 4, 2021

Thanks Luke, I'll try to do it over the weekend. For now I can say that I run it from inside the Apache Spark shell (we develop a Spark driver extension), so perhaps this will give you a hint.

@wajda
Copy link
Author

wajda commented Sep 7, 2021

Alright, this is what I have found so far. It has nothing to do with Spark, but Scala (Scala REPL to be precise). The error only occurs when the scanner is executed from inside a Scala REPL and only with Java 11
I have tested different scenarios (javac, scalac, scala REPL, both Java 1.8 and 11, but that is the only failing combination).

[wajda@alex-xps aaa]$ scala
Welcome to Scala 2.12.12 (OpenJDK 64-Bit Server VM, Java 11.0.12).
Type in expressions for evaluation. Or try :help.

scala> :require /home/wajda/.m2/repository/io/github/classgraph/classgraph/4.8.115/classgraph-4.8.115.jar
Added '/home/wajda/.m2/repository/io/github/classgraph/classgraph/4.8.115/classgraph-4.8.115.jar' to classpath.

scala> import io.github.classgraph.ClassGraph
import io.github.classgraph.ClassGraph

scala> new ClassGraph().enableClassInfo.scan().getAllClasses
io.github.classgraph.ClassGraphException: Uncaught exception during scan
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1558)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1575)
  at io.github.classgraph.ClassGraph.scan(ClassGraph.java:1588)
  ... 28 elided
Caused by: java.lang.UnsupportedOperationException: newFileChannel
  at java.base/jdk.internal.jrtfs.JrtFileSystem.newFileChannel(JrtFileSystem.java:338)
  at java.base/jdk.internal.jrtfs.JrtPath.newFileChannel(JrtPath.java:702)
  at java.base/jdk.internal.jrtfs.JrtFileSystemProvider.newFileChannel(JrtFileSystemProvider.java:316)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:292)
  at java.base/java.nio.channels.FileChannel.open(FileChannel.java:345)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:118)
  at nonapi.io.github.classgraph.fileslice.PathSlice.<init>(PathSlice.java:140)
  at io.github.classgraph.ClasspathElementPathDir$1.openClassfile(ClasspathElementPathDir.java:253)
  at io.github.classgraph.Classfile.<init>(Classfile.java:1925)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:734)
  at io.github.classgraph.Scanner$ClassfileScannerWorkUnitProcessor.processWorkUnit(Scanner.java:657)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkLoop(WorkQueue.java:246)
  at nonapi.io.github.classgraph.concurrency.WorkQueue.runWorkQueue(WorkQueue.java:161)
  at io.github.classgraph.Scanner.processWorkUnits(Scanner.java:342)
  at io.github.classgraph.Scanner.performScan(Scanner.java:970)
  at io.github.classgraph.Scanner.openClasspathElementsThenScan(Scanner.java:1112)
  at io.github.classgraph.Scanner.call(Scanner.java:1146)
  at io.github.classgraph.Scanner.call(Scanner.java:83)
  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
  at java.base/java.lang.Thread.run(Thread.java:829)

@lukehutch
Copy link
Member

@wajda thanks for looking into this. Is there a way I can connect a remote debugger (Eclipse) to the REPL, to debug commands that are launched? I did a search for this but can't find good info. I need to figure out what is different about the REPL environment, and why.

Are you launching your Spark jobs from the REPL?

@wajda
Copy link
Author

wajda commented Sep 8, 2021

I've never debugged Scala REPL, but a quick google gave me this:

scala -cp path/to/my/jar/some.jar -J-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=9998

Spark jobs can be executed in different ways, e.g.: a) as a standalone Java app using Spark as a library, b) send to the Spark cluster via the spark-submit facility, or c) using spark-shell which is basically a Scala REPL with some extensions.

The library (a Spark agent) that we develop is supposed to be included into the Spark driver classpath and support whatever execution way the user chooses, including REPL.

@lukehutch
Copy link
Member

Thanks for the pointer. With this info I was able to debug this.

The JrtFS paths are of the form "/modules/java.base/module-info.class". ClassGraph scans modules using the JPMS API, so we can just ignore these paths (or this filesystem) when scanning on JDK 9+.

However, supposedly JrtFS was retrofitted all the way back to Java 1.8, but I don't know what it was used for on that version. I have only ever seen rt.jar added to the standard classpath. I tried testing this on Java 1.8 with the Scala REPL, however my version of Scala was compiled with a newer JDK, and I don't want to have to mess with figuring out and installing the newest possible version of Scala that runs on Java 1.8. If you are sufficiently motivated to try this, then it would be great if you could test it -- thanks!

Otherwise, I'll work on ignoring this filesystem.

lukehutch added a commit that referenced this issue Sep 9, 2021
@lukehutch
Copy link
Member

This should be fixed in ClassGraph version 4.8.116 (just released). Please test with 4.8.115 with the Scala REPL on JDK 8, if you're able to do so easily. Otherwise, don't worry about it, it's probably not important, as ClassGraph auto-detects rt.jar in all the standard locations anyway for JDK 7 and 8.

Thanks for the report, and for the helpful info on reproducing in Scala! Sorry that it took longer than my usual turnaround time to fix (I aim for 24 hours -- I have a lot going on right now!).

@wajda
Copy link
Author

wajda commented Sep 9, 2021

Tested on Scala REPL 2.12 with several different JDK 1.8 - both v115 and v116 work.
On Scala REPL with JDK 11 - v116 works like a charm!

Confirm the issue fixed.
Thank you a million Luke!

@lukehutch
Copy link
Member

lukehutch commented Sep 9, 2021

Oops, I just remembered that you said in your original report that it worked fine in JDK 8 with 4.8.115. 😊

You're welcome, thank you for confirming! I'm glad the fix was simple in the end!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants