-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
java.lang.IllegalAccessError when running with Spark 2.4.0 #59
Comments
This error looks strange. These classes live in the same package. I check the code of spark 2.4.0,
And it finishes successfully. |
Exception in thread "main" java.lang.IllegalAccessError: class org.apache.spark.storage.StorageUtils$ (in unnamed module @0x2a265ea9) cannot access class sun.nio.ch.DirectBuffer (in module java.base) because module java.base does not export sun.nio.ch to unnamed module @0x2a265ea9 Above error is coming while running spark application |
@gmcaps - I am facing a similar issue as well, were you able to resolve this? |
@gmcaps I was able to solve the issue you mentioned by adding JAVA_HOME to path env variables. |
Before build I've changed spark version in pom.xml:
Spark-submit done in local[*] mode
<spark.version>2.4.0</spark.version>
java.lang.IllegalAccessError: tried to access class org.apache.spark.shuffle.sort.ShuffleInMemorySorter from class org.apache.spark.shuffle.sort.SplashUnsafeSorter
at org.apache.spark.shuffle.sort.SplashUnsafeSorter.(SplashUnsafeSorter.scala:77)
at org.apache.spark.shuffle.sort.SplashUnsafeShuffleWriter.(SplashUnsafeShuffleWriter.scala:56)
at org.apache.spark.shuffle.SplashShuffleManager.getWriter(SplashShuffleManager.scala:84)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:98)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:402)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
The text was updated successfully, but these errors were encountered: