Spark Error - Unsupported class file major version
Edit Spark 3.0 supports Java 11, so you'll need to upgrade
Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.1+. Java 8 prior to version 8u92 support is deprecated as of Spark 3.0.0
Original answer
Until Spark supports Java 11, or higher (which would be hopefully be mentioned at the latest documentation when it is), you have to add in a flag to set your Java version to Java 8.
As of Spark 2.4.x
Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)
On Mac/Unix, see asdf-java for installing different Javas
On a Mac, I am able to do this in my .bashrc
,
export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)
On Windows, checkout Chocolately, but seriously just use WSL2 or Docker to run Spark.
You can also set this in spark-env.sh
rather than set the variable for your whole profile.
And, of course, this all means you'll need to install Java 8 in addition to your existing Java 11
IllegalArgumentException: 'Unsupported class file major version 55'
That is because the run time version of your java is 11
Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)
Try installing a different java with of version 8 and point your JAVA_HOME
to the newly installed java.
pyspark 'Unsupported class file major version 55' in Pychrm with anaconda plungin
The 55 error is caused by an older Java version trying to execute Java version 11 code (in a .jar or .class file). So it seems like an older Java version (for example Java 8 (a JVM 8)) is used, but it encounters a part compiled in Java 11 (see 55 is Java 11).
Since Java 8 is less well supported, you could try to use Java 11 or newer (see Java version history). You could try to use PySpark 3.0.1 with Python 3.8 and Java 11. That way, you have recent parts that should be able to work together.
These links might also be helpful:
- Specify Java version in a Conda environment
- Spark Error - Unsupported class file major version
How to fix 'Unsupported class file major version 55' while executing 'org.apache.spark.sql.DataSet.collectAsList()'
The root cause of the issue was a symbolic link that I have aiming the wrong JDK and that's why it wasn't working. JAVA_HOME was aiming a jdk11 and eclipse was running with that.
Spark: IllegalArgumentException: 'Unsupported class file major version 55'
I suspect I somehow had multiple Java installations that caused the error. Solved it by running (a bit unorthodox) $ sudo rm -rf /Library/Java/*
and then reinstalling from https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html, at the time version: 'Mac OS X x64 245.92 MB jdk-8u201-macosx-x64.dmg'.
Pyspark throws IllegalArgumentException: 'Unsupported class file major version 55' when trying to use udf
I found solution,
i had to switch boot jdk of Pycharm (2xshift -> jdk -> select jdk 1.8)
Related Topics
How to Launch Home Screen Programmatically in Android
What Is the Use of Basecolumns in Android
Simple Program to Call R from Java Using Eclipse and Rserve
How Does One Configure Rjava on Osx to Select the Right Jvm -- .Jinit() Failing
Android Getintent().Getextras() Returns Null
How to Import All Packages in Jruby
What Is an Index in Elasticsearch
Is String Interning Really Useful
How to Get Ruby Generated Hmac for Sha256 That Is Url Safe to Match Java
Theme.Appcompat.Light.Darkactionbar - No Resource Found
Using Serviceloader on Android
Android Webview Always Returns Null for JavaScript Getelementbyid on Loadurl
Java Apns Certificate Error with "Derinputstream.Getlength(): Lengthtag=109, Too Big."
Can't Install Rjava on Ubuntu System
Java.Lang.Classnotfoundexception: Org.Postgresql.Driver, Android
The Specified Dsn Contains an Architecture Mismatch Between the Driver and Application. Java