代码之家  ›  专栏  ›  技术社区  ›  Sparker0i

intellij:线程“main”中出现异常java.lang.noclassdefounderror:org/apache/spark/sql/types/datatype

  •  1
  • Sparker0i  · 技术社区  · 5 年前

    还有一个类似的问题 here 但那是在MVN中,我的项目是在SBT中。

    首先,需要一些信息:

    • Spark安装版本:2.4.0
    • scala安装版本:2.11.12

    我想跑 this 在intellij思想中进行项目,为此 build.sbt 看起来像:

    name := "kafka-latest-spark-streaming"
    
    version := "0.1"
    
    scalaVersion := "2.11.12"
    
    libraryDependencies ++= Seq(
        "org.apache.spark" %% "spark-sql" % "2.4.0" % "provided",
        "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.0" % "provided",
        "org.apache.kafka" % "kafka-clients" % "0.11.0.1"
    )
    

    主应用程序代码类似于教程中的代码,除了一些我必须进行的导入以使其类似于 $ 工作。当我试图运行scala文件时,右键单击并选择 Run 'Main' ,引发以下错误:

    /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/bin/java "-javaagent:/Applications/IntelliJ IDEA.app/Contents/lib/idea_rt.jar=59919:/Applications/IntelliJ IDEA.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath /Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_181.jdk/Contents/Home/lib/tools.jar:/Users/sparker0i/kafka-latest-spark-streaming/target/scala-2.11/classes:/Users/sparker0i/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.12.jar:/Users/sparker0i/.ivy2/cache/net.jpountz.lz4/lz4/jars/lz4-1.3.0.jar:/Users/sparker0i/.ivy2/cache/org.apache.kafka/kafka-clients/jars/kafka-clients-0.11.0.1.jar:/Users/sparker0i/.ivy2/cache/org.slf4j/slf4j-api/jars/slf4j-api-1.7.25.jar:/Users/sparker0i/.ivy2/cache/org.xerial.snappy/snappy-java/bundles/snappy-java-1.1.2.6.jar Main
    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/types/DataType
        at Main.main(Main.scala)
    Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.types.DataType
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 1 more
    
    1 回复  |  直到 5 年前
        1
  •  3
  •   Krzysztof Atłasik    5 年前

    如果您试图从您的IDE中运行Spark作为本地应用程序(主设置为 local[n] )需要删除 假如 从您的依赖关系定义。

    libraryDependencies ++= Seq(
        "org.apache.spark" %% "spark-sql" % "2.4.0", //provided removed
        "org.apache.spark" %% "spark-sql-kafka-0-10" % "2.4.0", //provided removed
        "org.apache.kafka" % "kafka-clients" % "0.11.0.1"
    )
    

    另一方面,当你在Spark集群上运行你的应用程序时,你需要按照规定设置Spark依赖项。您只需在某些任务(如 assembly )

    你可以做的另一件事就是点击复选框 Include dependencies with "Provided" scope 运行时配置,以及从Intellij All运行项目时 假如 应包括依赖项。

    enter image description here