代码之家  ›  专栏  ›  技术社区  ›  a.moussa

线程“main”java中出现异常。lang.NoClassDefFoundError:org/apache/spark/streaming/StreamingContext

  •  3
  • a.moussa  · 技术社区  · 6 年前

    大家好,下面的代码中似乎找不到StreamingContext类。

    import org.apache.spark.streaming.{Seconds, StreamingContext}
    import org.apache.spark.{SparkConf, SparkContext}
    object Exemple {
      def main(args: Array[String]): Unit = {
        val conf = new SparkConf().setMaster("local[*]").setAppName("Exemple")
        val sc = new SparkContext(conf)
        val ssc = new StreamingContext(sc, Seconds(2)) //this line throws error
    
      }
    }
    

    以下是错误:

    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/StreamingContext
        at Exemple$.main(Exemple.scala:16)
        at Exemple.main(Exemple.scala)
    Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.StreamingContext
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 2 more
    
    Process finished with exit code 1
    

    我使用以下构建。sbt文件:

    name := "exemple"
    
    version := "1.0.0"
    
    scalaVersion := "2.11.11"
    
    // https://mvnrepository.com/artifact/org.apache.spark/spark-sql
    libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"
    // https://mvnrepository.com/artifact/org.apache.spark/spark-streaming
    libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" % "provided"
    // https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10
    libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0"
    

    我使用intellij run按钮运行Example类,得到了错误。在sbt外壳中,它工作良好。在我的依赖项模块中,我可以找到spark依赖项。代码在intellij中编译。我可以在外部库中看到spark Dependeies(在左侧项目面板中)。 你有什么想法吗。这似乎并不复杂。

    enter image description here

    1 回复  |  直到 6 年前
        1
  •  14
  •   mohammad RaoofNia Ryu_hayabusa    5 年前

    请删除 provided spark流媒体库中的术语。

    libraryDependencies += "org.apache.spark" %% "spark-streaming" % "2.2.0" 
    

    更改后,仍然存在进一步的依赖性问题,请排除重复的JAR。

     "org.apache.spark" %% "spark-streaming-kafka-0-10" % "2.2.0" excludeAll(
          ExclusionRule(organization = "org.spark-project.spark", name = "unused"),
          ExclusionRule(organization = "org.apache.spark", name = "spark-streaming"),
          ExclusionRule(organization = "org.apache.hadoop")
        ),
    

    希望这有帮助。

    谢谢 拉维