代码之家  ›  专栏  ›  技术社区  ›  Volatil3

无法连接在docker中运行的Apache Spark

  •  0
  • Volatil3  · 技术社区  · 4 年前

    我试图通过主机系统连接docker内部运行的Spark群集。我尝试了python脚本和 spark-shell 两种方法都给出了相同的结果:

    码头工人内部

    park-master_1  | 20/07/24 10:13:26 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
    spark-master_1  | java.io.InvalidClassException: org.apache.spark.deploy.ApplicationDescription; local class incompatible: stream classdesc serialVersionUID = 1574364215946805297, local class serialVersionUID = 6543101073799644159
    spark-master_1  |   at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699)
    spark-master_1  |   at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)
    spark-master_1  |   at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
    spark-master_1  |   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
    spark-master_1  |   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
    spark-master_1  |   at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
    spark-master_1  |   at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
    spark-master_1  |   at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
    spark-master_1  |   at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
    spark-master_1  |   at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
    spark-master_1  |   at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
    spark-master_1  |   at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
    spark-master_1  |   at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(Nett
    

    跑步 火花壳 主机系统的命令行上出现以下错误:

    ➜

      docker-spark-cluster git:(master) ✗ spark-shell --master spark://localhost:7077 
    20/07/24 15:13:17 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
    Setting default log level to "WARN".
    To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
    20/07/24 15:14:25 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
    20/07/24 15:14:25 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
    20/07/24 15:14:25 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
    20/07/24 15:14:26 ERROR SparkContext: Error initializing SparkContext.
    java.lang.IllegalArgumentException: requirement failed: Can only call getServletHandlers on a running MetricsSystem
        at scala.Predef$.require(Predef.scala:281)
        at org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:92)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:42)
        at $line3.$read.<init>(<console>:44)
        at $line3.$read$.<init>(<console>:48)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
        at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
    

    码头集装箱

    git:(master) ✗ docker ps
    CONTAINER ID        IMAGE                           COMMAND                  CREATED             STATUS              PORTS                                                      NAMES
    dfe3d47790ee        spydernaz/spark-worker:latest   "/bin/bash /start-wo…"   42 hours ago        Up 23 minutes       0.0.0.0:32769->8081/tcp                                    docker-spark-cluster_spark-worker_2
    c5e36b94efdd        spydernaz/spark-worker:latest   "/bin/bash /start-wo…"   42 hours ago        Up 23 minutes       0.0.0.0:32768->8081/tcp                                    docker-spark-cluster_spark-worker_3
    60f3d29e9059        spydernaz/spark-worker:latest   "/bin/bash /start-wo…"   42 hours ago        Up 23 minutes       0.0.0.0:32770->8081/tcp                                    docker-spark-cluster_spark-worker_1
    d11c67d462fb        spydernaz/spark-master:latest   "/bin/bash /start-ma…"   42 hours ago        Up 23 minutes       6066/tcp, 0.0.0.0:7077->7077/tcp, 0.0.0.0:9090->8080/tcp   docker-spark-cluster_spark-master_1
    ➜  docker-spark-cluster git:(master) ✗ 
    

    Spark Shell命令

    spark-shell --master spark://localhost:7077

    0 回复  |  直到 4 年前
        1
  •  0
  •   Shivam Agrawal    4 年前

    正如@koiralo在评论中已经提到的,这是因为本地和服务器上运行的pySpark版本不同。

    有相同的错误,一旦两个地方的版本都匹配,它就被修复了。