代码之家  ›  专栏  ›  技术社区  ›  BAE

maven依赖项冲突

  •  1
  • BAE  · 技术社区  · 6 年前

    我得到了以下错误:

    [ERROR] /Users/me/mycom/poc/spark-streaming/src/main/scala/com/mycom/poc/App.scala:15: error: object mycom is not a member of package org.apache.spark.io
    [ERROR] import io.mycom.myproj.schema.event.Event
    

    有一个称为 io.mycom.myproj . 它与 org.apache.spark.io . 如何修复?谢谢

    我的pom.xml文件:

        <dependency>
          <groupId>io.myproj.myproj</groupId>
          <artifactId>myproj-client</artifactId>
          <version>0.0.8-SNAPSHOT</version>
        </dependency>
        <dependency>
          <groupId>io.mycom.myproj</groupId>
          <artifactId>myproj-kafka-security</artifactId>
          <version>0.0.4-SNAPSHOT</version>
        </dependency>
        <dependency>
          <groupId>junit</groupId>
          <artifactId>junit</artifactId>
          <version>3.8.1</version>
          <scope>test</scope>
        </dependency>
        <dependency>
          <groupId>org.scala-lang</groupId>
          <artifactId>scala-library</artifactId>
          <version>${scala.version}</version>
        </dependency>
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-core_${scala.compactVersion}</artifactId>
          <version>${spark.version}</version>
        </dependency>
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-sql_${scala.compactVersion}</artifactId>
          <version>${spark.version}</version>
        </dependency>
        <dependency>
          <groupId>org.apache.spark</groupId>
          <artifactId>spark-streaming_${scala.compactVersion}</artifactId>
          <version>${spark.version}</version>
        </dependency>
        <dependency>
          <groupId>org.apache.kafka</groupId>
          <artifactId>kafka-clients</artifactId>
          <version>0.10.0.0-SASL</version>
        </dependency>
    

    import org.apache.spark._
    import io.mycom.myproj.schema.event.Event
    

    更新

    我将导入更改为:

    import org.apache.spark.SparkConf // <= fixed the problem
    import io.mycom.myproj.schema.event.Event
    
    0 回复  |  直到 6 年前