代码之家  ›  专栏  ›  技术社区  ›  Александр Шаповалов

姨妈。java.lang.NoSuchMethodError错误: 网址:org.yaml.snakeyaml.Yaml.<init>

  •  0
  • Александр Шаповалов  · 技术社区  · 6 年前

    大家好。 我开发了一个基于sparkLauncher的应用程序,它运行一个可执行jar,其中有5个操作。每个操作取决于特定的变量。 我有一个主要的hadoop集群spark2.3.0-hadoop2.6.5。很好用。 我工作代码的一部分:

     private void runSparkJob(String pathToJar, final LocalDate startDate, final LocalDate endDate) {
            if (executionInProgress.get()) {
                LOGGER.warn("Execution already in progress");
                return;
            }
            Process sparkProcess = null;
            try {
                LOGGER.info("Create SparkLauncher. SparkHome: [{}]. JarPath: [{}].", sparkHome, vmJarPath);
                executionInProgress.set(true);
                sparkProcess = new SparkLauncher()
                        .setAppName(activeOperationProfile)
                        .setSparkHome(sparkHome) //sparkHome folder on main cluster
                        .setAppResource(pathToJar) // jar with 5 operation
                        .setConf(SparkLauncher.DRIVER_EXTRA_JAVA_OPTIONS,
                                String.format("-Drunner.operation-profile=%1$s -Doperation.startDate=%2$s -Doperation.endDate=%3$s", activeOperationProfile, startDate,endDate))
                        .setConf(SparkLauncher.DRIVER_MEMORY, "12G")
                        .redirectToLog(LOGGER.getName())
                        .setMaster("yarn")
                        .launch();
    
                sparkProcess.waitFor();
                int exitCode = sparkProcess.exitValue();
                if (exitCode != 0) {
                    throw new RuntimeException("Illegal exit code. Expected: [0]. Actual: [" + exitCode + "]");
                }
    
            } catch (IOException | InterruptedException e) {
                LOGGER.error("Error occurred while running SparkApplication.", e);
                throw new RuntimeException(e);
            } finally {
                if (sparkProcess != null && sparkProcess.isAlive()) {
                    LOGGER.warn("Process still alive. Try to kill");
                    sparkProcess.destroy();
                }
                executionInProgress.set(false);
            }
        }
    

    我将master改为.setMaster(“local”),将新的配置文件与sparkHome、jarswithothopertations的路径放在一起

    2018-08-06 14:47:53150信息 [n.m.m.b.r.火花]BaseOperationsRunner.runSparkJob]105:创建 姨妈。火花室: 2018-08-06 14:47:54905信息 [o.a。spark.launcher.OutputRedirector【重定向】63:2018-08-06 14: 47:54警告NativeC装载机:62-无法加载本机hadoop 你的平台库。。。使用内置java类 适用于2018-08-06 14:47:57042信息 [o.a。spark.launcher.OutputRedirector【重定向】63:2018-08-06 14: 47:57错误复制:842-应用程序运行失败 2018-08-06 14:47:57043信息 [o.a。spark.launcher.OutputRedirector[重定向]63: 网址:org.yaml.snakeyaml.Yaml.(Lorg/Yaml/snakeyaml/constructor/BaseConstructor;Lorg/Yaml/snakeyaml/representer/representer;Lorg/Yaml/snakeyaml/DumperOptions;Lorg/Yaml/snakeyaml/LoaderOptions;Lorg/Yaml/snakeyaml/resolver/resolver;)V [o.a。spark.launcher.OutputRedirector[重定向]63:at 2018-08-06 14:47:57043信息 org.springframework.beans网站.factory.config.YamlProcessor.流程(YamlProcessor.java:139) 2018-08-06 14:47:57044信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.环境YamlPropertySourceLoader.负载(YamlPropertySourceLoader.java:50) [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.context.config.ConfigFileApplicationListener$加载器.loadDocuments(配置文件)eApplicationListener.java:547) 2018-08-06 14:47:57044信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.context.config.ConfigFileApplicationListener$装载机。装载机(配置文件)eApplicationListener.java:517) org.springframework.boot启动.context.config.ConfigFileApplicationListener$Loader.loadForFileExtension(配置文件)eApplicationListener.java:496) 2018-08-06 14:47:57045信息 org.springframework.boot启动.context.config.ConfigFileApplicationListener$装载机。装载机(配置文件)eApplicationListener.java:464) 2018-08-06 14:47:57045信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at 2018-08-06 14:47:57046信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at java.lang.Iterable语言forEach先生(Iterable.java:75) 2018-08-06 14:47:57,046 org.springframework.boot启动.context.config.ConfigFileApplicationListener$装载机.lambda$load$7(配置文件)eApplicationListener.java:445) 2018-08-06 14:47:57046信息 java.lang.Iterable语言forEach先生(Iterable.java:75) 2018-08-06 14:47:57,046 信息[o.a。spark.launcher.OutputRedirector[重定向]63:at 2018-08-06 14:47:57046信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at 2018-08-06 14:47:57047信息 org.springframework.boot启动.context.config.ConfigFileApplicationListener.addPropertySources(配置文件)eApplicationListener.java:212) 2018-08-06 14:47:57047信息 org.springframework.boot启动.context.config.ConfigFileApplicationListener.后处理环境(ConfigFil)eApplicationListener.java:195) 2018-08-06 14:47:57047信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.context.config.ConfigFileApplicationListener.onApplicationEvent(配置文件eApplicationListener.java:168) [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.context环境.event.SimpleApplicationEventMulticaster.doInvokeListener(简单应用程序)java:172) 2018-08-06 14:47:57048信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.context环境.event.SimpleApplicationEventMulticaster.invokeListener(简单应用程序)java:165) 2018-08-06 14:47:57048信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.context环境.event.SimpleApplicationEventMulticaster.multicastEvent(简单应用程序)java:139) 2018-08-06 14:47:57048信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.context环境.event.SimpleApplicationEventMulticaster.multicastEvent(简单应用程序)java:127) 2018-08-06 14:47:57049信息 2018-08-06 14:47:57049信息 org.springframework.boot启动.SpringApp公司应用程序RunListeners.environmentPrepared(春季应用)应用RunListeners.java:54) 2018-08-06 14:47:57049信息 org.springframework.boot启动.SpringApplication.prepareEnvironment公司(SpringApplication.java:358) org.springframework.boot启动.SpringApplication.run(SpringApplication.java:317) 2018-08-06 14:47:57050信息 org.springframework.boot启动.SpringApplication.run(SpringApplication.java:1255) [o.a。spark.launcher.OutputRedirector[重定向]63:at 2018-08-06 14:47:57050信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at net.mediascope.multirating.bigdata.operations.OperationRunner.主要(OperationRunner.java:21) 2018-08-06 14:47:57050信息 14: 47:57050信息[o.a。spark.launcher.OutputRedirector[重定向]63: sun.reflect.NativeMethodAccessorImpl附件.invoke(本地)veMethodAccessorImpl.java:62个) 2018-08-06 14:47:57051信息 sun.reflect.DelegatingMethodAccessorImpl附件.invoke(委派)ngMethodAccessorImpl.java:43个) [o.a。spark.launcher.OutputRedirector[重定向]63:at java.lang.reflect文件.方法.invoke(方法.java:498) 2018-08-06 在 2018-08-06 14:47:57051信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.loader.Launcher.launch加载程序(Launcher.java:50) 2018-08-06 14:47:57052信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.springframework.boot启动.加载器.JarLauncher.main(JarLauncher.java:51) 2018-08-06 14:47:57052信息 sun.reflect.NativeMethodAccessorImpl附件.invoke0(本机方法)2018-08-06 14: 47:57052信息[o.a。spark.launcher.OutputRedirector[重定向]63: sun.reflect.NativeMethodAccessorImpl附件.invoke(本地)veMethodAccessorImpl.java:62个) 2018-08-06 14:47:57053信息 sun.reflect.DelegatingMethodAccessorImpl附件.invoke(委派)ngMethodAccessorImpl.java:43个) [o.a。spark.launcher.OutputRedirector[重定向]63:at java.lang.reflect文件.方法.invoke(方法.java:498) 2018-08-06 在 2018-08-06 14:47:57053信息 [o.a。spark.launcher.OutputRedirector[重定向]63:at org.apache.spark网站.部署.SparkSubmit$.org$apache$spark$部署$SparkSubmit$$runMain(SparkSubmit.斯卡拉:879) [o.a。spark.launcher.OutputRedirector[重定向]63:at org.apache.spark网站.部署.SparkSubmit$.doRunMain美元(SparkSubmit.斯卡拉:197) [o.a。spark.launcher.OutputRedirector[重定向]63:at org.apache.spark网站.部署.SparkSubmit$。提交(SparkSubmit。斯卡拉:227) 2018-08-06 14:47:57054信息 org.apache.spark网站.部署.SparkSubmit$主(SparkSubmit。斯卡拉:136) 2018-08-06 14:47:57054信息 org.apache.spark网站.部署.SparkSubmit.main(SparkSubmit.scala文件) 2018-08-06 14: 47:57058信息[o.a。spark.launcher.OutputRedirector[重定向]63: 电话:2018-08-06 14:47:57060信息 [o.a。spark.launcher.OutputRedirector【重定向】63:2018-08-06 14: 47:57信息关机经理:54-正在删除目录 [o.s.b.a.l.条件评估eportLoggingListener.LogAutoConfiguration报告] 101 :

    启动ApplicationContext时出错。显示条件报告

    在我的项目中,我有Spring5.0中的SnakeYaml1.19,没有其他依赖项。 我不明白是什么问题,也许当我把它放入docker容器手册时,有必要在spark之外安装其他东西。

    <dependencies>
            <dependency>
                <groupId>net.mediascope</groupId>
                <artifactId>multirating-bigdata-core</artifactId>
                <version>${project.version}</version>
            </dependency>
            <dependency>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-starter-log4j2</artifactId>
            </dependency>
            <!-- Data Base -->
            <dependency>
                <groupId>org.jdbi</groupId>
                <artifactId>jdbi</artifactId>
                <version>2.71</version>
            </dependency>
    
            <dependency>
                <groupId>com.microsoft.sqlserver</groupId>
                <artifactId>sqljdbc42</artifactId>
                <version>4.2</version>
            </dependency>
    
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <exclusions>
                    <exclusion>
                        <groupId>org.slf4j</groupId>
                        <artifactId>slf4j-log4j12</artifactId>
                    </exclusion>
                    <exclusion>
                        <groupId>org.codehaus.janino</groupId>
                        <artifactId>commons-compiler</artifactId>
                    </exclusion>
                </exclusions>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-sql_2.11</artifactId>
            </dependency>
            <dependency>
                <groupId>net.sourceforge.jtds</groupId>
                <artifactId>jtds</artifactId>
                <version>1.3.1</version>
            </dependency>
        </dependencies>
    
        <profiles>
            <profile>
                <id>local</id>
                <build>
                    <plugins>
                        <plugin>
                            <groupId>org.springframework.boot</groupId>
                            <artifactId>spring-boot-maven-plugin</artifactId>
                            <configuration>
                                <profiles>
                                    <profile>${profile.active}</profile>
                                </profiles>
                                <executable>true</executable>
                            </configuration>
                        </plugin>
                    </plugins>
                </build>
            </profile>
            <profile>
                <id>hadoop</id>
                <build>
                    <!--Необходимо для адаптации Spring-Boot приложения под запуск через Spark-->
                    <plugins>
                        <plugin>
                            <groupId>org.apache.maven.plugins</groupId>
                            <artifactId>maven-shade-plugin</artifactId>
                            <version>2.3</version>
                            <executions>
                                <execution>
                                    <phase>package</phase>
                                    <goals>
                                        <goal>shade</goal>
                                    </goals>
                                    <configuration>
                                        <transformers>
                                            <transformer
                                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                                <resource>META-INF/spring.handlers</resource>
                                            </transformer>
                                            <transformer
                                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                                <resource>META-INF/spring.schemas</resource>
                                            </transformer>
                                            <transformer
                                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                                <resource>META-INF/spring.provides</resource>
                                            </transformer>
                                            <transformer
                                                    implementation="org.springframework.boot.maven.PropertiesMergingResourceTransformer">
                                                <resource>META-INF/spring.factories</resource>
                                            </transformer>
                                            <transformer
                                                    implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
                                                <mainClass>${start-class}</mainClass>
                                            </transformer>
                                        </transformers>
                                    </configuration>
                                </execution>
                            </executions>
                        </plugin>
                    </plugins>
                </build>
            </profile>
        </profiles>I
    
    1 回复  |  直到 6 年前
        1
  •  9
  •   Александр Шаповалов    6 年前

    我找到了解决办法。