代码之家  ›  专栏  ›  技术社区  ›  Jake Chase

将属性文件传递给Oozie Java Action

  •  0
  • Jake Chase  · 技术社区  · 9 年前

    我有一个Oozie java动作工作流设置,我计划使用Oozie协调器来安排。java操作运行一个CamusJob,我将其jar和属性配置文件放在workflow/lib目录中。关于我如何将-P论点传递给这个问题,有什么想法吗?目前,我正在做这样的事情:

    <workflow-app xmlns="uri:oozie:workflow:0.5" name="camus-wf">
        <start to="camusJob"/>
        <action name="camusJob">
            <java>
                <job-tracker>${jobTracker}</job-tracker>
                <name-node>${nameNode}</name-node>
                <configuration>
                <property>
                        <name>mapred.job.name</name>
                        <value>camusJob</value>
                    </property>
                    <property>
                        <name>mapred.job.queue.name</name>
                        <value>${queueName}</value>
                    </property>
                </configuration>
                <main-class>com.linkedin.camus.etl.kafka.CamusJob</main-class>
        <arg>-P</arg>
        <arg>${camusJobProperties}</arg>
            </java>
            <ok to="end"/>
            <error to="fail"/>
        </action>
        <kill name="fail">
            <message>${wf:errorMessage(wf:lastErrorNode())}</message>
        </kill>
        <end name="end"/>
    </workflow-app>
    

    camusJobProperties的外观

    hdfs://10.0.2.15:8020/coordCamusJob/workflowAppPath/lib/config.properties
    

    但工作流似乎没有运行(在PREP上卡住了)。有什么办法解决这个问题吗?

    谢谢

    编辑:更正我的nameNode URL后,我可以看到我收到以下错误:

    ACTION[0000002-150804091125207-oozie-oozi-W@camusJob] Launcher exception: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
    org.apache.oozie.action.hadoop.JavaMainException: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
        at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:58)
        at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
        at org.apache.oozie.action.hadoop.JavaMain.main(JavaMain.java:36)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:226)
        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
    Caused by: java.lang.IllegalArgumentException: Wrong FS: hdfs://10.0.2.15:8020/user/root/app/workflow/lib/config.properties, expected: file:///
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:645)
        at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:82)
        at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:603)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:821)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:598)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:414)
        at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputChecker.<init>(ChecksumFileSystem.java:140)
        at org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:341)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:766)
        at com.linkedin.camus.etl.kafka.CamusJob.run(CamusJob.java:679)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
        at com.linkedin.camus.etl.kafka.CamusJob.main(CamusJob.java:646)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.oozie.action.hadoop.JavaMain.run(JavaMain.java:55)
        ... 15 more
    

    因此,基本上我的问题是,当属性文件位于HDFS中(特别是在workflow/lib目录中)时,如何传递属性文件参数

    1 回复  |  直到 9 年前
        1
  •  0
  •   vishnu viswanath    9 年前

    对于问题的第一部分:可能是由于url不正确 namenode jobtracker

    对于第二部分:您必须配置 核心站点.xml ,该属性 fs.defaultFS hdfs://host:port/

    而且 在Java程序中设置核心站点的路径。您的 config 对象