代码之家  ›  专栏  ›  技术社区  ›  Tran Thien An

Hadoop Kerberos:Datanode无法连接到Namenode。jsvc启动Datanode与特权端口绑定(不使用SASL)

  •  0
  • Tran Thien An  · 技术社区  · 7 年前

    我已经建立了一个有效的HA Hadoop集群。但添加Kerberos身份验证后,datanode无法连接到namenode。

    已验证Namenode服务器已成功启动,并且未记录任何错误。我使用用户启动所有服务 'hduser'

    $ sudo netstat -tuplen
    ...
    tcp        0      0 10.28.94.150:8019       0.0.0.0:*               LISTEN      1001       20218      1518/java         
    tcp        0      0 10.28.94.150:50070      0.0.0.0:*               LISTEN      1001       20207      1447/java         
    tcp        0      0 10.28.94.150:9000       0.0.0.0:*               LISTEN      1001       20235      1447/java         
    

    数据节点

    以根用户身份启动datanode,使用jsvc将服务与特权端口绑定(参考。 Secure Datanode )

    $ sudo -E sbin/hadoop-daemon.sh start datanode
    starting datanode, logging to /opt/hadoop-2.7.3/logs//hadoop-hduser-datanode-STWHDDN01.out
    

    获取了datanode无法连接到namenodes的错误:

    ...
    2018-01-08 09:25:40,051 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: dnUserName = hduser
    2018-01-08 09:25:40,052 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: supergroup = supergroup
    2018-01-08 09:25:40,114 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue class java.util.concurrent.LinkedBlockingQueue
    2018-01-08 09:25:40,125 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50020
    2018-01-08 09:25:40,152 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Opened IPC server at /0.0.0.0:50020
    2018-01-08 09:25:40,219 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Refresh request received for nameservices: ha-cluster
    2018-01-08 09:25:41,189 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: Starting BPOfferServices for nameservices: ha-cluster
    2018-01-08 09:25:41,226 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting
    2018-01-08 09:25:41,227 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50020: starting
    2018-01-08 09:25:42,297 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: STWHDRM02/10.28.94.151:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)
    2018-01-08 09:25:42,300 INFO org.apache.hadoop.ipc.Client: Retrying connect to server: STWHDRM01/10.28.94.150:9000. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=10, sleepTime=1000 MILLISECONDS)    
    


    datanode hdfs站点。xml(摘录):

    <property>
      <name>dfs.block.access.token.enable</name>
      <value>true</value>
    </property>
    <property>
      <name>dfs.datanode.keytab.file</name>
      <value>/opt/hadoop/etc/hadoop/hdfs.keytab</value>
    </property>
    <property>
      <name>dfs.datanode.kerberos.principal</name>
      <value>hduser/_HOST@FDATA.COM</value>
    </property>
    <property>
        <name>dfs.datanode.address</name>
        <value>0.0.0.0:1004</value>
    </property>
    <property>
        <name>dfs.datanode.http.address</name>
        <value>0.0.0.0:1006</value>
    </property>
    <property>
        <name>dfs.datanode.data.dir.perm</name>
        <value>700</value>
    </property>
    


    我在HADOOP env中设置了HADOOP\u SECURE\u DN\u USER=hduser和JSVC\u HOME。上海


    hdfs。datanode上的键选项卡:

    $ klist -ke etc/hadoop/hdfs.keytab                                                             Keytab name: FILE:etc/hadoop/hdfs.keytab
    KVNO Principal
    ---- --------------------------------------------------------------------------
       1 hduser/stwhddn01@FDATA.COM (aes256-cts-hmac-sha1-96)
       1 hduser/stwhddn01@FDATA.COM (aes128-cts-hmac-sha1-96)
       1 hduser/stwhddn01@FDATA.COM (des3-cbc-sha1)
       1 hduser/stwhddn01@FDATA.COM (arcfour-hmac)
       1 hduser/stwhddn01@FDATA.COM (des-hmac-sha1)
       1 hduser/stwhddn01@FDATA.COM (des-cbc-md5)
       1 HTTP/stwhddn01@FDATA.COM (aes256-cts-hmac-sha1-96)
       1 HTTP/stwhddn01@FDATA.COM (aes128-cts-hmac-sha1-96)
       1 HTTP/stwhddn01@FDATA.COM (des3-cbc-sha1)
       1 HTTP/stwhddn01@FDATA.COM (arcfour-hmac)
       1 HTTP/stwhddn01@FDATA.COM (des-hmac-sha1)
       1 HTTP/stwhddn01@FDATA.COM (des-cbc-md5)  
    

    操作系统:Centos 7
    Hadoop:2.7.3
    Kerberos:MIT 1.5.1

    当以用户root身份运行datanode时,它不会使用kerberos进行身份验证。

    有什么想法吗?

    1 回复  |  直到 7 年前
        1
  •  1
  •   Tran Thien An    7 年前

    我发现了问题。需要将/etc/hosts更改为仅将127.0.0.1映射到localhost。

    之前

    127.0.0.1 STWHDDD01
    127.0.0.1 localhost
    ...
    

    之后

    127.0.0.1 localhost
    ...
    

    我仍然想知道为什么旧的映射在没有Kerberos身份验证的环境中工作。