恰好10小时后,更新与Apache Phoenix的连接(使用Kerberos)失败 [英] Renewing a connection to Apache Phoenix (using Kerberos) fails after exactly 10 hours

查看:218
本文介绍了恰好10小时后,更新与Apache Phoenix的连接(使用Kerberos)失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个Java应用程序,可以从Apache Phoenix进行一些SQL选择语句.为此,我使用带有密钥表的原理来创建连接.这是支持连接的类:

I have a Java application with possibility to make some SQL select statements from Apache Phoenix. For this i'm using a principle with a keytab to create the connection. This is the class that support the connection :

public class PhoenixDriverConnect {
private static Connection conn;
private static final Logger logger = LoggerFactory.getLogger(PhoenixDriverConnect.class);
private PhoenixDriverConnect(String DB_URL) {
    GetProperties getProperties = new GetProperties();
    try {
        Class.forName(getProperties.get("jdbc.driver"));
    } catch (ClassNotFoundException e) {
        logger.error(e.getMessage());
    }
    try {
        DriverManager.deregisterDriver(PhoenixDriver.INSTANCE);
        conn = DriverManager.getConnection(DB_URL, getProperties.getInfo());
        connTime = new DateTime().getMillis();
    } catch (SQLException e) {
        logger.error(e.getMessage());
    }
}

public static synchronized Connection getConnection(String DB_URL) {

    // for the first connection conn == null
    if (conn == null ) {
        logger.info("create new connection....");
        new PhoenixDriverConnect(DB_URL);
        logger.info("create new connection done.");
    }

    return conn;
}
}

以下是创建连接的驱动程序代码:

Here is driver code which creates the connection:

public synchronized Connection connect(final String url, final Properties info) throws SQLException {

    String principal = info == null ? null : (String)info.get("DelegationDriver.principal");
    String kt = info == null ? null : (String)info.get("DelegationDriver.keytab.file");
    String hadoopConfFile = info == null ? null : (String)info.get("hbase_site");
    String hbaseConfFile = info == null ? null : (String)info.get("core-site");

    Configuration conf = HBaseConfiguration.create();

    if (hadoopConfFile != null) {
        logger.info("Adding conf1: " + hadoopConfFile);
        conf.addResource(new Path(hadoopConfFile));
    } else {
        logger.info("Hadoop core configuration is not provided");
    }
    if (hbaseConfFile != null) {
        logger.info("Adding conf2: " + hbaseConfFile);
        conf.addResource(new Path(hbaseConfFile));
    } else {
        logger.info("HBase configuration is not provided");
    }       

    conf.set("hadoop.security.authentication", "kerberos");
    conf.set("hbase.security.authentication", "kerberos");
    conf.set("hbase.security.authorization", "true");


    logger.info("DelegationDriver - connect - principal : " + principal);
    logger.info("DelegationDriver - connect - keytab file : " + kt);
    logger.info("DelegationDriver - connect - hadoop configuration file : " + hadoopConfFile);
    logger.info("DelegationDriver - connect - hbase configuration file : " + hbaseConfFile);   

    UserGroupInformation.setConfiguration(conf);

    try {
      if (principal != null) {
          logger.info("Trying to login with the principal found in the properties (" + principal + ", keytab=" + kt + ")");
        if (kt == null) {
          throw new IllegalArgumentException("keytab is required, no property found");
        }
        if ((kt = kt.trim()).isEmpty()) {
          throw new IllegalArgumentException("keytab is required, found empty property");
        }
        this.ugi = UserGroupInformation.loginUserFromKeytabAndReturnUGI(principal, kt);

        //this.ugi.getLoginUser().reloginFromKeytab();
        logger.info("Logged by Kerberos with the principal/keytab found in the properties, ugi=" + (Object)this.ugi + ", ticket=" + (Object)this.ugi.getRealAuthenticationMethod());
      } else {
          logger.info("No principal found in the properties (DelegationDriver.principal and DelegationDriver.keytab.file), trying the current user if any");
        this.ugi = UserGroupInformation.getCurrentUser();
      }
    }
    catch (IOException e) {
      logger.warning(e.getMessage());
      throw new RuntimeException("Can't login, principal found was " + principal + ", keytab=" + kt + '\n' + e.getLocalizedMessage());
    }
    logger.info("Going to connect to Phoenix. UGI = " + (Object)this.ugi);
    Connection conn = (Connection)this.runWithSQLException(new PrivilegedSQLExceptionAction<Connection>(){

      @Override
      public Connection run() throws SQLException {
        return DelegationDriver.this.driver.connect(url, info);
      }
    });
    logger.info("Connection to phoenix done");
    return conn;
}

这很好用.注意:我在下午5点启动应用程序,但是恰好在凌晨3点10个小时之后,出现此错误:

This works perfectly. Note: I start my application at 5pm, but after 10 hours at exactly at 3am I get this error:

org.apache.zookeeper.KeeperException$SessionExpiredException: KeeperErrorCode = Session expired
    at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.connectionEvent(ZooKeeperWatcher.java:606) [hbase-client-1.1.1.jar!/:1.1.1]
    at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.process(ZooKeeperWatcher.java:517) [hbase-client-1.1.1.jar!/:1.1.1]
    at org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:522) [zookeeper-3.4.6.jar!/:3.4.6-1569965]
    at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:498) [zookeeper-3.4.6.jar!/:3.4.6-1569965]

当我尝试进行选择时,出现此错误:

When I try to make a select I get this error:

2016-11-29 09:48:07.491 ERROR 6352 --- [ared--pool2-t18] o.a.hadoop.hbase.ipc.AbstractRpcClient   : SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) ~[na:1.8.0_112]
at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:179) ~[hbase-client-1.1.1.jar!/:1.1.0]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:609) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$600(RpcClientImpl.java:154) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:735) ~[hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:732) ~[hbase-client-1.1.1.jar!/:1.1.1]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_112]
at javax.security.auth.Subject.doAs(Subject.java:422) ~[na:1.8.0_112]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) ~[hadoop-common-2.7.1.jar!/:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:732) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:885) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:854) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1180) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.execService(ClientProtos.java:32675) [hbase-protocol-1.1.0.jar!/:1.1.0]
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.execService(ProtobufUtil.java:1615) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:92) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel$1.call(RegionCoprocessorRpcChannel.java:89) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.RegionCoprocessorRpcChannel.callExecService(RegionCoprocessorRpcChannel.java:95) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.hadoop.hbase.ipc.CoprocessorRpcChannel.callMethod(CoprocessorRpcChannel.java:56) [hbase-client-1.1.1.jar!/:1.1.1]
at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService$Stub.getTable(MetaDataProtos.java:10665) [phoenix-core-4.4.0-HBase-1.1.jar!/:4.4.0-HBase-1.1]
at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1290) [phoenix-core-4.4.0-HBase-1.1.jar!/:4.4.0-HBase-1.1]
at org.apache.phoenix.query.ConnectionQueryServicesImpl$7.call(ConnectionQueryServicesImpl.java:1277) [phoenix-core-4.4.0-HBase-1.1.jar!/:4.4.0-HBase-1.1]
at org.apache.hadoop.hbase.client.HTable$16.call(HTable.java:1741) [hbase-client-1.1.1.jar!/:1.1.1]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_112]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_112]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_112]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_112]
Caused by: org.ietf.jgss.GSSException: No valid credentials provided (Mechanism level: Attempt to obtain new INITIATE credentials failed! (null))
at sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:343) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:145) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) ~[na:1.8.0_112]
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) ~[na:1.8.0_112]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) ~[na:1.8.0_112]
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) ~[na:1.8.0_112]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ~[na:1.8.0_112] ... 29 common frames omitted
Caused by: javax.security.auth.login.LoginException: Cannot read from System.in
at com.sun.security.auth.module.Krb5LoginModule.promptForName(Krb5LoginModule.java:865) ~[na:1.8.0_112]
at com.sun.security.auth.module.Krb5LoginModule.attemptAuthentication(Krb5LoginModule.java:704) ~[na:1.8.0_112]
at com.sun.security.auth.module.Krb5LoginModule.login(Krb5LoginModule.java:617) ~[na:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_112]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_112]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_112]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext.invoke(LoginContext.java:755) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext.access$000(LoginContext.java:195) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:682) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext$4.run(LoginContext.java:680) ~[na:1.8.0_112]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext.invokePriv(LoginContext.java:680) ~[na:1.8.0_112]
at javax.security.auth.login.LoginContext.login(LoginContext.java:587) ~[na:1.8.0_112]
at sun.security.jgss.GSSUtil.login(GSSUtil.java:258) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5Util.getTicket(Krb5Util.java:158) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:335) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5InitCredential$1.run(Krb5InitCredential.java:331) ~[na:1.8.0_112]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_112]
at sun.security.jgss.krb5.Krb5InitCredential.getTgt(Krb5InitCredential.java:330) ~[na:1.8.0_112]
... 36 common frames omitted

推荐答案

首先,对Kerberos的Java支持远非完美:引用 [Hadoop] 身份验证系统...跨版本和JDK脆弱"

To begin with, Java support of Kerberos is far from perfect: quoting Hadoop and Kerberos, The Madness beyond the Gate "... the public APIs are too simplistic for the [Hadoop] authentication system ... brittle across versions and JDKs"

这些限制之一是Java无法创建 renewable Kerberos票证,也无法续订现有的(例如,由kinit创建的)票证.因此,您的loginUserFromKeytabAndReturnUGI()创建的票证将在10小时后失效(这是票证寿命的典型设置).
为了进行记录,Hadoop身份验证库会自动产生一个后台线程来尝试续签其UGI票证,但无济于事,因为该票证不可续签.

One of these limitations is that Java cannot create renewable Kerberos tickets, and cannot renew an existing (e.g. created by kinit) renewable ticket. Therefore your loginUserFromKeytabAndReturnUGI() creates a ticket that will expire after 10 hours (which is the typical setting for ticket lifetime).
For the record, the Hadoop auth library automatically spawns a background thread to try to renew its UGI ticket, but to no avail, because the ticket is not renewable.

即使票证是可再生的,最终也会在7天后(达到典型设置)达到可再生寿命,并且您将不得不在某个时候重新创建

Even if the ticket was renewable, it would eventually reach its end-of-renewable-life after 7 days (typical setting again), and you would have to re-create it at some point.

标准解决方案是定期生成后台线程调用checkTGTAndReloginFromKeytab()-请参阅

The standard solution is to spawn a background thread invoking checkTGTAndReloginFromKeytab() periodically -- see that post for a very elaborate explanation by a HortonWorks guru (a colleague of the guy who wrote that GitBook about Hadoop & Kerberos)

另请参见这篇文章

See also this post and that post for more context about Kerberos and UGI.

这篇关于恰好10小时后,更新与Apache Phoenix的连接(使用Kerberos)失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆