KERBEROS_HDFS_CONFIG_CHECK
Tests the Kerberos configuration of a Vertica cluster that uses HDFS. This function is a more specific version of KERBEROS_CONFIG_CHECK.
You can call this function with arguments to specify an HDFS configuration to test, or without arguments. If you call it with no arguments, this function reads the HDFS configuration files and fails if it does not find them. See Configuring the hdfs Scheme. If it finds configuration files, it tests all configured nameservices.
The function performs the following tests, in order:
- Are Kerberos services available?
- Does a keytab file exist and are the Kerberos and HDFS configuration parameters set in the database?
- Can Vertica read and invoke kinit with the keys to authenticate to HDFS and obtain the database Kerberos ticket?
- Can Vertica perform
hdfs
andwebhdfs
operations using both the database Kerberos ticket and user-forwardable tickets? - Can Vertica make unauthenticated WebHCat calls?
If any test fails, the function returns a descriptive error message.
Syntax
KERBEROS_HDFS_CONFIG_CHECK( ['hdfsHost:hdfsPort', 'webhdfsHost:webhdfsPort', 'webhcatHost:webhcatPort'] )
Arguments
hdfsHost, hdfsPort |
The hostname or IP address and port of the HDFS name node. Vertica uses this server to access data that is specified with |
webhdfsHost, webhdfsPort |
The hostname or IP address and port of the WebHDFS server. Vertica uses this server to access data that is specified with |
webhcatHost, webhcatPort |
The hostname or IP address and port of the WebHCat server. The HCatalog Connector uses this server to access data through Hive. If the value is ' ', the function skips this part of the check. |
Privileges
This function does not require privileges.
Examples
The following example shows the results when the function cannot infer the configuration from the HDFS configuration files.
=> SELECT KERBEROS_HDFS_CONFIG_CHECK(); --------Checking basic Kerberos configuration--------- ok: kinit exists ok: klist exists ok: krb5 exists at [/etc/krb5.conf] ---------Checking basic Hadoop configuration--------- HadoopFSPrincipal: [mydb/server.example.com@EXAMPLE.COM] HadoopFSAuthentication (default): [/scratch_b/qa/mydb.keytab] HadoopFSTokenRefreshFrequency: 0 HadoopFSKinitCommand: export KRB5CCNAME=%tmp%; kinit %principal% -k -t %keytab% -c %tmp%; %cmd%; V_HADOOP_RESULT=$?; rm %tmp%; exit $V_HADOOP_RESULT; HadoopFSConnectionTimeout: 60 ok: Can read HDFS keys ok: Can get tickets for hdfs principal ok: vertica can kinit as HDFS storage **Inferring HDFS configuration** Number of HDFS Clusters: 1 Cluster 1 is HA ---------Checking LibHdfs++ configurations--------- **Checking default LibHdfs++ Nameservice** ok: Vertica can perform LibHdfs++ operations at [hdfs:///] ok: Can access [hdfs:///] using delegated ticket for [dbuser] **Checking LibHdfs++ using hdfs configurations found if any** ok: Vertica can perform LibHdfs++ operations at [hdfs://ns-server1/] ok: Can access [hdfs://ns-server1/] using delegated ticket for [dbuser] ---------Checking WebHdfs configurations--------- Attempt to pull an unathenticated page from the WebHdfs URL via curl ok: JMX works unauthenticated ok: Can make external connection to WebHdfs ok: Vertica can perform WebHdfs operations at [http://server1.example.com:50070/webhdfs/v1/] ok: Can access [http://server1.example.com:50070/webhdfs/v1/] using delegated ticket for [dbuser] ---------Checking WebHCat configurations--------- ok: Webhcat works unauthenticated ok: Can connect to webhcat at [server1.example.com:14433] (1 row)
The following example uses parameters to specify the three hosts to check. The Kerberos configuration is valid for HDFS and WebHDFS but not for WebHCat.
=> SELECT KERBEROS_HDFS_CONFIG_CHECK('ns-server1', 'server1.example.com:50070', 'server1.example.com:14433'); --------Checking basic Kerberos configuration--------- ok: kinit exists ok: klist exists ok: krb5 exists at [/etc/krb5.conf] ---------Checking basic Hadoop configuration--------- HadoopFSPrincipal: [mydb/server.example.com@EXAMPLE.COM] HadoopFSAuthentication (default): [/scratch_b/qa/mydb.keytab] HadoopFSTokenRefreshFrequency: 0 HadoopFSKinitCommand: export KRB5CCNAME=%tmp%; kinit %principal% -k -t %keytab% -c %tmp%; %cmd%; V_HADOOP_RESULT=$?; rm %tmp%; exit $V_HADOOP_RESULT; HadoopFSConnectionTimeout: 60 ok: Can read HDFS keys ok: Can get tickets for hdfs principal ok: vertica can kinit as HDFS storage ---------Checking LibHdfs++ configurations--------- **Checking default LibHdfs++ Nameservice** ok: Vertica can perform LibHdfs++ operations at [hdfs:///] ok: Can access [hdfs:///] using delegated ticket for [dbuser] **Checking LibHdfs++ using hdfs configurations found if any** ok: Vertica can perform LibHdfs++ operations at [hdfs://ns-server1/] ok: Can access [hdfs://ns-server1/] using delegated ticket for [dbuser] ---------Checking WebHdfs configurations--------- Attempt to pull an unathenticated page from the WebHdfs URL via curl ok: JMX works unauthenticated ok: Can make external connection to WebHdfs ok: Vertica can perform WebHdfs operations at [http://server1.example.com:50070/webhdfs/v1/] ok: Can access [http://server1.example.com:50070/webhdfs/v1/] using delegated ticket for [dbuser] ---------Checking WebHCat configurations--------- FAILED: Could not make unauthenticated connection to webhcat [server1.example.com:14433/templeton/v1] FAILED: Could not make external connection to webhcat at [server1.example.com:14433/templeton/v1/status] (1 row)