USER MANUALS

HDFS

For accessing datasets in Hadoop Distributed File System (HDFS) you have to:

  • In case the Hadoop cluster uses Kerberos authentication, you must configure the catalogs to work with the following Hadoop cluster services:

    • The Hadoop Distributed File System (HDFS).

    • The Hive Metastore.

    For this see External Hive Metastore with Kerberos or Embedded Hive Metastore with Kerberos sections depending on the type of Metastore used.

  • Execute the kubectl create secret command:

    kubectl create secret generic mpp-credentials --from-literal=METASTORE_DB_PASSWORD=hive
    
  • Run helm install command:

    helm install prestocluster prestocluster/
    
Add feedback