Connecting to Secure Clusters with Kerberos and SSL Enabled Using Spark3 Thrift Server
-
Create a
Kerberos
directory on a local system that's an SSL enabled ODH cluster for Hive. -
Copy
spark.service.keytab
from mn0 node (Spark3 Thrift Server node) of the ODH cluster to theKerberos
directory, and then rename it tooac.keytab
. -
Copy
/etc/krb5.conf
from mn0 node of the ODH cluster to theKerberos
directory and rename it tokrb5conf
. -
Update
admin_server
andkdc
information inkrb5conf
with the public IP of cluster's mn0 node instead of hostname. -
Create a file named
service_details.json
insideKerberos
directory. For example:{ "Host" : "<Public IP of Spark3 Thrift Server node(mn0)>", "Port" : "10000", "ServicePrincipalName" : "spark/<FQDN of Spark3 Thrift Server node(mn0)>@<REALM_NAME>" }
-
Create a zip for the
Kerberos
directory. For example:$ ls -1 kerberos krb5conf oac.keytab service_details.json $ zip -r spark3tskerb.zip kerberos/*
- To create a connection for Kerberos enabled ODH Open the navigation menu and click Analytics & AI. Under Analytics, click Analytics Cloud.
-
To connect to an Oracle Analytics Cloud instance, select the compartment in which you created the instance.
If needed, create an instance. See Creating an OAC Instance.
- Click the instance name.
- Click Analytics Home Page.
- Click Create, and then select Connection.
- Select Spark.
-
Enter a name for the connection, and then enter the remaining details with the following specifics:
- Authentication Type - Select Kerberos
- Client Credentials - Select
spark3tskerb.zip
from the local system - Authentication - Select Always use these credentials
- Click Save.
- To verify the connection, go to the OAC home page and click Connect to Your Data.
-
Click the connection you created.
If successful, the Apache Hive database tables are listed.