You can translate the question and the replies:

Connect Denodo to Azure-DataBrick with Simba driver

Hi, I am trying to connect Azure Databricks through Simba driver, but getting error due to autherntication or something else. Steps: I downloaded it from website (https://databricks.com/spark/jdbc-drivers-download) and put it in Denodo ( DenodoPlatform8.0\lib\extensions\jdbc-drivers-external\bigquery\SimbaSparkJDBC42-2.6.17.1021 ) Restarted Denodo and tried to make a new JDBC connection using **Database URL** : `jdbc:spark://host:port/default;transportMode=http;ssl=1;httpPath=sql/protocolv1/o/something/something-cooks535;AuthMech=3;UID=token;PWD=<Token generated from Azure Cluster valid 90 days>` Above Database url I found from Azure DataBrick > Cluster > Jdbc/odbc section. When tested the connection got **error**: > *Unable to establish connection: [Simba][SparkJDBCDriver](500151) Error setting/closing session: Open Session Error.* Ques-1: Do I need to pass the username and password even though we pass the Authentication Token in URL? What happens on enterprise deployment on Prod. we may not have usr/pwd I tried kerberos login using my own email_id for Azure and password, still got **error**: > Unable to establish connection: javax.security.auth.login.LoginException: No such host is known (NLDN03951PAU) Ques-2: Is the driver correct and compatible for Denodo? ``` Database Adapter: Spark SQL 2.x Databricks Driver Class Path: databricks Driver Class: com.simba.spark.jdbc41.Driver Transaction isolation: Database Default ```
user
10-05-2021 18:26:37 -0400
code

1 Answer

Hi, To establish a successful connection with Azure Data bricks from the Virtual DataPort, I would use the default SIMBA JDBC driver (choose Database Adapter: Spark SQL 2.x Databricks) provided by Denodo. You could refer to the Knowledge base article [How to connect to Azure Databricks from Denodo](https://community.denodo.com/kb/en/view/document/How%20to%20connect%20to%20Azure%20Databricks%20from%20Denodo) for detailed steps. When providing authentication details, I would provide the login information in the JDBC url or Login/password parameters of the JDBC data source as both options would work fine. For deployment options, I would use [Solution Manager](https://community.denodo.com/docs/html/browse/8.0/en/solution_manager/administration/introduction), as it is useful to deploy the VQL elements from one environment to another using the exported properties file which helped me to capture the connection parameters. The *Open session error* usually occurs when you have issues in SSL certificate validation. I would ensure if I have followed the steps provided in Knowledge base article [SSL Connection from VDP to data sources](http://https//community.denodo.com/kb/view/document/SSL%20connection%20from%20VDP%20to%20data%20sources?category=Security) to resolve the error. For the second error related to Kerberos login, it could have arisen due to improper Kerberos configuration and hence I would check the Kerberos configuration for Azure Databricks and provide the required information like Kerberos login and password in the JDBC data source. The section [Connecting to a JDBC Source with Kerberos Authentication](https://community.denodo.com/docs/html/browse/8.0/en/vdp/administration/creating_data_sources_and_base_views/jdbc_sources/jdbc_sources#connecting-to-a-jdbc-source-with-kerberos-authentication) would help you to get more information about it. Hope this helps
Denodo Team
11-05-2021 08:25:38 -0400
code
You must sign in to add an answer. If you do not have an account, you can register here