Hi,
To establish a successful connection with Azure Data bricks from the Virtual DataPort, I would use the default SIMBA JDBC driver (choose Database Adapter: Spark SQL 2.x Databricks) provided by Denodo. You could refer to the Knowledge base article [How to connect to Azure Databricks from Denodo](https://community.denodo.com/kb/en/view/document/How%20to%20connect%20to%20Azure%20Databricks%20from%20Denodo) for detailed steps.
When providing authentication details, I would provide the login information in the JDBC url or Login/password parameters of the JDBC data source as both options would work fine. For deployment options, I would use [Solution Manager](https://community.denodo.com/docs/html/browse/8.0/en/solution_manager/administration/introduction), as it is useful to deploy the VQL elements from one environment to another using the exported properties file which helped me to capture the connection parameters.
The *Open session error* usually occurs when you have issues in SSL certificate validation. I would ensure if I have followed the steps provided in Knowledge base article [SSL Connection from VDP to data sources](http://https//community.denodo.com/kb/view/document/SSL%20connection%20from%20VDP%20to%20data%20sources?category=Security) to resolve the error.
For the second error related to Kerberos login, it could have arisen due to improper Kerberos configuration and hence I would check the Kerberos configuration for Azure Databricks and provide the required information like Kerberos login and password in the JDBC data source. The section [Connecting to a JDBC Source with Kerberos Authentication](https://community.denodo.com/docs/html/browse/8.0/en/vdp/administration/creating_data_sources_and_base_views/jdbc_sources/jdbc_sources#connecting-to-a-jdbc-source-with-kerberos-authentication) would help you to get more information about it.
Hope this helps