Latest Activity


Denodo dialect support on Spark 3.3.0

Hi, I'm trying to connect to Denodo and run a vql (not a view/table export) using denodo-vdp-jdbcdriver.jar in Spark cluster on AWS Glue. The job used to run fine on Spark 2.4.3 but not on Spark 3.3.0. I use pyspark to run the job and I wonder how the...



Getting py4j.protocol.Py4JJavaError: An error occurred while calling o65.jdbc. : java.sql.SQLException: Unsupported type TIMESTAMP_WITH_TIMEZONE

I am making JDBC connection to Denodo database using pyspark. The table that i am connecting to contains "TIMESTAMP_WITH_TIMEZONE" datatype for 2 columns. Since spark provides builtin jdbc connection to a handful of dbs only of which denodo is not a pa...

JDBC Spark Spark-sql connection


Configure Bulk data load API for databricks source in denodo 8.0

Would like to know how to configure bulk data load API for Databricks source in denodo 8.0. I have gone through the Bulk data load API docs for Databricks, spark and impala. But I didnt understand how to configure it. Can you please let know the step b...

bulkload Bulk Data Load databricks Spark


Delegate LIMIT to Spark through JDBC

Hi, We use denodo on top of Spark through JDBC. We have some derived views built on top of some big tables. We would like to limit the number of records a user can query based off of this view automatically rather than user specifying LIMIT clause. ...

Delegate JDBC Spark