You can translate the question and the replies:

Getting py4j.protocol.Py4JJavaError: An error occurred while calling o65.jdbc. : java.sql.SQLException: Unsupported type TIMESTAMP_WITH_TIMEZONE

I am making JDBC connection to Denodo database using pyspark. The table that i am connecting to contains "TIMESTAMP_WITH_TIMEZONE" datatype for 2 columns. Since spark provides builtin jdbc connection to a handful of dbs only of which denodo is not a part, it is not able to recognize "TIMESTAMP_WITH_TIMEZONE" datatype and hence not able to map to any of its spark sql dataype. To overcome this i am providing my custom schema(c_schema here) but this is not working as well and i am getting the same error. Below is the code snippet. c_schema="game start date TIMESTAMP,game end date TIMESTAMP" df = spark.read.jdbc("jdbc_url", "schema.table_name",properties={"user": "user_name", "password": "password","customSchema":c_schema,"driver": "com.denodo.vdp.jdbc.Driver"}) Need Help.
user
05-02-2022 12:43:10 -0500
code

1 Answer

Hi, I further internally reviewed your scenario, I came across that, in case of spark sql the datatype TIMESTAMP WITH TIME ZONE will support only for oracle dialect. Hence, you can discuss this with the spark administrator. However, as a workaround, I would suggest you use the **CAST** function which converts the data from one data type to another. You can use this function to convert the TIMESTAMP WITH TIME ZONE data type to **text**. For further information, refer to the [**Conversion Functions**](https://community.denodo.com/docs/html/browse/8.0/en/vdp/vql/functions/conversion_functions/conversion_functions#cast) section of the Virtual DataPort VQL Guide. Hope this helps!
Denodo Team
07-02-2022 07:08:58 -0500
code
You must sign in to add an answer. If you do not have an account, you can register here