You can translate the question and the replies:

Timestamp error when using Dask

Hi there, I am trying to use the python library dask to work on a large dataframe. Dask requires an index column to chunk or batch the data being pulled into memory and it's best practice to use an int or timestamp type field to split the data. When using a timestamp field from denodo, I get the following error: ``` QUERY [VIRTUAL] [ERROR] QUERY [JDBC WRAPPER] [ERROR] QUERY [JDBC ROUTE] [ERROR] Received exception with message 'Conversion failed when converting DETAIL: java.sql.SQLException: Error executing query. Total time 0.204 seconds. QUERY [VIRTUAL] [ERROR] QUERY [JDBC WRAPPER] [ERROR] QUERY [JDBC ROUTE] [ERROR] Received exception with message 'Conversion failed when converting [SQL: SELECT sales_table.client, sales_table.product, sales_table.sale_amount, sales_table.date_recorded FROM sales_table WHERE sales_table.date_recorded >= %(date_recorded_1)s AND sales_table.date_recorded < %(date_recorded_2)s] [parameters: {'date_recorded_1': 'i', 'date_recorded_2': 'j'}] (Background on this error at: ``` It seems SQLAlchemy (which is used under the hood in dask) is having problems converting the timestamp data from denodo. Do you know why this is happening or if there is a fix for this? I have data in other tables that have only timestamp data for splitting and if I could get this working, it would really help to analyse in batches. I can also get the data from the SQL table directly and it batches the timestamp data just fine, equally if I usk an int column from the denodo table / interface, so it seems to be a denodo error. I am using SQLAlchemy version 1.3.20 with the Denodo dialect. Cheers!
07-04-2022 15:10:15 -0400

1 Answer

Hi, If I face such a scenario, I would follow the below steps, 1. Make sure the data types of the fields are compatible in both SQL Alchemy and the Denodo server, as the data conversion error happens when the data type does not match. 2. Try connecting to the Virtual DataPort server using the SQL Alchemy without the Dask library as per the User Manual - Denodo Dialect for SQLAlchemy ( 3. Check the “**vdp.log**” file present in the “**<DENODO_HOME>/logs**” folder for more details on the error message. 4. Activate the log level by running the command in the VQL shell of the Virtual DataPort Administration Tool, . **`call logcontroller('com.denodo.vdp.requests', 'info')`** Once done, rerun the query from SQLAlchemy to identify the query that has been sent to the Virtual DataPort server by examining the “**vdp-requests.log**” file present in the **"<DENODO_HOME>/logs/vdp"** folder and execute the same query in the Virtual DataPort Administration Tool. Hope this helps!
Denodo Team
11-04-2022 09:14:30 -0400
You must sign in to add an answer. If you do not have an account, you can register here