Hi there,
I am trying to use the python library dask to work on a large dataframe. Dask requires an index column to chunk or batch the data being pulled into memory and it's best practice to use an int or timestamp type field to split the data.
When using a timestamp field from denodo, I get the following error:
```
QUERY [VIRTUAL] [ERROR]
QUERY [JDBC WRAPPER] [ERROR]
QUERY [JDBC ROUTE] [ERROR] Received exception with message 'Conversion failed when converting
DETAIL: java.sql.SQLException: Error executing query. Total time 0.204 seconds.
QUERY [VIRTUAL] [ERROR]
QUERY [JDBC WRAPPER] [ERROR]
QUERY [JDBC ROUTE] [ERROR] Received exception with message 'Conversion failed when converting
[SQL: SELECT sales_table.client, sales_table.product, sales_table.sale_amount, sales_table.date_recorded
FROM sales_table
WHERE sales_table.date_recorded >= %(date_recorded_1)s AND sales_table.date_recorded < %(date_recorded_2)s]
[parameters: {'date_recorded_1': 'i', 'date_recorded_2': 'j'}]
(Background on this error at: http://sqlalche.me/e/13/4xp6)
```
It seems SQLAlchemy (which is used under the hood in dask) is having problems converting the timestamp data from denodo. Do you know why this is happening or if there is a fix for this? I have data in other tables that have only timestamp data for splitting and if I could get this working, it would really help to analyse in batches.
I can also get the data from the SQL table directly and it batches the timestamp data just fine, equally if I usk an int column from the denodo table / interface, so it seems to be a denodo error. I am using SQLAlchemy version 1.3.20 with the Denodo dialect.
Cheers!