You can translate the question and the replies:

Long running query timing out even with querytime=0 in context

Denodo 8, update 20230914, sql server db. The following query is finishing with error: select * from metrics_dashboards.dv_monitor_datasources CONTEXT('cache_preload'='true', 'cache_wait_for_load'='true', 'cache_return_query_results'='false', 'cache_invalidate'='all_rows', 'cache_atomic_operation'='false', 'SWAP'='ON', 'swapsize'='100', 'queryTimeout'='0') TRACE Finished with error: Error executing query. Total time 9951.675 seconds. DV_MONITOR_DATASOURCES [VIRTUAL] [THREAD_TIMEOUT] DV_MONITOR_DATASOURCES [JDBC WRAPPER] [THREAD_TIMEOUT] Do we need to set chunkTimeout or thread_timeout ? If so, how ?
15-03-2024 15:14:25 -0400

1 Answer

Hi, From what you've shared I can see that you are running a cache load query that is timing out at the thread level. This doesn't look to be a case where the query is running exactly as it should but got interrupted by a timeout setting. Since you're query's execution time is 2.7 hours, this really is more of an optimization problem. To tackle the issue, you'll want to examine the execution trace for the cache load looking for bottlenecks. A great resource on how to do this is the following article: [Detecting Bottlenecks In A Query]( The above resource is part 4 in a series of articles. Parts 1-3 are great resources as well: [Modeling Big Data And Analytic Use Cases]( [Configuring The Query Optimizer]( [Caching]( As you work your way through the article(s) and are finding and addressing bottlenecks, some things to consider are 1) what cache mode is best for this situation and 2) the potential for using bulk load APIs. For the consideration of the appropriate cache mode, if most of the data is the same as what is already cached, but there are extra rows that need to be added to the result set,** partial cache mode** with the** explicit loads** option might be more appropriate than the **full cache mode**. As for bulk load APIs, you can determine whether or not the database used as the cache system allows you to use the bulk data load feature by going to `Server Configuration --> Cache --> Read & Write` and then looking to see if the checkbox "Use Bulk Data Load APIs" exists. By using this feature, you can increase the performance of the query if huge amounts of data is being loaded into the cache. Hope this helps!
Denodo Team
18-03-2024 16:56:14 -0400
You must sign in to add an answer. If you do not have an account, you can register here