You can translate the question and the replies:

Delegate LIMIT to Spark through JDBC

Hi, We use denodo on top of Spark through JDBC. We have some derived views built on top of some big tables. We would like to limit the number of records a user can query based off of this view automatically rather than user specifying LIMIT clause. for e.g. assuming the allowable limit is 100, " select * from Derived_view " should return only 100 records also " select country, count(1) from derived_view " should return counts for 100 different countries. In other words, the LIMIT should be delegated as the last command after all the aggrgations/transformations/joins are done on spark. Is there a way to achieve this in denodo? Cheers
13-05-2016 12:11:41 -0400

1 Answer

Have you checked out Custom Policies? With custom policy you can restrict the number of rows returned by a query. You can refer to the section "CUSTOM POLICIES" in the Developer Guide. Hope it helps.
Denodo Team
17-05-2016 15:34:45 -0400
You must sign in to add an answer. If you do not have an account, you can register here