Q&A

You must sign in to ask a question. If you do not have an account, you can register here

Pass Through kerberos Authentication - Hive
answered 11-12-2019 02:07:32 -0500

I am looking for information on setting up a new data source using JDBC with a pass through credentials to a Hive metastore that uses Kerberos authentication. Please share details to on setup and configuration info.

Answers: 5

kerberos authentication debug

"No more data to read from socket" Error
answered 10-12-2019 20:27:03 -0500

Hi Denodo team, we are facing the following error while running a job: Initialization errors: No more data to read from socket This job was executed one month ago and everthing was fine. Any idea? Thanks in advance

Answers: 1

Connect to Test Drive instance using ODBC
answered 09-12-2019 20:08:30 -0500

Hello I registered for the Test Drive and the virtual machine is running. For testing, I'l like to connect to it using ODBC from my machine. Is that possible? What I've done so far: Installed the ODBC driver In the Windows ODBC manager: Add new ODBC d...

Answers: 1

ODBC Testing

Create an empty csv file if a query returns any results
answered 09-12-2019 15:10:25 -0500

Hi Denodo Team, I have a job on the scheduler server running a query, and creating a CSV file with the result of the query. But, in case of a specific field contains NULL values, then the file should be generated as empty. Wthat I have done is to sp...

Answers: 3

Importing flat files using FTP Server
answered 09-12-2019 13:26:49 -0500

How can I connect my vdp with ftp server. I want to import flat files from the server?

Answers: 1

FTP

List all interfaces with field names
answered 09-12-2019 12:12:35 -0500

We're looking to get a list of all our interfaces, with field names, and preferably attributes (ex. data type, max length). The end goal is to allow report developers to be able to search for all interfaces which contain specific field names. Is this p...

Answers: 2

names Interface statistics

Databricks as a Cache store for Denodo
answered 09-12-2019 10:18:26 -0500

Has anyone used Databricks as a Cache store for Denodo? I am able to connect to Databricks cluster successfully but not able to configure it as a Cache.

Answers: 4

How to create self-referential data source?
answered 06-12-2019 15:28:28 -0500

Hello Denodo experts, I want to create a base view from query from a self-referential data source..how do I create this self-referential data source? Please advise. Thanks!

Answers: 1

Database Baseview DataSource

Updates failing in scheduler
answered 06-12-2019 02:51:05 -0500

I have designed a VDP job in schdeuler. I have two three columns in the MSSQL table and all three of them are part of primary column. In the exporter section of scheduler job I have set the config as below Delete table contents to false update tupl...

Answers: 2

VDP Denodo Scheduler 7.0 Export MSSQL