Latest Activity
I have followed the below steps to configure databricks as cache and mpp in denodo 8.0 from this link https://community.denodo.com/kb/en/view/document/Configuring%20Databricks%20as%20MPP%20and%20Cache#h.df1a9s2oqcco Everything is OK but querying table...
i am testing bulk load feature now (hive and hdfs) In our production environment, we are using apache ranger(LDAP) for authorization in hive and hdfs. I got error checking if the HDFS path exist. Connection between vdp and hdfs is normal. We also check...
I am configured bulk data load API with databricks as cache and configured databricks cli (dbfs) in our denodo linux server..it successful when testing Listing HDFS URI contents ... OK Creating HDFS URI temporary directory ... OK Creating table ... OK...
Hi, I’m making tests on data file load using Denodo Express 8.0 for Windows 64 bits. I have an Oracle instance running on the same computer. I run catldr.sql as sysdba on the Oracle database. I created a data source (dstrcr) and activated "Use Bulk D...
I am getting error when trying to use bulk data load api for databricks source in denod8.0. Configured Databricks source and able to connect successfully. I was able install the databricks-cli in my server and configured databricks token as in in [http...
Would like to know how to configure bulk data load API for Databricks source in denodo 8.0. I have gone through the Bulk data load API docs for Databricks, spark and impala. But I didnt understand how to configure it. Can you please let know the step b...
I'm using Denodo Distributed File System Custom Wrapper. On S3, we have a file gets generated weekly with a full dump of data. So is there a best way to always get the most recent file whenever requested? Also is there a date range supported using cust...
I am sourcing data from Denodo using SSIS, I am getting the error below and I dont have an idea what i could be doing wrong, Please assist. [ADO NET Source [2]] Error: System.Data.OleDb.OleDbException (0x80040E14): ERROR: Syntax error: Exception parsi...
@primavera @data-server AWS HIVE JDBI Connection URI Bulk Data Load @wsdl SSIS 'Java heap space' DENODO SSIS
Hi, I can have scenarios where result set of a request can be huge. And Consumer does not prefer paginated REST access. Is it possible to take the request via REST and generate a file for payload and send the file to a file store for consumer to pickup?
Hi, I am trying to use sqlldr to load the cache on an oracle database. sqlldr runs fine on the denodo system account. I just had to add export LD_LIBRARY_PATH=/usr/lib/oracle/18.3/client64/lib:$LD_LIBRARY_PATH to the profile. Nevertheless, it does...