Denodo community Q&A RSS feedLatest Denodo community answered questionsspark test bulk load with warning2025-07-14T07:56:58Z2025-07-14T07:56:58ZHi
I tried to create a datasource with Spark.
The test connection was successful.
But when I tried to test the bulk load, an error occurred as shown below.
`CREATE TABLE vdb_1752479374923 (expirationDate BIGINT, rowStatus STRING ) USING PARQUET OPTIONS('path'='s3a://poc-sinopac-replication-bucket/Spark/vdb_1752479374923')`
> Error while compiling statement: FAILED: ParseException line 1:74 missing EOF at 'USING' near ')'org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 1:74 missing EOF at 'USING' near ')'
> at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:300)
> at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:286)
> at org.apache.hive.jdbc.HiveStatement.runAsyncOnServer(HiveStatement.java:324)
> at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:265)
> at com.denodo.vdb.util.datamovement.TestDataMovementUtil.sql(TestDataMovementUtil.java:38)
> at com.denodo.vdb.util.tablemanagement.sql.HadoopTableManager.getDataMovementAction(HadoopTableManager.java:848)
> at com.denodo.vdb.util.tablemanagement.sql.HadoopTableManager.executeCreateTableToTestBulkDataLoad(HadoopTableManager.java:807)
> at com.denodo.vdb.util.tablemanagement.sql.HadoopTableManager.createTableToTestBulkDataLoad(HadoopTableManager.java:801)
> at com.denodo.vdb.util.tablemanagement.sql.HadoopTableManager.testDataMovement(HadoopTableManager.java:447)
> at com.denodo.vdb.util.introspectionservice.actions.TestHdfsDataMovementAction.execute(TestHdfsDataMovementAction.java:154)
> at com.denodo.vdb.interpreter.execution.TestHdfsDataMovementAction.exec(TestHdfsDataMovementAction.java:87)
> at com.denodo.vdb.interpreter.execution.Action.run(Action.java:565)
> at com.denodo.vdb.interpreter.execution.processor.VDBActionProcessor.runAction(VDBActionProcessor.java:1763)
> at com.denodo.vdb.interpreter.execution.processor.VDBActionProcessor.process(VDBActionProcessor.java:2394)
> at com.denodo.vdb.interpreter.execution.processor.VDBActionProcessor.process(VDBActionProcessor.java:1370)
> at com.denodo.vdb.interpreter.execution.processor.VDBActionProcessor.preProcess(VDBActionProcessor.java:817)
> at com.denodo.vdb.interpreter.execution.Action.start(Action.java:695)
> at com.denodo.vdb.interpreter.execution.ExecutionEngine.execute(ExecutionEngine.java:349)
> at com.denodo.vdb.interpreter.execution.ExecutionEngine.execute(ExecutionEngine.java:239)
> at com.denodo.vdb.interpreter.execution.ExecutionEngine.execute(ExecutionEngine.java:229)
> at com.denodo.vdb.vdbinterface.server.QueryExecutorImpl.doExecuteStatement(QueryExecutorImpl.java:435)
> at com.denodo.vdb.vdbinterface.server.QueryExecutorImpl.executeStatement(QueryExecutorImpl.java:416)
> at com.denodo.vdb.vdbinterface.server.QueryExecutorImpl.executeStatement(QueryExecutorImpl.java:448)
> at jdk.internal.reflect.GeneratedMethodAccessor162.invoke(Unknown Source)
> at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:568)
> at com.denodo.internal.o.a.r.server.ObjectRef.invoke(ObjectRef.java:110)
> at com.denodo.internal.o.a.r.netty.RMIServerHandler.dispatch(RMIServerHandler.java:198)
> at com.denodo.internal.o.a.r.netty.RMIServerHandler.channelRead(RMIServerHandler.java:82)
> at com.denodo.internal.i.n.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
> at com.denodo.internal.i.n.channel.AbstractChannelHandlerContext.access$600(AbstractChannelHandlerContext.java:61)
> at com.denodo.internal.i.n.channel.AbstractChannelHandlerContext$7.run(AbstractChannelHandlerContext.java:425)
> at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
> at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
> at com.denodo.internal.i.n.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
> at java.base/java.lang.Thread.run(Thread.java:840)
> Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: ParseException line 1:74 missing EOF at 'USING' near ')'
> at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:335)
> at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:199)
> at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:260)
> at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247)
> at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:541)
> at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:527)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
> at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
> at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
> at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
> at com.sun.proxy.$Proxy58.executeStatementAsync(Unknown Source)
> at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:312)
> at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:562)
> at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557)
> at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542)
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56)
> at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> at java.lang.Thread.run(Thread.java:750)
> Caused by: java.lang.RuntimeException: org.apache.hadoop.hive.ql.parse.ParseException:line 1:74 missing EOF at 'USING' near ')'
> at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:231)
> at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:74)
> at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:67)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:616)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826)
> at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773)
> at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768)
> at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126)
> at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197)
> ... 27 more
But with the same code, when I run it in spark-shell, it shows:
25/07/14 03:37:46 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-s3a-file-system.properties,hadoop-metrics2.properties
25/07/14 03:37:47 WARN HadoopFSUtils: The directory s3a://poc-sinopac-replication-bucket/Spark/vdb_1752479374923 was not found. Was it deleted very recently?
res0: org.apache.spark.sql.DataFrame = []
How can I debug?2025-07-14T07:56:58ZDenodo metadata into Power BI file2025-07-10T15:06:57Z2025-07-10T15:06:57ZHi,
I have Power BI as end consumptoion and it is imporatant to have description added on field level for the users to understand the purpose and make it easy. Power BI is consuming from Denodo. Denodo do have the descriptons added on field level and they can be accessed via Denodo Catalog. Can any help me with how do I get these decriptions added into power BI? Is there a way without manual work to be done? Any suggestions on alternatives?
Thanks in Advance2025-07-10T15:06:57ZCan PDF files be linked directly to Denodo?2025-07-04T07:59:32Z2025-07-04T07:59:32ZDenodo can connect to a variety of data sources, but is it possible to link PDF files in their entirety? For example, can we use a data catalog to view PDF files linked to Denodo?2025-07-04T07:59:32ZXML DATASOURCE2025-07-02T10:45:40Z2025-07-02T10:45:40ZHi,
I would like to create a XML datasource that has a XLS extension. The file is located in a SFTP route. How can I create the datasource and the base view ?
Best regards,
Olga2025-07-02T10:45:40ZDate Format-Arabic calendar2025-06-30T13:05:57Z2025-06-30T13:05:57ZHi Team ,
My Oracle source table has date format like this 31-DEC-22 ,when I bring this value to Denodo it gets converted into timestamp ,I converted this to date format like 2022-12-31.
Now when I am trying to put filter condition like below
UPPER(formatdate('dd-MMM-yy',to_localdate('yyyy-MM-dd','2022-12-31')))
it doesn't work because above expression gets converted into Arabic calendar like 31-12-' some Arabic word' and condition gets failed.
Could you please provide syntax on how to convert date 2022-12-31 into 31-DEC-22 so that it doesn't fail in Denodo (I know that above syntax will work but since my server is in middle east region so locale is getting converted into 31-12-' some Arabic word' ) any suggestions for the same would be appreciated.2025-06-30T13:05:57ZDenodo Storage Costs2025-06-30T11:11:54Z2025-06-30T11:11:54ZHello, what are the storage costs of Denodo? Does anyone have an idea how it works? Thanks.2025-06-30T11:11:54ZHow to Implement Dynamic Filtering in Denodo Views Based on Userâ??s Department - DENODO 92025-06-20T09:05:51Z2025-06-20T09:05:51ZHello,
I am looking to implement dynamic filtering in Denodo, where each user only sees data relevant to their own department. The goal is for the view to automatically filter data based on the department associated with the currently logged-in user.
Could you please advise how to link the user session to a department table or view, so that this filtering happens dynamically? If possible, I would really appreciate a small example or sample code to help illustrate this setup.
Thank you very much for your support!
Best regards2025-06-20T09:05:51ZHow to Create User-Based Dynamic Views in Denodo Based on Department in DENODO 92025-06-20T09:00:03Z2025-06-20T09:00:03ZHello,
I would like to create dynamic views in Denodo where each user only sees the data related to their department. The idea is that, upon logging in, a user should automatically be restricted to the data of their respective department.
Could you please advise how to link the current user session to a department table or view, so that I can filter the data accordingly? If possible, I would also appreciate a small example or sample code demonstrating this setup.
Thank you very much in advance for your help!
Best regards2025-06-20T09:00:03ZWhich version of the Helm Charts belongs to Denodo 9.2.2? Where can I find that information?2025-06-17T07:18:07Z2025-06-17T07:18:07ZWe try to upgrade from 9.2.1 with charts 2.0.6 to 9.2.2 and charts 2.0.7. Latest greatest.
2.0.7 doesn't match with 9.2.2 because of a mount issue with custom-scripts in the SolutionManager.2025-06-17T07:18:07ZInformatica AXON to Denodo2025-06-13T10:04:54Z2025-06-13T10:04:54ZHi ,
Is it possible to connect Informatica AXON to Denodo and pull AXON database tables in
Denodo and create reports ?
I know it is possible other way around but can you guide me on the same.2025-06-13T10:04:54Z