You can translate the question and the replies:

Load data from Denodo View to Google Cloud Storage

Hi. We have views created in Denodo that we would like to load into Google Cloud Storage (which will eventually be moved to BigQuery to create a subject layer). Has anyone been able to complete this? We are currently in the very early stages of testing out Google Cloud Platform capabilities. My understanding is that BigQuery allows table creation from GCS but only in the following file format: CSV, JSON, Avro, Parquet, ORC, and Cloud Datastore backup. Any feedback is appreciated. Thank you!
user
25-10-2019 10:31:21 -0400
code

3 Answers

Hi, You could create REST web services with JSON as the presentation of the output data, load the JSON file to Google Cloud and then create tables with BigQuery. You may refer to the section [Creating REST Web Services](https://community.denodo.com/docs/html/browse/7.0/vdp/vql/publication_of_web_services/creating_rest_web_services/creating_rest_web_services) of the Virtual DataPort VQL Guide for more detailed information. Hope this helps!
Denodo Team
30-10-2019 12:31:08 -0400
code
Hi, As opposed to loading a single record at a time, is there a batch method? Does the REST web services allow batch extract to json? Thank you.
user
20-12-2019 13:59:51 -0500
Hi, Nowadays the extraction to JSON needs to be done one by one. You may also try to create jobs using Denodo Scheduler to export data into CSV files. More information could be found in the [Example of exporters section](https://community.denodo.com/docs/html/browse/latest/scheduler/administration/creating_and_scheduling_jobs/configuring_new_jobs/general_structure_of_a_job#example-of-exporters-section) of Scheduler Administration Guide. Hope this helps!
Denodo Team
13-01-2020 04:58:28 -0500
code
You must sign in to add an answer. If you do not have an account, you can register here