For this scenario, I would do one of the following approaches:
* I would [mount the Azure DataLake Storage](https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/azure/azure-datalake#--mount-azure-data-lake-storage-gen1-resource-using-a-service-principal-and-oauth-20) as a local drive. Then I would export the [output of the base view](https://community.denodo.com/docs/html/browse/latest/vdp/administration/creating_derived_views/querying_views/querying_views#:~:text=export%20the%20result%20of%20the%20query%20to%20a%20zip%20file) from Virtual DataPort and place it in that local drive.
* Also, once after mounting the Azure DataLake Storage as a local drive, I can export the results of the view from Scheduler Job and I would use [Denodo SFTP Exported Files Custom Handler](https://community.denodo.com/docs/html/document/denodoconnects/latest/Denodo%20SFTP%20Exported%20Files%20Custom%20Handler%20-%20User%20Manual) to sftp the files to the mounted drive/directory.
* Besides that, I would create a [own custom handler](https://community.denodo.com/docs/html/browse/latest/scheduler/administration/developer_api/extensions_plugins/handlers#handlers) using [Denodo4E plugin](https://community.denodo.com/docs/html/browse/latest/denodo4e/index#denodo4eclipse-plugin-guide) that could connect to the Azure DataLake. Then, I would use the created custom handler in the Scheduler Job, which could get the exported results from the execution of the [Scheduler Job](https://community.denodo.com/docs/html/browse/latest/scheduler/administration/creating_and_scheduling_jobs/configuring_new_jobs/postprocessing_section_exporters#exporters-section) and place it in the Azure Data Lake storage.
You could refer to the question [Export denodo views to ADLS (AZURE DATA LAKE)](https://community.denodo.com/answers/question/details?questionId=9064u000000CfAPAA0&title=Export+denodo+views+to+ADLS+%28AZURE+DATA+LAKE%29) for more information.
Hope this helps!!!