You can translate the document:

Introduction

This document will provide an example of a CI/CD Pipeline implemented with Azure DevOps, Git, the Denodo Solution Manager and optionally the Denodo Testing Tool. The following diagram will serve us as a reference of the steps that we will be implementing.

Prerequisites

This document assumes that you already have your VCS configured. In our example we will use GitHub.

You can find more information about the integration with your VCS in the section Version Control Systems Integration of the Virtual DataPort Administration Guide.

It is also assumed that you already have access to a Azure DevOps account linked to the main repository of your VCS.

Set up Azure Pipelines

A YAML file is essentially a script that Azure DevOps follows to perform a series of tasks, tests, and deployments. It defines:

  • The trigger events that initiate the pipeline run.
  • The environment to run the pipeline, including the type of agent and its configurations.
  • The steps, tasks, or scripts to execute during the pipeline run.
  • Any additional configurations required for the build and deployment process.

The example below will demonstrate the steps to create and configure the YAML file:

  • Navigate to Pipelines and click on New Pipeline.
  • Click on Github (YAML).

  • Select your repository.
  • Select Starter Pipeline.
  • Click on Save and run to save the YAML file into our repository in Github.

It is possible that we still do not have available resources for the execution of our pipeline, in this case we are going to configure a local agent.

Set up a Local Agent for Azure DevOps CI/CD Pipeline with Denodo

Configuring a local agent in Azure DevOps for CI/CD pipelines, especially when working with Denodo, offers distinct benefits such as local resource access, enhanced control, improved Performance and Cost Savings.

Install the Azure DevOps Agent

  • Navigate to your Azure DevOps organization.
  • Go to Project settings > Agent Pools.
  • Click on Add pool, select ‘Self-hosted’ as your pool type, enter a name for your agent pool, click on ‘Grant access permission to all pipelines’ and click on ‘Create’.

  • Click on the agent pool created and then go to the Agents section.
  • Click on New agent and select Windows, macOS or Linux (depending on your operating system).
  • Follow the instructions to download and configure the agent.

Now we can test if we have resources added and if we can use our pipeline, we will run a simple example. Remember that we need to run our run script of the agent first to grant our project local resources.

.yml file example:

trigger:

- main

pool:

    name: 'Pool agent name'  # Name of your pool agent.

steps:

- script: echo Hello, world!

  displayName: 'Run a one-line script'

If the Agent pool is well configured we can go to the Runs section and see that the execution is done.


We can even see the script execution sequentially:

For simplicity, let's create a local folder in which we will place some files generated and used during the execution of our pipeline. In our example we will create it in the following path: 'C:/Denodo/DenodoPlatform8.0/pipeline_files'

It is also important to adjust the permissions of our pipeline and people to the specific use we want to do with our pipeline, you can check the Azure Pipelines security documentation.

Step by step overview of the reference procedure

  1. Developers work on a private Virtual Database.
  2. Developer pushes changes to the VCS repository.
  3. The push to the VCS notifies Azure DevOps.
  4. Azure DevOps triggers the pipeline to execute the defined jobs and tasks.

Screenshot of the VCS database configuration of the development server

  1. The developer database and the main database both share the same VCS database project1, which allows the main database to pull changes made by the developer(s).
  2. (Optional): Integrate the Denodo Testing Tool, this allows to execute a set of automated tests before deploying.
  3. Using the Solution Manager REST API to create a revision from VQL.
  4. Deploy the revision in the target environment.

The next sections will showcase how you could implement the final steps in the Azure DevOps .yml using bash commands and a Python script.

Configuring the Fundamental Components of the Azure DevOps Pipeline

To set the foundation for our Azure DevOps CI/CD pipeline with Denodo, we start by configuring the core components of our script: the trigger, variables, and a job. This base structure will be built upon in subsequent sections of the documentation, where we will add tasks and provide detailed explanations for each part.

Trigger

The trigger specifies when the pipeline should run. In this case, it is set to initiate automatically upon any changes to the main branch.

Trigger configuration example:

trigger:

- main

Variables

Variables are defined to store paths, filenames, and configurations, promoting reusability and easier updates across multiple steps.

Variables configuration example:

variables:

  denodoPath: 'C:/Denodo/DenodoPlatform8.0/bin'

  testingToolPath: 'C:/Denodo/denodo-testing-tool-8.0-20230425'

  pipelineTmpPath: 'C:/Denodo/DenodoPlatform8.0/pipeline_files'

  server: 'localhost:9999/project1'

  vqlFileName: 'vcscommand.vql'

  testingToolConfig: 'configuration.properties'

  testingToolTest: 'test1.denodotest'

Job: AgentJob

This job encompasses various steps, ranging from script execution to artifact publishing, all running on a self-hosted agent specified by our agent pool.

Job configuration example:

jobs:

- job: AgentJob

  pool:

    name: 'cicd denodo pipeline'

  steps:

  # Steps will be defined here in subsequent sections

Now that we have established the basic structure of our job in the Azure DevOps pipeline, it is time to dive into the details and start building out our automation scripts within the steps section. Each step in our pipeline represents an individual task or script that is executed as part of the job.

Azure Pipeline calls the VDP server to pull from the common VCS repository to the main database

To induce a pull in the master database from Azure you can use the Denodo Import Script with a VQL file containing the DVCSPULL command for the main database.

However, if your Azure Pipeline server does not run on the same host as your Denodo Platform, you will need to copy the denodo-db-tools folder to the machine where your Azure Pipeline server is running. This specific folder encompasses the import and export scripts along with their required dependencies, enabling the execution of these scripts on remote hosts that do not have the Denodo Platform installed.

You can find the denodo-db-tools under <DENODO_HOME>/tools/db. Make sure to unzip the folder.

  • Create a VQL file containing the DVCSPULL command:

  - script: |

      REM Create VQL file with DVCSPULL command

      echo DVCSPULL project1; > $(pipelineTmpPath)/$(vqlFileName)

    displayName: 'Create DVCSPULL VQL File'

  • Specify the Import Script command:

  - script: |

      REM Use import script to execute PULL

      $(denodoPath)/import.bat --singleuser --file "$(pipelineTmpPath)/$(vqlFileName)" --server $(server)?admin@$(PASSWORD)

    displayName: 'Pull changes to master VDP'

Integrate the Denodo Testing Tool, this allows you to execute a set of automated tests before deploying (Optional)

By integrating the Denodo Testing Tool into the CI/CD pipeline, you can execute a set of predefined tests to test your views against their predefined expected Results.

This can help you in the testing process by exporting the test results for review by your testers.

You can find extensive documentation in its User Manual. Note that the Denodo Testing Tool is not a requirement for the implementation of a CI/CD pipeline but can serve as a relief from common testing scenarios by automating them.

You can run the Denodo Testing Tool from the .yml file:

Example:

  - script: |

      REM Tests with Denodo Testing Tool

      cd $(testingToolPath)/bin

      REM execute Testing Tool

      denodo-test.bat $(testingToolPath)/conf/$(testingToolConfig) $(testingToolPath)/conf/$(testingToolTest)

    displayName: 'Tests with Denodo Testing Tool'

The configuration.properties file includes the data source connectivity settings as well as the reporter settings. With the reporter setting you can configure how the results of the testing tool will be published.

configuration.properties example:

#...

# --------------------

# Example CSV reporter

# --------------------

reporter.csv=com.denodo.connect.testing.reporter.CSVTestReporter

reporter.csv.pathToOutputFile=C:/Denodo/denodo-testing-tool-8.0-20230425/src/test/resources/CSVReporter.csv

reporter.csv.overwriteOutputFileIfExists=true

#...

# Example VDP data source

vdpdev.driverClassName=com.denodo.vdp.jdbc.Driver

vdpdev.jdbcUrl=jdbc:vdb://localhost:9999/proyect1

vdpdev.dbAdapter=denodo-8.0.0

vdpdev.username=admin

vdpdev.password=admin

# property required for VDP, as JDBC4's Connection.isValid() is not implemented

 vdpdev.connectionTestQuery=select * from dual()

# Example MySQL data source - note the MySQL driver is not included in /lib

# mysql.driverClassName=com.mysql.jdbc.Driver

# mysql.jdbcUrl=jdbc:mysql://localhost/test

# mysql.dbAdapter=mysql-xxx

# mysql.username=test

# mysql.password=test

Example of a simple test file test1.denodotest:

%DESCRIPTION

 This test is pretty simple. It will fail when SELECT COUNT(*) FROM iv_test will not return 7450.

#-----------------------------

%EXECUTION[query] {ds:vdpdev}

SELECT COUNT(*) FROM iv_test

 

#----------------------------

%RESULTS[data]

count

7450

In this example we configured the Testing Tool to export the result to a csv file. We can use this file to parse for the result of the test.

Depending on the result we can go ahead with the promotion, or if we do not wish a promotion in case of a failed test, we can abort the pipeline execution.

Using the Solution Manager REST API create a revision from VQL and deploying it

To create a revision from VQL using the Solution Manager REST API we first need the necessary VQL code.

This allows us to use the import script to create a single VQL file with all the VQL statements of the repository. To obtain the VQL script, use the Denodo export utility. This can be done through a script as part of your CI/CD pipeline. Here’s an example:

Export example:

  - script: |

      REM locate in Denodo folder

      cd $(denodoPath)

      REM execute export script

      export.bat -l admin -p $(PASSWORD) -h $(server) -f $(pipelineTmpPath)/$(REVISION_NAME) -P includejars=yes -P includescanners=yes -P includeCustomComponents=yes -P replaceExistingElements=yes

    displayName: 'Exportar VDP de dev1 a VQL'

    env:

      PASSWORD: $(PASSWORD)

      REVISION: $(REVISION_NAME)

In this script:

  • We navigate to Denodo's bin directory where the export script resides.
  • We then execute the export.bat script with the necessary parameters:
  • -l $(USER): Specifies the login user, which we set as admin.
  • -p $(PASSWORD): Specifies the user's password, securely retrieved from our secret variable.
  • -h $(server): Specifies the Denodo server URL.
  • -f $(REVISION_NAME): Specifies the filename for the exported VQL.
  • Various other parameters to include additional configurations and components in the export.

The PASSWORD is passed as an environment variable to ensure it's securely handled.

This step ensures that the Denodo VDP (Virtual DataPort) is exported to a VQL file, capturing the current state and configurations, ready for versioning or further processes in our CI/CD pipeline.

Configuring Secret Variables for Password and User

To ensure security and maintain best practices, we store sensitive data like the Denodo user's password as secret variables in the pipeline configuration. Azure DevOps encrypts these values and ensures they are not visible in logs or the script.

  • Go to your pipeline settings in Azure DevOps.
  • Navigate to the "Variables" section.
  • Add a new variable named PASSWORD and input the password value. Ensure you mark it as a secret.

  • Add another two variables named USER and REVISION_NAME and set its value.

These variables are now securely stored and can be referenced in the script.

Creating and Publishing Artifacts in Azure DevOps

The creation and publishing of artifacts are crucial steps in a CI/CD pipeline, as they allow for the storage and sharing of results from builds, tests, and other processes between different stages and jobs. In Azure DevOps, this functionality can be achieved using the PublishBuildArtifacts task.

Publishing VQL Artifacts

This step is designed to publish the VQL files generated during the revision creation. These files are stored as artifacts for future use or for review.

Task Configuration:

  - task: PublishBuildArtifacts@1

    inputs:

      PathtoPublish: '$(pipelineTmpPath)/$(REVISION_NAME)'

      ArtifactName: 'VQL for Revision'

      publishLocation: 'Container'

Publishing the CSV File Generated by the Testing Tool

This step publishes the CSV file generated by the Denodo testing tool, providing easy access to the test results.

Task Configuration:

  - task: PublishBuildArtifacts@1

    inputs:

      PathtoPublish: '$(testingToolPath)/src/test/resources/CSVReporter.csv'

      ArtifactName: 'Testing Tool CSV gerenated file'

      publishLocation: 'Container'      

The next step would be to execute the /revisions/loadFromVQL POST request of the Solution Manager REST API.

The request body of this POST request has the following format:

{

    "name": <text>, /* descriptive name for the revision*/

    "description": <text>,  /* optional. Extensive description about the revision. */

    "content": <text> /* vql file content as **xsd:base64Binary** encoded in **UTF-8** */

}

NOTE: The field content must be the content of the VQL file xsd:base64Binary encoded in UTF-8.

Finally, the deployment of the revision can be started with the /deployments POST request of the Solution Manager REST API.

The request body of this POST request has the following format:

{

    "revisionIds": [ <number>, ... ],

    "environmentId": <number>,

    "description": <text>

}

One possible way to implement the /revisions/loadFromVQL POST request, encoding of the VQL file into UTF-8 and the /deployments POST request would be to bundle these steps into a single script that takes the required parameters for the POST requests as its input.

In the example below you can see the Python code for a script that bundles this together.

The script also determines if the promotion should be commenced at all based on the result of the Denodo Testing Tool test: If the result is FAILED, then we will throw an error to let the pipeline fail.

import json, argparse

import requests

import base64

import csv

#data for calling the solution manager APIs

parser = argparse.ArgumentParser()

parser.add_argument("-s", "--host", type=str)

parser.add_argument("-p", "--port", type=str)

parser.add_argument("-v", "--vqlfile", type=str)  #vql file location

parser.add_argument("-n", "--revname", type=str) #revisions/loadFromVQL parameter

parser.add_argument("-d", "--revdescription", type=str) #/revisions/loadFromVQL parameter

parser.add_argument("-e", "--environmentId", type=int) #/deployments parameter

parser.add_argument("-c", "--deploymentDescription", type=str) #/deployments parameter

parser.add_argument("-u", "--username", type=str)

parser.add_argument("-a", "--password", type=str)

# #the next lines of code check if the Denodo Testing Tool succeeded or not

# csvresults = []

# with open('C:/Denodo/denodo-testing-tool-8.0-20230425/conf/CSVReporter.csv', 'r') as csv_file:

#     reader = csv.reader(csv_file)

#     for row in reader:

#         #convert csv to array to get the test result

#         csvresults.append(row)

#     csv_file.close()

# #check the result

# if csvresults[1][0] == 'FAILED':

#     #in case the result of the testing tool is FAILED, we will abort the python script by throwing an error

#     print('The deployment process was aborted because the Denodo Testing Tool Test failed!')

#     raise SystemError('testing tool failed!')

# #in case the testing tool did not fail, the script will go on with the creation of the revision and its deployment

# revision_json = None

# deployment_json = None

#place script arguments into variables:

args = parser.parse_args()

host = args.host

port = args.port

vqlfile = args.vqlfile

name = args.revname

rev_description = args.revdescription

environment_id = args.environmentId

deployment_description = args.deploymentDescription

username = args.username

password = args.password

deployment_url = "http://" + host + ":" + port + "/deployments"

revision_url = "http://" + host + ":" + port + "/revisions/loadFromVQL"

properties_url = "https://" + host + ":" + port +"//environments/" + str(environment_id) +"/vdpProperties"

#parsing vql file as text for input into revision API

try:

    with open(vqlfile, "rb") as file:

        vql = file.read()

   

except Exception as e:

    print(e)

bom_mark = b'\xef\xbb\xbf'

if vql.startswith(bom_mark):

    vql = vql[len(bom_mark):]

base64_encoded = base64.b64encode(vql).decode('utf-8')

request_body_revision = {"name": name, "description": rev_description, "content": base64_encoded}

#calling create revision url

try:

    headers = {"Content-Type": "application/json"}

    revision_json = requests.post(revision_url, json=request_body_revision, headers=headers, auth=(username, password))

except Exception as e:

    print(e)

print(revision_json.text) #print for logging in our Pipeline.

revision_json = json.loads(revision_json.text)

revision_id = revision_json['id'] #save revision id as input for deployment request

print(revision_id) #print for logging

#request body for deployment API

request_body_deployment = { "revisionIds":[revision_id], "environmentId": environment_id, "description": deployment_description }

#deployment request

try:

    deployment_json = requests.post(deployment_url, json = request_body_deployment, auth=(username, password))

except Exception as e:

    print(e)

Note: To run this script in our Pipeline you have to have Python as well as the libraries that are imported installed.

This Python script is designed to automate the process of managing revisions and deployments in Denodo Solution Manager.

  1. Argument Parsing: It starts by parsing required details such as server connection information, file paths, and credentials from the command line arguments.

  1. VQL File Processing: The specified VQL file is read, converted to a Base64 encoded string, and prepared for submission.

  1. Revision Creation: It makes an HTTP POST request to create a new revision in Denodo, using the Base64 encoded VQL and other provided details.

  1. Deployment: The script then initiates the deployment of the created revision to the specified environment.

  1. Error Handling and Logging: Throughout the process, any errors encountered are printed to the console, and important information is logged for traceability in the Azure DevOps Pipeline.

NOTE: The script includes a step to remove the Byte Order Mark (BOM) when encoding the VQL file into base64. This is crucial because the BOM, which is used to denote the endianness of the text file, can interfere with the processing of the base64-encoded content in web APIs. Many web services expect the payload to be clean of such metadata to correctly interpret the data. By stripping the BOM, we ensure that the resulting base64 string is compatible with the expectations of the API endpoint, preventing any potential issues with the processing of the VQL content in subsequent requests.

Example on how to call the Python Script:

  - script: |

      REM Convert vql file of db to base64 and use Rest Api to create revision

      cd $(pipelineTmpPath)

      python3 base64revision.py -s localhost -p 10090 -v $(REVISION_NAME) -n revision_test -d rev_desc_test -e 7 -c deploy_test -u $(USERNAME) -a $(PASSWORD)

    displayName: 'Create revision with Denodo Rest Api '

Final example of the Commands included in the Build section (including the Denodo Testing Tool commands):

trigger:

- main  # This indicates that the pipeline will fire automatically when there are changes to the main branch.

variables:

  denodoPath: 'C:/Denodo/DenodoPlatform8.0/bin'

  testingToolPath: 'C:/Denodo/denodo-testing-tool-8.0-20230425'

  pipelineTmpPath: 'C:/Denodo/DenodoPlatform8.0/pipeline_files'

  server: 'localhost:9999/project1'

  vqlFileName: 'vcscommand.vql'

  testingToolConfig: 'configuration.properties'

  testingToolTest: 'test1.denodotest'

jobs:

- job: AgentJob

  pool:

    name: 'cicd denodo pipeline'  # Name of your pool agent.

  steps:

  - script: |

      REM Create VQL file with DVCSPULL command

      echo DVCSPULL project1; > $(pipelineTmpPath)/$(vqlFileName)

    displayName: 'Create DVCSPULL VQL File'

  - script: |

      REM Use import script to execute PULL

      $(denodoPath)/import.bat --singleuser --file "$(pipelineTmpPath)/$(vqlFileName)" --server $(server)?admin@$(PASSWORD)

    displayName: 'Pull changes to master VDP'

  - script: |

      REM Tests with Denodo Testing Tool

      cd $(testingToolPath)/bin

      REM execute Testing Tool

      denodo-test.bat $(testingToolPath)/conf/$(testingToolConfig) $(testingToolPath)/conf/$(testingToolTest)

    displayName: 'Tests with Denodo Testing Tool'

  - script: |

      REM locate in Denodo folder

      cd $(denodoPath)

      REM execute export script

      export.bat -l admin -p $(PASSWORD) -h $(server) -f $(pipelineTmpPath)/$(REVISION_NAME) -P includejars=yes -P includescanners=yes -P includeCustomComponents=yes -P replaceExistingElements=yes -P includeCreateDatabase=yes

    displayName: 'Export VDP from dev1 to VQL'

    env:

      PASSWORD: $(PASSWORD)

      REVISION: $(REVISION_NAME)

  - task: PublishBuildArtifacts@1

    inputs:

      PathtoPublish: '$(pipelineTmpPath)/$(REVISION_NAME)'

      ArtifactName: 'VQL for Revision'

      publishLocation: 'Container'

  - task: PublishBuildArtifacts@1

    inputs:

      PathtoPublish: '$(testingToolPath)/src/test/resources/CSVReporter.csv'

      ArtifactName: 'Testing Tool CSV gerenated file'

      publishLocation: 'Container'      

  - script: |

      REM Convert vql file of db to base64 and use Rest Api to create revision

      cd $(pipelineTmpPath)

      python3 base64revision.py -s localhost -p 10090 -v $(REVISION_NAME) -n revision_test -d rev_desc_test -e 7 -c deploy_test -u $(USERNAME) -a $(PASSWORD)

    displayName: 'Create revision with Denodo Rest Api '

  - script: |

      REM Retry deletion of both temporary VQL file and REVISION_NAME file until they are deleted

      :retryVql

      del "%pipelineTmpPath%\%vqlFileName%" 2> NUL

      if exist "%pipelineTmpPath%\%vqlFileName%" (

          REM Wait for a second before retrying

          timeout /t 1 /nobreak > NUL

          goto retryVql

      )

      :retryRevision

      del "%pipelineTmpPath%\%REVISION_NAME%" 2> NUL

      if exist "%pipelineTmpPath%\%REVISION_NAME%" (

          REM Wait for a second before retrying

          timeout /t 1 /nobreak > NUL

          goto retryRevision

      )

    displayName: 'Retry Delete Temporary Pull VQL and REVISION Files'

In this example we have added a script at the end of our pipeline to delete temporary files, this is a good practice when we generate temporary files locally especially when we have multiple projects.

Pipeline configuration in a realistic environment with multiple projects in the repository

Already having configured the pipeline for a branch of our repository, we can now take our pipeline to a more realistic environment in which we can have the repository organized by projects as shown in the images:

In our repository for this example we are using  ‘main_project1’ and ‘main_project2’ branches.

Now that we have branches configured in our Azure DevOps linked repository we can see that we have a box where we can configure a specific yaml for each project to be triggered when a developer makes changes to that branch.

It is also important to make sure that we are configuring our pipeline with the parameters and variables according to the project to be automated.

Managing and Monitoring Azure Pipelines via the Web Interface

Once you have configured and run your Azure Pipeline, Azure DevOps provides a comprehensive web interface to manage, monitor, and analyze your pipeline’s execution. Here are some of the things you can do:

Viewing Pipeline Runs

You can view all the runs of your pipeline, check their status (whether they’ve succeeded, failed, or are still running), and see how long each run took.

Inspecting Individual Runs

By clicking on a specific run, you can inspect it in more detail. This includes:

Logs: View logs generated during the run, which can help you identify what went wrong in case of a failure.

Artifacts: Access the artifacts published during the run, such as the VQL files and the CSV file from the testing tool in our case.

Summary: Get a high-level overview of the run, including the agent used, duration, and other key information.

For comprehensive insights, management, and monitoring of your Azure Pipelines, we highly recommend visiting the official Azure DevOps Pipeline documentation. This resource provides in-depth guidance, best practices, and detailed explanations of various features and functionalities available within Azure Pipelines. Whether you are looking to optimize your existing pipelines, troubleshoot issues, or explore advanced capabilities, this documentation serves as a valuable tool to enhance your CI/CD processes and ensure smooth and efficient pipeline execution.

Disclaimer
The information provided in the Denodo Knowledge Base is intended to assist our users in advanced uses of Denodo. Please note that the results from the application of processes and configurations detailed in these documents may vary depending on your specific environment. Use them at your own discretion.
For an official guide of supported features, please refer to the User Manuals. For questions on critical systems or complex environments we recommend you to contact your Denodo Customer Success Manager.

Questions

Ask a question

You must sign in to ask a question. If you do not have an account, you can register here