You can translate the document:


Expert trails guide Denodo users through all the relevant materials related to a specific topic, including official doc, KB articles, training, Professional Services offering, and more. The main goal is to give users a single place with references to all the information that they need to become a Denodo expert on any specific topic.

In general, the Software industry defines Software Development Life Cycle(SDLC) for every project in order for the teams to quickly produce high-quality software. When it comes to Data Virtualization projects, one can leverage the Development Life Cycle to outline, design, develop, test, deploy, and maintain the elements created in the Denodo Platform with greater efficiency and improved quality.

This Expert Trail shows a curated selection of the different resources available to get a bigger picture of the Software Development Life Cycle process to be carried out in Data Virtualization projects.

The Hike

Stage 1: Discover and analyze the business need

Before starting the Hike, it is necessary to know the location of the trails, their characteristics, and what you need to pack.

Similarly, before getting started with a Data Virtualization project, it is required to know what needs to be built and what are the approaches to meet the expectation.

Let’s start this activity by discussing with all the stakeholders in the organization to identify the current business problems and gather the need, requirements, and expectations for building a new system.

The next step is to look out for possible solutions that solve the current problem with the help of Data Virtualization. Calculate the impact and benefits of the proposed solutions and pick the best one. Finally, communicate the chosen solution to key stakeholders to get approval and move further.

To get more insights on common use cases for Data Virtualization, you can go through the Denodo Solutions Overview section which includes various Customer Success stories and detailed solutions for each particular industry and use cases.

Stage 2: Architect and Design the Solution

While on a hike, taking a map with you avoids getting lost and it provides hikers with insights on what to expect, the region's geographical features and points out places of interest.

To build a Data Virtualization project, the first thing is to design the solution as per the business requirement. This design architecture acts as a map which will theoretically catch problems and help to secure the funding and resources.

This involves creating design documents and guidelines, deciding on the tools, and defining the methodology, best practices, and patterns to be used in order to meet the requirement specification and goals defined.

This process is comprised of the following phases:

  1. Decide on technologies to be used such as Data Sources and Consuming applications.
  2. Design the architecture of the system

This includes data models, data access protocols, i.e., how the users will connect to the Denodo Platform, the Service Level Agreements of the system, availability of the resources, etc.

  1. Select the development methodology that is suitable for your Data Virtualization project.  

The Data Virtualization projects follow the lifecycle of Software Development. That is, Data Virtualization projects can also be developed using Waterfall, Incremental, and Agile methodologies.

For example, an Agile methodology follows an iterative development approach. One of the key use cases that have benefited from Data Virtualization is being agile to deliver the solutions to the Business Request with greater flexibility and less turnaround time.  You could take a look at Denodo Development Strategy - Development paradigm  which describes an example for the application of a development methodology for a data virtualization project.

  1. Define Team roles & responsibilities

It is important to define who is involved in a Data Virtualization project at any moment of the project life cycle. These are different role variations with respect to Data Virtualization project based on Denodo Platform:

  • Data Virtualization Architect - address data diversity and unlock the full value of all your information universe.
  • Denodo Platform Administrator - keep control of all enterprise data and grow your services with data virtualization.
  • Data Virtualization Developer - Implements the Virtual Data Model and performs development activities.
  • Business Users -  Discovers day-to-day business insights.

When it comes to the Development phase,  Denodo proposes a certain model which allows the teams to work together, protect the changes against accidental modifications, and keep track of changes. One can organize a Development team as

  • Distributed team with multiple development servers
  • Centralized Team.

Let´s review the Scenarios and recommendations for different aspects of the Development phase and choose the right strategy for the project.

Scenarios and Recommended Uses

  1. Documentation

Specify the project layers, naming conventions and document the elements created in Denodo Platform that helps in finding the elements easily.

Take a look at the Expert Trail: Deployment Topology which provides the necessary knowledge for defining the Data Virtualization architecture.

Stage 3: Build and Test the Solution

The most common environments that would exist in any Data Virtualization project are

  • Development: where the development of new projects will be executed, following the standard development practices and processes for the company.
  • Testing/QA: this environment will be used to validate the suitability of each new iteration of a project, both for functional, security and performance metrics. Only when a project is considered valid and passes all the tests, it can be deployed to production.
  • Production: the final operational system that provides data to consuming applications. This environment will be potentially implemented as a cluster of Denodo Platform servers for horizontal scalability and fault tolerance.

How to do the setup of the different servers/environments is described in the Expert Trail: Operations.


In this stage, you can:

Various optimization techniques can be applied to achieve a greater performance as explained in the Expert Trail: Query Performance Optimization.

Additionally, Version Control System(VCS) support is integrated with the features for managing the metadata of the Denodo installations used at different points of the software deployment lifecycle.   Denodo proposes a certain model when working with a VCS in the development environment, which allows organizing the teams to work together, protect their changes against accidental modifications, and keep track of changes as versions. It supports integration with Subversion (SVN), Git and Microsoft TFS VCS servers. The Scenarios and Recommended Uses section of the official documentation explains different workflows when working with a VCS in the development environment.  

To build and manage the virtual models, Denodo offers development tools such as Virtual DataPort Administration Tool and Web Design Studio.

When working with the Denodo Platform, there are different best practices any developer must bear in mind. The Development section of Denodo Admin and Development Best Practices Knowledge Base article covers some of the best practices to be followed during the development phase.


In this environment, the testing team can make controlled unit tests to the virtual model already developed.  You could leverage various test levels such as Unit testing, Integration testing, System testing and Acceptance testing. The Testing in Denodo presentation describes the different Test Levels and how to execute them in Denodo.

The Denodo Testing Tool allows Denodo users to easily automate the testing of the data virtualization scenarios. With the Denodo testing tool, it is also possible to integrate with third party automation tools like Jenkins for achieving Continuous integration/Continuous Delivery. This Denodo Testing Tool User manual provides more information about installation, testing methods and debugging steps.

You could refer to the Knowledge base article Denodo Load Testing with Apache JMeter which explains how to configure Apache JMeter to simulate workloads on the Denodo Platform and carry out performance tests and Denodo Load Testing with LoadRunner that explains how to configure LoadRunner for load/stress tests on the Denodo Platform.


Final users and applications will use the views/services deployed in the production environment.

Stage 4: Deployment

Hurrah! Only, few more miles to reach our goal.

Denodo provides tools to manage large deployments with several environments (Dev, Testing, Production), including clustering and geographical distribution.  It offers Solution Manager Promotions which is the recommended way for promoting the metadata between environments.  On the other hand, it offers Import/Export Scripts for backup purposes or to re-create the same metadata in another Virtual DataPort server.

In addition to the graphical support, the Solution Manager includes REST APIs to facilitate automation of tasks from DevOps tools such as Jenkins (e.g. migration defined in Denodo can be managed in Jenkins), UDeploy, chef, Apache buildr, etc. An introduction to the process with Jenkins is described in Denodo Deployments and Continuous Integration.

Also, starting from Denodo 8.0, the Solution Manager automates the deployment of the Denodo Platform on Amazon Web Services and Azure. The main benefit is that you can instantiate and manage your entire Cloud deployment without having to create and configure custom elements and without SSH connections to each individual server to configure various settings.

The Automated Cloud Mode section of Solution Manager Administration Guide provides more details on this.

Stage 5: Maintenance

Yes! You have reached your destination. What’s next?

The final phase of this life cycle is “Maintenance”.  This is to ensure that needs continue to be met and that the system continues to perform as per the specification mentioned in the first phase. Maintenance of the Denodo Platform can include monitoring the performance of the Denodo Platform servers and software updates and upgrades. 

The Expert Trail: Monitoring provides more details on various monitoring methods available in the Denodo Platform.


Coming to the Software upgrades, as with any other software, Denodo releases a new version that offers a significant change or major improvement over the current version.  You can refer to the Expert Trails: Operations which provides detailed information about the upgrade process.


Fill up your backpack with additional gear:


Official Documentation

Expert Trails

KB Articles

Additional Resources


Official Documentation

KB Articles


Official Documentation

KB Articles

Additional Resources


Expert Trails

Best practices

Official Documentation

KB Articles

Guided Routes

Denodo Training Courses

Denodo training courses provide expert data virtualization training for data professionals, including administrators, architects, and developers.

If you are interested in Development Cycle, you should enroll in the following course/s:

  • Denodo Project Management - This course explains the vision of project management with Denodo Platform 8.0 and the roles involved in the project.
  • Connection and Combining Data with Denodo - This course provides data developers with Denodo Platform 8.0 concepts, terminology, and skills needed to develop a data virtualization project.
  • Denodo Deployment Configuration - This course helps data architects and administrators understand about ways in which Denodo Platform can be configured and deployed in various environments (development, QA, Production, etc.) and infrastructures (virtual machines, containers, cloud, etc.). The course also includes some recommendations on sizing and capacity planning of the environments in a Data Virtualization project.

Technical Advisory Sessions

Denodo Customers with active subscriptions have access to request Meet a Technical Advisory sessions.

These are the sessions available related to the Development Lifecycle.

Development Methodology

Development Lifecycle

Assistance in defining the development lifecycle when working on a Denodo Platform project.

Collaborative Environment: Version Control (VCS) Best Practices

- Direction on selecting and defining the best procedures for working in a collaborative environment. What are the different recommended workflows, and which one suits best in your case?

- Resolve your questions and possible challenges during the implementation of your model.

- Versioning and Branching best practices.

Code Promotion Strategy

Promotion through Solution Manager:

- Graphical process.

- Solution Manager API for integration with external lifecycle management systems like Jenkins.

- Revisions > Rollback.

- Isolated Environments restrictions.

- Management of user privileges.

- Configuration and management of environment-dependent properties (URLs, users, passwords, etc.).

Advice on defining an alternative promotion strategy not relying on Solution Manager due to specific constraints (Note that using Solution Manager is the recommended approach.)

- Explain the available methods for export/import and how to integrate them with your promotion strategy.

Developing a Denodo Extension

Assist you in defining a procedure to develop a new Denodo Platform Extension. Provide you with the resources you need to build custom Denodo Platform extensions:

- Custom Wrappers.

- Custom Functions.

- Custom Procedures.

- Custom Policies.

- Scheduler Extensions (Exporters, Handlers).

Testing Policies

Recommendations on defining testing policies. These will help you determine practical success criteria: i.e. when to move forward with planned enhancements, pivot to the next development cycle, rollback an update:

- Unit Tests

- Integration Tests

- System Tests: Regression Testing, Load Testing/Performance Testing, Functional Testing, Security Testing

- Acceptance Tests

Adoption Plan

Define CoE & Administration Teams

Guidance on defining the roadmap and strategy for data virtualization adoption.

Guidance on building a methodology to follow for the adoption.

Guidance to define the roles and capabilities required.

Denodo Project Lifecycle

Project Lifecycle

Guidance on how to define the project lifecycle.

Professional Services

Denodo Professional Services can help you at the start or any part of your query performance trail. You can find information about the Denodo Professional Services offering in:

Professional Services for Data Virtualization | Denodo

If you are a Denodo customer, you can reach out to your Customer Success Manager for details about any Guided Route that you need.

Big Hike Prep Check

Let’s see if you are ready to start your big trail. Take this 4-question questionnaire to check your readiness for an enjoyable hike.

Read the questions below, think about the solution and check if you got them right by looking at the solution. Have you become an expert?

  1. Consider you have one Virtual Dataport Development server. You have two developers working on the same view and as administrator you need to make sure the changes made by the developers do not overlap each other. Which VCS workflow would you pick?

Click here to check if you got it right

Centralized workflow with private databases. This workflow provides conflict detection/resolution when several developers work on the same metadata in a single development environment.  The document Centralized Workflow with Private Databases provides more details on the VCS workflows.

  1. How will a developer unit test the code? For eg: A user has designed a denodo view ‘dv_sales’ and it is dependent on multiple intermediate views managed by different users. A change in the intermediate view might affect the final view ‘dv_sales’. How to make sure in an automated fashion the user is notified the dv_sales is not impacted?

Click here to check if you got it right

The user can do unit testing with the help of the Denodo Testing tool. Using this tool the user can create simple scripts to make sure the view results are the same , for eg: comparing a set of results against a csv file to make sure the count is always the same. If the count differs then the unit test failed that gives a notification that an intermediate view has affected the view ‘dv_sales’

  1. Consider there is a function "REGEXP_REPLACE" that is supported by a database such as Oracle and there is no equivalent predefined function available in Virtual DataPort. Now, you wanted to implement the same function in Virtual DataPort in such a way that whenever this function is executed, the server should invoke the function of the database instead of being executed in the Virtual layer. What can be done to achieve this use case?

Click here to check if you got it right

Develop custom functions which can be delegated to JDBC data sources. That means, when possible, instead of executing the Java code of the custom function in the Virtual layer, the Virtual DataPort Server invokes a function of the database. This can be achieved by creating a custom function with Annotations.

  1. Assume, you have 2 servers in the PROD environment and you wanted to deploy the metadata from QA to the PROD. What is the best and easy approach to promote elements between environments?

Click here to check if you got it right

Using the Solution Manager, you can create a revision, which is a collection of Virtual DataPort elements and Scheduler jobs that you wish to migrate from one environment to another. Once the revision is created, deploy one or several revisions to the desired target environment. The deployment will execute the changes included in the revisions on every server that belongs to the target environment based on the deployment strategy that you have chosen. You could refer to the Promotions section of Solution Manager Administration Guide for more information. 


Ask a question

You must sign in to ask a question. If you do not have an account, you can register here