Automating the status check of data sources created in Denodo Virtual Dataport

Applies to: Denodo 8.0
Last modified on: 16 Jun 2021
Tags: Administration Connectivity Error handling Monitoring Scheduler jobs Stored procedures VQL View creation

Download document

You can translate the document:

Goal

This document describes how to check the status (UP, DOWN or Timeout) of the data sources created in Denodo Virtual DataPort using the stored procedures, PING_DATA_SOURCE and GET_ELEMENTS, and Denodo Scheduler in order to automate the data source status check and notify project stakeholders proactively in case any of the sources are down for any reason.

Content

Denodo provides many predefined stored procedures out of the box that allow automating many common tasks. In this article, we will explain how to leverage the PING_DATA_SOURCE stored procedure that checks whether a specified data source is accessible from Virtual DataPort or not.

CALL PING_DATA_SOURCE (

database_name : text

,data_source_type : <ping_data_source_type>

,data_source_name : text

,timeout : long

)


<ping_data_source_type> can be any one of the following types:

'JDBC', 'ODBC', 'LDAP', 'OLAP', 'SAPBWBAPI', 'SAPERP', 'SALESFORCE'


To automate the data sources status check we will combine
PING_DATA_SOURCE with another predefined stored procedure called GET_ELEMENTS.

The GET_ELEMENTS stored procedure will list all the elements of a Virtual DataPort server. We will use this to get all the data source elements which will be passed as inputs to the PING_DATA_SOURCE stored procedure.

CALL GET_ELEMENTS (

      input_database_name : text

    , input_name : text

    , input_type : element type

    , input_user_creator : text

    , input_last_user_modifier : text

    , input_init_create_date : date

    , input_end_create_date : date

    , input_init_last_modification_date : date

    , input_end_last_modification_date : date

    , input_description : text

)

VQL to automate the data source status check

You can create a view called data_source_status_check that will show the status of all the data sources in your Virtual DataPort server. The following VQL can be executed to create that data_source_status_check view:

CREATE OR REPLACE VIEW data_source_status_check AS

SELECT

           data_source_status.database_name    AS database_name

         , data_source_status.data_source_name AS data_source_name

         , data_source_status.data_source_type AS data_source_type

         , elements.description                AS description

         , data_source_status.status           AS status

         , data_source_status.timeout          AS timeout

         , data_source_status.start_time       AS start_time

         , data_source_status.duration         AS duration

         , data_source_status.down_cause       AS down_cause

FROM

           get_elements() AS elements

           NESTED INNER JOIN

           ping_data_source() AS data_source_status

           ON

           (

           elements.database_name = data_source_status.database_name

           AND

           elements.subtype   = data_source_status.data_source_type

           AND

           elements.name      = data_source_status.data_source_name

                           )

WHERE

           (

           data_source_status.timeout=15000 AND

elements.input_type = 'datasources' AND

elements.subtype in

(

  'jdbc',

             'odbc',

             'ldap',

             'olap',

             'sapbwbapi',

             'saperp',

             'salesforce'

)

           )

Tree View representation of the data_source_status_check view


The following steps explain the execution flow of the data_source_status_check view just created:

  1. A NESTED INNER JOIN is performed between the GET_ELEMENTS and PING_DATA_SOURCE stored procedures in order to ensure one execution of PING_DATA_SOURCE per data source returned by GET_ELEMENTS. The join conditions are:
  • elements.database_name = data_source_status.database_name  
  • elements.subtype = data_source_status.data_source_type                            
  • elements.name = data_source_status.data_source_name

  1. In order to get just the data source types supported by the PING_DATA_SOURCE stored procedure, from the GET_ELEMENTS stored procedure, the following conditions are added
  • elements.input_type = 'datasources'
  • elements.subtype in ( 'jdbc',  'odbc', 'ldap', 'olap', 'sapbwbapi', 'saperp', 'salesforce')

  1. In addition to this, the following filter is added which represents the time in milliseconds that the PING_DATA_SOURCE stored procedure will be waiting for an answer from the data source. Default value is 15000 milliseconds. This change is optional and you can increase or decrease according to your needs.
  • data_source_status.timeout=15000
  1. Below an example of the results returned by the execution of this view:

Automating the status check using Scheduler

Once we have a view that checks the status of the data sources, we can create a Scheduler job to query the view periodically and send notification through email in case any of the data sources is in DOWN or Timeout status. Log in to the Scheduler Web Administration Tool and follow these steps:

  1. Create a new job of type “VDP”.
  2. Fill the Extraction section as follows:
  1. In the data source field, select the data source that points to the Virtual DataPort server and database where the ‘data_source_status_check’ view has been created.
  2. In the “Parameterized query” field, use the following query to get the list of data sources which are down or do not respond within the timeout value defined in the data_source_status_check view.

SELECT database_name, data_source_name, data_source_type, description, status, timeout, start_time, duration, down_cause FROM operations.data_source_status_check where status in ('DOWN','TIMEOUT')

  1. Go to the Exporters section and add a CSV exporter. Configure it as follows:
  1. Use the following as Output file name:  @{jobName}.csv.
  2. Select UTF-8 as encoding.
  3. Select “Overwrite old file”.
  4. Use comma (,) as a separator.
  5. Select the “Include header” option.
  6. Leave all the other  parameters with their default values.

        

  1. To receive email notifications that include the CSV file generated in the previous step a Denodo Connects tool called “Denodo Custom Handle Email Exported Files” can be used. This plug-in can be found in the Denodo Support Site under the Resources > Denodo Connects section. Follow the instructions in the Custom Handler Email Exported Files - User Manual to install it.
    In order to send emails, Denodo Scheduler needs to be also properly configured with the values of an email server. You can find more details in the
    Mail Settings section of the Scheduler Administration Guide.

  1. Once the plugin is installed go to the Handlers section of the job and add a new “email-exported-files” handler. Configure it as follows:
  1. In the Emails text box, add the values of the emails that will receive the notification separated by commas.
  2. Configure the Subject, Body and Footer options for the notification emails.
  3. Check  Write File in Email option.
  4. Check  Use HTML in Email content.
  5. Leave the other options as default.

                


  1. In the Triggers section specify the time expression that sets when the job will be executed. It uses a UNIX “cron expression”.  You can find more details on how to configure this screen in the Time-based Job Scheduling section of the Scheduler Administration Guide.
  2. Save the job. It can be executed manually by clicking on the “Play” icon next to it. If there is any data source down or timing out an email with the list of dat sources having issues similar to this will be sent:

References

Defining a Derived View

PING_DATA_SOURCE

GET_ELEMENTS

Mail Settings

Configuring New Jobs

Denodo Email Exported Files Custom Handler - User Manual

Questions

Ask a question
You must sign in to ask a question. If you do not have an account, you can register here

Featured content

DENODO TRAINING

Ready for more? Great! We offer a comprehensive set of training courses, taught by our technical instructors in small, private groups for getting a full, in-depth guided training in the usage of the Denodo Platform. Check out our training courses.

Training