You can translate the document:

Introduction

You can query any dataset in natural language from ChatGPT leveraging Denodos AI SDK and OpenAI GPTs. This extends the capabilities of OpenAI Agents by enabling them to intelligently connect, search and securely query any enterprise data using Denodo.

OpenAI Agents can access Denodo’s AI SDK by wrapping the API calls within functions. Functions are reusable bits of code that an AI Agent can run whenever a condition is met. By defining a function, our agent can execute code and then return the result back into the prompt for further reasoning.  In our case, the executed code will be a request to the Denodo AI SDK API.

If you are familiar with tool calls within other orchestration frameworks, functions are a similar iteration of the same idea. Within ChatGPT we can define a function to be a call to the Denodo AI SDK API when passing certain criteria. Although we are executing code on the backend, within ChatGPT we will only need to describe the API schema and how the agent should fill the parameters.

Requirements

- Denodo AI SDK deployed in an address accessible by OpenAI.

- Python* (not required but recommended for setting up HTTP basic credentials).

- ChatGPT Plus, Pro, or Team.

If you do not already have Denodo’s AI SDK set up, you can refer to the github repository and to the user manual.

Preparing your Workspace

Accessible Denodo AI SDK

To allow ChatGPT to communicate with the Denodo AI SDK it is necessary for the AI SDK to be hosted on an IP address accessible by ChatGPT.

Getting the API definitions from the Denodo AI SDK

Next we will want to extract the openAPI.json file from the Denodo AI SDK. This defines the different API capabilities of the SDK adhering to the OpenAPI standard. This can be found at the /openAPI.json endpoint of your AI SDK server.

<Your AI SDK URL>/openAPI.json

After entering the URl into the address bar of your browser, you will see a webpage that looks like the following:

Copy the contents of this page and save it as OpenAPI.json. OpenAI will need this API definition when defining GPT actions.

Formatting the Denodo AI SDK’s API schema

Although the openAPI.json file contains all of the information needed to define the AI SDK to ChatGPT, we will need to modify it slightly to adhere to ChatGPT requirements. First, we will need to modify the parameters of each AP endpoint, limiting the scope to only the necessary parameters for each call. Then we will need to include the correct URL for accessing the AI SDK.

For an example take the following definition of the /answerQuestion function:

{

  "openapi": "3.1.0",

  "info": {

    "title": "Denodo AI SDK /answerQuestion endpoint",

    "description": "API for asking natural language questions to a VDP database.",

    "version": "1.0.0"

  },

  "servers": [

    {

      "url": <INPUT YOUR URL HERE>

    }

  ],

  "paths": {

    "/answerQuestion": {

      "get": {

        "summary": "Answerquestion",

        "description": "This endpoint processes a natural language question ",

        "operationId": "answerQuestion_answerQuestion_get",

        "parameters": [

          {

            "name": "question",

            "in": "query",

            "required": true,

            "schema": {

              "type": "string",

              "title": "Question"

            }

          }],

         "responses": {

          "200": {

            "description": "Successful Response",

            "content": {

              "application/json": {

                "schema": {

                  "$ref": "#/components/schemas/answerQuestionResponse"

                }

              }

            }

          },

          "422": {

            "description": "Validation Error",

            "content": {

              "application/json": {

                "schema": {

                  "$ref": "#/components/schemas/HTTPValidationError"

             

                }

              }

            }

          }

        },

        "security": [

          {

            "basicAuth": []

          }

        ]

      }

    }

  },

  "components": {

    "securitySchemes": {

      "basicAuth": {

        "type": "http",

        "scheme": "basic"

      }

    },

    "schemas": {

       "answerQuestionResponse": {

        "properties": {

          "answer": {

            "type": "string",

            "title": "Answer"

          },

          "sql_query": {

            "type": "string",

            "title": "Sql Query"

          },

          "query_explanation": {

            "type": "string",

            "title": "Query Explanation"

          },

          "tokens": {

            "type": "object",

            "title": "Tokens"

          },

          "execution_result": {

            "type": "object",

            "title": "Execution Result"

          },

          "related_questions": {

            "items": {

              "type": "string"

            },

            "type": "array",

            "title": "Related Questions"

          },

          "tables_used": {

            "items": {

              "type": "string"

            },

            "type": "array",

            "title": "Tables Used"

          },

          "raw_graph": {

            "type": "string",

            "title": "Raw Graph"

          },

          "sql_execution_time": {

            "type": "number",

            "title": "Sql Execution Time"

          },

          "vector_store_search_time": {

            "type": "number",

            "title": "Vector Store Search Time"

          },

          "llm_time": {

            "type": "number",

            "title": "Llm Time"

          },

          "total_execution_time": {

            "type": "number",

            "title": "Total Execution Time"

          }

        },

      }

    }

  },

  "headers": {

    "Authorization": {

      "description": "Basic authentication header.",

      "schema": {

        "type": "string"

      }

    }

  }

}

In this example, all non-necessary parameters have been removed from the API definition. This is important because ChatGPT can take an unset parameter as an opportunity for inference. If left configurable, ChatGPT may not choose the default values even if specified.

As you might notice, in the previous definition we have already added the information on the URL to the AI SDK server. Without this key factor, our GPT will not know where to reach the AI SDK.

{

  "openapi": "3.1.0",

  "info": {

    "title": "Denodo AI SDK /answerQuestion endpoint",

    "description": "API for asking natural language questions to a VDP database.",

    "version": "1.0.0"

  },

  "servers": [

    {

      "url": "<INPUT YOUR URL HERE>"

    }

  ],

  "paths": {

    …

In the end, we should have a JSON schema definition of each endpoint with only the necessary parameters allowing calls from our AI agent. We will need these in the next step when defining the GPT within ChatGPT.

Adding the API definition to a ChatGPT GPT

Once you have the API definition for your Denodo AI SDK, the next step is creating a GPT agent within chatGPT.

First, navigate to the ChatGPT homepage and select “Explore GPTs”.

Then select “Create”.

Select “Configure”.

Select “Create New Action”.

Modify the authentication parameters.

HTTP Basic

For HTTP Basic authentication you will need to select API key and basic authentication.

Within the API key parameter, we will need to provide a base64 encoded representation of our AI SDK credentials, this being the username and password.

To generate the encoded string, feel free to use the following python notebook code, replacing username and password with your actual credentials.

import base64

username = "username"

password = "password"

# Construct the credentials string

credentials_str = f"{username}:{password}"

# Encode the string to Base64

encoded_credentials = base64.b64encode(credentials_str.encode("utf-8")).decode("utf-8")

print(encoded_credentials)

We will need to provide the result of encoded_credentials into the API key field in the GPT function. Note that base64 is an encoding, and not an encryption algorithm–this is essentially the same as sending plain text. Make sure that credentials sent over the network are sent to only HTTPS endpoints. Also Note You will need to reenter the API key (the encoded username and password) every time the GPT is edited.

OAUTH

GPTs also work with OAUTH. For this we will want to access the OAUTH configuration within the Authentication menu.

Within the menu we can configure the authentication specifically to match the API requirements for OAUTH.

Schema

Finally, we can provide our API schema to the GPT Action.

We will know that ChatGPT can successfully read our API definitions if the functions appear within the “Available Actions” setting, as can be seen bellow:

Finally we can test each of the endpoints by pressing the “Test” button that gets generated after the schema is successfully read. This will document the API call within ChatGPTs chat window, and return the desired response from the AI SDK.

OpenAI Python SDK With Denodo AI SDK

Functions are also available through OpenAI’s Python SDK. Unlike the ChatGPT interface, the Python SDK allows for much greater extensibility in configuring the code to be executed when calling the tools and the triggers that execute them.

Functions within OpenAI’s Python SDK are defined similarly to other orchestration frameworks. This being a definition of the function with enough context so that the LLM knows how to pass parameters to it. After creating the function, we will append the tool to the LLM within a reasoning loop and call it.

The first step is defining our python function that will call the Denodo AI SDK:

import requests

def call_answer_question_api(question):

    url = "https://<ai_sdk_host>:<port>/answerQuestion"

    params = {

        "question": question,

        "markdown_response": "false",

        "disclaimer": "false"

    }

    response = requests.get(url, params=params, auth=HTTPBasicAuth('admin', 'admin'))

    if response.status_code == 200:

        return response.json()

    else:

        response.raise_for_status()

Take the function call_answer_question_api as an example of a possible tool for the OpenAI SDK. It is designed to send a question to the Denodo AI SDK /answerQuestion endpoint running at https://<ai_sdk_host>:8008. The function accepts a single input question, which is the query being asked.

Next we will need to add sufficient context to the function for the LLM to know when to use it and how to pass the question parameter.

function_spec = {

    'name': 'call_answer_question_api',

    'description': 'Fetches the answer to a given question from an external API.',

    'parameters': {

        'type': 'object',

        'properties': {

            'question': {

                'type': 'string',

                'description': 'The natural language question on enterprise data the user wants answered.'

            }

        },

        'required': ['question']

    }

}

function_spec defines a JSON schema for the LLM agent to understand when to use it with the description and what the parameters are under the parameters section.

Next we will define a call to our LLM, in the reasoning loop this would be where our agent decides what tool they should use.

messages = [

    {'role': 'system', 'content': 'You are a helpful assistant. Through the call_answer_question_api you have access to enterprise data.'},

    {'role': 'user', 'content': 'How many loans do we have?'}

]

response = openai.chat.completions.create(

    model='gpt-4o-mini',

    messages=messages,

    functions=[function_spec]

)

Conclusion

By integrating  Denodo AI SDK Query Rag Endpoints within the ChatGPT, agents built on OpenAI gain seamless access to organizational data and metadata modeled within the Denodo Platform. This synergy not only ensures robust data governance for AI users, but also empowers AI agents to unlock deeper insights from an organization’s data resources.

Disclaimer
The information provided in the Denodo Knowledge Base is intended to assist our users in advanced uses of Denodo. Please note that the results from the application of processes and configurations detailed in these documents may vary depending on your specific environment. Use them at your own discretion.
For an official guide of supported features, please refer to the User Manuals. For questions on critical systems or complex environments we recommend you to contact your Denodo Customer Success Manager.

Questions

Ask a question

You must sign in to ask a question. If you do not have an account, you can register here