USER MANUALS


Large Language Model Functions

The large language model functions interact with the configured large language model (LLM) to generate accurate and contextually appropriate responses based on the input provided.

Requirements to use these functions:

Note

These functions send a request to large language model, for each row they apply to, which may result in high response times. We recommend enabling cache in the views that use these functions.

Large language model functions:

CLASSIFY_AI

Description

The CLASSIFY_AI function performs text classification by querying the configured large language model. It takes a given text and a classification scale as input, and returns an accurate classification based on the content of the text.

Syntax

classify_ai (<value:text>, <classification scale:array> [, <temperature:double>]]): text
  • value. Required. Text to classify.

  • classification scale. Required. An array of categories used as the basis for classification. This array must use entries in the following format: { row(<label: text>[,<description: text>[, <example: text>]]) }

  • temperature. Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.0 if the user does not provide it.

Example

CLASSIFY_AI classify text

ENRICH_AI

Description

The ENRICH_AI function queries de configured large language model with the provided prompt.

Syntax

enrich_ai (<prompt:text>[, <temperature:double>]): text
  • prompt. Required. Text to be sent to the large language model for generating a response.

  • temperature.Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.4 if the user does not provide it.

Example 1

You want information about a car model.

ENRICH_AI information about a car model

Example 2

You are hosting a car auction and you want to make a post on social media about the auctions of each car model.

ENRICH_AI post about a car auction

Note that you can provide context to the large language model by concatenating your prompt with the specific field of the view you want to target.

ENRICH_AI_BINARY

Description

The ENRICH_AI_BINARY function queries the configured large language model with the provided prompt and a image or a PDF document.

Syntax

enrich_ai_binary (<text prompt>, <binary blob>, <content type text>[, <temperature:double>]): text
  • prompt. Required. Text to be sent to the large language model for generating a response.

  • binary. Required. Binary containing the image or pdf.

  • mime type. Required. Specifies the file type. At the moment ‘image/jpeg’,’image/jpg’,’image/png’,’image/gif’,’image/webp’ and ‘application/pdf’ are supported.

  • temperature.Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.4 if the user does not provide it.

Example 1

You want the description of an image.

ENRICH_AI information about an image

Example 2

You want to generate a set of 5 questions and answers based on a PDF file.

ENRICH_AI summary of a pdf file

EXTRACT_AI

Description

The EXTRACT_AI function identifies and extracts the most relevant occurrence of specified entities from a given text. Given the input text and specified entities, the function utilizes the configured large language model to determine and return the most accurate and contextually appropriate matches.

Syntax

extract_ai (<value:text>, <entityArray:array> [, <temperature:double>] ): register
  • value. Required. Text to analyze.

  • entityArray. Required. Array of concepts to identify and extract from text.

  • temperature. Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.0 if the user does not provide it.

Example

You want to extract information about Denodo events from a text.

EXTRACT_AI extract subjects from a text
EXTRACT_AI register obtained from a row

SENTIMENT_AI

Description

The SENTIMENT_AI function analyzes the sentiment of the provided text input by querying the configured large language model to determine whether the sentiment is negative, neutral, mixed or positive.

Additionally, it can receive custom scales, which are used to tailor the sentiment analysis to specific sentiments defined by the user. For these custom sentiments, the user must provide the sentiment name, and optionally, a description or example. Note that when using custom scales, providing the optional description and example will highly improve the accuracy of the sentiment analysis.

Syntax

sentiment_ai (<value:text> [, <custom sentiments:array> ] [, <temperature:double>]): text
  • value. Required. Text to analize.

  • custom sentiments. Optional. An array containing the custom scale for the sentiment analysis. This array must use entries of the following format: { row(<sentiment name: text>[,<sentiment description: text>[, <sentiment example: text>] ]) }

  • temperature. Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.0 if the user does not provide it.

Example 1

SENTIMENT_AI social media post sentiment analysis

Example 2 with custom scales

SENTIMENT_AI social media post sentiment analysis with custom scale

SUMMARIZE_AI

Description

The SUMMARIZE_AI function generates automatic summaries by querying the configured large language model with the provided text input. The function summarizes the text in its original language and does not translate it into the Denodo Assistant’s configured language.

Syntax

summarize_ai (<value:text> [, <word length:int>]  [, <temperature:double>]): text
  • value. Required. Text to summarize.

  • word length. Optional. Approximate word length of the summary.

  • temperature. Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.0 if the user does not provide it.

Example

SUMMARIZE_AI resume text for a quick look at table data

TRANSLATE_AI

Description

The TRANSLATE_AI function performs language translation by using a large language model. Given the original text, the target language code, and optionally the original language code, it generates an accurate translation in the specified target language.

Syntax

translate_ai (<value:text>, <target language:text> [, <origin language:text>] [, <temperature:double>]): text
  • value. Required. Text to translate.

  • target language. Required. Target language.

  • origin language. Optional. Original language.

  • temperature. Optional. Controls the randomness of the LLM’s response. Lower values produce more focused and deterministic outputs, while higher values enhance creativity and variability. Must be a double in the range of 0 to 1. Very high values are not recommended, as they are prone to causing hallucinations. By default, the temperature is set to 0.0 if the user does not provide it.

Example

TRANSLATE_AI translate text
Add feedback