AIN. - AnythingLLM AI

<< Click to Display Table of Contents >>

Navigation:  »No topics above this level«

AIN. - AnythingLLM AI

AIN.GetOpenAIModels

Previous Top Next


MiniRobotLanguage (MRL)

 

AIN.GetOpenAIModels
Retrieve Available OpenAI-Compatible Models from AnythingLLM

 

Intention

 

GetOpenAIModels Command: Listing OpenAI-Compatible Models
 
The AIN.GetOpenAIModels command fetches a list of available OpenAI-compatible models from the AnythingLLM API. It’s useful for identifying which models can be used in subsequent OpenAI-compatible operations.

This command is part of the AIN - AnythingLLM AI suite, supporting integration with OpenAI-style endpoints within the AnythingLLM ecosystem.

 

What is the GetOpenAIModels Command?

 

This command queries the AnythingLLM API to retrieve a list of model slugs available in an OpenAI-compatible format. It returns the count of models and an array of slugs, along with raw response data.

The results can be stored in variables or accessed from the stack, facilitating model selection for commands like AIN.ExecuteOpenAIChat.

 

Why Do You Need It?

 

Use AIN.GetOpenAIModels for:

Model Discovery: Identify available models for OpenAI-compatible operations.

Configuration: Choose a specific model to set via AIN.SetModel.

Scripting: Automate workflows by dynamically selecting models based on availability.

 

How to Use the GetOpenAIModels Command?

 

Call the command with optional variables to store the model slugs and raw response. The model count is always returned. Ensure the API key and endpoint are configured beforehand using AIN.SetKey and AIN.SetChatEndpoint.

 

Illustration

 

┌────────────────────────┐

│ Model List             │

├────────────────────────┤

│ [model1, model2, ...]  │

└────────────────────────┘

Fetching available OpenAI-compatible models.

 

Syntax

 

AIN.GetOpenAIModels[|P1][|P2]

AIN.GOM[|P1][|P2]

 

Parameter Explanation

 

P1 - (String, Optional) Variable to store the array of model slugs (comma-separated if populated).

P2 - (String, Optional) Variable to store the raw JSON response from the API.

 

Example

 

' Configure API access

AIN.SetKey|your_api_key_here

AIN.SetChatEndpoint|http://localhost:3001

' Retrieve available OpenAI models

AIN.GetOpenAIModels|$$MOD|$$RAW

DBP.Model Slugs: $$MOD

DBP.Raw Response: $$RAW

POP.$$CNT

DBP.Model Count: $$CNT

ENR.

 

Remarks

 

- Requires prior configuration of the API key and endpoint.

- The model count is pushed to the stack and can be retrieved with POP.

- If P1 is provided, the slugs are returned as a comma-separated string; otherwise, they are stored internally.

 

Limitations

 

- Parameter count must not exceed 3 (0-2 parameters allowed), or an error is set.

- Dependent on the AnythingLLM API’s OpenAI compatibility and availability of models.

 

See also:

 

AIN.ExecuteOpenAIChat

AIN.SetModel

AIN.GetModel

Model Configuration

Operations