AID. - Artificial Intelligence DeepSeek Utility

<< Click to Display Table of Contents >>

Navigation:  3. Script Language > AI - Artificial Intelligence Commands > AID. - DeepSeek > !Model Configuration >

AID. - Artificial Intelligence DeepSeek Utility

AID.SetModel

Previous Top Next


MiniRobotLanguage (MRL)

 

AID.Set Model
Set the DeepSeek Model for API Requests

 

Intention

 

SetModel Command: Configure the DeepSeek Model
 
The SetModel command enables you to specify which DeepSeek model will process subsequent API requests, such as those initiated by AID.Ask or AID.AskT. This allows you to customize the AI’s behavior, performance, and cost based on your specific application needs.

By selecting an appropriate model, you can optimize for tasks ranging from general conversation to advanced reasoning, while also managing API usage costs effectively.

It’s an integral part of the AID - DeepSeek API suite, leveraging the power of DeepSeek’s advanced language models.

 

What is the SetModel Command?

 

The SetModel command modifies the global AID_Model variable, which determines the DeepSeek model used for all API interactions within the Smart Package Robot environment.

If no model is specified or an empty string is provided, it defaults to deepseek-chat, a versatile model optimized for general-purpose dialogue. This default is defined in the code with $AID_DefModel="deepseek-chat".

DeepSeek offers a range of models, each tailored to different use cases, with varying performance levels and costs. As of March 19, 2025, the primary models accessible via the DeepSeek API include:

deepseek-chat (V3): A general-purpose conversational model with a 128K token context length. Costs are $0.27 per million input tokens and $1.10 per million output tokens (post-promotional pricing as of February 2025).

deepseek-reasoner (R1): An advanced reasoning model excelling in math, coding, and logical tasks, also with a 128K context length. Costs are $0.55 per million input tokens (cache miss), $0.14 per million input tokens (cache hit), and $2.19 per million output tokens.

These prices reflect DeepSeek’s updated pricing structure after ending promotional rates in February 2025, driven by high demand and computational costs, yet they remain significantly lower than competitors like OpenAI’s GPT-4o ($5.00/M input, $15.00/M output) or o1 ($15.00/M input, $60.00/M output).

 

Why Do You Need It?

 

Configuring the model with SetModel is essential for:

Task-Specific Customization: Switch between models like deepseek-chat for casual chats and deepseek-reasoner for complex problem-solving.

Cost Optimization: Choose a model that balances performance and expense—e.g., deepseek-chat is cheaper for high-volume general tasks, while deepseek-reasoner justifies its cost for precision tasks.

Consistency: Ensure predictable behavior across multiple API calls within a script or session.

Scalability: Prepare for future DeepSeek models as they become available, maintaining flexibility in your automation workflows.

 

How to Use the SetModel Command?

 

Invoke the command by providing the exact name of the desired DeepSeek model as a single parameter. The model name is case-sensitive and must match DeepSeek’s API specifications precisely.

Here’s a detailed overview of available DeepSeek models and their pricing (as of March 19, 2025, sourced from DeepSeek API documentation and industry reports):

Model Name

Use Case

Input Cost (per 1M tokens)

Output Cost (per 1M tokens)

deepseek-chat (V3)

General conversation, chatbots

$0.27

$1.10

deepseek-reasoner (R1)

Reasoning, math, coding

$0.55 (cache miss) / $0.14 (cache hit)

$2.19

Note: Costs may vary with context caching (reducing input costs for repeated queries) and off-peak discounts (e.g., up to 75% off during 1630-0030 GMT, as announced by DeepSeek in February 2025). Always check DeepSeek API docs for the latest pricing.

 

Example Usage

 

AID.Set Model|deepseek-reasoner

DBP.Model set to deepseek-reasoner for advanced reasoning tasks

AID.Ask|What is the integral of x^2?|$$RES

DBP.Result: $$RES

This configures the system to use deepseek-reasoner, ideal for mathematical computations.

 

Illustration

 

┌──────────────────────┐

│ Current Model        │

├──────────────────────┤

│ deepseek-chat        │

├───── SetModel ───────┤

│ deepseek-reasoner    │

└──────────────────────┘

Transitioning from the default model to a specialized one for enhanced capabilities.

 

Syntax

 

AID.SetModel|P1

AID.Set Model|P1

 

Parameter Explanation

 

P1 - The exact name of the DeepSeek model to set (e.g., deepseek-chat or deepseek-reasoner). This parameter is required; omitting it or providing an invalid name triggers an error (%IC_ER_PA).

 

Example

 

AID.Set Model|deepseek-chat

DBP.Model set to deepseek-chat for cost-effective chatting

AID.Ask|Tell me a story|$$STO

DBP.Story: $$STO

ENR.

 

Remarks

 

- The default model is deepseek-chat if not explicitly set or if P1 is an empty string.

- Model names are validated by the DeepSeek API at runtime; check DeepSeek API docs for updates as new models (e.g., future V4 or R2 variants) may be introduced.

- Use AID.Get Model to verify the current setting.

 

Limitations

 

- Limited to models supported by the DeepSeek API; unsupported names won’t raise an error until an API call fails.

- Requires an active internet connection and valid API key for functionality.

 

See also:

 

AID.Get Model

AID.Ask

AID.AskT

Model Configuration