|
<< Click to Display Table of Contents >> Navigation: 3. Script Language > AI - Artificial Intelligence Commands > AIU. - OpenAI API > !Response Analysis > AIU. - Artificial Intelligence Utility |
MiniRobotLanguage (MRL)
AIU.GetPromptTokens
Gets the token count for the prompt from the last response.
Intention
The AIU.GetPromptTokens command (aliased as AIU.Gpt) retrieves the number of tokens that were consumed by the input prompt in the most recent AI API call. This is essential for monitoring and managing API costs, as most models bill based on the number of input (prompt) and output (completion) tokens.
This command should be called after a successful API call (like AIU.Chat) that returns usage statistics.
Illustration
┌───────────────────────────┐
│ AI API Response (JSON) │
├───────────────────────────┤
│ ... │
│ "usage": { │
│ "prompt_tokens": 25, │ <-- Extracts this value
│ ... │
│ } │
└───────────────────────────┘
Extracting the prompt token count from the API response.
Syntax
AIU.GetPromptTokens[|$$RET]
Parameter Explanation
P1 - $$RET - (Variable, Numeric, Optional)
The variable where the prompt token count will be stored. If omitted, the value is returned on the top-of-stack (TOS).
Examples
'***********************************
' AIU.GetPromptTokens - Sample 1: Get token count after a chat call
'***********************************
VAR.$$KEY=sk-YourSecretKeyHere
AIU.SetKey|$$KEY
' Make a call to the AI
AIU.Chat|Hello, how are you?|$$RES
' Retrieve the usage statistics
AIU.GetPromptTokens|$$PTK
AIU.GetCompletionTokens|$$CTK
AIU.GetTotalTokens|$$TTK
MBX.Usage Stats:|Prompt: $$PTK tokens$crlf$Completion: $$CTK tokens$crlf$Total: $$TTK tokens
END.
Remarks
- The alias AIU.Gpt can be used for convenience.
- This command returns the value from the last API call. It does not make a new API call itself.
- If the last API response did not include usage statistics (e.g., due to an error or streaming), this command will return -1.
See also:
• AIU.Chat