AIC. - Artificial Intelligence Command

<< Click to Display Table of Contents >>

Navigation:  3. Script Language > AI - Artificial Intelligence Commands > AIC. - Artificial Intelligence Command > AI - Set Parameters >

AIC. - Artificial Intelligence Command

AIC.Set Temperature

Previous Top Next


MiniRobotLanguage (MRL)

 

AIC.Set Temperature
Save OpenAI-API Key encrypted to a file

 

clip0685

 

This is how the AI imagines the Set Temperature Command.

 

 

Intention

 

The `AIC.SetTemperature` command is an essential tool for finetuning the creativity and randomness of responses generated by OpenAI's large language models (LLMs). It's like adjusting the seasoning in a recipe - a higher temperature spices things up, while a lower temperature keeps things more conservative and focused. Understanding and effectively using this command can help you craft the perfect responses for your application's needs.

 

**Description**

 

`AIC.SetTemperature` takes a single parameter - the temperature value, which is a floating-point number typically ranging from 0.0 to 1.0. This parameter is responsible for scaling the logits before the final Softmax layer during text generation, effectively controlling the randomness in the model's output.

 

 **Behavior**

**Lower Temperature Values (e.g., 0.2)**

 

- When the temperature is set to a lower value, the LLM's output becomes more deterministic and focused.

- The model is more likely to choose the most probable word sequences, making the text more coherent and less random.

- This is particularly useful when you need the model to produce factual information or adhere closely to a certain writing style.

 

**Higher Temperature Values (e.g., 0.8)**

 

- Setting a higher temperature makes the model's output more diverse and creative.

- The model is less biased towards the most probable word sequences and is more likely to produce surprising or unexpected responses.

- This is great when you want the model to generate creative content, brainstorm ideas, or be more conversational and less formal.

 

**Extreme Values**

 

- Setting the temperature to 0 makes the model completely deterministic, always choosing the most likely word. This might make the text repetitive or robotic.

- Setting the temperature very high (e.g., above 1) can result in very random and potentially nonsensical text.

 

**Use Cases**

 

1. **Story Generation**: A higher temperature (e.g., 0.7) could be used to generate more imaginative and varied stories.

2. **FAQ Responses**: A lower temperature (e.g., 0.2) would be appropriate for generating straightforward and consistent answers to frequently asked questions.

3. **Idea Brainstorming**: A medium to high temperature (e.g., 0.5 to 0.8) might be ideal for generating a wide range of creative ideas.

 

**Example**

 

' Set a low temperature for focused, factual responses

AIC.Set Temperature|0.2

 

' Set a higher temperature for creative, diverse responses

AIC.Set Temperature|0.8

 

**Example Script**

 

' Set OpenAI API-Key from the saved File

AIC.SetKey|File

' Set Model

AIC.SetModel_Chat|0

 

FOR.$$LEE|0|1|0.33

  $$RET=

 

' Set Model-Temperature

  AIC.Set_Temperature|$$LEE

 

' Set Max-Tokens (Possible lenght of answer, depending on the Model up to 2000 Tokens which is about ~6000 characters)

' The more Tokens you use the more you need to pay. But the longer Input and Output can be.

  AIC.SetMax_Token|100

 

' Ask Question and receive answer to $$RET

  $$QUE=What is the next word after "the winner is ..."?

  AIC.Ask_Chat|$$QUE|$$RET

  CLP.$$RET

  DBP. Temp.: $$LEE $crlf$$$RET

  DBP.-------------------------

NEX.

:enx

ENR.

 

 

**Output of the Script**

Temp.: 0 

The next word after "the winner is ..." depends on the context. It could be the name of the winner or the prize they have won.

 

Temp.: .33 

The next word after "the winner is ..." depends on the context. It could be the name of the winner or the prize they have won.

 

Temp.: .66 

The next word is typically the name of the winner or the prize they have won. For example, "The winner is John Smith" or "The winner is awarded a brand new car."

 

Temp.: .99 

The next word is the name of the winner.

 

 

**Conclusion**

 

The `AIC.SetTemperature` command is a powerful dial that controls the flavor of your LLM's responses. Like a master chef, understanding how to adjust this dial can help you produce the perfect dish for any occasion. Whether you need focused, factual content or imaginative, creative prose, the `AIC.SetTemperature` command has got you covered.

 

 

Here is a more detailed explanation

... and comparison of the `top_k`, `top_p`, and `temperature` parameters in the context of OpenAI's GPT-3 API.

 

When GPT-3 generates text, it does so word by word. For each word it generates, it calculates a probability for every word in its vocabulary, and then selects the next word based on these probabilities. The `top_k`, `top_p`, and `temperature` parameters are all ways to influence this selection process.

 

1. `top_k`: This parameter limits the number of words that the model considers as the next possible word. If `top_k` is set to 50, for example, the model will only consider the 50 words it thinks are most likely. This can make the output more focused and less random, because it's only choosing from a subset of words. However, it can also make the output less diverse, because it's ignoring a lot of potential words. The `top_k` value can be any non-negative integer, with larger values leading to more randomness and smaller values leading to less randomness. If `top_k` is not set, the model considers all possible words.

 

2. `top_p`: Also known as nucleus sampling, this parameter is a bit more dynamic. Instead of always considering a fixed number of words like `top_k`, `top_p` considers however many words are needed to reach a certain cumulative probability. For example, if `top_p` is set to 0.9, the model will consider the smallest set of words that have a combined probability of 90%. This set of words can be larger or smaller depending on the specific probabilities for each word. Like `top_k`, `top_p` can make the output more focused and less random, but it can also reduce diversity. The `top_p` value is a float between 0 and 1, with larger values leading to more randomness and smaller values leading to less randomness.

 

3. `temperature`: This parameter controls the "sharpness" of the probability distribution. If `temperature` is set to a high value (close to 1), the model's word selection will be more random and less deterministic, even if some words have much higher probabilities than others. If `temperature` is set to a low value (close to 0), the model's word selection will be more deterministic and less random, with the model strongly favoring words that have higher probabilities. In other words, a high `temperature` makes the model more "adventurous" in its word choices, while a low `temperature` makes the model more "conservative".

 

In summary, `top_k` and `top_p` are ways to limit the number of words that the model considers for each step of the generation process, while `temperature` is a way to control the randomness of the model's word selection within those limits. All three parameters can be used together to finely tune the behavior of the model. The optimal values for these parameters can depend on your specific use case and the desired behavior of the model. It's a good idea to experiment with different values to see what works best for your needs.

 

 

 

Syntax

 

 

AIC.Set Temperature|P1

AIC.Set_Temperature|P1

 

 

Parameter Explanation

P1 - `<value>`: A floating-point number or variable representing the temperature.
       Commonly used values are between 0.0 and 1.0, but higher values can also be used for more randomness.

 

 

 

Example

 

'*****************************************************

' EXAMPLE 1: AIC.-Commands

' Set Model-Temperature

'*****************************************************

' Set OpenAI API-Key from the saved File

AIC.SetKey|File

' Set Model

AIC.SetModel_Chat|0

 

FOR.$$LEE|0|1|0.33

  $$RET=

 

' Set Model-Temperature

  AIC.Set_Temperature|$$LEE

 

' Set Max-Tokens (Possible lenght of answer, depending on the Model up to 2000 Tokens which is about ~6000 characters)

' The more Tokens you use the more you need to pay. But the longer Input and Output can be.

  AIC.SetMax_Token|100

 

' Ask Question and receive answer to $$RET

  $$QUE=What is the next word after "the winner is ..."?

  AIC.Ask_Chat|$$QUE|$$RET

  CLP.$$RET

  DBP. Temp.: $$LEE $crlf$$$RET

  DBP.-------------------------

NEX.

:enx

ENR.

 

 

 

Remarks

-

 

Limitations:

-

 

 

See also:

 

  Set_Key

  Set_Max_Token