LMS. - LM Studio Interface

<< Click to Display Table of Contents >>

Navigation:  3. Script Language > AI - Artificial Intelligence Commands > LMS. - LM-Studio Commands >

LMS. - LM Studio Interface

LMS.Set NBatch - Set Prompt Batch Size

Previous Top Next


MiniRobotLanguage (MRL)

 

LMS.Set NBatch Command

Set the Batch Size for Parallel Processing

 

Intention

 

The LMS.Set NBatch command is an advanced performance-tuning setting that controls the number of tokens processed in parallel during the prompt ingestion phase. This is also known as the batch size.

 

Increasing this value can significantly speed up the processing of very long prompts, but it comes at the cost of higher VRAM usage. Setting it too high for your hardware may lead to out-of-memory errors. Most users can leave this at its default value. Calling the command without a parameter resets it to the default.

 

Syntax

 

LMS.Set NBatch|[P1]

 

Parameter Explanation

 

   P1 - (Optional) An integer for the batch size (e.g., 256, 512, 1024). If omitted, the value is reset to the default of 512.

 

Example

 

'**********************************************

' LMS.Set NBatch - Sample

'**********************************************

'

' For a very long prompt, we might increase the batch size to improve performance.

LMS.Set NBatch|1024

PRT.Batch size set to 1024 for this session.

' ... (LMS.ask commands would now use this setting) ...

'

' Reset the batch size to the default value.

LMS.Set NBatch

PRT.Batch size has been reset to default (512).

ENR.

 

Remarks

 

This is an advanced setting that directly corresponds to the `n_batch` parameter in the underlying llama.cpp engine. Altering it is only recommended if you are experiencing performance issues with very long prompts and have sufficient VRAM to accommodate a larger batch size. Generally a larger value may improve computing speed.

 

See also:

 

    LMS.Set Max Tokens

    LMS. - LM-Studio Commands