SystemPrompt

SystemPrompt is the default system prompt applied to every new session created from this preset. It sets the assistant’s role, tone, and constraints.

Quick reference

Type string
Default "" (empty)
Category Chat session
Field on ChatParameters.SystemPrompt

What it does

When a chat session starts — either explicitly via StartNewChatAsync or implicitly on the first SendMessageAsync — the engine injects a system turn with this text at the top of the conversation. The model sees the system prompt before any user input and uses it to shape its behavior across the session.

  • "" (default) — no system turn. Some presets (certain Gemma variants) prefer this.
  • A short instruction — role and tone. For example, “You are a concise technical assistant.”
  • A longer instruction — include format constraints, forbidden topics, preferred output structure.

The system prompt is applied once per session. Changes to SystemPrompt after AsposeLLMApi.Create do not affect already-running sessions.

When to change it

Scenario Value
Preset-specific default Whatever the preset ships with
Role specialization "You are a ..."
Format enforcement Explicit format rules
Safety / content filtering Instructions to refuse certain inputs

Keep system prompts concise — 50-300 tokens. Every token in the system prompt counts against ContextParameters.ContextSize.

Example

var preset = new Qwen25Preset();
preset.ChatParameters.SystemPrompt =
    "You are a precise technical assistant. Answer in at most two sentences. " +
    "Say 'I do not know' when you are unsure.";

using var api = AsposeLLMApi.Create(preset);

Interactions

  • History — seeded history is appended after the system prompt.
  • CacheCleanupStrategy — most strategies preserve the system prompt; the cleanup policy anchors on it.
  • ContextSize — system prompt consumes tokens from the window.

What’s next