Hello, world!
Contents
[
Hide
]
This example shows how to create an AsposeLLMApi instance from a preset, send a single message, and print the response.
Prerequisites
- Install the Aspose.LLM NuGet package.
- Apply a license or run in evaluation mode.
- Enough memory and (if needed) disk space for the model used by the preset.
Code
- Create an API instance using a preset (e.g.
Qwen25PresetfromAspose.LLM.Abstractions.Parameters.Presets):
using Aspose.LLM;
using Aspose.LLM.Abstractions.Parameters.Presets;
var preset = new Qwen25Preset();
using var api = AsposeLLMApi.Create(preset);
- Send a message. If no session exists, the API starts one automatically:
string response = await api.SendMessageAsync("Hello, say one short sentence about yourself.");
Console.WriteLine(response);
- Dispose the API when done (or use
usingas above).
Full example:
using Aspose.LLM;
using Aspose.LLM.Abstractions.Parameters.Presets;
var preset = new Qwen25Preset();
using var api = AsposeLLMApi.Create(preset);
string response = await api.SendMessageAsync("Hello, say one short sentence about yourself.");
Console.WriteLine(response);
What’s next
- Multi-turn chat — start a session explicitly and send several messages.
- Save and restore session — persist conversation state to disk.
- Developer’s reference — presets, sessions, and API details.