Hello, world!

This example shows how to create an AsposeLLMApi instance from a preset, send a single message, and print the response.

Prerequisites

  • Install the Aspose.LLM NuGet package.
  • Apply a license or run in evaluation mode.
  • Enough memory and (if needed) disk space for the model used by the preset.

Code

  1. Create an API instance using a preset (e.g. Qwen25Preset from Aspose.LLM.Abstractions.Parameters.Presets):
using Aspose.LLM;
using Aspose.LLM.Abstractions.Parameters.Presets;

var preset = new Qwen25Preset();
using var api = AsposeLLMApi.Create(preset);
  1. Send a message. If no session exists, the API starts one automatically:
string response = await api.SendMessageAsync("Hello, say one short sentence about yourself.");
Console.WriteLine(response);
  1. Dispose the API when done (or use using as above).

Full example:

using Aspose.LLM;
using Aspose.LLM.Abstractions.Parameters.Presets;

var preset = new Qwen25Preset();
using var api = AsposeLLMApi.Create(preset);
string response = await api.SendMessageAsync("Hello, say one short sentence about yourself.");
Console.WriteLine(response);

What’s next