ModelService
Namespace: Daisi.Host.Core.Services
The ModelService manages local model files for self-hosted inference.
It queries the Orc for required models, checks which are present on disk, downloads missing ones with progress reporting, and registers them in the host settings.
ModelService to automatically download required models at startup,
showing a progress dialog while the user waits.
Constructor
ModelService is typically resolved via dependency injection. It requires an
ISettingsService, a ModelClientFactory, and an ILogger.
// DI registration (done at app startup) builder.Services.AddSingleton<Daisi.Host.Core.Services.Interfaces.ISettingsService, DesktopSettingsService>(); builder.Services.AddSingleton<ModelService>();
Properties
ISettingsServiceSettingsService get set
The host settings service, providing access to model folder path and model configuration.SettingsSettings get
Shortcut toSettingsService.Settings.ModelClientModelsClient get
The Orc ModelClient used to query required models.List<LocalModel>LocalModels get
The list of loaded local models afterLoadModels()has been called.LocalModel?Default get
The default model. Returns the first model markedIsDefault, or the first enabled model, or the first model in the list.
Methods
voidLoadModels()
Loads all models registered inSettings.Model.ModelsintoLocalModels. Models withLoadAtStartup = trueare loaded into memory immediately.TaskDownloadRequiredModelsAsync()
Queries the Orc for required models, checks which are missing from the model folder, downloads them, registers them in settings, and saves. This is the all-in-one orchestration method.TaskDownloadRequiredModelAsync(AIModel model, Action<double>? percentageProgress = null)
Downloads a single model file from its URL to the local model folder. The optionalpercentageProgresscallback is invoked with values from 0 to 100 as the download proceeds.boolIsDownloaded(AIModel model)
Returnstrueif the model is registered in the local settings.boolIsDefault(AIModel model)
Returnstrueif the model is marked as the default.boolIsEnabled(AIModel model)
Returnstrueif the model is enabled.
Downloading a Single Model with Progress
Use DownloadRequiredModelAsync with a progress callback to show download progress to the user.
The callback receives a percentage value (0–100).
var modelService = host.Services.GetRequiredService<ModelService>();
// Load host settings first
var settingsService = host.Services.GetRequiredService<ISettingsService>();
await settingsService.LoadAsync();
// Get required models from Orc
var requiredModels = modelService.ModelsClient.GetRequiredModels().Models;
// Check which files are missing
string modelFolder = settingsService.Settings.Model.ModelFolderPath;
var existingFiles = Directory.GetFiles(modelFolder)
.Select(Path.GetFileName)
.ToHashSet(StringComparer.OrdinalIgnoreCase);
foreach (var model in requiredModels)
{
if (!existingFiles.Contains(model.FileName))
{
Console.WriteLine($"Downloading {model.Name}...");
await modelService.DownloadRequiredModelAsync(model, progress =>
{
Console.Write($"\r {progress:F1}% complete");
});
Console.WriteLine($"\n Done: {model.FileName}");
}
}
// Load models into memory
modelService.LoadModels();
Auto-Download Pattern (Used by DaisiBot)
DaisiBot's TUI and MAUI apps implement an auto-download pattern at startup. The flow is:
- Check if
HostModeEnabledistruein user settings (this is the default). - Call
ILocalInferenceService.GetRequiredDownloadsAsync()to get the list of missing models. - If models are missing, show a progress dialog.
- For each missing model, call
ILocalInferenceService.DownloadModelAsync()with a progress callback that updates the UI. - After all downloads complete, call
ILocalInferenceService.InitializeAsync()to load settings, tools, and models. - Auto-close the dialog and proceed to the main screen.
Users can skip the download (Esc in TUI, Skip button in MAUI) and switch to DaisiNet cloud mode later via the host mode toggle.
// Simplified startup pattern
var localInference = host.Services.GetRequiredService<ILocalInferenceService>();
var missing = await localInference.GetRequiredDownloadsAsync();
if (missing.Count > 0)
{
foreach (var model in missing)
{
Console.WriteLine($"Downloading {model.Name}...");
await localInference.DownloadModelAsync(model, progress =>
{
Console.Write($"\r {progress:F1}%");
});
Console.WriteLine();
}
}
await localInference.InitializeAsync();
Console.WriteLine("Local inference ready.");