Skip to content
Introducing AI Conversations: Natural Language Interaction for Your Apps! Learn More

Introducing Shiny.AiConversation — AI Conversation

NuGet package Shiny.AiConversation

Building an AI-powered app today means stitching together a chat client, speech recognition, text-to-speech, audio playback, message persistence, and state management — across platforms, with proper lifecycle handling. That’s a lot of plumbing before you write your first prompt.

Shiny.AiConversation wraps all of that into a single IAiConversationService interface. Text chat, voice chat, hands-free wake word activation, configurable audio feedback, and persistent chat history — registered with one DI call, consumed through one service.


Every AI chat app ends up building the same infrastructure:

  • An authenticated chat client that handles token refresh
  • Speech-to-text so users can talk instead of type
  • Text-to-speech so the AI can respond out loud
  • Sound effects for state transitions (thinking, responding, error)
  • A wake word listener for hands-free mode
  • Message persistence for chat history
  • State management so the UI knows what’s happening
  • Thread safety so nothing blows up

Each of these is a separate library, a separate abstraction, and a separate set of platform quirks. You spend weeks on infrastructure before you ship a single feature.

// Register your chat client in DI
builder.Services.AddChatClient(new OpenAIClient("your-api-key").GetChatClient("gpt-4o").AsIChatClient());
builder.Services.AddShinyAiConversation(opts =>
{
opts.SetMessageStore<MyMessageStore>(); // optional
});

That’s it. The service registers IAiConversationService with all the wiring — speech services from Shiny.Speech, chat completions from Microsoft.Extensions.AI, audio playback, time provider, and optional message persistence. The default IChatClientProvider resolves IChatClient straight from DI, so for most apps you just register your chat client and go. For advanced scenarios (on-demand auth, token refresh), you can still implement IChatClientProvider directly.

The simplest path. Send a message, get a streaming response:

aiService.AiResponded += response =>
{
if (response.Update.Text is { } text)
Console.Write(text);
if (response.IsResponseCompleted)
Console.WriteLine();
};
await aiService.TalkTo("What is .NET MAUI?", cancellationToken);

The service handles the full lifecycle — acquires the chat client, prepends system prompts, streams the response, stores both messages if a message store is configured, fires the event, and manages state transitions throughout.

One method call captures speech and sends it to the AI:

await aiService.ListenAndTalk(cancellationToken);

The service activates speech-to-text, waits for the user to stop speaking, sends the transcribed text through TalkTo(), and optionally reads the response aloud via text-to-speech.

This is the “Hey Siri” experience:

await aiService.StartWakeWord("Hey Copilot");

The service enters a continuous loop: listen for the wake phrase, capture the utterance that follows, send it to the AI, loop back. The user never touches the screen. Call StopWakeWord() when you’re done.

Control how the AI delivers responses:

ModeWhat Happens
NoneSilent — text only, delivered via the AiResponded event
AudioBlipShort sound effects at each state transition
LessWordyText-to-speech with a “be concise” system prompt
FullFull text-to-speech of the complete response

Sound effects are driven by string file names and a SoundResolver callback — the library stays platform-agnostic while you provide the stream:

aiService.SoundResolver = name => FileSystem.OpenAppPackageFileAsync(name);
aiService.ThinkSound = "think.mp3";
aiService.OkSound = "ok.mp3";

Register an IMessageStore and every message is automatically persisted. But the interesting part is the AI chat lookup tool — it’s an AITool that lets the AI search its own conversation history:

“What did we talk about yesterday?” “Find the recipe you gave me last week.”

The tool is registered automatically when you call SetMessageStore(). The AI gets search parameters (text, date range, limit) and queries your store directly.

The service exposes its current state and fires events:

aiService.StatusChanged += state =>
{
// state: Idle, Listening, Thinking, Responding
UpdateUI(state);
};

This is what powers the “Aura” visualization in our sample app — a pulsing orb that changes color based on what the AI is doing.

The library doesn’t care which AI you use. By default, it resolves IChatClient from DI — just register one and you’re done. For advanced auth scenarios, implement IChatClientProvider to return any IChatClient from Microsoft.Extensions.AI:

  • OpenAInew OpenAIClient(apiKey).GetChatClient("gpt-4o").AsIChatClient()
  • GitHub Copilot — OAuth device code flow with Copilot API token exchange
  • Azure OpenAI — Managed identity or API key
  • Ollama — Local model, no auth needed
  • Anything else — If it implements IChatClient, it works

The sample apps include a complete GitHub Copilot implementation with device code flow, token caching, automatic re-authentication, and the custom HTTP headers the Copilot API requires.

The library targets plain net10.0 — no MAUI dependency in the library itself. Shiny.Speech handles the platform abstraction for speech and audio, so the same IAiConversationService works on:

  • MAUI — Android, iOS, Windows, Mac Catalyst
  • Blazor — Server-side and WebAssembly (speech via Web Audio API)

We ship two sample apps that prove it: a full MAUI sample with chat, settings, and an animated aura visualization, plus a Blazor Server sample with the same features translated to Razor components and CSS animations.

The library is built with IsAotCompatible=true. Generic type parameters on SetChatClientProvider<T>() and SetMessageStore<T>() carry [DynamicallyAccessedMembers] attributes so the trimmer knows what to keep. No reflection surprises at runtime.

Terminal window
dotnet add package Shiny.AiConversation

The library is MIT licensed and open source. We’d love to hear what you build with it.