The component requires the IAIChatService to be registered in your application, please use our demos source for reference. This service supports any OpenAI-compatible endpoint, including OpenAI, Azure OpenAI, Cloudflare AI, and other compatible providers. For WebAssembly applications, you'll need to use a server-side proxy endpoint since browsers cannot make direct requests to external APIs due to CORS restrictions, refer to AIChatController.
Console log
Customize the appearance of the AIChat component using Style
, ShowClearButton
, Disabled
, and ReadOnly
properties.
Create a more compact chat interface suitable for smaller spaces or sidebar implementations.
Handle chat events like MessageAdded
, MessageSent
, ResponseReceived
, and ChatCleared
to integrate with your application logic.
Console log
The AIChat component supports conversation memory that remembers previous questions and maintains context across multiple interactions.
Use SessionId
to maintain conversation state and SessionIdChanged
to track session changes.
Customize AI behavior at runtime by configuring the Model
, Temperature
, MaxTokens
, SystemPrompt
, Endpoint
, Proxy
, ApiKey
, and ApiKeyHeader
parameters.
These parameters can be set on the component or passed to the SendMessage
method to override settings per message, enabling dynamic switching between different AI providers and models.
Model: @cf/meta/llama-4-scout-17b-16e-instruct
Total Messages: 0
@cf/meta/llama-4-scout-17b-16e-instruct
Active@cf/qwen/qwen1.5-14b-chat-awq
@cf/mistral/mistral-7b-instruct-v0.1
Radzen Blazor Components, © 2018-2025 Radzen.
Source Code licensed under
MIT