Nuda Kit includes a fully-featured AI chat interface out of the box. It's designed to be beautiful, responsive, and ready for production use—perfect for AI wrappers, assistants, or any conversational AI product.
Real-time Streaming
Multiple Models
Beautiful UI
Responsive Design
The chat interface comes pre-configured with popular models from three providers:
| Provider | Models | Logo |
|---|---|---|
| OpenAI | GPT-5, GPT-4o | |
| Anthropic | Claude 4.5 Sonnet | |
| Gemini 2.5 Pro, Gemini 2.5 Flash |
When users open the chat, they're greeted with a personalized message:
Hello First Name
How can I help you today?
The greeting uses a beautiful gradient text effect and subtle text reveal animation.
Each AI response includes action buttons:
While the AI is generating a response, users see:
The message input includes:
The chat interface communicates with a single streaming endpoint:
| Method | Endpoint | Description |
|---|---|---|
POST | /ai/stream | Stream AI responses |
The endpoint accepts:
messages — Conversation historymodel — Selected model name (e.g., gpt-4o, claude-4.5-sonnet)The backend uses advanced streaming techniques for a polished experience:
The chat page is protected by authentication middleware. Users must be logged in to access it, and the JWT token is automatically included in API requests.
You can easily add new AI models to the chat interface. Here's how:
Add your new model to the PROVIDER_MODELS constant in backend/app/services/ai_service.ts:
export const PROVIDER_MODELS = {
openai: ['gpt-5', 'gpt-4o', 'gpt-4o-mini'], // Add new OpenAI models
anthropic: ['claude-4.5-sonnet', 'claude-3-haiku'], // Add new Anthropic models
google: ['gemini-2.5-flash', 'gemini-2.5-pro'],
} as const
The service automatically detects which provider to use based on the model name.
Add the model to the aiModels array in frontend/app/pages/app/ai-chat.vue:
const aiModels = shallowRef([
// ... existing models
markRaw({
name: 'gpt-4o-mini',
avatar: GPTLogo,
avatarFilled: false,
enabled: true
}),
])
If you're adding a model from a new provider, add the API key to your backend/.env:
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=...
To add an entirely new AI provider (e.g., Mistral, Cohere):
Install the Vercel AI SDK package for your provider.
Add the new provider and its models to the PROVIDER_MODELS constant.
Add a new case in the getModelForProvider method to handle the new provider.
Import the provider's logo SVG and add models to the frontend selector.
You can extend the chat interface to fit your product:
The chat interface fully supports dark mode:
On mobile devices: