AI Integration

Chat Interface

A beautiful, production-ready AI chat interface with streaming responses.

Nuda Kit includes a fully-featured AI chat interface out of the box. It's designed to be beautiful, responsive, and ready for production use—perfect for AI wrappers, assistants, or any conversational AI product.

Features

Real-time Streaming

Responses stream in real-time using the Vercel AI SDK, providing instant feedback as the AI generates text.

Multiple Models

Switch between OpenAI, Anthropic, and Google AI models with a single click.

Beautiful UI

Gradient glow effects, smooth animations, and a polished design that works in light and dark mode.

Responsive Design

Optimized for desktop, tablet, and mobile devices.

Supported Models

The chat interface comes pre-configured with popular models from three providers:

ProviderModelsLogo
OpenAIGPT-5, GPT-4o
AnthropicClaude 4.5 Sonnet
GoogleGemini 2.5 Pro, Gemini 2.5 Flash
The model selector includes brand logos for each provider, making it easy for users to identify their preferred model.

User Experience

Personalized Welcome

When users open the chat, they're greeted with a personalized message:

Hello First Name
How can I help you today?

The greeting uses a beautiful gradient text effect and subtle text reveal animation.

Message Actions

Each AI response includes action buttons:

  • Copy — Copy the response to clipboard with visual feedback
  • Regenerate — Generate a new response for the same prompt

Thinking Indicator

While the AI is generating a response, users see:

  • An animated pulsing dot with gradient colors
  • "Thinking..." text with animated dots
  • The send button transforms into a stop button

Smart Input

The message input includes:

  • Auto-resize — Textarea grows as you type, up to a maximum height
  • Enter to send — Press Enter to send, Shift+Enter for new lines
  • Disabled while streaming — Prevents sending while AI is responding

Interface Components

  • Breadcrumb navigation
  • New Chat button to clear conversation and start fresh

Chat Area

  • Messages displayed in a clean, readable format
  • User messages aligned right with muted background
  • AI responses aligned left with action buttons
  • Auto-scroll to newest messages

Input Area

  • Gradient glowing border effect
  • Model selector dropdown with provider logos
  • Send/Stop button that adapts to current state

API Endpoint

The chat interface communicates with a single streaming endpoint:

MethodEndpointDescription
POST/ai/streamStream AI responses

The endpoint accepts:

  • messages — Conversation history
  • model — Selected model name (e.g., gpt-4o, claude-4.5-sonnet)

Smooth Streaming

The backend uses advanced streaming techniques for a polished experience:

  • Simulated streaming middleware — Ensures consistent streaming behavior across providers
  • Smooth stream transform — Provides natural text flow instead of choppy chunks

Authentication

The chat page is protected by authentication middleware. Users must be logged in to access it, and the JWT token is automatically included in API requests.

Adding More Models

You can easily add new AI models to the chat interface. Here's how:

Step 1: Update the Backend

Add your new model to the PROVIDER_MODELS constant in backend/app/services/ai_service.ts:

export const PROVIDER_MODELS = {
  openai: ['gpt-5', 'gpt-4o', 'gpt-4o-mini'],  // Add new OpenAI models
  anthropic: ['claude-4.5-sonnet', 'claude-3-haiku'],  // Add new Anthropic models
  google: ['gemini-2.5-flash', 'gemini-2.5-pro'],
} as const

The service automatically detects which provider to use based on the model name.

Step 2: Update the Frontend

Add the model to the aiModels array in frontend/app/pages/app/ai-chat.vue:

const aiModels = shallowRef([
  // ... existing models
  markRaw({ 
    name: 'gpt-4o-mini', 
    avatar: GPTLogo, 
    avatarFilled: false, 
    enabled: true 
  }),
])

Step 3: Add API Keys (if needed)

If you're adding a model from a new provider, add the API key to your backend/.env:

OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=...

Adding a New Provider

To add an entirely new AI provider (e.g., Mistral, Cohere):

Install the SDK

Install the Vercel AI SDK package for your provider.

Add to Provider Models

Add the new provider and its models to the PROVIDER_MODELS constant.

Add Provider Logic

Add a new case in the getModelForProvider method to handle the new provider.

Import the provider's logo SVG and add models to the frontend selector.

The Vercel AI SDK supports many providers including Mistral, Cohere, Perplexity, and more. Check the Vercel AI SDK documentation for available providers.

Customization Ideas

You can extend the chat interface to fit your product:

  • System prompts — Add custom instructions to shape AI behavior
  • Chat history — Persist conversations to the database
  • File uploads — Allow users to upload documents or images
  • Tool calling — Enable the AI to perform actions
  • Usage tracking — Monitor token usage per user
  • Rate limiting — Control API usage based on subscription tier
The chat interface is a starting point. Customize it to match your product's unique value proposition.

Dark Mode

The chat interface fully supports dark mode:

  • Gradient colors adapt to the theme
  • Message bubbles use theme-aware backgrounds
  • Model logos switch between filled/outline variants
  • All UI elements respect the user's color preference

Mobile Experience

On mobile devices:

  • Full-width message input
  • Touch-friendly buttons
  • Collapsible header elements
  • Optimized spacing and typography