/monorepo
Ai

AI Assistant

Architecture and implementation of the AI features.

The platform comes with a dedicated AI Assistant application (apps/app) and a backend module (apps/api/src/chat) to power it.

Architecture

The AI integration follows a modern pattern using the Vercel AI SDK on both the frontend and backend.

Frontend (apps/app)

The frontend is a specialized React application built with Vite. It leverages Assistant UI components to provide a rich chat experience.

  • Runtime: Uses useChatRuntime from @assistant-ui/react-ai-sdk.
  • Transport: Connects to the backend via AssistantChatTransport.
  • UI: Features a custom thread list and message view, styled with Tailwind CSS.

Backend (apps/api/src/chat)

The backend implementation relies on the AI SDK Core (ai) to handle streaming responses from LLMs.

  • Model: Currently configured to use Google Gemini (gemini-3-pro-preview) via @ai-sdk/google.
  • Streaming: Utilizes streamText to send real-time tokens to the client.
  • Endpoint: Exposes /chat (POST) to receive messages and return the stream.

Customization

Changing the Model

To switch to a different model (e.g., OpenAI GPT-4), update apps/api/src/chat/chat.service.ts:

// apps/api/src/chat/chat.service.ts
import { openai } from '@ai-sdk/openai';

// ...
streamText({
  model: openai('gpt-4'),
  // ...
})

Modifying the UI

The assistant UI is modular. You can customize the look and feel by editing components in apps/app/src/components/assistant-ui/.