Create an AI-powered web app with streaming chat and conversation history
A deployed AI chat application with streaming responses, conversation memory, and user authentication.
Next.js handles the frontend chat UI and API routes. Vercel AI SDK provides the streaming integration with AI providers (OpenAI, Anthropic, etc.). Supabase stores conversation history and handles user auth. Vercel deploys and runs everything on edge functions for low latency.
Create a new Next.js project and install the Vercel AI SDK.
npx create-next-app@latest my-ai-app --typescript --tailwind --appnpm install ai @ai-sdk/openai.env.localnpx create-next-app@latest my-ai-app --typescript --tailwind --app
cd my-ai-app
npm install ai @ai-sdk/openai
OPENAI_API_KEY=sk-...
Create an API route that streams AI responses back to the client.
src/app/api/chat/route.tsstreamText from the AI SDK to stream responsesimport { openai } from '@ai-sdk/openai'
import { streamText } from 'ai'
export async function POST(req: Request) {
const { messages } = await req.json()
const result = streamText({
model: openai('gpt-4o'),
system: 'You are a helpful AI assistant.',
messages
})
return result.toDataStreamResponse()
}
gpt-4o-mini during development to save API costs. Switch to gpt-4o or claude-3.5-sonnet for production.Create a streaming chat UI with the `useChat` hook from the AI SDK.
src/app/page.tsxuseChat hook from ai/react for automatic streaming state management'use client'
import { useChat } from 'ai/react'
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat()
return (
<div className="max-w-2xl mx-auto p-4">
<div className="space-y-4 mb-4">
{messages.map(m => (
<div key={m.id} className={`p-4 rounded-lg ${
m.role === 'user' ? 'bg-blue-100 ml-12' : 'bg-gray-100 mr-12'
}`}>
{m.content}
</div>
))}
</div>
<form onSubmit={handleSubmit} className="flex gap-2">
<input value={input} onChange={handleInputChange}
placeholder="Ask me anything..."
className="flex-1 p-3 border rounded-lg" />
<button type="submit" disabled={isLoading}
className="px-6 py-3 bg-black text-white rounded-lg">
Send
</button>
</form>
</div>
)
}
useChat hook handles streaming, loading states, error handling, and message history automatically. You don't need to manage any of this yourself.Store conversations in Supabase so users can revisit past chats.
npm install @supabase/supabase-jsconversations table (id, user_id, title, created_at)messages table (id, conversation_id, role, content, created_at).env.localCREATE TABLE conversations (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id TEXT NOT NULL,
title TEXT DEFAULT 'New conversation',
created_at TIMESTAMPTZ DEFAULT now()
);
CREATE TABLE messages (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
conversation_id UUID REFERENCES conversations(id) ON DELETE CASCADE,
role TEXT NOT NULL CHECK (role IN ('user', 'assistant', 'system')),
content TEXT NOT NULL,
created_at TIMESTAMPTZ DEFAULT now()
);
Deploy your AI app with edge functions for low-latency streaming.
export const runtime = 'edge' to your chat route for edge deploymentA deployed AI chat application with streaming responses, conversation memory, and user authentication.
Get a step-by-step checklist, setup order, and the exact config for every tool in this guide. Or let me build it for you.
Get the checklist → Want this built for you?