Quick Start

Get up and running in 2 minutes

Prerequisites: Node 18+, an LLM API key (OpenAI, Anthropic, etc.)

At a Glance

Quick Setup

Three steps to your AI copilot

1

Install

Add the SDK packages

bash
pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk openai
2

Backend

Create your API route

typescript
import { streamText } from '@yourgpt/llm-sdk';
import { openai } from '@yourgpt/llm-sdk/openai';

export async function POST(req: Request) {
  const { messages } = await req.json();
  const result = await streamText({
    model: openai('gpt-4o-mini'),
    messages,
  });
  return result.toTextStreamResponse();
}
3

Frontend

Add the chat component

tsx
import { CopilotProvider } from '@yourgpt/copilot-sdk/react';
import { CopilotChat } from '@yourgpt/copilot-sdk/ui';

export default function App() {
  return (
    <CopilotProvider runtimeUrl="/api/chat">
      <CopilotChat />
    </CopilotProvider>
  );
}

Ready to launch

Run pnpm dev and open localhost:3000


Detailed Setup

Create New Project

The fastest way to start - one command creates everything:

pnpm create ai-copilot
npx create-ai-copilot
yarn create ai-copilot
bunx create-ai-copilot

The CLI will guide you through:

  1. Project name - Name your app
  2. Framework - Next.js (full-stack) or Vite + React
  3. Provider - OpenAI, Anthropic, Google, or xAI
  4. API Key - Optional, can add to .env later

The CLI creates a complete project with frontend, backend, and styling ready to go.

Run Your App

cd my-app
pnpm dev

Open http://localhost:3000 - your AI copilot is ready!


What's Included

The generated project includes:

  • Frontend - React chat UI with @yourgpt/copilot-sdk
  • Backend - API route with @yourgpt/llm-sdk
  • Styling - Tailwind CSS with shadcn/ui theming
  • TypeScript - Full type safety
  • Environment - .env file for API keys

Next Steps

Add to Existing Project

Already have a Next.js, Vite, or React app? Add the SDK manually.


1. Install Dependencies

pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk openai zod

The openai package works with OpenAI, Google Gemini, and xAI (all OpenAI-compatible APIs).

pnpm add @yourgpt/copilot-sdk @yourgpt/llm-sdk @anthropic-ai/sdk zod

Anthropic requires the @anthropic-ai/sdk package instead of openai.


2. Create API Route

app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { openai } from '@yourgpt/llm-sdk/openai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai('gpt-4o-mini'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
OPENAI_API_KEY=sk-...

First install Anthropic SDK:

pnpm add @anthropic-ai/sdk
app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { anthropic } from '@yourgpt/llm-sdk/anthropic';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: anthropic('claude-sonnet-4-20250514'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
ANTHROPIC_API_KEY=sk-ant-...
app/api/chat/route.ts
import { streamText } from '@yourgpt/llm-sdk';
import { google } from '@yourgpt/llm-sdk/google';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: google('gemini-2.0-flash'),
    system: 'You are a helpful assistant.',
    messages,
  });

  return result.toTextStreamResponse();
}
.env.local
GOOGLE_API_KEY=...

Google Gemini uses OpenAI-compatible API via the openai package you already installed.


3. Verify Environment Variable

Make sure your .env.local has the API key for your chosen provider.

See Providers for more providers and configuration options.


4. Add Provider (Frontend)

app/providers.tsx
'use client';

import { CopilotProvider } from '@yourgpt/copilot-sdk/react';

export function Providers({ children }: { children: React.ReactNode }) {
  return (
    <CopilotProvider runtimeUrl="/api/chat">
      {children}
    </CopilotProvider>
  );
}

Wrap your app with the provider:

app/layout.tsx
import { Providers } from './providers';

export default function RootLayout({ children }) {
  return (
    <html>
      <body>
        <Providers>{children}</Providers>
      </body>
    </html>
  );
}

5. Add Chat Component

app/page.tsx
import { CopilotChat } from '@yourgpt/copilot-sdk/ui';

export default function Home() {
  return (
    <div className="h-screen p-4">
      <CopilotChat className="h-full rounded-xl border" />
    </div>
  );
}

6. Configure Styling (Tailwind CSS v4)

app/globals.css
@import "tailwindcss";

/* Include SDK package for Tailwind class detection */
@source "node_modules/@yourgpt/copilot-sdk/src/**/*.{ts,tsx}";

@custom-variant dark (&:is(.dark *));

For theming and CSS variables, follow the shadcn/ui theming guide.


Done!

Run pnpm dev and you have a working AI chat.


Next Steps

On this page