Basic Usage
This guide covers the basic usage patterns for integrating AI capabilities into your Nuxt application using the Nuxt AI module.
First decide when model you want to use
The module supports multiple AI providers through the Vercel AI SDK. You can choose the provider you want to use by setting the client
option in the configuration.
You can see which providers are supported by the module here.
We'll use OpenAI in this guide, so we'll install the OpenAI provider:
pnpm add -D @ai-sdk/openai
yarn add --dev @ai-sdk/openai
npm install --save-dev @ai-sdk/openai
bun add -D @ai-sdk/openai
Using the Chat Interface
The most common use case is implementing a chat interface using the Vercel AI SDK. With Nuxt AI, the SDK's composables are auto-imported.
Creating a Chat Component
<script setup lang="ts">
const { messages, input, handleSubmit } = useChat({
api: '/api/chat',
});
</script>
Creating an API Route Handler
Set up an API route to handle the chat requests:
useRuntimeConfig
object to access your API keys. Learn more about runtime config environment variables.import { createOpenAI } from '@ai-sdk/openai'
export default defineLazyEventHandler(async () => {
const apiKey = useRuntimeConfig().openaiApiKey
if (!apiKey) {
throw new Error('Missing OpenAI API key')
}
const openai = createOpenAI({
apiKey,
})
return defineEventHandler(async (event: any) => {
const { messages } = await readBody(event)
const result = streamText({
model: openai('gpt-4o'),
messages,
})
return result.toDataStreamResponse()
})
})
Using Streaming Completion
For simple text completion without a chat interface:
<script setup lang="ts">
const prompt = ref('');
const { completion, complete, isLoading } = useCompletion({
api: '/api/completion',
});
async function handleSubmit() {
await complete(prompt.value);
}
</script>
<template>
<div>
<form @submit.prevent="handleSubmit">
<textarea v-model="prompt" placeholder="Enter a prompt..."></textarea>
<button type="submit" :disabled="isLoading">Generate</button>
</form>
<div v-if="completion" class="completion">
<h3>Generated Text:</h3>
<p>{{ completion }}</p>
</div>
</div>
</template>
And the corresponding API route:
import { OpenAIStream, StreamingTextResponse } from '@ai-sdk/edge'
import { defineEventHandler } from 'h3'
import OpenAI from 'openai'
const openai = new OpenAI({
apiKey: import.meta.env.OPENAI_API_KEY!,
})
export default defineEventHandler(async (event) => {
const { prompt } = await readBody(event)
const response = await openai.completions.create({
model: 'gpt-3.5-turbo-instruct',
stream: true,
prompt,
})
const stream = OpenAIStream(response)
return new StreamingTextResponse(stream)
})
Using Different AI Providers
Nuxt AI supports multiple AI providers through the Vercel AI SDK. Here's how to use Anthropic's Claude:
pnpm add -D @ai-sdk/anthropic
yarn add --dev @ai-sdk/anthropic
npm install --save-dev @ai-sdk/anthropic
bun add -D @ai-sdk/anthropic
import { createAnthropic } from '@ai-sdk/anthropic'
export default defineLazyEventHandler(async () => {
const apiKey = useRuntimeConfig().anthropicApiKey
if (!apiKey) {
throw new Error('Missing Anthropic API key')
}
const anthropic = createAnthropic({
apiKey,
})
return defineEventHandler(async (event: any) => {
const { messages } = await readBody(event)
const result = streamText({
model: anthropic('claude-3-7-sonnet-20250219'),
messages,
})
return result.toDataStreamResponse()
})
})
Using MCP Tools
If you've enabled the MCP server in your configuration, you can access its tools in development:
- Navigate to your Nuxt app in development mode (usually at http://localhost:3000)
- Open the built-in MCP tools at the configured path (default is /mcp)
- Use the various development tools provided by the MCP server
MCP tools can help with:
- Generating and managing AI rules
- Documenting your application
- Testing AI integrations
- And more, depending on which MCP servers you've enabled