Getting Started
Basic Usage
Learn how to use Nuxt AI in your Nuxt applications.
This guide covers the basic usage patterns for integrating AI capabilities into your Nuxt application using the Nuxt AI module.
Using the Chat Interface
The most common use case is implementing a chat interface using the Vercel AI SDK. With Nuxt AI, the SDK's composables are auto-imported.
Creating a Chat Component
components/AiChat.vue
<script setup lang="ts">
const { messages, input, handleSubmit } = useChat({
api: '/api/chat',
});
</script>
Creating an API Route Handler
Set up an API route to handle the chat requests:
server/api/chat.ts
import { createOpenAI } from '@ai-sdk/openai'
export default defineLazyEventHandler(async () => {
const apiKey = useRuntimeConfig().openaiApiKey
if (!apiKey) {
throw new Error('Missing OpenAI API key')
}
const openai = createOpenAI({
apiKey,
})
return defineEventHandler(async (event: any) => {
const { messages } = await readBody(event)
const result = streamText({
model: openai('gpt-4o'),
messages,
})
return result.toDataStreamResponse()
})
})
Using Streaming Completion
For simple text completion without a chat interface:
components/TextCompletion.vue
<script setup lang="ts">
const prompt = ref('');
const { completion, complete, isLoading } = useCompletion({
api: '/api/completion',
});
async function handleSubmit() {
await complete(prompt.value);
}
</script>
<template>
<div>
<form @submit.prevent="handleSubmit">
<textarea v-model="prompt" placeholder="Enter a prompt..."></textarea>
<button type="submit" :disabled="isLoading">Generate</button>
</form>
<div v-if="completion" class="completion">
<h3>Generated Text:</h3>
<p>{{ completion }}</p>
</div>
</div>
</template>
And the corresponding API route:
server/api/completion.ts
import { OpenAIStream, StreamingTextResponse } from '@ai-sdk/edge'
import { defineEventHandler } from 'h3'
import OpenAI from 'openai'
const openai = new OpenAI({
apiKey: import.meta.env.OPENAI_API_KEY!,
})
export default defineEventHandler(async (event) => {
const { prompt } = await readBody(event)
const response = await openai.completions.create({
model: 'gpt-3.5-turbo-instruct',
stream: true,
prompt,
})
const stream = OpenAIStream(response)
return new StreamingTextResponse(stream)
})
Using Different AI Providers
Nuxt AI supports multiple AI providers through the Vercel AI SDK. Here's how to use Anthropic's Claude:
server/api/claude-chat.ts
import { AnthropicStream, StreamingTextResponse } from '@ai-sdk/edge'
import Anthropic from '@anthropic-ai/sdk'
import { defineEventHandler } from 'h3'
const anthropic = new Anthropic({
apiKey: import.meta.env.ANTHROPIC_API_KEY!,
})
export default defineEventHandler(async (event) => {
const { messages } = await readBody(event)
const response = await anthropic.messages.create({
model: 'claude-3-opus-20240229',
max_tokens: 1000,
stream: true,
messages,
})
const stream = AnthropicStream(response)
return new StreamingTextResponse(stream)
})
Using MCP Tools
If you've enabled the MCP server in your configuration, you can access its tools in development:
- Navigate to your Nuxt app in development mode (usually at http://localhost:3000)
- Open the built-in MCP tools at the configured path (default is /mcp)
- Use the various development tools provided by the MCP server
MCP tools can help with:
- Generating and managing AI rules
- Documenting your application
- Testing AI integrations
- And more, depending on which MCP servers you've enabled