AI & LLMs
Integrate AI functionality to Fumadocs.
Docs for LLM
You can make your docs site more AI-friendly with dedicated docs content for large language models.
To begin, make a getLLMText
function that converts pages into static MDX content.
This is an example for Fumadocs MDX:
import { source } from '@/lib/source';
import type { InferPageType } from 'fumadocs-core/source';
export async function getLLMText(page: InferPageType<typeof source>) {
const processed = await page.data.getText('processed');
return `# ${page.data.title}
URL: ${page.url}
${processed}`;
}
It requires includeProcessedMarkdown
to be enabled:
import { defineDocs } from 'fumadocs-mdx/config';
export const docs = defineDocs({
docs: {
postprocess: {
includeProcessedMarkdown: true,
},
},
});
llms-full.txt
A version of docs for AIs to read.
import { source } from '@/lib/source';
import { getLLMText } from '@/lib/get-llm-text';
// cached forever
export const revalidate = false;
export async function GET() {
const scan = source.getPages().map(getLLMText);
const scanned = await Promise.all(scan);
return new Response(scanned.join('\n\n'));
}
*.mdx
Allow people to append .mdx
to a page to get its Markdown/MDX content.
On Next.js, you can make a route handler to return page content, and redirect users to that route.
import { type NextRequest, NextResponse } from 'next/server';
import { getLLMText } from '@/lib/get-llm-text';
import { source } from '@/lib/source';
import { notFound } from 'next/navigation';
export const revalidate = false;
export async function GET(
_req: NextRequest,
{ params }: { params: Promise<{ slug?: string[] }> },
) {
const { slug } = await params;
const page = source.getPage(slug);
if (!page) notFound();
return new NextResponse(await getLLMText(page));
}
export function generateStaticParams() {
return source.generateParams();
}
Page Actions
Common page actions for AI, require *.mdx
to be implemented first.
npx @fumadocs/cli add ai/page-actions
Use it in your docs page like:
<div className="flex flex-row gap-2 items-center border-b pt-2 pb-6">
<LLMCopyButton markdownUrl={`${page.url}.mdx`} />
<ViewOptions
markdownUrl={`${page.url}.mdx`}
githubUrl={`https://github.com/${owner}/${repo}/blob/dev/apps/docs/content/docs/${page.path}`}
/>
</div>
Ask AI
You can install the AI search dialog using Fumadocs CLI:
npx @fumadocs/cli add ai/search
You can add the trigger component to your root layout.
AI Model
By default, it's configured for Inkeep AI using Vercel AI SDK. Update the configurations in useChat
and /api/chat
route to connect to your own AI model instead.
Note that Fumadocs doesn't provide the AI model, it's up to you.
Your AI model can use the llms-full.txt
file generated above, or more diversified sources of information when combined with 3rd party solutions.
这篇文章怎么样?
Last updated on