Robots.txt
Set up nextjs-seo-manager to serve a dynamic robots.txt managed from your dashboard.
Pages Router
- Create
pages/robots.txt.js. - Export the
RobotsHelpercomponent:
// pages/robots.txt.js
import SEOInit from "nextjs-seo-manager/init";
import RobotsHelper from "nextjs-seo-manager/robotshelper";
SEOInit({
projectId: process.env.SEO_MANAGER_PROJECT_ID,
secretKey: process.env.SEO_MANAGER_SECRET_KEY,
});
export default RobotsHelper;
App Router
For the App Router, use the helper functions from nextjs-seo-manager/robotsbasichelper to create a route handler.
- Create
app/robots.txt/route.ts. - Export a
GEThandler:
// app/robots.txt/route.ts
import { NextRequest, NextResponse } from "next/server";
import SEOInit from "nextjs-seo-manager/init";
import {
generateRobotsTxt,
createFallbackRobotsTxt,
extractHeadersFromRequest,
} from "nextjs-seo-manager/robotsbasichelper";
SEOInit({
projectId: process.env.SEO_MANAGER_PROJECT_ID,
secretKey: process.env.SEO_MANAGER_SECRET_KEY,
});
export async function GET(request: NextRequest) {
try {
const headers = extractHeadersFromRequest(request);
const robotsTxt = await generateRobotsTxt(headers);
return new NextResponse(robotsTxt, {
headers: { "Content-Type": "text/plain" },
});
} catch (error) {
const origin = new URL(request.url).origin;
const fallback = createFallbackRobotsTxt(origin);
return new NextResponse(fallback, {
headers: { "Content-Type": "text/plain" },
});
}
}
Helper Functions
generateRobotsTxt(headers)— calls the SEO Manager API and returns the robots.txt content as a string.createFallbackRobotsTxt(origin)— returns a basic robots.txt with a sitemap reference, useful as an error fallback.extractHeadersFromRequest(request)— extractsuser-agent,accept,accept-language, andcache-controlfrom the Next.js request.
To edit your robots.txt rules, visit your dashboard.