Robots.txt

Control crawler access
from one place

Define crawler rules and manage your robots.txt directly from the dashboard. Control which pages search engines can crawl without touching your codebase.

How it works

Simple, visual management

Step 1
Add directives

Add User-agent, Allow, Disallow, or Sitemap lines through the dashboard form.

Step 2
Reorder & preview

Drag to reorder directives. Preview the final robots.txt before saving.

Step 3
Save & go live

Save changes. Your robots.txt is served dynamically at /robots.txt — no deploy needed.

Step 4
Sitemaps auto-added

Sitemap references are automatically appended so crawlers discover your sitemaps.

In the dashboard

Manage directives visually

Robots.txt dashboard
Key capabilities

Everything you need

User-agent, Allow, Disallow, and Sitemap directives

Add, reorder, and remove directives from the dashboard

Live preview before publishing

Automatic sitemap references appended to robots.txt

Records User-Agents to distinguish bots vs humans — see which bots and how often

Changes go live instantly — no rebuilds required

Example output

Your robots.txt, served dynamically

User-agent: *
Allow: /
Disallow: /admin
Disallow: /api

Sitemap: https://yoursite.com/sitemaps/seomanager-sitemap-0
Explore

Other features

Sitemap Management

Auto-generate XML sitemaps as your pages change.