robots-builder
Zero-dependency TypeScript library and CLI tool for generating, validating, and parsing robots.txt files programmatically.
The Problem It Solves
Every website needs a robots.txt file at its root. It's the standard protocol that tells search engine crawlers — Googlebot, Bingbot, GPTBot, and hundreds of others — which parts of your site they're allowed to visit.
High Risk
Block too much and Google stops indexing; block too little and AI scrapers take everything.
No Feedback
Writing it by hand offers no linting, no validation, and no immediate error feedback.
Complex Rules
Non-obvious rules like required empty Disallow: lines or specific directive ordering.
Installation
1npm install robots-builderQuick Start
Get up and running in seconds with full TypeScript support.
1import { build } from 'robots-builder';
2
3const txt = build({
4 rules: [
5 {
6 userAgent: '*',
7 allow: ['/'],
8 disallow: ['/admin', '/api', '/private'],
9 crawlDelay: 5,
10 },
11 {
12 userAgent: 'GPTBot',
13 disallow: ['/'],
14 },
15 ],
16 sitemaps: ['https://example.com/sitemap.xml'],
17 host: 'https://example.com',
18});
19
20console.log(txt);API Reference
build(config)
Generates a robots.txt string. Throws a RobotsBuilderError if the config is invalid.
1import { build, RobotsBuilderError } from 'robots-builder';
2
3try {
4 const txt = build(config);
5 fs.writeFileSync('public/robots.txt', txt);
6} catch (err) {
7 if (err instanceof RobotsBuilderError) {
8 console.error(err.errors); // ValidationError[]
9 }
10}validate(config)
Returns { valid: boolean, errors: ValidationError[] }. Use this to check before building.
parse(content)
Parses an existing robots.txt string back into a typed config object.
Ready-made Presets
allowAllAllows all crawlers with no restrictions.
blockAllBlocks all crawlers from the entire site.
standardAllows Google/Bing, blocks AI and internal paths.
blockAiCrawlersShields your content from GPTBot, CCBot, etc.
nextjsOptimized for Next.js internal routes.
Framework Integration
Next.js (App Router)
1import { build, presets } from 'robots-builder';
2
3export default function robots() {
4 return build({
5 ...presets.standard,
6 sitemaps: [`${process.env.URL}/sitemap.xml`],
7 });
8}Astro
1import { build, presets } from 'robots-builder';
2
3export const GET = () =>
4 new Response(build(presets.standard), {
5 headers: { 'Content-Type': 'text/plain' },
6 });Powerful CLI
1# Generate robots.txt with a preset
2robots-builder --preset standard > public/robots.txt
3
4# Validate an existing file
5robots-builder --validate public/robots.txt
6
7# Parse an existing file into JSON
8robots-builder --parse public/robots.txt > config.json
9
10# Use a custom config file
11robots-builder --config robots.config.js --comment