robots.txt Generator
Build a robots.txt file with crawler rules, sitemaps, and crawl-delay directives.
Quick Presets
Rule #1
User-agent: * Disallow: /
How to Use robots.txt Generator
- 1
Set crawl rules
Choose which pages or directories to allow or disallow.
- 2
Add user agents
Specify rules for Googlebot, Bingbot, or all crawlers.
- 3
Add sitemap URL
Include your sitemap location in the robots.txt file.
- 4
Copy the file
Click copy to grab the robots.txt content for your server.
Related Tools
File Size Calculator & Converter
Convert between bytes, KB, MB, GB, and TB with precise binary and decimal calculations.
Image EXIF / Metadata Viewer
View and inspect EXIF data, GPS coordinates, camera settings, and metadata from images.
CSV Viewer & Editor
Open, view, and edit CSV files in a spreadsheet-style table with sorting and filtering.
JSON Tree Viewer
Visualize JSON data as a collapsible tree with syntax highlighting and search.
Barcode Generator
Generate Code 128, EAN, UPC, and other barcode formats with customizable size and labels.
Data URL / Base64 Image Converter
Convert images to Base64 data URLs and decode Base64 strings back to images.