How It Works
Everything you need to know about BotCheck scans, scoring and AI crawler visibility.
What does BotCheck actually check?
BotCheck runs 7 checks against your URL: robots.txt rules for 30+ AI crawlers, meta directives like noai and noindex, HTTP headers such as X-Robots-Tag, AI discovery files like llms.txt and ai.txt, response speed, paywall and login wall detection, and HTML content structure quality.
How is the score calculated?
Each check gets a score from 0 to 100 and they are combined using weighted averages. Weights change based on your selected mode. In "Be Found" mode robots.txt is 30% and response speed is 15%. In "Block" mode robots.txt is 35% and meta directives are 20%. All weights are shown in your results.
Is robots.txt enough to block AI?
Not on its own. robots.txt is advisory. Reputable crawlers from OpenAI, Anthropic and Google respect it but it is not enforcement. For stronger protection combine robots.txt with meta directives, X-Robots-Tag headers and access controls. BotCheck checks all of these.
What are AI discovery files?
Emerging standards like llms.txt, ai.txt and ai-plugin.json help AI systems understand your site. llms.txt provides an LLM-friendly description of your content. These are optional but increasingly useful for AI visibility.
Does BotCheck store my URL or data?
No. BotCheck is completely stateless. There is no database, no user accounts and no logging. Your scan history is stored only in your browser and never leaves your device.
Why does my score change between modes?
The same configuration means different things depending on your goal. For example blocking a major crawler is great for "Block" mode but bad for "Be Found" mode. The weights also shift because response speed matters more for visibility than for blocking.
No login. No tracking. Free to use.