AI Visibility Guide

Make your site visible to LLMs with llms.txt, robots-ai.txt and markdown conversion. Three tools to optimize your content for AI consumption.

What is AI Visibility?

AI Visibility means optimizing your website so that Large Language Models can efficiently discover, access, and understand your content. dat2ai provides three complementary tools that work together to make your site fully visible to AI.

llms.txt

Tell AI models what content matters most on your site

robots-ai.txt

Control which AI crawlers can access your content

Markdown

Serve pages as clean markdown with ~80% fewer tokens

llms.txt

What is llms.txt?

llms.txt is a standard from llmstxt.org that provides a structured file telling AI models about your site's important content. Think of it as a sitemap specifically designed for LLMs -- it helps AI models quickly understand what your site offers and where to find key pages.

Editing Modes

  • Structured Editor Build your llms.txt using a form-based interface. Define a title, description, and organize content into sections with titled links and descriptions.
  • Raw Mode Paste or write your own llms.txt content directly. Full control over the output format.

Where is it served?

WordPress: yoursite.com/llms.txt

Script tag / other: dat2ai.com/api/llms-txt/{siteKey}

Example Output

# My Website

> A brief description of your website for AI models.

## Main Content
- [Home](https://example.com): The homepage
- [About](https://example.com/about): About the company

## Documentation
- [Getting Started](https://example.com/docs): Quick start guide
- [API Reference](https://example.com/docs/api): REST API docs

robots-ai.txt

What is robots-ai.txt?

robots-ai.txt lets you control which AI crawlers can access your site. It works like robots.txt but is specifically designed for AI bots. You can allow or block individual crawlers with a simple toggle interface.

Default Crawlers

dat2ai includes 8 pre-configured AI crawlers with per-crawler allow/block toggles:

GPTBot(OpenAI)
Toggle
Google-Extended(Gemini)
Toggle
ClaudeBot(Anthropic)
Toggle
Bytespider(ByteDance)
Toggle
CCBot(Common Crawl)
Toggle
PerplexityBot(Perplexity)
Toggle
FacebookBot(Meta AI)
Toggle
Applebot-Extended(Apple)
Toggle

Custom Rules

For advanced use cases, add custom robots.txt directives for crawlers not in the default list. Custom rules are appended to the generated output.

Where is it served?

WordPress: yoursite.com/robots-ai.txt (also automatically appended to your standard robots.txt)

Script tag / other: dat2ai.com/api/robots-ai/{siteKey}

Markdown Conversion

What does it do?

Markdown conversion takes any page on your site and converts it to clean markdown. LLMs process markdown far more efficiently than raw HTML -- typically using ~80% fewer tokens. This means faster responses, lower costs, and better comprehension by AI models.

Configuration

  • Allowed Paths: Specify which URL paths can be converted (e.g., /blog/*, /docs/*). Leave empty to allow all paths.
  • Excluded Selectors: CSS selectors for elements to strip before conversion (e.g., .sidebar, .comments, .advertisement).
  • Metadata Toggle: Optionally include YAML frontmatter with the page title, source URL, and conversion date.

API Endpoint

API: dat2ai.com/api/markdown/{siteKey}/{path}

WordPress: Add ?format=md to any page URL on your WordPress site.

Rate Limits & Caching

Markdown conversion is rate limited to 10 requests per minute per site to prevent abuse. Converted pages are cached for 5 minutes to reduce server load and speed up repeated requests.

WordPress vs Script Tag

FeatureWordPressScript Tag
llms.txtServed natively at yoursite.com/llms.txtServed via dat2ai.com/api/llms-txt/{siteKey}
robots-ai.txtServed natively + appended to robots.txtServed via dat2ai.com/api/robots-ai/{siteKey}
MarkdownAdd ?format=md to any page URLUse API endpoint with path
ConfigurationPlugin settings + dashboard syncDashboard only