NEW: Now monitoring 9 AI platforms including ChatGPT, Claude, Gemini, and Perplexity
PromptEden Logo
Content Optimization 18 min read

How to Set Up llms.txt for Your Website

This llms.txt guide covers everything you need to set up the file correctly: the format, what to include, what to leave out, and how it connects to AI visibility. llms.txt is a proposed standard that helps AI models understand your website before crawling every page. It takes under an hour to create and deploy, and it is one of the clearest signals you can send to language models about who you are and what you do.

By PromptEden Team
How to Set Up llms.txt for Your Website hero image

What Is llms.txt and Why It Exists

llms.txt is a plain-text file you place at the root of your domain, at yourdomain.com/llms.txt. It gives AI language models a structured, human-readable summary of what your website is about, what your key pages are, and how you want your content used.

The format was proposed by Jeremy Howard, co-founder of fast.ai, in late 2024. The idea drew on the logic of robots.txt, which tells crawlers which pages to avoid, and sitemap.xml, which lists pages for indexing. Neither of those files was designed with AI in mind. A sitemap can contain thousands of URLs with no indication of which pages matter most. A robots.txt file is a blocklist, not a description. llms.txt fills a different role: it helps an AI model quickly understand a site without having to crawl and parse every page.

Think of it as a table of contents combined with a brief product description. A well-written llms.txt might take under an hour to create, but it gives any language model or AI agent a useful orientation to your site in seconds.

How It Differs from robots.txt and sitemap.xml

The three files serve distinct purposes and work together:

  • robots.txt controls crawler access. It tells bots which pages they are allowed or not allowed to fetch. It says nothing about what your site does.
  • sitemap.xml lists URLs for search engine indexing. It is comprehensive by design, but it treats every URL as equal and carries no descriptive context.
  • llms.txt is descriptive and selective. It tells AI models what your site is, highlights the most important pages, and provides enough context for the model to reason about your brand without crawling everything.

None of them conflict with each other. You can block certain AI crawlers in robots.txt while still providing an llms.txt for the models that do access your content.

The Current State of the Standard

At the time of writing, llms.txt is a proposed standard, not an official RFC or W3C specification. It was published at llmstxt.org and has gained adoption across a range of companies and projects. Several AI-focused tools and frameworks have started referencing and consuming the format. Adoption is growing, but it is not yet universal.

That said, the cost of implementing it is low. A basic llms.txt file is a few dozen lines of text. If AI models do use it to better understand your site, the benefit to your visibility is real. If a model ignores it, you have lost nothing.

PromptEden LLM monitoring dashboard showing brand mentions across AI platforms

The llms.txt File Format Explained

The llms.txt format uses Markdown-style syntax. It is a structured plain-text file, not HTML, JSON, or XML. Here is the basic structure:

# Your Brand or Site Name

> A one-paragraph description of what your site or product does.

Optional: Any additional context about the site's purpose, audience, or how AI should use the content.

## Section Name

- [Page Title](https://yourdomain.com/page/): Brief description of what this page covers.
- [Another Page](https://yourdomain.com/other/): What a reader or AI agent will find here.

## Another Section

- [Page Title](https://yourdomain.com/docs/): Description.

The components break down like this:

The H1: Your Site Name

The first line is a single # heading with your brand or site name. Keep it short. This is the primary label an AI model will use to identify your site.

The Blockquote: Your Description

Immediately after the H1, include a > blockquote with a concise description of your site. One to three sentences works well. This is the most important field in the file. It is the first thing a model reads and the clearest signal you can give about what your site does.

Write this description as if you are explaining your site to someone who has never heard of you. Include:

  • What your site or product does
  • Who it is for
  • What category it belongs to

Avoid vague mission statements. "Helping businesses grow through technology" means nothing. "PromptEden monitors how AI platforms mention and recommend your brand, with a Visibility Score from 0 to 100" is useful.

Optional Context

After the blockquote, you can include additional plain text paragraphs. This is the right place for:

  • Licensing or usage notes (e.g., "Content may be used for training purposes" or "All rights reserved")
  • Language or region information
  • Notes about how content is organized
  • Any caveats about data freshness or accuracy

This section is optional. Many implementations skip it entirely.

H2 Sections: Page Categories

Use ## headings to group pages by category. Common groupings include:

  • Documentation
  • Blog or Resources
  • About / Company
  • Products or Features
  • Pricing
  • Tools

You do not need to list every page. The goal is selective curation, not exhaustive enumeration. Pick the pages that best represent each category.

Page Entries: Title, URL, Description

Under each section, list pages using Markdown link syntax followed by a colon and a brief description:

- [Page Title](https://yourdomain.com/page/): What this page covers in one sentence.

The description should be a single sentence. It tells the AI model what a reader will find on that page, which helps the model decide whether to reference it when constructing a response.

An llms.txt Example

Here is a minimal but complete example:

# Acme Analytics

> Acme Analytics is a web analytics platform for e-commerce businesses.
> It tracks conversion rates, cart abandonment, and customer lifetime value
> without requiring cookies or JavaScript trackers.

This site includes product documentation, pricing, a blog covering analytics
best practices, and a free ROI calculator tool.

## Docs

- [Getting Started](https://acmeanalytics.com/docs/start/): Installation guide and first dashboard setup.
- [Tracking Setup](https://acmeanalytics.com/docs/tracking/): How to add Acme's tracking snippet to your store.
- [API Reference](https://acmeanalytics.com/docs/api/): Full REST API documentation with request examples.

## Product

- [Features](https://acmeanalytics.com/features/): Complete list of analytics features and integrations.
- [Pricing](https://acmeanalytics.com/pricing/): Plan comparison and pricing tiers.

## Resources

- [E-Commerce Analytics Guide](https://acmeanalytics.com/blog/ecommerce-analytics-guide/): A practical guide to measuring store performance.
- [About Acme](https://acmeanalytics.com/about/): Company background and mission.

That file is concise and gives an AI model a clear picture of what the site is and where to find key information.

What to Include (and What to Leave Out)

One of the most common questions about llms.txt is how selective to be. The instinct is to include everything. Resist it. The value of llms.txt is that it is curated, not comprehensive. A file with hundreds of URLs and no descriptions defeats its own purpose.

What to Include

Your most important pages. These are the pages that define your brand, explain your product, or answer the questions most relevant to your audience. For a SaaS company, that typically means the homepage, features page, pricing page, and two or three primary use case or solution pages.

Documentation that AI agents need. If your product has a public API, an SDK, or integration guides, include those. AI coding agents actively look for this kind of structured reference material.

Content that earns citations. If you have published original research, comprehensive guides, or data-backed resources that other sites reference, those belong in your llms.txt. AI models are more likely to cite content they can trace back to a clearly identified source.

An accurate description of each page. The one-sentence description under each URL is not decoration. It is how the model decides whether your page is relevant to a given query. Be specific. "Overview of our product features and integrations" is better than "Learn more."

What to Leave Out

Pages behind login walls. If an AI model cannot access the page anyway, listing it provides no benefit.

Thin or low-value pages. Tag pages, pagination results, author archives, and search results pages add noise without context. Leave them out.

Every blog post you have ever written. List your three or four most authoritative content pieces, not the full archive. If AI models want more of your blog, they can follow links or discover it through sitemaps.

Outdated pages. If a page is stale or has been replaced by newer content, leave it out. Directing a model to an outdated resource does not help anyone.

Pages you want to keep private. llms.txt is a public file. Do not include internal URLs, staging environments, or pages you have not yet published.

How Long Should It Be?

There is no strict rule. A focused file for a small site might be a few dozen entries. A large software company with extensive documentation might reasonably include many more. The key question is whether each entry adds meaningful information. If you are adding a URL just to have more entries, do not add it.

How to Create and Deploy Your llms.txt File

Creating the file does not require a developer. Deploying it does require access to your web server, hosting platform, or CMS. Here is the full process.

Step 1: Generate or Write the File

You have two options. You can write the file manually using any text editor, or you can use a generator to produce a first draft.

PromptEden's free llms.txt Generator walks you through the key fields and outputs a properly formatted file. It is a good starting point if you want a structured template to work from rather than starting from a blank document.

If you write the file manually, start with these four things:

  1. Your brand name as an H1 heading
  2. A one-to-three sentence description as a blockquote
  3. Two or three H2 sections grouping your page types
  4. Your five to fifteen most important pages with one-sentence descriptions

That gives you a functional file you can expand later.

Step 2: Validate the Format

Before deploying, check that:

  • The file is saved as plain text with a .txt extension
  • The H1 heading uses a single # character
  • The blockquote uses > at the start of each line
  • URLs in page entries are absolute (include https://)
  • No encoding issues have crept in (smart quotes, special characters)

A quick way to check: paste the file into any Markdown previewer. If it renders cleanly, the format is correct.

Step 3: Deploy to Your Web Root

The file must be accessible at https://yourdomain.com/llms.txt. How you do this depends on your setup:

Static sites (Netlify, Vercel, Cloudflare Pages, GitHub Pages): Add llms.txt to your project's public directory or static folder, then deploy. Most static site generators will pass through any file in the public folder without modification.

WordPress: Add llms.txt to the root directory of your WordPress installation via FTP, SFTP, or your hosting provider's file manager. Alternatively, some SEO plugins may add support for this format.

Custom CMS or server: Place the file in your web root directory (the same level as robots.txt). Make sure your server is configured to serve .txt files with Content-Type: text/plain.

Next.js or similar frameworks: Add the file to your public/ directory. Next.js serves all files in public/ at the root path, so public/llms.txt becomes yourdomain.com/llms.txt.

Step 4: Verify It Is Accessible

After deploying, visit https://yourdomain.com/llms.txt in a browser. You should see plain text. If you see an error page or a page-not-found response, the file is not in the right location or your server is not serving it correctly.

Also check that your robots.txt does not block access to the file. Some broad wildcard rules can accidentally block .txt files. Use PromptEden's free AI Robots.txt Checker to confirm AI crawlers can reach your site.

Step 5: Keep It Current

llms.txt is not a one-time task. When you add major new pages, update the product, or change your positioning, update the file too. An llms.txt that describes an outdated version of your site is worse than no file at all. Set a reminder to review it quarterly.

PromptEden Visibility Score showing Presence, Prominence, Ranking, and Recommendation components

Common Mistakes and How to Avoid Them

The format is simple, but there are several ways to undermine the file's value without realizing it.

Writing a Vague Description

The blockquote description is the single most important field. A description like "A website about technology and business" tells an AI model nothing. Be specific about what you do, who you serve, and what category you belong to. If someone read only that one sentence, would they know what your site is?

Poor example: "We help companies succeed with innovative solutions."

Better example: "Acme Analytics helps e-commerce stores track conversion rates, cart abandonment, and customer lifetime value without cookies or JavaScript tracking."

Including Too Many Pages

Length is not quality. A bloated llms.txt that lists every blog post and tag page is harder to parse than a concise file with clear section structure. AI models use the file to orient themselves quickly. If the file itself requires analysis to navigate, it loses much of its value.

Using Relative URLs

Every URL in your llms.txt should be an absolute URL, including the full domain. /docs/api/ is a relative URL. https://yourdomain.com/docs/api/ is an absolute URL. Relative URLs only make sense within a browser context, and AI models parsing the file do not have that context.

Forgetting to Update It

A stale llms.txt is a real problem. If your file lists pages that have moved, been deleted, or been replaced, you are pointing AI models at broken or outdated resources. When those models follow up by attempting to access those URLs, they hit an error. That is not a helpful signal.

Blocking AI Crawlers in robots.txt

There is no point in creating an llms.txt if your robots.txt blocks the crawlers that would read it. Common user agents for major AI platforms include GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot (Perplexity), and Googlebot-Extended (Google). If any of these are blocked with a Disallow: / rule, those models cannot reach your site or your llms.txt.

Check your robots.txt before you invest time in llms.txt. You can do this with the free AI Robots.txt Checker.

Treating It as a Substitute for Good Content

llms.txt is a signaling file, not a content file. It helps AI models find and understand your pages. It does not replace the work of creating clear, well-structured content on those pages. A model that reads your llms.txt and follows a link to a thin, poorly organized page will still have an inaccurate picture of your brand. The file and the content it points to need to work together.

PromptEden Citation Intelligence showing which sources AI models cite for brand mentions

How llms.txt Connects to AI Visibility Outcomes

Setting up llms.txt is a technical step, but the reason to care about it is strategic. AI visibility depends on whether models have an accurate, accessible picture of your brand. llms.txt is one mechanism for providing that picture.

What It Can and Cannot Do

llms.txt can help an AI model correctly identify what your company does, which pages are most relevant to common questions, and how to categorize your brand within a product space. That accuracy matters when a model is constructing a response that includes you.

What llms.txt cannot do is manufacture authority you have not built. If your site is not referenced anywhere outside your own domain, if your content is thin, or if your brand is not mentioned in training data, llms.txt will not fix those problems. It is a useful orientation file, not a substitute for the broader work of building AI visibility.

Think of the relationship this way: llms.txt helps models understand your site correctly. Content quality and third-party coverage determine whether models recommend your brand favorably.

The Relationship to AEO

Answer Engine Optimization (AEO) is the practice of improving how your brand appears in AI-generated responses. llms.txt is one component of a broader AEO strategy. It sits alongside:

  • Structured content that AI models can parse and quote from
  • robots.txt hygiene that ensures AI crawlers can access your pages
  • Schema markup that provides structured metadata about your content
  • Third-party citations from independent sources that give AI models corroborating signals
  • Prompt monitoring to track how AI actually responds to queries about your brand

Learn more about the full picture at /use-cases/seo-for-ai/.

Measuring Whether It's Working

You cannot directly measure whether a given AI model has read and used your llms.txt. What you can measure is your overall AI visibility: how often AI models mention your brand, how accurately they describe what you do, and whether those descriptions match what your llms.txt says.

If you publish an llms.txt that clearly describes your product as an analytics platform, and AI models continue to describe you as a CRM, that is a signal that either the file is not being consumed or other content signals are overriding it. Both outcomes tell you something useful about where to focus next.

PromptEden monitors brand mentions across 9 AI platforms and tracks your Visibility Score (0-100), which combines four components: Presence (does AI mention you?), Prominence (how featured are you?), Ranking (where do you appear in lists?), and Recommendation (does AI actively suggest you?). Tracking these metrics before and after publishing your llms.txt gives you a baseline for evaluating its impact alongside your other changes.

The free plan supports 10 tracked prompts with weekly refresh. The Starter plan ($49/month) expands to 100 prompts with daily monitoring. If you want to track visibility changes systematically after updating your llms.txt, the features page has the full breakdown.

llms-txt content-optimization ai-visibility aeo technical-seo ai-crawlers

Sources & References

  1. llms.txt is a proposed standard published by Jeremy Howard (fast.ai) in late 2024 to help AI language models understand website content llmstxt.org (accessed 2026-03-01)
  2. PromptEden monitors brand mentions and AI visibility across 9 AI platforms spanning search, API, and agent categories PromptEden (accessed 2026-03-01)
  3. PromptEden Visibility Score (0-100) combines four components: Presence, Prominence, Ranking, and Recommendation PromptEden (accessed 2026-03-01)
  4. PromptEden Free plan supports 10 tracked prompts with weekly refresh. Starter plan at $49/month supports 100 prompts with daily monitoring. PromptEden (accessed 2026-03-01)
  5. PromptEden offers a free llms.txt Generator tool on the marketing site at /tools/llms-txt-generator/ PromptEden (accessed 2026-03-01)
  6. PromptEden offers a free AI Robots.txt Checker at /tools/robots-checker/ to verify AI crawlers are not blocked PromptEden (accessed 2026-03-01)

Frequently Asked Questions

Is llms.txt an official standard?

Not yet. llms.txt is a proposed standard published at llmstxt.org by Jeremy Howard in late 2024. It has gained adoption among many companies and AI tools, but it is not an official W3C specification or RFC. The cost of implementing it is low, and the potential benefit to AI visibility makes it worth adding even before the standard is formalized.

Will AI models actually read my llms.txt file?

Adoption among AI systems varies. Some crawlers and agent frameworks are beginning to support the format, while others do not yet actively seek out the file. That said, any model or agent that does look for it will find yours if it is correctly placed at your domain root. Implementing it now positions you ahead of broader adoption.

How often should I update my llms.txt file?

Review it at least once per quarter. Update it any time you add major new pages, change your product positioning, remove pages that are listed in the file, or rebrand. A stale llms.txt that points to moved or deleted pages is worse than one that is simply less detailed.

Does llms.txt replace my sitemap.xml?

No. They serve different purposes. Your sitemap.xml tells search engines about all your indexable pages and is consumed by crawlers during indexing. llms.txt is a curated, descriptive file for AI models. You should have both. They complement each other without overlapping.

What is the difference between llms.txt and llms-full.txt?

The llms.txt specification describes two optional variants. llms.txt is the standard summary file with page listings. llms-full.txt is an extended version that can include the full text content of your key pages, giving models even more context without requiring them to fetch each page individually. Most sites start with the standard llms.txt, and llms-full.txt is a more advanced implementation.

Can llms.txt hurt my site if it is wrong?

A poorly written llms.txt is unlikely to actively damage your rankings, but it can give AI models an inaccurate description of your site. If your description is misleading or your listed pages are broken, you are sending confusing signals. The fix is keeping the file accurate and up to date.

Do I need a developer to create and deploy llms.txt?

Not necessarily. Writing the file requires only a text editor. Deploying it requires placing a file at your domain root, which is a standard file hosting task. For most hosting platforms, WordPress installations, and static site frameworks, this does not require code changes. If your CMS or server setup is unusual, a developer may be needed for the deployment step.

How does llms.txt connect to AI visibility monitoring?

llms.txt helps AI models understand your site. Monitoring tools like PromptEden track whether AI models are actually mentioning and accurately describing your brand. You use them together: publish an accurate llms.txt, then monitor your Visibility Score over time to see whether AI models are describing you correctly and including you in relevant responses.

Where do I put llms.txt if I use a subdomain for my product?

Put it at your primary marketing domain root. If your marketing site is at yourdomain.com, the file belongs at yourdomain.com/llms.txt. If your product runs at app.yourdomain.com and you want to describe that too, you can also add a separate llms.txt at app.yourdomain.com/llms.txt. The two files can cross-reference each other in their descriptions.

Check your AI visibility after you publish

Track how AI platforms mention your brand across 9 models. Measure your Visibility Score and see whether your llms.txt and content changes are moving the needle.