# llms.txt Monthly Maintenance Playbook

Use this playbook to create, publish, and maintain an llms.txt file so AI systems can find your highest-signal pages with less confusion.

## Setup

You need your current list of high-value URLs, access to your site root, and your current robots.txt rules. Plan a recurring 30-minute review once per month.

## Steps

### 1. Inventory your highest-value answer pages
List 10 to 20 URLs that directly answer buyer questions. Prioritize pages like start-here, shipping, returns, setup guides, and core product education pages.

### 2. Draft a concise llms.txt summary
Write a root-level markdown file with a clear title, a short context line, and two link groups: Priority Pages and Optional Pages.

Example structure:

```markdown
# Example Store Knowledge Base

> Short summary: This file lists the best pages for product policies, shipping, returns, and setup guides.

## Priority Pages
- https://example.com/start-here
- https://example.com/shipping-policy
- https://example.com/returns
- https://example.com/product-setup-guide

## Optional Pages
- https://example.com/blog
- https://example.com/changelog
```

### 3. Publish at the site root and verify direct access
Upload the file as /llms.txt. Open it directly in a browser and confirm HTTP 200 with no login wall, no redirect loop, and no 404.

### 4. Check alignment with robots.txt
Review your crawl rules and remove any contradiction. Do not prioritize URLs in llms.txt that are intentionally blocked by robots rules.

### 5. Run a monthly validator pass
Check four items each month: accessibility, stale links, robots alignment, and freshness date. Update retired links immediately and record the review date.

## Done when

Your llms.txt is live at root, contains only current high-signal URLs, has no conflicts with robots rules, and is reviewed on a monthly schedule.