AI-powered systems like ChatGPT, Google Gemini, Claude, and Perplexity are changing how we get information online. They aim to process complex web content to provide faster and more accurate answers. However, these models often struggle. They aren’t human; they process the underlying code of a website, and the complex HTML structures, navigation menus, and scripts can obscure the core information. To address this challenge, a new web standard has been proposed: llms.txt.
It's a simple text file placed in a site's root directory to act as a direct guide for AI. Instead of forcing a model to parse a visually complex webpage, it provides a concise summary and direct links to clean, markdown (.md) versions of the site’s most important content. It creates a clear, predictable pathway to the data that matters, in a format built for AI.
The Thinking Behind the llms.txt Proposal
The llms.txt proposal is based on a simple, practical idea: AI models process information more accurately when it's provided in a clear, structured format. This is why the standard uses markdown—it's a clean, text-based language that LLMs can easily understand, free from the clutter of code.
The thinking behind the standard suggests a few potential benefits for the web:
- For Marketers: It offers a way to directly signal which content is most important, potentially leading to more accurate representation of your brand or information in AI-generated answers.
- For AI Systems: It provides a clear, efficient path to high-quality information. Instead of trying to parse a standard webpage—with its complex HTML, CSS, and JavaScript—the model gets a direct feed of the core content, removing guesswork and potential for error.
Essentially, it aims to create a more direct line of communication between a website's core content and the models that consume it.
Why we added a llms.txt to our site
As AI systems become a primary source of information for users, we want to ensure that the answers they generate about Monogram—our mission, our services, and our work—are accurate, current, and sourced directly from us. By providing a clean, machine-readable guide to our content, we take control of our narrative and reduce the risk of AI models misinterpreting complex layouts or outdated information.
Our Approach: A Look at Monogram's Live llms.txt
While the full proposal emphasizes linking to clean .md versions of pages, we've taken a strategic first step. Our initial implementation focuses on providing a high-level, curated guide for AI. Instead of pointing to .md files, we provide direct links to our key pages, each accompanied by a concise, human-written description.
This approach provides immediate value by acting as an expert-curated sitemap. It tells an AI, "Don't just crawl randomly; start with these important pages, and here is exactly what you will find on each one." Here is the content of our live llms.txt file.
The Growing Case for llms.txt
Our reason for implementing llms.txt is straightforward: we want to make it easier for AI systems to find and accurately understand our most important information. While the standard is still a proposal, it’s one that’s quickly gaining traction. We’re not alone in seeing its potential. A growing number of major players in AI and web infrastructure have also implemented llms.txt files, primarily for their documentation sites:
This growing adoption demonstrates its value. By offering a direct path to key content, the file provides a clear signal of intent to any system designed to look for it. For Monogram, and for these other early adopters, it’s about taking a simple, proactive step to provide clarity to AI.