Documentation bundles for AI tools
If you use Cursor, GitHub Copilot, Claude Code, or similar assistants while you learn or change Adobe Commerce Storefront code, this page shows how to load this site’s documentation into your editor or agent so that answers follow the same topics as the browser. This page supports learning and building while you read the documentation, not instead of it.
General-purpose AI models often miss the newest storefront APIs, drop-in versions, and Edge Delivery Services-specific guidance, because they learn from older training data rather than the latest published pages. For that reason, this documentation site generates plain-text bundles at build time so editors and agents can load the same storefront topics the browser shows.
Context files
Section titled “Context files”Start with llms.txt. It is the index that lists where to find every bundle. Full exports load the whole documentation set into one file. Files under _llms-txt/ cover one topic area each, in smaller files for tools that should not pull the entire site. The table lists each filename and a one-line description of what it contains.
| File | Purpose |
|---|---|
llms.txt | Index file: overview and links to all documentation sets (full, abridged, and topic bundles under _llms-txt). |
llms-full.txt | Complete documentation in one file. |
llms-small.txt | Abridged bundle (same content as the full export except the long-form release changelog page). |
_llms-txt/*.txt | Smaller topical exports (for example, drop-ins reference, tutorials, blocks). All bundle URLs are listed from llms.txt. |
Some of these files are large. Loading a whole file sends a lot of text into the model at once. They are rebuilt every time the site is published, so treat them as a fallback when an AI tool cannot search or fetch the live docs. Whenever possible, browse or search this storefront documentation for the latest edits, navigation, and diagrams.
How to use context files
Section titled “How to use context files”Each tool below supports either the public llms.txt URL, pasted bundle text, or a built-in documentation index when the feature exists. For agent-based tools, adding the llms.txt URL to a project context file such as AGENTS.md or CLAUDE.md is the most reliable method — the agent fetches the index on demand rather than relying on a pre-built index that may lag recent changes.
Cursor
Section titled “Cursor”Create or update AGENTS.md at your project root and add a line pointing to the documentation index:
See https://experienceleague.adobe.com/developer/commerce/storefront/llms.txt for Adobe Commerce Storefront documentation.Cursor agent mode reads AGENTS.md and fetches the index on demand. You can also add this reference to a file under .cursor/rules/ if you prefer a Cursor-specific location.
If you still want to use Cursor’s built-in indexer, go to Cursor → Settings → Cursor Settings → Indexing & Docs → Add Doc and paste the llms.txt URL. Keep in mind that the indexed copy may not reflect the latest published changes. Optional: paste llms-small.txt or a specific _llms-txt/*.txt bundle directly into chat when you need a narrow scope.
Claude Code (CLI)
Section titled “Claude Code (CLI)”Create or update CLAUDE.md at your project root and add a line pointing to the documentation index:
See https://experienceleague.adobe.com/developer/commerce/storefront/llms.txt for Adobe Commerce Storefront documentation.Claude Code will fetch the index and linked bundles when the documentation is relevant. In a new chat, ask a question this storefront documentation would answer (for example, about a drop-in). If the reply cites or aligns with those topics, the URL is wired in.
Other AI tools
Section titled “Other AI tools”Other assistants often accept a documentation URL in settings or in the system prompt.
For tools that support custom context URLs or “docs” sources (GitHub Copilot Chat, Windsurf, and similar): provide the full URL below. Tools that implement the llms.txt spec will follow the index to the relevant bundles automatically.
https://experienceleague.adobe.com/developer/commerce/storefront/llms.txtFor tools that do not support URL-based context, paste the content of llms-small.txt (or a relevant _llms-txt/*.txt topic bundle) directly into the chat or system prompt.
Example prompt
Section titled “Example prompt”After this documentation is wired into your assistant, try a concrete question in chat. The sample below is one illustration you can paste or rewrite. In this sample, PDP means product detail page.
Use the Storefront docs. I'm building a PDP page and I want to add a custom 'Add to Wishlist' button in the product details drop-in, below the existing Add to Cart button. I don't want to fork the drop-in. How do I use slots to do this?Adapt drop-in names or UI labels if your project differs, then verify the reply against the live product drop-in topics before you ship a change.
- Many paths, blocks, and drop-in examples in this documentation match the folder layout of the Commerce boilerplate on GitHub, not a random custom storefront. Open your clone of that repository in your editor while you read so file lists and paths the assistant suggests line up with what you see on disk. If you use a different project, say that in the chat so the assistant does not assume boilerplate folders.
- After
llms.txtor a bundle connects this documentation to your assistant, you can paste the web address of the page you are reading into the same chat when you need the reply scoped to that topic. That paste narrows one reply to the page you mean and leaves yourllms.txtor bundle connection unchanged.