Using LLMs
Using AI to integrate Turnkey
Turnkey documentation is now AI-enhanced. Whether you’re using ChatGPT, Claude, or a custom LLM integration, we’ve made it easy to feed Turnkey docs directly into your models—and even easier to surface relevant answers programmatically or in your dev tools.
LLM Feed Files
To help LLMs stay current on how Turnkey works, we expose two continuously updated files for ingestion:
llms.txt
- A concise, high-signal list of top-level docs pages, great for smaller models or quick context building.llms-full.txt
- A more exhaustive listing that includes nearly all pages, ideal for full-context indexing.
You can regularly ingest these URLs into your custom GPTs or other LLM apps to ensure Turnkey-specific questions are grounded in accurate technical detail.
Search Turnkey Docs with Mintlify MCP
You can integrate our documentation with any Mintlify MCP-compatible client to perform contextual searches without leaving your development environment.
To install and register the Turnkey MCP server, run:
On execution, you will see output similar to:
Supported MCP Clients:
- Cursor
- Windsurf
- Claude Desktop
- All
Chat With Our Docs (Contextual Deep Links)
We’ve enabled Mintlify’s contextual
feature across Turnkey’s docs:
- You can copy any Turnkey docs page as Markdown for reuse or embedding.
- Even better: you can launch a chat session with Claude or ChatGPT preloaded with that specific page’s context.
This is perfect for troubleshooting, code generation, or just diving deeper into a topic with AI assistance.
To enable this, we’re using Mintlify’s contextual param:
Learn more here.