Using AI to integrate Turnkey

Turnkey documentation is now AI-enhanced. Whether you’re using ChatGPT, Claude, or a custom LLM integration, we’ve made it easy to feed Turnkey docs directly into your models—and even easier to surface relevant answers programmatically or in your dev tools.

LLM Feed Files

To help LLMs stay current on how Turnkey works, we expose two continuously updated files for ingestion:

  • llms.txt - A concise, high-signal list of top-level docs pages, great for smaller models or quick context building.
  • llms-full.txt - A more exhaustive listing that includes nearly all pages, ideal for full-context indexing.

You can regularly ingest these URLs into your custom GPTs or other LLM apps to ensure Turnkey-specific questions are grounded in accurate technical detail.

Search Turnkey Docs with Mintlify MCP

You can integrate our documentation with any Mintlify MCP-compatible client to perform contextual searches without leaving your development environment.

To install and register the Turnkey MCP server, run:

npx @mintlify/mcp@latest add turnkey-0e7c1f5b

On execution, you will see output similar to:

🛠  Installing Tools:

   search
   Search across the Turnkey documentation to fetch relevant context for a given query

✔  Created new MCP server at ~/.mcp/turnkey-0e7c1f5b

?  Select MCP client: › Use arrow keys to select. Return to submit.
❯   Cursor
    Windsurf
    Claude Desktop
    All

To start the server, run:
node ~/.mcp/turnkey-0e7c1f5b/src/index.js

Supported MCP Clients:

  • Cursor
  • Windsurf
  • Claude Desktop
  • All

We’ve enabled Mintlify’s contextual feature across Turnkey’s docs:

  • You can copy any Turnkey docs page as Markdown for reuse or embedding.
  • Even better: you can launch a chat session with Claude or ChatGPT preloaded with that specific page’s context.

This is perfect for troubleshooting, code generation, or just diving deeper into a topic with AI assistance.

To enable this, we’re using Mintlify’s contextual param:
Learn more here.