Skip to main content
You can use large language models (LLMs) to assist in the building of Invopop integrations. We provide a set of tools and best practices if you use LLMs during development.

Plain Text Docs

You can access all of our documentation as plain text markdown files by adding .md to the end of any url. For example, you can find the plain text version of this page itself at https://docs.invopop.com/building-with-llms.md. This helps AI tools and agents consume our content and allows you to copy and paste the entire contents of a doc into an LLM. This format is preferable to scraping or copying from our HTML and JavaScript-rendered pages because:
  • Plain text contains fewer formatting tokens.
  • Content that isn’t rendered in the default view (for example, it’s hidden in a tab) of a given page is rendered in the plain text version.
  • LLMs can parse and understand markdown hierarchy.
We also host an /llms.txt file which instructs AI tools and agents how to retrieve the plain text versions of our pages. The /llms.txt file is an emerging standard for making websites and content more accessible to LLMs.

Model Context Protocol (MCP) Server

Here are some ways you can connect to our docs MCP server:
  • Claude
  • Claude Code
  • Cursor
  • VS Code
  1. Navigate to the Connectors page in the Claude settings.
  2. Select Add custom connector.
  3. Add the Invopop Docs MCP server name (such as invopop-docs) and URL (https://docs.invopop.com/mcp).
  4. Add the GOBL Docs MCP server name (such as gobl-docs) and URL (https://docs.gobl.org/mcp).
  5. Select Add.
  6. When using Claude, select the attachments button (the + icon).
  7. Select your MCP server.
See the Model Context Protocol documentation for more details.