Skip to main content

llms.txt

Webull OpenAPI documentation is published in machine-readable formats to support AI-assisted development. Large Language Models can reference the documentation directly — helping you generate more accurate integration code, troubleshoot faster, and explore the API with AI tools.

Machine-Readable Documentation (llms.txt)

The documentation follows the llms.txt standard — a lightweight format that gives LLMs a complete, structured reference of the API.

Index file:

https://developer.webull.com/apis/llms.txt

Markdown Access

Every documentation page has a Markdown variant. Append .md to any page URL to get the raw Markdown content, ideal for feeding into LLMs, RAG pipelines, or documentation crawlers.

Page URLMarkdown URL
https://developer.webull.com/apis/docs/aboutabout.md
https://developer.webull.com/apis/docs/trade-api/overviewtrade-overview.md

Fetch Markdown content from the command line:

curl https://developer.webull.com/apis/docs/about.md

Integration with AI Tools

Claude

In Claude, you can add the Webull OpenAPI documentation as a Project Knowledge source:

  1. Create or open a Project in Claude.
  2. In the Project settings, click Add ContentAdd from URL.
  3. Paste the llms.txt URL:
    https://developer.webull.com/apis/llms.txt
  4. Claude will use the documentation as context for all conversations within that project.

Cursor

In Cursor, you can add the Webull OpenAPI documentation as custom docs:

  1. Open the command palette (Command + Shift + P) and select Add New Custom Docs.
  2. Paste the llms.txt URL:
    https://developer.webull.com/apis/llms.txt
  3. In any AI conversation, use @Add Contextdocs to attach the Webull OpenAPI reference. The AI will use it as context for code generation and Q&A.

Kiro

In Kiro, you can add the llms.txt as a steering file or reference it directly in chat using #URL to give the AI full context of the Webull OpenAPI documentation.

ChatGPT

In ChatGPT, you can provide the documentation via file upload or custom instructions:

  1. Download the llms.txt file or copy its content.
  2. In a conversation, click the attachment icon and upload the file — or paste the content directly into the chat.
  3. For persistent access, add the llms.txt URL to your Custom Instructions or use it within a GPT as a knowledge source.

Gemini

In Google Gemini, you can reference the documentation directly:

  1. Upload the llms.txt file as an attachment in a conversation.
  2. Alternatively, paste the raw content or specific sections into the chat for targeted questions.

More AI Models / Tools

The llms.txt URL works with any tool that accepts external documentation as context — including RAG pipelines, MCP servers, and general-purpose AI coding assistants.

https://developer.webull.com/apis/llms.txt