You can also use locally running Ollama models. Installation instructions for Ollama can be found here. Once Ollama is installed, you can start a local LLM by executingDocumentation Index
Fetch the complete documentation index at: https://llm-tools.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
ollama run <modelname>.

