Skip to main content

Documentation Index

Fetch the complete documentation index at: https://llm-tools.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

You can also use locally running Ollama models. Installation instructions for Ollama can be found here. Once Ollama is installed, you can start a local LLM by executing ollama run <modelname>.

Install Ollama addon

npm install @llm-tools/embedjs-ollama

Usage

import { RAGApplicationBuilder } from '@llm-tools/embedjs';
import { Ollama } from '@llm-tools/embedjs-ollama';

const app = await new RAGApplicationBuilder()
.setModel(new Ollama({
    modelName: "llama3",
    baseUrl: 'http://localhost:11434'
}))