* ollama implementation * ollama implementation * Update ai-generator.js * Update README.md * Create OLLAMAREADME.MD * Update OLLAMAREADME.MD * Update notes-editor.js * Update index.html * Update OLLAMAREADME.MD * Update ai-generator.js
4.8 KiB
Recent Changes (May 18, 2025)
Ollama Integration for AI Text Generation
An integration with Ollama has been added as a new provider for the AI text generator feature, allowing users to leverage local large language models.
Key Changes:
- New Provider: "Ollama" is now available in the AI generator's model/provider selection.
- Model Name as Key: When Ollama is selected, the "API Key" input field is repurposed to accept the Ollama model name (e.g.,
llama3,mistral, etc.) instead of a traditional API key. - Local Endpoint: The integration communicates with a local Ollama instance. Configuration details below.
- Streaming Support: Responses from Ollama are streamed into the text area.
Ollama Setup and Configuration
To use Ollama with Fantasy Map Generator, you need to ensure Ollama is correctly running and configured on your machine.
1. Install Ollama:
- Download and install Ollama from ollama.com.
- Download the desired models (e.g.,
ollama run llama3).
2. Configure Ollama for Network Access (Crucial Step):
By default, Ollama might only listen for connections from the same machine (localhost or 127.0.0.1). For Fantasy Map Generator to access Ollama, especially from other devices on your local network, you must configure Ollama to listen on all network interfaces and allow cross-origin requests.
-
Set
OLLAMA_HOSTEnvironment Variable:- This variable tells Ollama which network interfaces to listen on.
- Action: Set
OLLAMA_HOSTto0.0.0.0. - How to set (Windows Permanent):
- Search for "Edit the system environment variables" in the Windows search bar.
- Click "Environment Variables...".
- In the "System variables" section (bottom pane), click "New..." (or "Edit..." if it exists).
- Variable name:
OLLAMA_HOST - Variable value:
0.0.0.0 - Click "OK" on all dialogs.
- Restart your PC for the changes to take effect for all processes.
- How to set (Linux/macOS - per session or persistent):
- Per session: In your terminal, before running
ollama serve:export OLLAMA_HOST="0.0.0.0" - Persistent: Add
export OLLAMA_HOST="0.0.0.0"to your shell profile file (e.g.,~/.bashrc,~/.zshrc), thensourcethe file or restart your terminal.
- Per session: In your terminal, before running
-
Set
OLLAMA_ORIGINSEnvironment Variable (CORS Configuration):- This variable is essential for browsers to allow JavaScript code from one origin (Fantasy Map Generator's port 8000) to communicate with Ollama on a different port (11434).
- Action: Set
OLLAMA_ORIGINSto allow your Fantasy Map Generator's origin. - How to set (Windows Permanent): Follow the same steps as for
OLLAMA_HOST, but use:- Variable name:
OLLAMA_ORIGINS - Variable value:
http://<YOUR_PC_IP_ADDRESS>:8000(e.g.,http://192.168.178.46:8000) - For development (easiest): You can use
*as the value (OLLAMA_ORIGINS=*) to allow all origins. This is less secure for production but simplifies testing. - Restart your PC after setting the variable.
- Variable name:
- How to set (Linux/macOS - per session or persistent):
- Per session:
export OLLAMA_ORIGINS="http://<YOUR_PC_IP_ADDRESS>:8000"orexport OLLAMA_ORIGINS="*" - Persistent: Add the
exportline to your shell profile file.
- Per session:
-
Firewall Configuration:
- Ensure your PC's firewall (e.g., Windows Defender Firewall) is not blocking incoming connections to Ollama's default port,
11434. - Action: Create an inbound rule to allow TCP traffic on port
11434.
- Ensure your PC's firewall (e.g., Windows Defender Firewall) is not blocking incoming connections to Ollama's default port,
3. Configure Fantasy Map Generator's ai-generator.js:
The ai-generator.js file needs to point to the correct Ollama endpoint.
-
Scenario A: Using only on the same machine (
localhost):- Ensure the
fetchcall in thegenerateWithOllamafunction (insidemodules/ui/ai-generator.js) points tohttp://localhost:11434/api/generate. This is usually the default.
- Ensure the
-
Scenario B: Using from other machines on the local network:
- You must change the
fetchcall in thegenerateWithOllamafunction (insidemodules/ui/ai-generator.js) to use the actual local IP address of your machine where Ollama is running. - Example:
// Inside modules/ui/ai-generator.js, within generateWithOllama function: const response = await fetch("http://192.168.178.46:11434/api/generate" // Replace with your actual PC's IP - How to find your PC's IP:
- Windows: Open Command Prompt (
cmd) and typeipconfig. Look for "IPv4 Address" under your active network adapter. - Linux/macOS: Open Terminal and type
ip addr showorifconfig.
- Windows: Open Command Prompt (
- You must change the