How to run Elephas offline with Ollama

Running Elephas knowledge assistant 100% offline with Ollama

Elephas offers privacy friendly offline mode to run it 100% local.

Here are the steps,

ย 

Install Ollama on your Mac

Install AI Deck client to manage Ollama model

This is optional, if you prefer a nice UI to download models from Ollama.

Download a Chat model

There are many decent local models, some of the recommended ones are,

Llama 3.2

Mistral

DeepSeek

Download an Embedding model

Embedding is a process of converting text to mathematical vectors for easy comparison.

nomic-embed-text-1.5 is a popular local embedding model.

ย 

Using local model on Elephas

Now, go to Elephas Settings โ†’ AI Providers โ†’ Offline AI

Elephas by default will pick the local models in Ollama.

ย 
Did this answer your question?
๐Ÿ˜ž
๐Ÿ˜
๐Ÿคฉ