r/LocalLLaMA - Reddit
r/LocalLLaMA - Reddit
reddit.com
Related
Highlights
2
2
Open WebUI
openwebui.com
3
3
GitHub - comfyanonymous/ComfyUI: The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface.
github.com
2
2
Ollama
ollama.com
eneral-purpose models
1.1B:
TinyDolphin 2.8 1.1B
. Takes about ~700MB RAM and tested on my Pi 4 with 2 gigs of RAM. Hallucinates a lot, but works for basic conversation.
2.7B:
Dolphin 2.6 Phi-2
. Takes over ~2GB RAM and tested on my 3GB 32-bit phone via llama.cpp on Termux.
7B:
Nous Hermes Mistral 7B DPO
. Takes about ~4-5GB RAM depending on contex
...
See more
r/LocalLLaMA - Reddit
Unlock unlimited Related cards