Show HN: Gerbil – an open source desktop app for running LLMs locally
github.comGerbil is an open source app that I've been working on for the last couple of months. The development now is largely done and I'm unlikely to add anymore major features. Instead I'm focusing on any bug fixes, small QoL features and dependency upgrades.
Under the hood it runs llama.cpp (via koboldcpp) backends and allows easy integration with the popular modern frontends like Open WebUI, SillyTavern, ComfyUI, StableUI (built-in) and KoboldAI Lite (built-in).
Why did I create this? I wanted an all-in-one solution for simple text and image-gen local LLMs. I got fed up with needing to manage multiple tools for the various LLM backends and frontends. In addition, as a Linux Wayland user I needed something that would work and look great on my system.
The big feature which I would like to see is a way to easily interact with the content of the local filesystem --- I have a prompt for re-naming scans based on parsing their content which I've been using in Copilot --- recent changes require that I:
- launch Copilot
- enter a prompt to get it into Copilot Pages mode
- click a button to actually get into that mode
- paste in the prompt
- drag in 20 files
- wait for them to upload
- click the button to process the prompt on the uploaded files
- quit Copilot, launch Copilot, delete the conversation, quit, launch Copilot and have it not start, which then allows repeating from the beginning
It would be much easier if I could just paste in the prompt specifying a folder full of files for it to run on, then clear that folder out for the next day's files and repeat.
Would that be something which your front-end could do? If not, is there one which could do that now? (apparently jan.ai has something like this on their roadmap for 0.8)
Serious question, not a "what's the point of this" shitpost... My experience with local LLMs is limited.
Just installed LM Studio on a new machine today (2025 Asus ROG Flow Z13, 96GB VRAM, running Linux). Haven't had the time to test it out yet.
Is there a reason for me to choose Gerbil instead? Or something else entirely?
Not OP, but I am running ollama as a testing ground for various projects ( separately from gpt sub ).
<< Is there a reason for me to choose Gerbil instead? Or something else entirely?
My initial reaction is positive, because it seems to integrate everything without sacrificing being able to customize it further if need be. That said, did not test it yet, but now I will.