Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Who is using Ollama day-to-day?
2 points by brauhaus 18 hours ago | hide | past | favorite | 3 comments
I maintain a Chrome extension that adds an AI support to any input field online.

One of my favorite reviews is a 2-star review asking for “local model support”.

My first reaction was: who installs a 100KB Chrome extension to talk to a 10GB model running locally?

But it did make me curious. Are people actually running Ollama or LM Studio as part of their daily workflow?

 help



Yes, I do. It is quite helpful. For main coding tasks I use Claude, but if my credits vanish or I have enough time to get an answer, I use Ollama extensively. I would recommend for developers to maintain their own AI pipeline as well.

For anything dealing with personal data, like browser inputs, I would exclusively use local models too. Probably still niche, but non-local AI would be a deal breaker for me for both browsers and OS.


I'm curious what the workflow actually looks like for people running Ollama day-to-day.

Do you mostly use it through the terminal, a UI like Open WebUI, or via integrations with other tools?

I’m trying to understand where a browser integration would actually fit - if at all


I am running but it is only useful for very easy conference tasks, or either it needs a very high computing power. Currently I was running on a 32GB Mac Studio M1, and I mostly use it for generating commit messages.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: