Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. If the AI is not integrated with the IDE, it's not as helpful. If there were an IDE plugin that let you use a local model, perhaps that would be an option, but I haven't seen that (Github Copilot allows selecting different models, but I didn't check more carefully whether that also includes a local one, anyone knows?).


> (Github Copilot allows selecting different models, but I didn't check more carefully whether that also includes a local one, anyone knows?).

To my knowledge, it doesn't.

On Emacs there's gptel which integrates quiet nicely different LLM inside Emacs, including a local Ollama.

> gptel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. It works in the spirit of Emacs, available at any time and uniformly in any buffer.

https://github.com/karthink/gptel


This can use Ollama: https://www.continue.dev/


It’s doable as it’s what I use to experiment.

Ollama + CodeGPT IntelliJ plugin. It allows you to point at a local instance.


I also use Ollama for coding. I have a 32G M2 Mac, and the models I can run are very useful for coding and debugging, as well as data munging, etc. That said, sometimes I also use Claude Sonnet 3.5 and o1. (BTW, I just published an Ollama book yesterday, so I am a little biassed towards local models.)


Thanks for the book!


> If there were an IDE plugin that let you use a local model

TabbyML




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: