Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Off the shelf, LLMs can’t just look at docs and give you the answer.

But if you properly pre-process the documents and create a RAG type system (which uses embedding to find semantically similar docs before inserting them into LLM context) then it actually works quite well.

It’s good for big organizations with internal wikis, I’ve found.

It also works well for ingesting articles from online publications.



ChatGPT helped me to setup a hugo page from scratch with templating and stuff.

It did so by just reading the hugo documentation without any RAG.

And it was 10x better than the huge documentation in itself.


Well that’s because Hugo is very simple to set up, there are lots of tutorials online about it, it hasn’t changed drastically in the past year and it’s mostly the same for different static site generators.

You won’t find that kind of ease of use with say webgpu+winit for building a small renderer in rust.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: