It was cool to see subreddit simulators evolve alongside progress in text generation, from Markov chains, to GPT-2, to this. But as they made huge leaps in coherency, a wonderful sort of chaos was lost. (nb: the original sub is now being written by a generic foundation llm)
Cool! I like the way this kind of demo breaks the fourth wall of its medium. I'm actually surprised that I haven't really seen this kind of thing in demoscene, where it's always just opengl craziness within the confines of a window, which doesn't really "take advantage" of living in a windowing operating system.
Yeah they confirm that at the bottom of the linked page
> Furthermore, by leveraging tools like MapAnything to generate metric points, ShapeR can even produce metric 3D shapes from monocular images without retraining.
ELI5 has meant friendly simplified explanations (not responses aimed at literal five-year-olds) since forever, at least on the subreddit where the concept originated.
Now, perhaps referring to differentiability isn't layperson-accessible, but this is HN after all. I found it to be the perfect degree of simplification personally.
I hate to sound like a webdev stereotype but surely the parsing step of querySelector, which is cached, is not slow enough to warrant maintaining such a build step.
It's not the searching that's infeasible. Efficient algorithms for massive scale full text search are available.
The infeasibility is searching for the (unknown) set of translations that the LLM would put that data through. Even if you posit only basic symbolic LUT mappings in the weights (it's not), there's no good way to enumerate them anyway. The model might as well be a learned hash function that maintains semantic identity while utterly eradicating literal symbolic equivalence.
A lot of medical devices still run XP as well unfortunately, because of old proprietary software for expensive equipment that doesn't receive updates anymore.
I don't even know if the golden ratio itself is that magical, but I do see a lot of value in picking one ratio and sticking to it everywhere.
reply