Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Drivers may not implement their lexers etc. quite well, but that's not the failing of the specification."

Well, it could be, if the specification of the language requires a lot of work in the lexer, but that's probably not the case.



GLSL isn't anything as hard to parse as C. Besides, the bytecode discussion is a complete red herring. It's been done since ages for the web too, if it's slow to compile, it's either sheer incompetence on the part of the driver writer, or the reason isn't the parsing.


It's a bit of an understatement that hardware vendors aren't known for being very good at software. NVidia and AMD aren't so bad, but look at how awful the mobile GPU vendors are: https://dolphin-emu.org/blog/2013/09/26/dolphin-emulator-and...

That's what made it a bad decision to compile the entire shader in the driver. They are incompetent, and the ARB picked a design that exacerbated their incompetence instead of working around it. Users don't care why their OpenGL implementation has bugs, they just care that it does.


The 'lore', if you will, around this decision is that 3dfx pushed for GLSL being handled in the driver instead of a bytecode model because they had the best driver developers so would have a clear advantage. Who knows if that was actually their reasoning, perhaps they just wanted the flexibility?


Yeah, that's in line with my understanding/expectations. My point was just that this is an "in practice" thing, not "in principle".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: