Maybe this is a good place to note that with the release of Safari 15 now every major browser has WebGL 2 enabled. It's finally possible to rely on WebGL 2 without excluding iOS users. For GLSL that means you can upgrade to a new major version of the language. Some limitations on loops have been lifted, bitwise integer operations are available, as well as many other things: https://webgl2fundamentals.org/webgl/lessons/webgl2-whats-ne...
I must be misunderstood your comment but WebGPU is has not shipped yet except behind flags and or origin trials. the specs are still changing and not done
It's also pretty broken for webassembly (which is where such a low-level API like WebGPU is going to have the biggest benefit), since Rust and C/C++ WebGPU bindings tend to fall out of API-compatibility with the browser implementation as Chrome and Firefox release new nightlies
For anyone looking to pick up shaders for the first time, I recommend not beginning with the book of shaders. It’s an excellent introduction to a particularly heady slice of shader programming that, in my opinion, is a blind alley for folks new to shaders. I think focusing on fragment shaders rendering as full-screen quads makes the process feel more arcane and mind-shattering than it does when you consider the frag shader’s role in a 3D rendering pipeline. In the latter context, the simultaneous invocation of every frag doesn’t feel like a puzzle to solve but a reasonable part of getting pixels onscreen as part of a 3D render. For that reason I recommend newbies work on using fragments in the context of materials that include vertex and potentially compute shaders to manipulate geometry and to replicate things like phong shading or a shader-based particle system first.
It hasn't been really added to for a long time, either. I would be surprised at this point if they decide to finish the thing, it feels like I've been checking for over a year now, but starting out on an in-progress book that isn't even 50% of the way there yet sure sounds like a great exercise in disappointment. I'd love to see it get there and cover things more widely in terms of setting up a development environment as well as everything you mentioned, though.
As an outsider, it's really hard to discern what I might choose to learn amongst graphics tech. I've tried to invest my learning in subjects and technologies that aren't going anywhere and where my knowledge will pay off over decades — computer science fundamentals, math, SQL, C, vanilla JavaScript, HTML, CSS, LaTeX, Git, Python, etc. It looks like everybody's still fighting it out in graphics though.
Honestly, if you learn hlsl or glsl it's not much work to learn the other.
I originally learned opengl in school and after a few years of working with it I transitioned to a job where we primarily used Direct3D. I guess I had passively picked up enough bits and pieces along the way because I never really thought about the shift.
I briefly looked at Vulkan when it first came out, but I haven't had the time to get into it. That's a whole different beast.
This. I wrestled with the same topic a couple years ago. Not only which to use, but once you choose a “common” one (say OpenGL), then actually getting pixels in a window is still different across platforms (hence things like SDL and GLFW).
For me I settled on: 1) Learn WebGL in the browser. It’s a pretty good subset of OpenGL [ES] and an easy environment to experiment in. Then 2) learn a more “native” approach with good tooling support, such as Direct3D 11. (I wouldn’t get into D3D12 or Vulkan until you learn the basics pretty well).
As others have mentioned, while the landscape is confusing, the concepts across them are pretty similar once you get a foothold.
Not quite right, the Wii uses a GL based API similar to GL 3.3.
Not all Nintendo consoles do so.
And the switch has even GL 4.5, Vulkan and NVN to chose from.
As anyone with some graphic programming experience in cross platform knows, even among GL variants, being a variant is enough for shaders not to compile the same way and require code rewrites.
What? The Wii uses the GX API from the GameCube [0], which is not similar at all to GL 3.3; it was developed in 1999. It does not have any form of programmable shaders; it uses TexEnv-style combinators.
The Wii U has a custom GX2 API, where the official shader language is GLSL (again, see the Mario Kart 8 kiosk demo).
The Switch has NVN, GL, Vulkan, where the official shader language for all three is GLSL (based on NVIDIA's GLSL compiler).
Well, first of all, when I wrote Wii, I wasn't even thinking into the distinction between Wii and Wii U.
Secondly, the Cafe SDK documentation clearly refers to OpenGL 3.3, and GX2 is basically OpenGL in spirit while using naming and some utility functions, which fits "GL based API similar to GL 3.3".
I assume you also have access to Cafe documentation.
Playstation uses PSSL, which is basically HLSL with some slightly different semantics. It's possible to compile hlsl as pssl with a single (small) header file with some preprocessor defines in it. PSSL and HLSL are not developed in lock step though so all valid PSSL is not valid HLSL and vice a versa even without defines. It's not possible to actually get into details without breaking NDA and I haven't actively done Sony development since PS4 days (5-6 years ago)
Vulkan supports both officially, although GLSL support is better. To be fair Vulkan kind of assumes the game engine shader compiler does most of the heavy lifting and using either glsl or hlsl directly in a small Vulkan application is a pain compared to OpenGL or DirectX.
HLSL is proprietary, and GLSL is open. It would be more odd if they had somehow unified. We still have plenty of proprietary vs open choices in computer languages - it doesn’t seem odd C++ and C# haven’t unified, does it?
Yep, not much had changed over the last few years on that regard. Well, except running complex 3d scenes in a modern browser is mostly fine these days. Plenty of performance
I still remember Macromedia/Adobe showing off their flash 3d engine. Nothing was accelerated, just software rendering. Ugly and slow but there wasn't anything better at the time. It went nowhere I think
Of course it went nowhere, we have spent 10 years catching up with 2011, after killing Flash.
See Unreal Engine 3 demo on Flash.
And with Web 3D reflecting 2012 hardware in 2021, no wonder that the answer for ultimate 3D and good debugging tools is native on mobile, or server side rendering with streaming.
> we have spent 10 years catching up with 2011, after killing Flash.
Lol, Flash 3d was built on OpenGL and DirectX, which are still here and have progressed since then. What are we catching up with? Adobe chose to kill flash because of WebGL.
Hahaha Flash abuse by advertisers and spammers is a big reason plugins met their well-deserved end. I really am sorry that you lost your favorite tools, Flash was truly awesome for animators and developers. What you’re talking about is separate and independent from GL APIs, which don’t attempt to compete in the artist-facing tooling space at all.
The irony is that native game development and server side rendering is where all the 3D programming fun is nowadays.
Even when WebGPU eventually comes out into stable Chrome, it will take again 10 years to similar adoption, and it won't be anything more than a plain 1.0 MVP, short of many things that Vulkan, Metal and DirectX 12 Ultimate are capable of in 2021.
Safari, made by Apple, was the lone holdout for WebGL2. You’re familiar with Apple’s relationship to Flash. No I won’t be surprised at all if Safari holds out on WebGPU, and it will be precisely because it offers functionality that Metal already has.
> Enjoy your pyrrhic victory.
I don’t understand why you’re repeatedly spending time on WebGL threads sharing sour grapes about Flash. They don’t compete. At all. WebGL is not an authoring tool, and Flash was. Your sentiment is just laughably misplaced. Adobe killed flash, Adobe alone is where your ire should be aimed, for not making an ecosystem that could withstand abusive content creators.
Not sure about other browsers but this is incorrect for Chrome and Firefox. They both have flags to disable WebGL and WASM. Please be careful not to spread misinformation, we owe it to ourselves to research our posts well.
I don't think the regular user even knows what WebGL and WASM are, they will just wonder why their browser game isn't working and then google the cryptic setting that they have to change to get it to work.
Thinking about designing a scene in a fragment shader is already mind boggling. Writing a new shader from scratch live in front of an audience within a 25 minute deadline is insane. The demoscene calls this a "Shader Showdown" competition, and they are amazing to watch.
A good example is the Revision 2021 semi-finals[1]. The entire video is worth watching, but I especially recommend watching the last half of evvvil-vs-flopine (from about 43:00); the last few minutes (from ~56:00) fundamentally changed my understanding of what is possible in a fragment shader. Ray marching into a hand crafted signed distance field is such a bizarre way of thinking about a scene.