Comment by pjmlp

Comment by pjmlp 10 months ago

43 replies

No surprise here, given the extent HLSL is already the de facto shading language for Vulkan.

Khronos already mentioned in a couple of conferences that there will be no further work improving GLSL, and given DirectX weight in the industry, HLSL kind of took over.

Additionally for the NVidia fans, it might be that Slang also gets a place in the Vulkan ecosystem, discussions are ongoing, as revealed on SIGGRAPH sessions.

TillE 10 months ago

My understanding was that dxc lacked support for compiling various HLSL features to SPIR-V (hence SM7 now), so there are still a bunch of Vulkan-focused projects like Godot which only support GLSL.

But yes, the games industry has been almost entirely HLSL since forever, and this is going to help remove the final obstacles.

  • minraws 10 months ago

    Yep, especially DXC HLSL to SPIRV was a big issue when it came to supporting new features from Vulkan.

    Though I would still like to see if slang can succeed and I am always a bit afraid of Microsoft just dropping the ball somewhere.

Simran-B 10 months ago

What about WGSL though, the shader language of WebGPU? WebGPU is kind of Vulkan lite, but unlike with Vulkan, Apple is on board and actually the reason why WGSL exists as yet another shading language.

  • jsheard 10 months ago

    What about it? Nobody wanted WGSL, it's just an artifact of having to appease Apple during WebGPUs development as you say. I don't see why it would be adopted for anything else.

    The old WebGPU meeting notes have some choice quotes from (IIRC) Unity and Adobe engineers literally begging the committee not to invent a new shader language.

    • fngjdflmdflg 10 months ago

      >The old WebGPU meeting notes have some choice quotes from (IIRC) Unity and Adobe engineers literally begging the committee not to invent a new shader language.

      This was an interesting tidbit, so I tried to find the source for it. While I did not find it, I did find the December 2019 minutes[0] which has a related point:

      >Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

      I found this link via rust hn[1] which I found after reading this blog post:[2]

      >Vulkan used a bytecode, called SPIR-V, so you could target it from any shader language you wanted. WebGPU was going to use SPIR-V, but then Apple said no

      The lobsters thread also links to a relevant HN post:[3]

      >I know, I was there. I also think that objection to SPIR-V wasn't completely unfounded. SPIR-V is a nice binary representation of shaders, but it has problems in the context of WebGPU adoption: It's so low level [...] It has a lot of instructions [...] Friction in the features we need, vs features Khronos needs. [...] there is no single well specified and tested textual shading language. HLSL doesn't have a spec.

      The linked blog post from lobsters was also discussed on HN, which you also commented in.[4]

      It would be great if you could find that Unity/Adobe discussion as I would be interested to read it.

      [0] https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

      [1] https://lobste.rs/s/q4ment/i_want_talk_about_webgpu

      [2] https://cohost.org/mcc/post/1406157-i-want-to-talk-about-web...

      [3] https://news.ycombinator.com/item?id=23089745

      [4] https://news.ycombinator.com/item?id=35800988

      • jsheard 10 months ago

        > It would be great if you could find that Unity/Adobe discussion as I would be interested to read it.

        https://github.com/gpuweb/gpuweb/wiki/Minutes-2019-09-24

        Corentin: Web Shading Language — A high-level shading language made by Apple for WebGPU.

        <room in general grimaces>

        [...]

        Jesse B (Unity): We care about HLSL

        Eric B (Adobe): Creating a new high level language is a cardinal sin. Don’t. Do. That. Don’t want to rewrite all my shaders AGAIN.

        Jesse B: If we can transcode to HLSL to whatever you need, great. If we can’t, we may not support your platform at all.

        Eric B: Would really not like even to write another transcoder. If there’s an existing tool to get to an intermediate representation, that’s good. Would suggest SPIRV is an EXCELLENT existing intermediate representation.

        Note the WSL language made by Apple which sparked that discussion is unrelated to the WGSL language they ended up shipping, but the sentiment that the ISV representatives just wanted them to use HLSL or SPIR-V stands.

        • fngjdflmdflg 10 months ago

          >WSL

          Ah, that explains part of why I couldn't find it. I was searching mainly for WGSL. Something Like 'WEBGPU minutes "Unity" "HLSL" "WGSL"'. There was also WHLSL also from Apple at one point but was later droppped in favor of WSL.[0][1]

          >A few months ago we discussed a proposal for a new shading language called Web High-Level Shading Language, and began implementation as a proof of concept. Since then, we’ve shifted our approach to this new language, which I will discuss a little later in this post.

          >[...]

          >Because of community feedback, our approach toward designing the language has evolved. Previously, we designed the language to be source-compatible with HLSL, but we realized this compatibility was not what the community was asking for. Many wanted the shading language to be as simple and as low-level as possible. That would make it an easier compile target for whichever language their development shop uses.

          >[...]

          >So, we decided to make the language more simple, low-level, and fast to compile, and renamed the language to Web Shading Language to match this pursuit.2

          The "we designed the language to be source-compatible with HLSL, but we realized this compatibility was not what the community was asking for" comment is funny because Unity's "We care about HLSL" comment seems to be directly against this.

          In any case, this is really a disappointing move from Apple. Just another example of them ignoring developers – even large developers like Adobe and Unity – over completely petty disputes and severe NIH.

          The craziest line in the post is probably "[WSL] would make it an easier compile target for whichever language their development shop uses." It's like they knew people wanted SPIR-V but they wouldn't do it due to some petty legal drama that Apple invented and then chose literally the worst of all worlds by making yet another compile target instead of at least choosing the next best thing which would be something that is compatible with HLSL.

          [0] https://github.com/w3c/strategy/issues/153

          [1] https://webkit.org/blog/9528/webgpu-and-wsl-in-safari/

    • troupo 10 months ago

      > it's just an artifact of having to appease Apple during WebGPUs development

      To appease Google most likely. WebGPU is based on original work by Apple and Mozilla which based it on Metal.

      I doubt Apple would be against whatever Metal uses for its shader language.

      • jsheard 10 months ago

        The choice was between using or adapting SPIR-V, which is what basically everyone doing multi-platform development wanted, or using anything else and pissing everyone off by making them support another shader language. Apple stonewalled using SPIR-V or any other Khronos IP on unspecified legal grounds so they effectively forced the decision to roll a new shader langage, post-hoc rationalizations were given (e.g. human readable formats being more in the spirit of the web despite WebAssembly already existing at that point) but the technical merits were irrelevant when one of the biggest stakeholders was never ever going to accept the alternative for non-technical reasons.

        https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

        Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

      • jchw 10 months ago

        I don't understand why people say things that are kind of trivial to disprove, but here's the document with the notes where Apple refuses to use SPIR-V.

        https://docs.google.com/document/d/1F6ns6I3zs-2JL_dT9hOkX_25...

        > MS: Apple is not comfortable working under Khronos IP framework, because of dispute between Apple Legal & Khronos which is private. Can’t talk about the substance of this dispute. Can’t make any statement for Apple to agree to Khronos IP framework. So we’re discussing, what if we don’t fork? We can’t say whether we’re (Apple) happy with that.

        Reading between the lines, it seems like Apple mainly doesn't want to implement SPIR-V because engaging with the "Khronos IP framework" would prevent them from suing other Khronos members over patent disputes.

  • pjmlp 10 months ago

    WebGPU, like WebGL, is a decade behind the native APIs it is based on.

    No one asked for a new Rust like shading language that they have to rewrite their shaders on.

    Also contrary to FOSS circles, most studios don't really care about Web 3D, hence why streaming is such a thing for them.

    There have been HLSL to SPIR-V compilers for several years now, this is Microsoft own official compiler getting SPIR-V backend as well.

    • torginus 10 months ago

      Because WebGL, just like WebAssembly (with its hacky thread support and compilation issues) is a giant kludge.

      WebGL still has fundamental issues of not even supporting anything resembling a modern OpenGL feature set (with modern meaning 2010s era stuff like compute shaders and multi draw indirect) in theory, and in practice, macOS doesn't support WebGL2, meaning stuff like multiple render targets (which is necessary for deferred rendering), so it's almost impossible to make a modernish game that runs in a browser well.

      Imo the problem isn't that WebGPU/Wasm is a decade/X years behind, but that we cannot reliably expect a feature set that existed on typical mid 2000s PCs to reliably work in the browser across all platforms (which is the whole point of the web).

      • Ygg2 10 months ago

        It's almost as like some Fruit based company is sabotaging the efforts to keep its walled garden.

      • flohofwoe 10 months ago

        > macOS doesn't support WebGL2

        WebGL2 is fully supported in Safari since quite a while now. In fact it's using the same rendering backend as Chrome and Firefox (ANGLE), and AFAIK Google and Apple engineers worked together to create (or at least improve?) the ANGLE Metal backend and integrate ANGLE into Safari.

      • adrian17 10 months ago

        Safari supports WebGL2 since version 15 - unless you meant something else by macOS lacking support?

        (I agree with your general point though.)

  • flohofwoe 10 months ago

    The native WebGPU libraries accept SPIRV as input, and they offer libraries to convert WGSL to SPIRV and back. E.g. WGSL is only needed when running WebGPU in browsers, but even there it can be code-generated from other shading languages by going through SPIRV (but tbh, I actually like WGSL, it's simple and straightforward).

    • MindSpunk 10 months ago

      Except that the conversion to WGSL is a complete waste of compute resources, engineering effort and the time of everyone involved. WebGPU is a _web_ API after all, even if people realized the runtimes could be used outside the browser.

      Converting your SPIR-V to WGSL just to convert it back to SPIR-V to feed it into a Vulkan driver, or running an entire language frontend just to emit DXIL or Metal IR. We learned 15 years ago that textual shader languages at the GPU API interface are a mistake but we're forced to relearn the same mistakes because Apple wouldn't play ball. What a joke.

  • WhereIsTheTruth 10 months ago

    WGSL was a mistake and hopefully they get rid of it, it negatively impacts WebGPU's adoption, at least it did for me, the syntax is one of the worst ever created, just horrible

  • kvark 10 months ago

    WGSL could be good for Khronos. It’s a modern language with an actual specification. It’s gaining users every day.

hgs3 10 months ago

> Khronos already mentioned in a couple of conferences that there will be no further work improving GLSL

Unfortunately, HLSL isn’t an open standard like GLSL. Is it Khronos's intention to focus solely on SPIR-V moving forward, leaving the choice of higher-level shader languages up to application developers?

  • ferbivore 10 months ago

    There's likely to be very little funding for GLSL moving forward, and I would expect no major spec updates ever again, but vendors will probably keep publishing extensions for new GPU features and fixing things up. GLSL still has a fairly large user base. Whether SPIR-V is going to be the only Khronos shading language (or whatever you want to call it) moving forward, that's hard to say. Nvidia is pushing for Slang as a Khronos standard at the moment. Not sure if anyone's biting.

  • pjmlp 10 months ago

    Yes, they officially stated at Vulkanised, SIGGRAPH among other places, that there is no budget for GLSL improvements, and also they aren't programming language experts anyway.

    It is up to the community to come up with alternative, and the game development community is mostly HLSL.

gigatexal 10 months ago

Will this help games be more compatible with the proton layer on Linux or is this not related?

camel-cdr 10 months ago

I haven't used either in a while, what is missing from GLSL?

  • pjmlp 10 months ago

    C based, no support for modular programming, everything needs to be a giant include, no one is adding features to it as Khronos isn't assigned any budget to it.

    HLSL has evolved to be C++ like, including lightweight templates, mesh shaders and work graphs, has module support via libraries, is continuously being improved on each DirectX release.

    • flohofwoe 10 months ago

      I'm not a fan of GLSL either, but adding C++ like baggage to shading languages like HLSL and especially MSL do (which is C++) is a massive mistake IMHO, I'd prefer WGSL over that sort of pointless language complexity any day.

      • pjmlp 10 months ago

        Long term shading languages will be a transition phase, and most GPUs will turn into general purpose compute devices, where we can write code like in the old days of software rendering, except it will be hardware accelerated anyway.

        We already see this with rendering engines that are using CUDA instead, or as shown at Vulkanised sessions.

        I do agree that to the extent C++ has grown, and the security issues, something else would be preferable, maybe NVidia has some luck with their Slang adoption proposal.

      • NotGMan 10 months ago

        At some point you have to stop working in assembly and graduate to a high-level language and beyond.

        Modern GPU stuff is getting too complex to be practical without higher language features.

        • flohofwoe 10 months ago

          From the pov of assembly, C and any other high level language are basically the same. That doesn't mean that climbing even higher up on the abstraction ladder is a good thing though (especially for performance).