No Graphics API

(sebastianaaltonen.com)

184 points | by ryandrake 2 hours ago

14 comments

  • vblanco 1 hour ago
    This is a fantastic article that demonstrates how many parts of vulkan and DX12 are no longer needed.

    I hope the IHVs have a look at it because current DX12 seems semi abandoned, with it not supporting buffer pointers even when every gpu made on the last 10 (or more!) years can do pointers just fine, and while Vulkan doesnt do a 2.0 release that cleans things, so it carries a lot of baggage, and specially, tons of drivers that dont implement the extensions that really improve things.

    If this api existed, you could emulate openGL on top of this faster than current opengl to vulkan layers, and something like SDL3 gpu would get a 3x/4x boost too.

    • pjmlp 1 hour ago
      DirectX documentation is on a bad state currently, you have the Frank Lunas's books, which don't cover the latest improvements, and then is hunting through Learn, Github samples and reference docs.

      Vulkan is another mess, even if there was a 2.0, how are devs supposed to actually use it, especially on Android, the biggest consumer Vulkan platform?

    • tadfisher 1 hour ago
      Isn't this all because PCI resizable BAR is not required to run any GPU besides Intel Arc? As in, maybe it's mostly down to Microsoft/Intel mandating reBAR in UEFI so we can start using stuff like bindless textures without thousands of support tickets and negative reviews.

      I think this puts a floor on supported hardware though, like Nvidia 30xx and Radeon 5xxx. And of course motherboard support is a crapshoot until 2020 or so.

      • vblanco 1 hour ago
        This is not really directly about resizable BAR. you could do mostly the same api without it. Resizable bar simplifies it a little bit because you skip manual transfer operations, but its not completely required as you can write things to a cpu-writeable buffer and then begin your frame with a transfer command.

        Bindless textures never needed any kind of resizable BAR, you have been able to use them since early 2010s on opengl through an extension. Buffer pointers also have never needed it.

  • opminion 1 hour ago
    The article is missing this motivation paragraph, taken from the blog index:

    > Graphics APIs and shader languages have significantly increased in complexity over the past decade. It’s time to start discussing how to strip down the abstractions to simplify development, improve performance, and prepare for future GPU workloads.

    • alberth 1 hour ago
      Would this be analogous to NVMe?

      Meaning ... SSDs initially reused IDE/SATA interfaces, which had inherent bottlenecks because those standards were designed for spinning disks.

      To fully realize SSD performance, a new transport had to be built from the ground up, one that eliminated those legacy assumptions, constraints and complexities.

      • rnewme 46 minutes ago
        ...and introduced new ones.
  • pjmlp 1 hour ago
    I have followed Sebastian Aaltonen's work for quite a while now, so maybe I am a bit biased, this is however a great article.

    I also think that the way forward is to go back to software rendering, however this time around those algorithms and data structures are actually hardware accelerated as he points out.

    Note that this is an ongoing trend on VFX industry already, about 5 years ago OTOY ported their OctaneRender into CUDA as the main rendering API.

    • mrec 43 minutes ago
      Isn't this already happening to some degree? E.g. UE's Nanite uses a software rasterizer for small triangles, albeit running on the GPU via a compute shader.
      • jsheard 13 minutes ago
        Things are kind of heading in two opposite directions at the moment. Rasterization was traditionally done in fixed-function hardware, but has steadily incorporated more and more software elements, culminating in Nanite which is 99% software.

        Meanwhile GPU raytracing was a purely software affair until quite recently when fixed-function raytracing hardware arrived. It's fast but unfortunately very opaque, only exposed via high level driver interfaces, so you have to just let Jensen take the wheel.

  • aarroyoc 1 hour ago
    Impressive post, so many details. I could only understand some parts of it, but I think this article will probably be a reference for future graphics API.

    I think it's fair to say that for most gamers, Vulkan/DX12 hasn't really been a net positive, the PSO problem affected many popular games and while Vulkan has been trying to improve, WebGPU is tricky as it has is roots on the first versions of Vulkan.

    Perhaps it was a bad idea to go all in to a low level API that exposes many details when the hardware underneath is evolving so fast. Maybe CUDA, as the post says in some places, with its more generic computing support is the right way after all.

  • greggman65 6 minutes ago
    This seems tangentially related?

    https://github.com/google/toucan

  • reactordev 1 hour ago
    I miss Mantle. It had its quirks but you felt as if you were literally programming hardware using a pretty straight forward API. The most fun I’ve had programming was for the Xbox 360.
  • blakepelton 48 minutes ago
    Great post, it brings back a lot of memories. Two additional factors that designers of these APIs consider are:

    * GPU virtualization (e.g., the D3D residency APIs), to allow many applications to share GPU resources (e.g., HBM).

    * Undefined behavior: how easy is it for applications to accidentally or intentionally take a dependency on undefined behavior? This can make it harder to translate this new API to an even newer API in the future.

  • Bengalilol 15 minutes ago
    After reading this article, I feel like I've witnessed a historic moment.
  • ksec 1 hour ago
    I wonder why M$ stopped putting out new Direct X? Direct X Ultimate or 12.1 or 12.2 is largely the same as Direct X 12.

    Or has the use of Middleware like Unreal Engine largely made them irrelevant? Or should EPIC put out a new Graphics API proposal?

    • pjmlp 1 hour ago
      That has always been the case, it is mostly FOSS circles that argue about APIs.

      Game developers create a RHI (rendering hardware interface) like discussed on the article, and go on with game development.

      Because the greatest innovation thus far has been ray tracing and mesh shaders, and still they are largely ignored, so why keep on pushing forward?

    • reactordev 1 hour ago
      Both-ish.

      Yes, the centralization of engines to Unreal, Unity, etc makes it so there’s less interest in pushing the boundaries, they are still pushed just on the GPU side.

      From a CPU API perspective, it’s very close to just plain old buffer mapping and go. We would need a hardware shift that would add something more to the pipeline than what we currently do. Like when tesselation shaders came about from geometry shader practices.

  • MaximilianEmel 1 hour ago
    I wonder if Valve might put out their own graphics API for SteamOS.
    • m-schuetz 1 hour ago
      Valve seems to be substantially responsible for the mess that is Vulkan. They were one of its pioneers from what I heard in chats with Vulkan people.
      • jsheard 1 hour ago
        There's plenty of blame to go around, but if any one faction is responsible for the Vulkan mess it's the mobile GPU vendors and Khronos' willingness to compromise for their sake at every turn. Huge amounts of API surface was dedicated to addressing problems that only existed on mobile, and earlier versions of Vulkan insisted on doing things the mobile way even if you knew your software was only ever going to run on desktop.

        Thankfully later versions have added escape hatches which cut out much of the mobile-specific bureaucracy, but it was grim for a while, and all that early API cruft is still there to confuse newcomers.

      • pjmlp 1 hour ago
        Samsung and Google also have their share, see who does most of Vulkanised talks.
  • thescriptkiddie 1 hour ago
    the article talks a lot about PSOs but never defines the term
    • flohofwoe 1 hour ago
      "Pipeline State Objects" (immutable state objects which define most of the rendering state needed for a draw/dispatch call). Tbf, it's a very common term in rendering since around 2015 when the modern 3D APIs showed up.
    • CrossVR 1 hour ago
      PSOs are Pipeline State Objects, they encapsulate the entire state of the rendering pipeline.
  • henning 1 hour ago
    This looks very similar to the SDL3 GPU API and other RHI libraries that have been created at first glance.
  • yieldcrv 1 hour ago
    what level of performance improvements would this represent?
    • modeless 8 minutes ago
      It would likely reduce or eliminate the "compiling shaders" step many games now have on first run after an update, and the stutters many games have as new objects or effects come on screen for the first time.

      It would be especially nice for game developers as they face long shader compile times more often, and it would dramatically reduce the complexity of the low level rendering code while improving flexibility.

    • vblanco 1 hour ago
      There is no implementation of it but this is how i see it, at least comparing with how things with fully extensioned vulkan work, which uses a few similar mechanics.

      Per-drawcall cost goes to nanosecond scale. Assuming you do drawcalls of course, this makes bindless and indirect rendering a bit easier so you could drop CPU cost to near-0 in a renderer.

      It would also highly mitigate shader compiler hitches due to having a split pipeline instead of a monolythic one.

      The simplification on barriers could improve performance a significant amount because currently, most engines that deal with Vulkan and DX12 need to keep track of individual texture layouts and transitions, and this completely removes such a thing.

    • flohofwoe 1 hour ago
      It's mostly not about performance, but about getting rid of legacy cruft that still exists in modern 3D APIs to support older GPU architectures.
    • m-schuetz 1 hour ago
      Probably mostly about quality of life. Legacy graphics APIs like Vulkan have abysmal developer UX for no good reason.
  • ginko 17 minutes ago
    I mean sure, this should be nice and easy.

    But then game/engine devs want to use the vertex shader producing a uv coordinate and a normal together with a pixel shader that only reads the uv coordinate (or neither for shadow mapping) and don't want to pay for the bandwidth of the unused vertex outputs (or the cost of calculating them).

    Or they want to be able to randomly enable any other pipeline stage like tessellation or geometry and the same shader should just work without any performance overhead.