Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Windows ML is an abstraction to use local LLM models across CPU, GPU or NPU, making the code independent of the actual hardware.

It is the evolution of DirectX for ML, previously known as DirectML.



It’s kind of a bummer because this is the exact same playbook as DirectX, which ended up being a giant headache for the games industry, and now everyone is falling for it again.


I would be curious to see whether it's a common opinion that DirectX was a bad thing for the games industry. It was preceded by a patchwork of messy graphics/audio/input APIs, many of them proprietary, and when it started to gain prominence, Linux gaming was mostly a mirage.

A lot of people still choose to build games on Direct3D 11 or even 9 for convenience, and now thanks to Proton games built that way run fine on Linux and Steam Deck. Plus technologies like shadercross and mojoshader mean that those HLSL shaders are fairly portable, though that comes at the cost of a pile of weird hacks.

One good thing is that one of the console vendors now supports Vulkan, so building your game around Vulkan gives you a head start on console and means your game will run on Windows, Linux and Mac (though the last one requires some effort via something like MoltenVK) - but this is a relatively new thing. It's great to see either way, since in the past the consoles all used bespoke graphics APIs (except XBox, which used customized DirectX).

An OpenGL-based renderer would have historically been even more of an albatross when porting to consoles than DX, since (aside from some short-lived, semi-broken support on PS3) native high-performance OpenGL has never been a feature on anything other than Linux and Mac. In comparison DirectX has been native on XBox since the beginning, and that was a boon in the XBox 360 era when it was the dominant console.

IMO historically picking a graphics API has always been about tradeoffs, and realities favored DirectX until at least the end of the XBox 360 era, if not longer than that.


While Switch supports Vulkan, if you really want to take advantage of Switch hardware, NVN is the way to go, or make use of the Nintendo Vulkan extensions that are only available on the Switch.

So it isn't that portable as people think.


Anyone who thinks DirectX was bad for the games industry needs to go back and review the history of graphics APIs.


Usually it is an opinion held by folks without background in the industry.

Back in my "want to do games phase", and also during Demoscene days, going to Gamedev.net, Flipcode, IGDA forums, or attending GDCE, this was never something fellow coders complained about.

Rather how to do some cool stuff with specific hardware, or gameplay ideas, and mastering various systems was also seen as a skill.


DirectX carried the games industry forward because there weren't alternatives. OpenGL was lagging, and Vulkan didn't exist yet. I hope everyone moves to Vulkan, but DX was ultimately a net positive.


There were others, there is this urban myth that games consoles used OpenGL and only Windows was the outlier.

Even Mac OS only adopted OpenGL after the OS X reboot, before it was QuickDraw 3D, and Amiga used Warp 3D during its last days.


In the last 25 years I have never gotten this vibe from devs, DirectX likely enabled a ton of games that would have never seen the light without it.


It is a FOSS, complaining about proprietary APIs, because there is this dissonance between communities.

Game developers care about IP, how to make it go beyond games, getting a publisher deal, gameplay, the proprietary APIs is a set of plugins on a middleware engines in-house or external, and done it is.

Also there is a whole set of companies whose main business is porting games, where is where several studios got their foot into the door before coming up with their own ideas, as a means to get experience and recognition in the industry, they are thankful each platform is something else.

Finally anyone claiming Khronos APIs are portable never had the pleasure to use extensions or deal with drivers and shader compiler bugs.


It is only an headache for FOSS folks, games industry embraces proprietary APIs, it isn't the elephant problem FOSS culture makes it to be, as anyone that has ever attended game development conferences, or Demoscene parties can tell.


Yeah DirectX ended up being a giant headache but there were times in its history where it was the easiest api to use and very high performance. DirectX came about because the alternatives at the time were, frankly, awful.


OpenGL (the main competition to DirectX) really wasn't that bad in the fixed-function days. Everything fell apart when nVidia / AMD came up with their own standards for GPU programming.

DirectX was nice in that the documentation, and example/sample code was excellent.


The fixed function version of OpenGL was non thread safe with global state, it made for some super fun bugs when different libraries set different flags and then assumed they knew which state the OpenGL runtime was in the next time they tried to render something.


What's stopping you from using ONNX models on other platforms? A hardware agnostic abstraction to make it easier for consumers to actually use their inference capable hardware seems like a good idea, and exactly the kind of stuff I think an operating system should provide.

> Call the Windows ML APIs to initialize EPs [Execution Providers], and then load any ONNX model and start inferencing in just a few lines of code.


i exclusively use ONNX models across platforms for CPU inference. it's usually the fastest option on CPU. hacking on ONNX graphs is super easy, too...i make my own uint8 output ONNX embedding models


Exactly, any work you do on top of this makes your work hostage to Windows.


People mainly think of Direct3D when referring to DirectX.

The other components were very well: DirectInput etc.


I think this is very neat. So many possibilities.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: