Adobe | Rendering Software Engineer in Test | On-site or remote, US or Western Europe
We’re a close-knit, remote-first team of engineers building next-generation, interactive rendering technology that powers a growing portfolio of high-profile creative tools, including the Substance suite of products.
We’re looking for a software engineer with a passion for computer graphics —real-time or offline— to lead and evolve our quality engineering efforts. This role combines strong software engineering skills with a deep focus on rendering quality and infrastructure. As part of the core rendering team, you’ll contribute C++ code, build tools and tests, and play a key role in ensuring the renderer stays fast, stable, and visually correct across diverse hardware.
Fee free to reach out if you have any questions, my email is in my profile.
I think the author pretty clearly means "standard" in the sense of "providing functionality at the same level as the C++ standard library".
For example, if I were to say "I'm going to write my own Dune", you'd probably understand that to mean my own broadly-interpreted sci-fi epic, and not literally a clone of Dune with Paul Atreides, the Bene Gesserit, etc.
I don't think they specifically understood the idea of a "game engine" as the core product at the time. But there are plenty of references if you Google a bit that Quake was designed for modders due to the popularity of DOOM mods - so developer experience was absolutely taken into account from the start.
They had already done licensing deals for the DOOM engine at that point, including the greatest game of all time "Chex Quest."
Yeah, this is why Quake's logic for a lot of game things - monsters, weapons, moving platforms - is written in a byte-code interpreted language (QuakeC). The idea was to separate it from the engine code so modders could easily make new games without needing access to the full engine source.
(And QuakeC was supposed to be simpler as a language than C, which it... is, but at the cost of weird compromises like a single number type (float) which is also used to store bitfields by directly manipulating power of two values. Which works, of course, until your power of 2 is big enough to force the precision to drop below 1...)
Ended up investigating the issue with AMD for several months, was generously compensated by AMD for all the troubles (sending motherboards and CPUs back and forth, a real PITA), but the outcome is that I've been running since then with a custom BIOS image provided by AMD. I think at the end the fault was on Gigabyte's side.
Supermicro gave us same type of assistance. Then new feature of bifurcation did not work correctly. Without it, enterprise telecommunications peripheral that costs 10x more than 4 socket Xeon motherboard can't run at nominal speed, and it was ran on real lines, not test data.
They sent us custom BIOSes until it got stabilized and said they'll put the patch in the following BIOS releases.
The thing is neither Intel nor AMD nor Supermicro can test edge cases at max usage in niche environments without paying money, but they would really love to claim with backup they can be integrated for such solutions. If Intel wants to test stuff in space for free they have to cooperate with NASA; the alternative is in-house launch.
NASA has super-elaborate testbeds and simulators.
Maybe producers can provide some format/interfaces/simulators for users, users would write test-cases for it, and give back to providers to run in-house.
When you’re not only helping them debug their own hardware but are also spending money on their ridiculously overpriced HEDT platform, it probably makes them want to keep you happy.