Hacker Newsnew | past | comments | ask | show | jobs | submit | Jtsummers's commentslogin

What's the highly editorialized title based on?

In case it changes, the submitted title is currently:

  ArsTechnica seemingly using AI to write an article about AI impersonation
The actual title is:

  After a routine code rejection, an AI agent published a hit piece on someone by name
And the article is not about AI impersonation of anyone which makes AdmiralAsshat's title all the stranger.

My submission was actually to a comment, in which the article's subject (i.e. the source article's author) showed up and declared that half the quotes attributed to him in the article did not exist in the referenced piece he wrote--suggesting that ArsTechnica's editors had used an AI to write the article, which hallucinated details.

The article has now been pulled by Ars.


That makes a lot more sense, but just as a note for the future: HN replaces submission URLs with their canonical URL for most pages (they have disabled this for some where the site's canonical URL is reliably wrong). So when you submitted a link to the comment section, HN stripped that out and replaced it with the link to the article itself. It's done this for years.

I get a (childish) 404.


He links to those in the first two sentences. How is that not "on the top"?

> What I'm missing is certainly what the hell the algorithm even is and what is its complexity.

https://arxiv.org/pdf/2504.17033 - Linked from the second sentence of the submission, not hard to track down. And the complexity (by which I presume you mean algorithmic complexity) is stated in the submission and in the PDF linked by the submission and that I just shared with you.


I did eventually find that yes, after sifting through the rest of the useless links, the quantamagazine article that says jack shit, the link to the ACM symposium call for submissions (lmao). Like come on, why label that "underlying research"?

And all of that was wasted time since it seems that this just isn't at all applicable to A* heuristics the way Dijkstra's is. It's only an improvement in a very specific case.


It's like the common claim that data-oriented programming came out of game development. It's ahistorical, but a common belief. People can't see past their heroes (Casey Muratori, Jonathon Blow) or the past decade or two of work.

I partly agree, but I think you're overcorrecting. Game developers didn't invent data-oriented design or performance-first thinking. But there's a reason the loudest voices advocating for them in the 2020s come from games: we work in one of the few domains where you literally cannot ship if you ignore cache lines and data layout. Our users notice a 5ms frame hitch- While web developers can add another React wrapper and still ship.

Computing left game development behind. Whilst the rest of the industry built shared abstractions, we worked in isolation with closed tooling. We stayed close to the metal because there was nothing else.

When Casey and Jon advocate for these principles, they're reintroducing ideas the broader industry genuinely forgot, because for two decades those ideas weren't economically necessary elsewhere. We didn't preserve sacred knowledge. We just never had the luxury of forgetting performance mattered, whilst the rest of computing spent 20 years learning it didn't.


> I think you're overcorrecting.

I don't understand this part of your comment, it seems like you're replying to some other comment or something not in my comment. How am I overcorrecting? A statement of fact, that game developers didn't invent these things even though that's a common belief, is not an overcorrection. It's just a correction.


Ah, I read your comment as "game devs get too much credit for this stuff and people are glorifying Casey and Jon" and ran with that, but you were just correcting the historical record.

My bad. I think we're aligned on the history; I was making a point about why they're prominent advocates today (and why people are attributing invention to them) even though they didn't invent the concepts.


I don't really like this line of discourse because few domains are as ignorant of computing advances as game development. Which makes sense, they have real deadlines and different goals. But I often roll my eyes at some of the conference talks and twitter flame wars that come from game devs, because the rest of computing has more money resting on performance than most game companies will ever make in sales. Not to mention, we have to design things that don't crash.

It seems like much of the shade is tossed at web front end like it's the only other domain of computing besides game end.


I mean... fair point? I'm not claiming games are uniquely performance-critical.

You're right that HFT, large-scale backend, and real-time systems care deeply about performance, often with far more money at stake.

But those domains are rare. The vast majority of software development today can genuinely throw hardware or money at problems (even HFT and large backend systems). Backends are usually designed to scale horizontally, data science rents bigger GPUs, embedded gets more powerful SoCs every year. Most developers never have to think about cache lines because their users have fast machines and tolerant expectations.

Games are one of the few consumer-facing domains that can't do this. We can't mandate hardware (and attempts at doing so cost sales and attract community disgust), we can't hide latency behind async, and our users immediately notice a 5ms hitch. That creates different pressures- we're optimising for the worst case on hardware we don't control whilst most of the industry optimises for the common case on hardware they choose.

You're absolutely right that we're often ignorant of advances elsewhere. But the economic constraint is real, and it's increasingly unusual.


I think we as software developers are resting on the shoulders of giants. It's amazing how fast and economical stuff like redis, nginx, memcached, and other 'old ' software are written decades ago, mostly in C, by people who really understood what made them run fast (in a slightly different way to games, less about caches and data, and more about how the OS handles low level primitives).

A browser like Chrome also rests on a rendering engine like Skia, that has been optimized to the gills, so at least performance can be theoretically fast.

Then one tries to host static files on a express webserver, and is suprised to find that a powerful computer can only serve files at 40MB/s with the CPU at 100%.

I would like to think that a 'Faustian deal' in terms of performance exists - you give up 10,50,90% of your performance in exchange for convenience.

But unfortunately experience shows there's no such thing, arbitrarily powerful hardware can be arbitrarily slow.

And as you contrast gamedev to other domains who get to hide latency, I don't think its ok that a simple 3 column gallery page takes more than 1 second to load, people merely tolerate this not enjoy it.

And ironically I find that a lot of folks end up optimizing their React layouts way more than what it'd have cost to render naively with a more efficient toolkit.

I am also not sure what advances game dev is missing out on, I guess devs are somewhat more reluctant to write awful code in the name of performance nowadays, but I'd love to hear what advances gamedev could learn from the broader software world.

The TLDR version of what I wanted to say, is I wish there was a linear performance-convenience scale, where we could pick a certain point and use techniques conforming to that, and trade two thirds of the max speed for dev experience, knowing our performance targets allow for that.

But unfortunately that's not how it works, if you choose convenience over performance, your code is going to be slow enough that users will complain, no matter what hardware you have.


It clearly didn’t come out of game dev. Many people doing high performance work on either embedded or “big silicon” (amd64) in that era were fully aware of the importance of locality, branch prediction, etc

But game dev, in particular Mike Acton, did an amazing job of making it more broadly known. His CppCon talk from 2014 [0] is IMO one of the most digestible ways to start thinking about performance in high throughput systems.

In terms of heroes, I’d place Mike Acton, Fabian Giesen [1], and Bruce Dawson [2] at the top of the list. All solid performance-oriented people who’ve taken real time to explain how they think and how you can think that way as well.

I miss being able to listen in on gamedev Twitter circa 2013 before all hell broke loose.

[0] https://youtu.be/rX0ItVEVjHc?si=v8QJfAl9dPjeL6BI

[1] https://fgiesen.wordpress.com/

[2] https://randomascii.wordpress.com/


Unless you are actually the author of the page (and that person has another account here on HN, so if so why use a new account and not that one?), it's strange to copy/paste the contents of a page and use the material as if you wrote it yourself.

I am the author :) I got shadowbanned, I don't know why.

You used the other account almost entirely for self-promotion. That goes against the guidelines and often ends up in things like bans.

I’ve been a passive reader for years, but you’re absolutely right, the posts I was sharing were indeed self-promotion. Thanks for pointing that out. I will be more active in the community this time.

Why the shouting? Someone already posted the archive link, no need to write IN ALL CAPS like a racist uncle.

https://news.ycombinator.com/item?id=46993858

And maybe check out the guidelines and FAQ (which covers paywall submissions and how to emphasize text):

https://news.ycombinator.com/newsguidelines.html

https://news.ycombinator.com/newsfaq.html

> Please don't use uppercase for emphasis. If you want to emphasize a word or phrase, put *asterisks* around it and it will get italicized.



We held elections during the US Civil War. We'd have to have an event that makes the Civil War look like a walk in the park to even have a chance of justifying skipping elections in 2026 or 2028.

> not a UBI

Who said it is a UBI that this "rebuttal" even makes sense to appear here? The Irish government isn't calling it a UBI. The article doesn't call it a UBI. Even the FAQ for the program says it is not UBI:

>> Why this is not a Universal Basic Income

>> It is important to note that that the Basic Income for the Arts Pilot is not a Universal Basic Income. This is a sectoral intervention to support practicing artists and creative arts workers to focus on their creative practice. This policy is separate to the Universal Basic income as outlined in the Programme for Government.

https://www.gov.ie/en/department-of-culture-communications-a... - C-f for "universal"


Basic Income and UBI are colloquially synonyms, people use them interchangeably, and the Irish government are almost certainly using it to endear themselves to supporters of UBI and to get more coverage for their policy than media would give them if they just called it a grant.

This happens all the time. For example, in the UK there was a push for a "living wage" in the 2010s, which the government responded to by rebranding the minimum wage the "National Living Wage" and bumping it a little for over-25s.

This seems to be the same thing.


The first word of UBI is universal. The entire concept relies on that characteristic.

> Do they have to be unemployed during the grant period?

No, they're allowed to have other work or earn money from their art. The intent is to subsidize their income, not be their exclusive income for those three years.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: