Hacker Newsnew | past | comments | ask | show | jobs | submit | boxedemp's commentslogin

You may be mistaking some ai dev with non, because it doesn't have tell tails

Many AI generated web sites have a “look” and it’s not just all the emojis.

And no true scotsman puts sugar in his porridge

Yes, that was the reference!

Possibly too obscure. I can't tell whether I'm being downvoted by optimists who missed the joke, or by pessimists who got it.


Most sarcasm worsens discussion. And the comment guidelines say Don't be snarky.[1]

[1] https://news.ycombinator.com/newsguidelines.html


Haha, I downvoted you from the first category (until I read this comment).

Same. I have a lot of ideas I like to explore that people find boring or tedious. I used to just read, but it's pleasant to have the option to play with those thoughts more.

I love pitching scifi premise/half book ideas at ChatGPT and having it write short stories that end the way I want them to, dammit.

Just like rockets landing themselves

No, rockets landing themselves is just controlling the mechanism you use to have them take off, and builds on trust vectoring technology from 1970s jet fighters based on sound physics.

Figuring out how to radiate a lot of waste heat into a vacuum is fighting physics. Ordinarily we use a void on earth as a very effective _insulator_ to keep our hot drinks hot.


This is a classic case of listing all the problems but none of the benefits. If you had horses and someone told you they had a Tesla, you'd be complaining that a Tesla requires you to dig minerals where a horse can just be born!

> Figuring out how to radiate a lot of waste heat into a vacuum is fighting physics.

Radiators should work pretty well, and large solar panels can do double duty as radiators.

Also, curiously, newer GPUs are developed to require significantly less cooling than previous generations. Perhaps not so coincidentally?


Well there lies the rub, solar panels already need their own thermal radiators when used in space ...

Great, so you seem to agree the technology exists for this and it is a matter of deploying more of it?

It's a matter of deploying it for cheaper or with fewer downsides than what can be done on earth. Launching things to space is expensive even with reusable rockets, and a single server blade would need a lot of accompanying tech to power it, cool it, and connect to other satellites and earth.

Right now only upsides an expensive satellite acting as a server node would be physical security and avoiding various local environmental laws and effects


> Right now only upsides ...

You are missing some pretty important upsides.

Lower latency is a major one. And not having to buy land and water to power/cool it. Both are fairly limited as far as resources go, and gets exponentially expensive with competition.

The major downside is, of course, cost. In my opinion, this has never really stopped humans from building and scaling up things until the economies of scale work out.

> connect to other satellites and earth

If only there was a large number of satellites in low earth orbit and a company with expertise building these ;)


> And not having to buy land and water to power/cool it.

It's interesting that you bring that up as a benfit. If waterless cooling (i.e. closed cooling system) works in space, wouldn't it work even better on Earth?


I mostly agree with you, but I don't understand the latency argument. Latency to where?

These satellites will be in a sun-synchronous orbit, so only close to any given location on Earth for a fraction of the day.


You need to understand more of basic physics and thermodynamics. Fighting thermodynamics is a losing race by every measure of what we understand of the physical world.

> Fighting thermodynamics is a losing race

The great thing about your argument is that it can be used in any circumstance!

Cooling car batteries, nope can't possibly work! Thermodynamics!

Refrigerator, are you crazy? You're fighting thermodynamics!

Heat pump! Haah thermodynamics got you.


Actually all of those things agree with the same laws that dictate why data centers can't work in space.

Your examples prove our case. You just must not understand how they work


I guess you _really_ don't understand how thermodynamics works. Call me back when you think you can get better efficiency than a Carnot engine.

1kW TDP chips need LESS cooling?

Yes, Rubin reportedly can deal with running significantly hotter.

That makes radiating a much more practical approach to cooling it.


I see what you’re saying - higher design temp radiates better despite more energy overall to dissipate.

> I see what you’re saying - higher design temp radiates better despite more energy overall to dissipate.

Yes, running hotter will cause more energy to be radiated.

but

These parts are not at all designed to radiate heat - just look at the surface area of the package with respect to the amount of power they consume.


I think OP was saying hotter part -> hotter radiator attached to the part, not that the part itself will radiate significantly.

> I think OP was saying hotter part -> hotter radiator attached to the part, not that the part itself will radiate significantly.

Hmm, surely the radiator can run at arbitrary temperatures w.r.t. the objects being cooled? I'm assuming heat pump etc is already part of the design.


Figuring out how to radiate a lot of waste heat into a vacuum is just building very large radiators.

From what I understand, very, very large radiators every few racks. Almost as much solar panels every few racks. Radiation shielding to avoid transient errors or damage to the hardware. Then some form of propulsion for orbital corrections, I suppose. Then hauling all of this stuff to space (on a high orbit, otherwise they'd be in shade at night), where no maintenance whatsoever is possible. Then watching your hardware progressively fail and/or become obsolete every few years and having to rebuild everything from scratch again.

His point is that everyone said landing and reusing rockets was impossible and made fun of Elon and SpaceX for years for attempting it.

The difference is that it was mostly clueless people like Thunderf00t who said it was impossible, who nobody took seriously. I don’t remember that basically all relevant experts claimed it was near impossible with current technology. That’s the situation now.

There’s also fairly clear distinction with how insane Elons plan has become since the first plans he laid for Tesla and SpaceX and the plans he has now. He has clearly become a megalomaniac.

Funnily enough, some of the things people said about Tesla is coming true, because Elon simply got bored of making cars. It’s now plausible that Tesla may die as a car company which I would not have imagined a few years ago. They’re arguably not even winning the self driving and robotics race.


No, people made fun of Elon for years because he kept attempting it unsafely, skirting regulations and rules, and failing repeatedly in very public ways.

The idea itself was proven by NASA with the DC-X but the project was canceled due to funding. Now instead of having NASA run it we SpaceX pay more than we'd ever have paid NASA for the same thing.

DC-X test flight: https://www.youtube.com/watch?v=gE7XJ5HYQW4

It's awesome that Falcon 9 exists and it is great technology but this guy really isn't the one anyone should want in charge of it.


>Now instead of having NASA run it we SpaceX pay more than we'd ever have paid NASA for the same thing.

This doesn’t pass the smell test given that the cost of launch with spacex is lower than it ever was under ULA.

NASA has never been about cheap launches, just novel technology. Look at the costs of Saturn and SLS to see what happens when they do launch.


SpaceX is heavily subsidized and has extremely lucrative contracts with the US government. Not to mention they get to rely on the public research NASA produces.

He also said he could save the us a trillion dollars per year with DOGE, and basically just caused a lot data exfiltration and killed hundreds of thousands of people, without saving any money at all

Elon Musk killed hundreds of thousands of people?

Yes. Mostly kids, because of the DOGE ransacking of USAID

https://healthpolicy-watch.news/the-human-cost-one-year-afte...


Not to be crass, but as much as I dislike Musk US taxpayers are not responsible for the lives of children half a world away. Why is the US the only country held to this standard? No one ever complains that Turkey is killing thousands of children by not funding healthcare initiatives in Africa.

Not crass, it's a fair point.

It is our money and we're not obligated to give it away if we think it's needed for something else. I'd note though, that in terms of the budget, USAID was like change in the couch cushions and nothing else in the world was even close in terms of lives saved per dollar. Why the man tasked with saving the government trillions of dollars went there at all was nonsensical to begin with.

Nevertheless, it is fully within our rights to pull back aid if we (collectively) decide it's best thing to do. But the only legal way to do that is through the democratic process. Elected can legislators take up the issue, have their debates, and vote.

If congress had canceled these programs through the democratic process, there almost certainly would've been a gradual draw down. Notice and time would be given for other organizations to step in and provide continuity where they could.

And since our aid programs had been so reliable and trusted, in many cases they became a logistics backbone for all sorts of other aid programs and charities. Shutting it all down so abruptly caused widespread disruption far beyond own aid programs. Food rotting in warehouses as people starved. Medications sitting in warehouses while people who needed them urgently died. The absolute waste of life and resources caused by the sudden disruption of the aid is a true atrocity.

Neither Elon or Trump had legal authority to unilaterally destroy those programs outside of the democratic process the way they did, so they are most directly morally responsible for the resulting death.

To add insult-to-injury, Elon was all over twitter justifying all of it with utterly deranged, insane conspiracy theories. He was either lying cynically or is so far gone mentally that he believed them. I'm not sure which is worse.


> landing and reusing rockets

Currently SpaceX have managed to land the booster only, not the rocket itself, if you are thinking about Starship. And reusability of said rocket is also missing (collecting blown up pieces from the bottom of the ocean doesn't count!).


He said impossible, this was done recently, by spacex themselves.

Pure speculation, nobody can say say for sure, but my guess is 2-3 years.


The latest Nano Banana is pretty good, but it's not perfect yet. And many font use cases demand perfect.

Maybe the next major update will be able to do it.


Fonts are not generated as bitmaps, anyone who doesn't see how AI can and will be good at font generation is a fool.

It wasn't long ago that we thought creativity and programming were safe from AI. Fonts are entirely within the realm of possibilities within 12 months.


Nobody really knows the future. What were originally consumer graphics expansion cards turned out useful in delivering more compute than traditional CPUs.

Now that compute is being used for transformers and machine learning, but we really don't know what it'll be used for in 10 years.

It might all be for naught, or maybe transformers will become more useful, or maybe something else.

'no way' is very absolute. Unlikely, perhaps.


> What were originally consumer graphics expansion cards turned out useful in delivering more compute than traditional CPUs.

Graphics cards were relatively inexpensive. When one got old, you tossed it out and move on to the new hotness.

Here when you have spent $1 trillion on AI graphics cards and a new hotness comes around that renders your current hardware obsolete, what do you do?

Either people are failing to do simple math here or are expecting, nay hoping, that trillions of $$$ in value can be extracted out of the current hardware, before the new hotness comes along.

This would be a bad bet even if the likes of OpenAI were actually making money today. It is an exceptionally bad bet when they are losing money on everything they sell, by a lot. And the state of competition is such that they cannot raise prices. Nobody has a real moat. AI has become a commodity. And competition is only getting stronger with each passing day.


You can likely still play the hottest games with the best graphics on an H200 in 5 years.


Open question,

If you're on a VPN and using Firefox containers, is the only way to identify me to look at my mouse movement and correlate it?


The older I get the more I see that RMS was right about so many things.

When I was young I used to think of him as that eccentric pedantic mit guy but now I see him as a true warrior for freedom.


Oh yeah. He's been telling us for decades how technology will be used to oppress people. I guess he had the experience of how things turned out with UNIX, and knew first hand how hard he had to work to even have a chance at undermining them. What he did at a time was build something from scratch which was compatible with the UNIX interface. These days I would call that a lost battle.

Imagine if you said: I'm going to undermine facebook by building another social network which will be Free software, and will be compatible with facebook. I'll federate facebook whether they like it or not, and I'll do that by reverse engineering how facebook servers talk to each other. That wouldn't work because it takes you huge effort to pull off, and it takes facebook zero effort to change the interface in a tiny way that breaks everythign for you. (Ok the analogy isn't perfect, but hopefully you get the idea of diminishing something's value by forcefully opening it up)

But he hugely contributed to win a battle like this in the late 80s, then Linus Torvalds came in and finished the job in 1991 or so. RMS doesn't get the credit or even appreciation he deserves. I think he's one of the most tragic figures in the history of computers.


I'm actually impressed with windows 11. I've been using win since 3.11. 95, 98, 98se, 2000, XP (the dream), 7, and 10.

And finally after a lifetime of windows, 11 finally got me to switch my desktop OS to Ubuntu!

So 11 achieved something for me that no other Win release could: get me to abandon Microsoft OSs. Though, I admit to having a W10 VM instance with GPU passthrough.


I went through a 10 year Linux phase. Built a new PC about 5 years ago and decided I should get Windows 10 Pro, and use it for work and gaming. When EOL hit, Minecraft wouldn't update, so I switched to 11. The game installs just fine, but will not run unless you log into the Microsoft Store. Since I just wiped my drive any way, figured I'd just do it one more time with Arch instead. OK, maybe it took 3 more wipes to get it right, but it was well worth it. MC runs perfectly, and so do all the games I care about in Steam.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: