Hacker Newsnew | past | comments | ask | show | jobs | submit | troyvit's commentslogin

I'll pile on with the Desiderata: https://www.desiderata.com/desiderata.html

"And whether or not it is clear to you, no doubt the universe is unfolding as it should. Therefore be at peace with God, whatever you conceive Him to be. And whatever your labors and aspirations, in the noisy confusion of life, keep peace in your soul. With all its sham, drudgery and broken dreams, it is still a beautiful world. Be cheerful. Strive to be happy."


Soap comes from everywhere. It's in the grocery store, drug stores. Hell it's in every hotel you stay in. Just grab it before you go and you've got a few weeks' supply.

I would guess that the median American can count the number of times they've ever been to a hotel on their fingers. Possibly even the average American.

Eh it's the same with motels, which a lot more people go to for one reason or another. It's not even worth cherry picking when there are so many other sources of soap that don't come from the back of a van.

I'll add to the chorus who ditched Amazon years ago because of their predatory practices. I do recognize though that I'm a relatively rich American so I can afford to, but if everybody who did, could, the market might look different.

That said, how much of that $3k/year is spent on things they need vs things they bought through Amazon's upselling algorithms? I drive past the giant warehouses and I wonder, how much useful stuff is actually in there? Because when I do find myself on amazon.com most of what I see is just trash wrapped in plastic.

And it proves a point: Things are still available at retail. Sometimes it is a box store but just as often it's a smaller shop. Does it take more time? Sure! But seriously, what is everybody using all that time they saved by shopping at Amazon for? From what I see it's more shopping online.


It's a test designed to cause cognitive dissonance. The LLM assumes a human has a logical reason to to walk to the car wash. The prompt never says the car isn't already at the car wash (and that the user has a second car). The issue isn't that LLMs can't solve a simple logic problem. It's that it assumes people aren't idiots.

Maybe not normal market dynamics, but typical human behavior: https://en.wikipedia.org/wiki/Vincent_Kosuga#Cornering_the_o...

When I do this for family I buy a used pixel. Then no dollar goes directly back to Google.

By ensuring that Pixels have significant resale value, you are encouraging consumers to buy Pixel phones.

Still, you are stopping the extraction of analytics, which probably bring Google the much more revenue over the longer term, and it is not possible to disable on regular Android phones.

Remember that on every certified Google Android phone, Google Play Services runs with system-level privileges. On GrapheneOS, it is sandboxed like pretty much any other app (if you choose to install Play Services) and you can make it 'blind' by revoking most privileges.

Same for Pixel Camera, etc., I just block network access.


Right? We need a "You won capitalism!" award where everybody in the org gets a huge bonus and then the company is split into small pieces and then they start over. On top of it we do what you describe and enforce the split so they can't collude.

So they create a new chip for every model they want to support, is that right? Looking at that from 2026, when new large models are coming out every week, that seems troubling, but that's also a surface take. As many people here know better than I that a lot of the new models the big guys release are just incremental changes with little optimization going into how they're used, maybe there's plenty of room for a model-as-hardware model.

Which brings me to my second thing. We mostly pitch the AI wars as OpenAI vs Meta vs Claude vs Google vs etc. But another take is the war between open, locally run models and SaaS models, which really is about the war for general computing. Maybe a business model like this is a great tool to help keep general computing in the fight.


We’re reaching a saturation threshold where older models are good enough for many tasks, certainly at 100x faster inference speeds. Llama3.1 8B might be a little too old to be directly useful for e.g. coding but it certainly gets the gears turning about what you could do with one Opus orchestrator and a few of these blazing fast minions to spit out boilerplate…

One of these things, however old, coupled with robust tool calling is a chip that could remain useful for decades. Baking in incremental updates of world knowledge isn't all that useful. It's kinda horrifying if you think about it, this chip among other things contains knowledge of Donald Trump encoded in silicon. I think this is a way cooler legacy for Melania than the movie haha.

What if you did like The Guardian, but writ small, and just suggest how much the article is worth after a person read it. Often when I read something really good I have a bit of a high at the end of it, real gratitude for the writer(s) and others who put it together. I wouldn't mind if that feeling was monetized.

I think I've made two good decisions in my life. The first was switching entirely to Linux around '05 even though it was a giant pain in the ass that was constantly behind the competition in terms of stability and hardware support. It took awhile but wow no regrets.

The second appears to be hitching my wagon to Mistral even though it's apparently nowhere as powerful or featureful as the big guys. But do you know how many times they've screwed me over? Not once.

Maybe it's my use cases that make this possible. I definitely modified my behavior to accommodate Linux.


They're too small to screw you over. But you've got more time until they do at least.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: