Hacker Newsnew | past | comments | ask | show | jobs | submit | blagie's commentslogin

> Then Iranians will be reminded how peaceful and prosperous the most other Muslim countries are.

This is factually incorrect. Top 10 majority-Muslim countries, sorted by population:

Indonesia, Pakistan, Egypt, Turkey, Algeria, Sudan, Iraq, Afghanistan, Morocco, Saudi Arabia

Now, the majority of those have problems with seeds in Western Imperialism, but the point is (a) the majority of those have problems (b) Iran's problems also have seeds in US interventions.

The gap between how peaceful and educated most people are, and how bad governments are, is a phenomenon almost unique here. Figuring out how to bridge that gap is the major challenge. The trick would be establishing a collective caliphate -- where the caliph isn't an individual but an institution -- and which spans the Muslim world.


Really?

When did this come up?

I know tons of people who run Windows unactivated. The key difference is there's a watermark. Otherwise, it seems to work fine.


The problem is not activation, it's the login requirement.


Personalization options like wallpaper and color settings are disabled until activation, though I'm sure there are workarounds.


Unless they have fixed it you used to be able to change stuff really quickly after install before it locked it down due to activation.


For future ML developers: A post like this should include system requirements.

It's not clear from the blog post, the git page, and most other places if this will run on, even in big-O:

* CPU

* 16GB GPU

* 240GB server (of the type most business can afford)

* Meta/Google/Open AI/Anthropic-style data center


Indeed. I've tried to run it locally this but couldn't get it running on my measly gaming-spec workstation.

It's seems you need lot's of ram and vram. Reading the issues on github[1], it does not seem many others have had success in using this effectively:

- someone with a 96 Gb VRAM RTX 6000 Pro had cuda oom issues

- someone somehow made it work on a RTX 4090 somehow, but RTF processing time was 12...

- someone with a RTX 5090 managed to use it, but with clips no longer than 20s

It seems utility of the model for hobbyist with consumer grade cards will be low.

[1]: https://github.com/facebookresearch/sam-audio/issues/24


It realy depends on your runtime environment, but I agree it would be nice to have some references with commonly used setups.


It does, but my comment was "even in big-O."

Environments might mean the difference between e.g. 16GB and 24GB, but not 16GB and 160GB.


I, respectfully, disagree with this analysis.

Prototyping platforms have tiny markets, but lead to downstream sales. Many a company were brought down by more developer-friendly platforms ignoring the "tiny" userbase of people who want to do unconventional things.

Most IC vendors provide free samples and support because of this. That's a market size of close to zero -- electronic engineers -- but leads to a market size of "massive." I can get an application engineer to visit my office for free to help me develop if I want.

Arguably, iPhone and Android won by supporting the tiny market of developers, who went on to build an ecosystem of applications, some long-tail, and some unexpected successes.

And arguably, x86 won for the same reason.

Atmel had shipped 500 million AVR flash microcontrollers, due in large part to the ecosystem created by Arduino.

Balmer said "Developers! developers! developers!" Visual Studio was not a major revenue driver for Microsoft; what was developed in it was.


> Prototyping platforms have tiny markets, but lead to downstream sales. Many a company were brought down by more developer-friendly platforms ignoring the "tiny" userbase of people who want to do unconventional things.

Qualcomm doesn't even make small/cheap MCUs so they aren't going to win over that market by buying Arduino. Their first board post-acquisition is a mashup of a Linux SBC with an MCU devkit, and while the Linux SOC is from QCOM, the MCU is from ST Micro.


>Atmel had shipped 500 million AVR flash microcontrollers, due in large part to the ecosystem created by Arduino.

How do you know the 500 million sales is due to the Arduino ecosystem?

I used to work in embedded for 10+ years and in the 4 companies I worked at so far, none of the products ever featured AVR microcontrollers. The microcontroller of choice for production was always based on the feature/cost ratio for each application, never on the "is it part of the Arduino ecosystem?" question.

Tinkering with Arduino at home, and building products for mass production, have widely different considerations.


If they sold 500 million microcontrollers and your workplaces never bought any, then your experience doesn't tell us anything about why the people that did buy them, bought them.


All of the products that i've been involved with that included AVR microcontrollers are from before the Arduino platform existed. The STMicro ARM M3 chips are more capable and cheaper then the 8-bit AVRs; The Arduino IDE never factored into the decision, even at the height of its popularity.


FWIW: I've used Arduinos, but never with their IDE.

AVR was super-developer-friendly well before the Arduino. It replaced the PIC for a lot of hobbyist projects.

To the points in the thread, on major product development, these things don't matter. On the long tail of smaller products, as well as on unexpected successes, they do.


That is the downside. you can prototype with one chip and when the concept works switch. I've worked with many projects over the years where that was done. Sometimes an intern proved it works with arduino - which wat cheap enough to buy without needing supply management, but then we did the project with 'good code' on our internal controllers. Othertimes we bought a competitor andiagain first thing switched them to our controllers. (Our controllers are designed for harsh environments which means millions of dollars spent designing the case and connectors)


"Fun" isn't the same thing as "functional."

I remember having great fun in QuickBASIC. And my son enjoys Scratch.

Django code is much more fun to work with than Node, but I can't imagine developing something competitive in it in 2025 to what I'm developing in Node. Node is a pain in the butt, but at the end of the day, competitiveness is about what you deliver to the user, not how much fun you have along the way.

* I think the most fundamental problems are developer-base/libraries and being able to use the same code client-side and server-side.

* Django was also written around the concept of views and templates and similar, rather than client-side web apps, and the structure reflects that.

* While it supports async and web sockets, those aren't as deep in the DNA as for most Node (or even aiohttp) apps.

* Everything I do now is reactive. That's just a better way to work than compiling a page with templates.

I won't even mention mobile. But how you add that is a big difference too.

It's very battery-included, but many of the batteries (e.g. server-side templating language) are 2005-era nickel cadmium rather than 2025-era lithium ion.

I would love to see a modern Node framework as pleasant to work with, thought-out, engineered, documented, supported, designed, etc. as well as Django, but we're nowhere close to there yet.


You spell out a lot of examples, but all of them are purely technical. What is it that you can deliver to the user using Node that you cannot deliver using Django? This is a genuine question.


There is nothing you can't do, given a Turing-complete language.

That doesn't make it reasonable or convenient to do so, though.


You must not be very imaginative.

Plenty of Django businesses making tens of millions. Some in the billions.

I know a solopreneur making around $2m a year and all he uses is Django


Man, the only true part is the async/web socket part (and it's most because of python and not django itself) ... you can do a lot, and by a lot I mean almost 99% of websites/apps out there, with django and it's 2005-era nickel cadmium features


The lithium-ion battery analogy seems fitting: When we're not careful about sourcing those modern batteries from a trustworthy supply-chain, they tend to explode and injure the user.


It is, and intentionally so.

NiCd batteries could also sit on a shelf forever, holding their charge. They had virtually no self-discharge, which was super-convenient.

They came in standard form factors (AA, AAA, 9V, etc.).

I really liked NiCd batteries.

But realistically, you couldn't sell a phone or laptop in 2025 which ran on them.


It's double-return-fraud.

Amazon shouldn't sell returned products as "new," but as "open box."

The other way it happens is co-mingling. Some vendor sends an "open box" product to Amazon as new, or a fake product, and Amazon ships it out when sold by Amazon since it considers goods to be fungible.

I stopped buying anything which goes in my body from eBay, Amazon, and similar after receiving a premium food product with very clearly fake packaging.


Man, I don't think this co-mingling thing was big or existed when I moved to a country that Amazon doesn't ship to directly almost 6 years ago.

Reading about it on HN makes me feel fortunate. I can't recall ever running into something like this back then.


Amazon broke in 2020, when most shopping went online. It never recovered.

I doubt it ever will. Trust takes a long time to earn, and a little bit of time to break. I had four or five incidents on Amazon, cancelled Prime, and I doubt it will ever make business sense for Amazon to get me back.

I do think there's a place for a competitor to Amazon right now which looks more like the old Amazon.

Starting one would be super-capital-intensive. It's not a lean startup. There's only a handful of organizations with the capital to do that, and I doubt any of them will, in fact, do it.


> Amazon broke in 2020

nah. jassy taking the reigns in 2021 is when the nosedive began.


He was representative, not causative. The rot had been settling for years.


>I do think there's a place for a competitor to Amazon right now which looks more like the old Amazon.

If walmart plays their cards right, they can do it (I mean they did acquire Jet). Unfortunately they also seem to be OK with becoming a dropship frontend for aliexpress


I think modern Walmart understands the problem better than modern Amazon.

Walmart just needs the volume to be able to compete with Amazon's logistics, so is getting volume where it can.


There is some good news. After years of customer and company complaints, Amazon finally ended commingling recently.


The cycle has been:

Piracy -> Friendly ways to buy -> Unfriendly ways to buy -> Piracy -> ...

Unfortunately, giving money back to writers involves hopping through piracy. At that point, a new, consumer-friendly service will sprout up. Everyone will use it.

Over time, the service will want to profit-maximize, and will adopt anti-consumer techniques. Leading people to go to Pirate Bay. Leading to friendly services.

Rinse, repeat.


How many times has this happened, such that it can be called a cycle?

There are other possibilities, such as people simply not writing as much anymore, or higher quality writers existing the market due to lack of sufficient return.


Bad DRM led to Napster led to Netflix lead to a fragmentation of services led to a resurgence of piracy.

Similar thing happened with music, only rather than piracy, it landed on legal / free (e.g. Youtube). Youtube is just starting to do the consumer-unfriendly thing (but it's got a long ways to go before piracy comes out competitive).

Similar in books.

I'll mention: A lot of these are consumer-unfriendly in some ways (e.g. Netflix DRM), but friendly in others. $20/month for all the movies you can watch beats piracy.


It’s happened to some degree with music, movies, and TV shows.


Four of these together should, in the abstract, let you run 200GB models, which is where things get very, very interesting.

The biggest Deepseek V2 models would just fit, as would some of the giant Meta open source models. Those have rather pleasant performance.

In theory, how feasible is that?

I feel like the software stack might be like a Jenga tower. And PCIe limitations might hit pretty hard.


Well, no. It doesn't. The comparison is to the A1000.

Toss in a 5060 Ti into the compare table, and we're in an entirely different playing field.

There are reasons to buy the workstation NVidia cards over the consumer ones, but those mostly go away when looking at something like the new Intel. Unless one is in an exceptionally power-constrained environment, yet has room for a full-sized card (not SFF or laptop), I can't see a time the B50 would even be in the running against a 5060 Ti, 4060 Ti, or even 3060 Ti.


> There are reasons to buy the workstation NVidia cards over the consumer ones

I seem to recall certain esoteric OpenGL things like lines being fast was a NVIDIA marketing differentiator, as only certain CAD packages or similar cared about that. Is this still the case, or has that software segment moved on now?


I don't know the full set of differentiators.

For me (not quite at the A1000 level, but just above -- still in the prosumer price range), a major one is ECC.

Thermals and size are a bit better too, but I don't see that as $500 better. I actually don't see (m)any meaningful reasons to step up to an Ax000 series if you don't need ECC, but I'd love to hear otherwise.


It's competition. Poison increases in toxicity over time.

I could generate subtly wrong information on the internet LLMs would continue to swallow up.


> I could generate subtly wrong information on the internet

There’s already a website for that. It’s called Reddit.


Yes, but so would people, so what's the point of this unless you just dislike everybody (and if so, that's fair too I suppose)?


The whole design was based on things to make pages not discoverable except to sketchy AI companies who violate web norms.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: