Hacker Newsnew | past | comments | ask | show | jobs | submit | lelag's commentslogin

Indeed. I've tried to run it locally this but couldn't get it running on my measly gaming-spec workstation.

It's seems you need lot's of ram and vram. Reading the issues on github[1], it does not seem many others have had success in using this effectively:

- someone with a 96 Gb VRAM RTX 6000 Pro had cuda oom issues

- someone somehow made it work on a RTX 4090 somehow, but RTF processing time was 12...

- someone with a RTX 5090 managed to use it, but with clips no longer than 20s

It seems utility of the model for hobbyist with consumer grade cards will be low.

[1]: https://github.com/facebookresearch/sam-audio/issues/24


Miracle Max gave us a clear definition if I recall, you die when you are "all dead", as long as you are mostly dead, you are slightly alive...

I'll let myself out now.


“ If we were not perfectly convinced that Hamlet's Father died before the play began, there would be nothing more remarkable in his taking a stroll at night, in an easterly wind, upon his own ramparts, than there would be in any other middle-aged gentleman rashly turning out after dark “


In his last blog post, Sam Altman also revealed how much power the average chatgpt query uses, and it's in the same ballpark.

> People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.

https://blog.samaltman.com/the-gentle-singularity


Gallons and teaspoons ... the units of measurements in the 21. century!


Damn, I would not have guessed that Men In Black was actually a documentary...


I thought the universe they were saving in that was in some kind of "fish bowl" type universe (galaxy?)


Maybe they want to compile the Apollo Guidance Computer source code...

https://www.softwareheritage.org/wp-content/uploads/2019/07/...


If it's not a joke, I think it was already digitized: https://github.com/chrislgarry/Apollo-11


The metabase "backend" is written in clojure.

The web frontend is written in TypeScript/React.


Interesting point about PC going digital-only as Nintendo is a fascinating counter-example.

While they offer digital downloads on the eShop, their pricing actively discourages it.

Case in point: I just bought my kid a new first-party Switch game. Physical copy on Amazon was ~25% cheaper than the identical digital version on Nintendo's own eShop. Even my 9-year-old noted how illogical it seems, the physical version requires manufacturing, shipping, retail markup, yet costs significantly less than the digital bits that have near-zero marginal cost.

It strongly suggests Nintendo wants the physical retail channel to thrive, or values the perceived permanence/resale value of cartridges.

This context makes the Switch 2 "gamekey" cartridges (physical auth token, digital download) fit their pattern of valuing a physical artifact and retail presence, even if the data delivery shifts.


And physical, at least to date, retains resell value as well. If you want to play an expensive Nintendo release that effectively never goes down in retail value, it's reasonably safe to buy it, play it, and resell it if you don't want it indefinitely. Nintendo never lowering their prices helps anchor the value high even in the resell market most of the time.

I haven't read enough about this to know if the gamekey will kill this but it's certainly only a matter of time before they are all coded and bindable to only one account. Technically this has obviously been possible for a long time, they just haven't dared to pull that trigger yet. They clearly want to.


> And physical, at least to date, retains resell value as well.

That stopping being true as soon as the DS line started and they switched to flash memory that will degrade over time when they don't have power. People's DS games are already failing. The same will happen to switch games. Only a few hardcore collectors are going to pay money for a cartridge that doesn't let you play the game anymore.


That's more like antique value. Resell as meant here occurs within the first few months to maybe years after a game releases. Degradation that happens on a time scale of a decade or more will not be a significant issue for ordinary resale.


I believe it’s to ensure retailers want to sell the games.

When I go to virgin mega store, there’s a large section of Nintendo games, but nothing for any PC games.

If it was cheaper to buy online, why would I buy physically, unless I was really into it physical media.

And that segment isn’t big enough to cause retailers to put up a section in the mall, which not only works as a point of sale but also as a giant advertising banner.


It's not only Nintendo, but all consoles that still have physical versions of games.

I never buy digitally from Sony, for example. The discs get discounted far earlier than the occasional digital sale.

Plus since we're talking about preservation, I don't trust Sony to make my digital purchases available indefinitely.

The only content you really own is the cracked version from pirate bay...


Price discrimination. Digital is more convenient to the consumer, hence by nature they prefer it. Consumers who are more price sensitive can instead choose to put up with the inconvenience of a physical purchase in exchange for a cheaper price.


Nintendo has raised the physical prices for Switch 2 games more than the digital edition. E.g. Mario Kart physical will cost 90USD/EUR and digitally 80USD/EUR. Thus, their retail-friendliness has diminished with the new generation.


If I understand it correctly, that's a valid concern but the way structured generation library like outlines[1] work is that they can generate multiple variants of the inference (which they call beam search).

One beam could be "This is a way to solv-". With no obvious "good" next token. Another beam could be "This way is solv-". With "ing" as the obvious next token.

It will select the best beam for the output.

[1]:https://github.com/dottxt-ai/outlines


I can’t say for certain, but I’d guess that writing without the letter “e” is slightly more difficult in French than in English. For one, “e” is a bit more common in French (around 15% of all letters, versus about 12% in English). But more importantly, French grammar adds extra challenges—like gender agreement, where feminine forms often require an “e”, and the frequent use of articles like le and les, which become unusable.

That said, I think the most impressive achievement is the English translation of the French novel. Writing an original constrained novel is hard enough, but translating one means you can’t just steer the story wherever you like. You have to preserve the plot, tone, and themes of the original, all while respecting a completely different set of linguistic limitations. That’s a remarkable balancing act.


I was going to point that out.

What I will add is that constrained generation is supported by the major inference engine like llama.cpp, vllm and the likes, so what you are describing is actually trivial on locally hosted models, you just have to provide a regex that prevent them to use the letter 'e' in the output.


You can do this more properly with the antislop sampler and we are working on a follow up paper to our previous work on this exact problem.

https://github.com/sam-paech/antislop-sampler

https://arxiv.org/abs/2306.15926


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: