The idea of getting a refund for mischaracterized tariffs is actually fairly common (it's called a duty drawback and there's a cottage industry around this). It's generally used when an importer incorrectly categorized their import under an HS code that has a higher duty than the correctly categorized HS code.
The difference this time is the scale is orders of magnitude larger. Will be interesting to see how they (importers and CBP) work through this.
A regular importer who routinely pays customs duties is now owed money by Customs and Border Protection. Can they now set off future duties against the balance owed them? Normally, reciprocal debt cancellation is legal.
The U.S. Treasury has a whole system for this, but in the other direction. If the government owes you money, and you owe the government money, the Treasury will deduct what you owe from whatever they are paying out.[1] But they're not set up for that in the other direction.
From what I understand, they make money per-token billing. Not enough for how much it costs to train, not accounting for marketing, subscription services, and research for new models, but if they are used, they lose less money.
Finance 101 tldr explanation:
The contribution margin (= price per token -variable cost per token ) this is positive
Profit (= contribution margin x cuantity- fix cost)
if 100% of the money they spend is in inference priced by tokens (they don't say about subscriptions so i asume they lost money), yes they make money, but their expenses are way higher than inference alone.
so they can make the gpu cost if they sell tokens but in reality this isnt the case, becouse they have to constaly train new models, subscription marketing, R&D, And overhead.
antropic in general lost way less money than their competitors
i will take this number in particular the projected break even but googling say
Gross margin in this case is how much money they do whit the GPU
"
Gross Margins: Projected to swing from negative 94% last year to as much as 50% this year, and 77% by 2028.
Projected Break-even: The company expects to be cash flow positive by 2027 or 2028. "
i will not be as bullish to say they will no colapse (0 idear how much real debt and commitments they have, if after the bubble pop spending fall shraply, or a new deepseek moment) but this sound like good trajectory (all things considered) i heavily doubt the 380 billions in valuation
"this is how much is spendeed in developers
between $659 billion and $737 billion. The United States is the largest driver of this spending, accounting for more than half of the global total ($368.5 billion in 2024)"
so is like saying that a 2% of all salaries of developers in the world will be absorbed as profit whit the current 33.3 ratio, quite high giving the amount of risk of the company.
For the most part, you can replicate any B2B SaaS product in a spreadsheet. The same reasons why spreadsheets didn't kill B2B SaaS apply to "in house vibe coded SaaS replacements." The original in house apps are (and continue to be) spreadsheets.
Idk about this news specifically but oracle cds prices are moving. The below link says 30k layoffs may hit Oracle which I feel is a bit hyperbolic so this article may not be grounded in reality.
In my experience, it’s basically impossible to accurately measure productivity of knowledge work. Whenever I see a stat associated to productivity gain/loss I get skeptical.
If you go the pure subjective route, I’ve found that people conflate “speed” or “productivity” with “ease.”
The METR study has a unique approach to measuring productivity. It’s the only one I’ve seen that holds water.
I don’t think I can do the approach justice here, but the short version is that they have the developer estimate how long a change will take, then randomly assign that task to be completed with AI or normally and measure how long it actually takes. Afterwards, they compare the differences in the ratios of estimates to actuals.
This gets around the problem of developer estimates being inaccurate by comparing ratios.
They want the Trump Jr. and Chamath connections to be forgotten by the time the media and YouTubers get active again after New Year. It's a standard media tactic.
Honestly, it's basically an accepted fact that the best thing to do is just to never respond anything that media/people say. If you respond, the conversation will continue. If you just say nothing, they'll move on to the next topic, because everyone needs to keep their audience engaged. With no response, your audience isn't interested either, so you have to move onto something else.
A lazy easy cheap shot. But do you deny these aspects from the article are not coming? Or won't be still here in 5 years?
- Addition of more—and faster—memory.
- Consolidation of memory.
- Combination of chips on the same silicon.
All of these are also happening for non AI reasons. The move to SoC that really started with the M1 wasn't because of AI, but unified memory being the default is something we will see in 5 years. Unlike 3D TV.
We just had a series of articles and sysadmin outcry that major vendors were bringing 8gb laptops back to standard models because of the ram prices. In the short term, we're seeing a reduction.
In terms of demand, anecdotally-speaking I can certainly see this influencing some decisions when other circumstances permit. Many people I know are both excited for new and better games, and equally exited about running LLM/SD/etc models locally with Comfy, LM studio and the like
- People wanting more memory is not a novel feature. I am excited to find out how many people immediately want to disable the AI nonsense to free up memory for things they actually want to do.
- Same answer.
- I think the drive towards SOCs has been happening already. Apple's M-series utterly demolishes every PC chip apart from the absolute bleeding-edge available, includes dedicated memory and processors for ML tasks, and it's mature technology. Been there for years. To the extent PC makers are chasing this, I would say it's far more in response to that than anything to do with AI.
The move to SoC happened long before the M1, it was the state of things in the ARM space for over a decade, and most x86 laptops have been SoCs for quite some time.
The line between betting, investing, and insurance is blurring. For example, if I want to hedge my exposure to tariffs on copper wire, I can now "bet" that new tariffs will be implemented on metals.
That's a good thing? But it opens up a Pandora's box of bad things, too.
The difference this time is the scale is orders of magnitude larger. Will be interesting to see how they (importers and CBP) work through this.
reply