TL;DW: The difference in tax revenue between a surface parking lot and a business with subterranean parking is so vast, that cities can justify using value to underwrite the loans necessary for developers to do the work. (Called "Tax Increment Financing") This model is proving extremely successful with cities that try taking it on.
This means Apple is literally going to take nearly 3x in fees from Patreon's customers than Patreon is taking from their own customers.
My understanding is that the reason the number 30% is so magical is a historical anomaly. When software was physically distributed back in the day, 15% of the MSRP was reserved for the distributor and another 15% for the retailer. When these digital marketplaces were set up, the companies just said "well, we're the distributor and the retailer, so we'll keep both". Forgetting the fact that the cost to distribute and retail the software is literally pennies on the dollar of what it used to be.
I think the irony in this case is that this is a greed problem of their own making. When Steve Jobs announced that apps on the original iPhone would only be $1-$3, he set off the first enshittification crisis in the software industry. In 2008, Bejeweled cost $19.99 if you wanted to buy it on the PC. On the iPhone it was $0.99! This artificially low anchor price is what kicked off the adoption of ad and subscription driven software models in the first place.
My understanding was that the retailer margin was 50% and the distributor margin was 10%. So Apple/Steam/etc went "half of 60% is a great deal".
Of course the retailer margin is never actually 50%. That's theoretical if 100% of product is sold at MSRP. Actual retail margins are about 25% because of sales, write-offs, et cetera.
OTOH when there's a sale in Steam, they still get their full cut (of the reduced price).
I remember writing apps for PalmOS (long time ago) distributors like PalmGear took over 60% from international developers like me, plus they held your earnings until you hit a minimum payout threshold. Add bank fees on top of that, and it was basically not worth developing for the platform. 30% felt like a godsend in comparison. (I'm not defending the Apple / Google tax)
From what I could find, it does seem that major retailers back in the day (CompUSA, Circuit City, etc) were only making 15% margin on software sales. This is much lower than other product categories - but also software didn't take up much floor space.
its agency model vs retail model. Recall - Amazon hated the agency model, where the publisher sets the price (and 30% cut goes to app store - Jobs sold this as amazing deal). Retail model the retailer sets the price, and the publisher is guaranteed the wholesale price. Amazon preferred the latter because they competed on dynamic price setting. this was so long ago we forget.
It coupled the small floor space with high prices, and an extreme overall easiness of management (low weight, resistance to small impacts, possibility of stacking, etc).
So that margin not only had to pay for small management costs, and had small opportunity costs on the floor space, but it also was divided by a large unitary price.
Had no idea about the history and the 15%/15% split but when the topic comes up I just remember how good the 30% seemed back in, what, 2008?
It made perfect sense that this shiny new iOS platform would take 30% of a cheap app to ensure that it matches the high quality of iOS. These were little productivity apps and games at the time.
This however - I just don't understand what the need is for an app at all for Patreon. Isn't this a website/platform kind of thing? Wouldn't an app just be an additional window into the Patreon platform?
What's next - 30% of my pizza price goes to Apple because I ordered it on my phone?
> What's next - 30% of my pizza price goes to Apple because I ordered it on my phone?
You joke but this already happens with places like DoorDash. They take 30% of the order from the store owner after adding their own additional fees to the order that customers pay.
Someone I know owns a pizza store and his prices are 30% higher on DoorDash but some people still pay. The big difference is it's not a monopoly. He offers regular delivery at normal store prices and 95% of his deliveries go through that.
I was working for a small software company at the time and we thought it was outrageous. We were selling our software online direct through our own web site and the cost was far lower. A few percent for credit card processing fees, and the server/bandwidth cost was inconsequential.
>This however - I just don't understand what the need is for an app at all for Patreon. Isn't this a website/platform kind of thing? Wouldn't an app just be an additional window into the Patreon platform?
That's the other part of the surrogate war happening with mobile. The web was unregulated and hard to profit off of, so Jobs took great strides to push the "there's an app for that" mentality that overtook that age. This had the nifty side effect of killing off flash, but it's clear the prospects didn't stop there. Not to mention all the other web hostile actions taken on IOS to make it only do the bare minimum required to not piss off customers.
It very much could just be a website with no reliance on IOS as a dependency. But Apple clearly doesn't want that.
>What's next - 30% of my pizza price goes to Apple because I ordered it on my phone?
I'm pretty sure Apple has discussed things exactly like this.
Their upper management really does tend to think that 30% of any monetary transaction on an Apple platform belongs to them. Too bad our government is too busy being ran by the billionaires to do anything about these abuses from billionaires.
Really hope the 2nd wave of Sherman hits these bit tech companies hard if/when this regime inevitably falls. I just hope there's something left of America when it happens.
Steve Jobs never announced a price ceiling for apps on the App Store. The well-known I Am Rich app for iPhone retailer for $999, the actual price ceiling.
30% might be fair when you have a choice of either marketing and selling your app yourself, or just using an app store to do everything for you. But when you are forced to use the app store, things get really stupid really fast.
Apple still insists that the app store "provides value" for developers. They simply can't comprehend the harsh reality that these days, for most developers, the app store isn't the godsend service that helps their app get discovered, but instead an asinine bureaucratic obstacle they have to clear, and then regularly attend to, to have an iOS app at all.
The Mac app store, being optional for developers, is a good example of how much people actually want something like this.
> Apple still insists that the app store "provides value" for developers. They simply can't comprehend the harsh reality that these days, for most developers, the app store isn't the godsend service that helps their app get discovered, but instead an asinine bureaucratic obstacle they have to clear, and then regularly attend to, to have an iOS app at all.
Oh, no, they can comprehend, they just don't care. Apple controls access to a valuable pool of business, and they are going to extract as much value as possible from people wanting access to that pool. And, of course, they are going to try to burnish it with marketing speak, but that doesn't mean they believe their own marketing.
Counterpoint: "Low wage underclass" has described most of humans for all of existence. It is only in very recent history that we have had a large middle class. And while the middle class is slipping, I don't think the message is that we need to "redesign society".
Society has been working well recently (on the grand scale) - we just need to tweak some of the settings so that we don't backslide into aristocracy and feudalism.
"All of existence" is pretty wildly wrong. "All of civilization" might be a bit more on track. I wouldn't describe hunter-gatherer societies as particularly hierarchical. There may have been individuals in charge, but the concept of a leisure nobility class didn't seem to form until we developed agriculture.
I was specifically talking about quality of life, I wasn't even thinking of hierarchy.
Although, that difference speaks a lot about how one might see this argument. I suppose some people may be willing to have a worse life if the tradeoff was a more egalitarian world.
I am unclear how you got that out of my statement at all. I am arguing that we have made huge leaps of progress that we should not be willing to give up.
A good illustration of this is in Hans Rosling's books. We're making unprecedented bounds on metrics that matter - like childhood poverty, disease, illiteracy, hunger, child labor, violent crime, lead usage, etc.
Some of these things we are at risk of backsliding on, but for even the poorest person in America the quality of life is so much better today than it was even 50 years ago.
But that isn't nearly good enough, and it's much worse for people in many other countries.
And that is largely despite many structural aspects of our society. There have been some improvements to social structures, but almost all quality of life improvements have been from technology gains.
The social structures are fundamentally based on elitism and exploitation. The prevailing counterviews seem to be basically 1950s style centralized planning.
I'm not saying we should throw the baby out, but we need a more fair, refined, and technologically sound foundational worldview.
I don't think we should abandon money or centralize things. But we do need, for example, protocols and/or protocol registries enforced by government for sharing information effectively, such as about energy and resources. We also need the monetary systems to be integrated into truly democratic government in such a way that resources and power are distributed in a sane way.
> But Amodei argues that AI increasingly matches the full range of human cognitive abilities, so it will take away jobs drafting memos, reviewing contracts, and analyzing data that might otherwise emerge. A customer service rep who retrains as a paralegal would find AI waiting there, too. "AI isn't a substitute for specific human jobs but rather a general labor substitute for humans," he wrote.
A) This is a bunch of marketing hype disguised as performative commiseration. "Our product is so good we need to think about changing laws". Maybe that's true, but let's not get ahead of ourselves. We don't even know the true costs associated with AI or how much better it's even going to get.
B) All of these previous inventions were also touted as "general labor substitutes" and none of them ended up being true.
Let me make an alternate case - coffee in this country used to be an entirely automated process. Everyone has a dedicated coffee robot in their house. But our tastes have so shifted that now the average person gets multiple cups of handmade coffee a week. An entirely new job category called "barista" was introduced, and today we have over half a million of them. They are not high wage jobs, but they are comparable to something like the customer service rep job that Amodei is apparently worried about.
Even if AI were to take away massive swathes of white collar jobs (I'm still skeptical), the historical expectation is that new, unforseen labor categories open up. Just like nobody inventing a computer thought QA tester or video game streamer or spreadsheet technician was going to be a job in the future.
It's like an inverse Baumol's Cost disease - if AI does tank the value of all of these services, all of the services that require, I dunno - physical hands, go up in value. All of the niche things AI can't do suddenly become all the more valuable.
I've also heard this referred to as jevons paradox.
I use AI to build AI systems that are being used and paid for by real users. I'm in the trenches doing customer discovery, designing UX and producing product. Everything I do in my work either leverages AI or is built on AI capabilities.
The LLMs we have today are not capable of replacing most human knowledge worker jobs, and it's not clear to me they will be anytime soon. They are powerful tools but they are not remotely as generally intelligent as people.
The applications being built are more like power tools. The power drill and power saw didn't replace carpenters, it just enabled them to build more and faster.
People see a language calculator doing things that seem super human and assume it will replace human knowledge work, but without realizing the actual range of human intelligence and how it contrasts to LLMs.
There were people with the job title "calculator", and we don't have those narrow jobs any more, but there are more people using electronic calculators/computers to do calculations in their work than ever before.
There are at least three important misconceptions in thinking that AI is going to replace all human labor anytime soon:
- underestimating human intelligence
- overestimating current AI and it's rate of progress
- underestimating current unmet and future demand for getting stuff done
Even if AI makes knowledge work 99% more efficient, it doesn't mean there will be 99% fewer knowledge workers... More likely it means that there will be 100x or more demand for knowledge work, and the demand for the human components will stay the same or even grow.
A good example of this happening was the introduction of the ATM. ATMs actually increased the need to bank tellers, because so many more people ended up opening or using a bank account once getting money in and out became more convenient.
(Of course, the prestige and pay of being a teller did take a hit as well).
The point youre making is important - these geeks have no clue where things will land. The only person Ive seen involved in technology that was pretty good at predicting was Steve Jobs. But he wasnt really a geek, more of a hippie - meaning he had an understanding of people and society. Amodei definitely does not.
It's pretty telling that when the administration deployed troops across America, nobody was shot (remember that?). The troops had no unpopular mandate to enforce, they were not under the direct tactical command of the administration, and frankly were better trained than this ICE para-military.
If you watch the videos of what these raids are like, they are unbelievably chaotic and uncoordinated. Their "arrests" are wild and uncontrolled. They're also doing incredibly stupid things like stepping in front of cars or chasing down protesters.
It's pretty telling how many police agencies have issued statements saying ICE has been uncooperative - they are using dangerous tactics (for both bystanders and themselves) and they have no training or interest in de-escalation. They would be much more successful working with local PD but they clearly don't want anything to do with even vague accountability.
We watched some games last year on the all-22 (because it was the only way we could watch it on ESPN+).
You definitely lose a lot by not having the close-ups, the slow motion replay, etc. That said, you actually get to see many more of the little things that are kind of cool - what teams do to set up for a play, what coaches are doing between plays, how players and officials interact, etc.
It's often pointed out that the ball is only live for about 18 minutes of every game. But what makes football so fascinating is that for every play there are 22 different jobs being executed at the same time. And the jobs change every play.
For something like baseball, you can basically see everything happening in frame the whole time. But for football, the game is so information dense that you can spend hours unpacking the game afterwards to see what was going on. That's why replays and highlights are so much more satisfying. And that's what makes it fun to analyze and or watch videos during the week - you can find all sort of unique or interesting aspects just watching the same play again and analyzing a different personnel group.
It also explains why cameras are everywhere (besides them being just flat out cheaper for high school games, etc). Film study is a crucial part of the game for players - more than in any other sport.
It's interesting how quickly the quest for the "Everything AI" has shifted. It's much more efficient to build use-case specific LLMs that can solve a limited set of problems much more deeply than one that tries to do everything well.
I've noticed this already with Claude. Claude is so good at code and technical questions... but frankly it's unimpressive at nearly anything else I have asked it to do. Anthropic would probably be better off putting all of their eggs in that one basket that they are good at.
All the more reason that the quest for AGI is a pipe dream. The future is going to be very divergent AI/LLM applications - each marketed and developed around a specific target audience, and priced respectively according to value.
In my lab, we have been struggling with automated image segmentation for years. 3 years ago, I started learning ML and the task is pretty standard, so there are a lot of solution.
In 3 months, I managed to get a working solution, which only took a lot of sweat annotating images first.
I think this is where tools like OpenCode really shine, because they unlock the potential for any user to generate a solution to their specific problem.
I don't get this argument. Our nervous system is also heterogenous, why wouldn't AGI be based on an "executive functions" AI that manages per-function AIs?
TL;DW: The difference in tax revenue between a surface parking lot and a business with subterranean parking is so vast, that cities can justify using value to underwrite the loans necessary for developers to do the work. (Called "Tax Increment Financing") This model is proving extremely successful with cities that try taking it on.
reply