Regarding the alleged performance benefits of using a public CDN: if there ever was a cache hit because a user visited one site then yours that coincidentally used the same CDN and query version (pro tip: never happened), and in a short enough time to not suffer a cache eviction, the user did not notice.
When the CDN was slow, every user noticed and thought your website was slow.
You gave away free analytics and made your website worse, there wasn't even a trade off.
There definitely was a long long time on the web where everyone a half dozens versions of jQuery.
There's probably a dozen versions of react that make up a healthy 20% of the web.
> and in a short enough time to not suffer a cache eviction
These assets are cached by version. They can be basically immutable forever, with maximum expiration.
As commented elsewhere, you don't even have to give away free analytics. Just set a referrerPolicy on your <script> tags. And add subresource integrity to protect from the CDN being compromised.
I get that the Storage Isolation needs forced us to blow up the real awesomeness of CDNs. I'm ok with it. But it feels wild me to me to say that the cross-site caching wasn't worth anything. It was a super excellent path to fast loading, was enormously enormously helpful, especially for the age, when very few people had reliably low latency high bandwidth broadband.
Exactly this… even before cache partitioning the benefits of public CDNs were overstated due to the connection overhead, and range of library versions sites used
It won't work today, but was this really so improbable in the past? There was a time when jQuery was so ubiquitous to be almost a "de-facto standard" at it also had a "canonical url" to load from (per version at least). Why would it have been so improbable that a user visited two or more sites that used the same jQuery version and also included it from the canonical url?
There were multiple free public CDNs, multiple versions of jQuery, cache evictions happen much faster than you're thinking, users had less RAM and disk in the past.
I can only imagine "naturally" producing this scenario by browsing HN in 2010 and clicking every link on the front-page and hoping at least 2 were indie blogs or startups that don't take security seriously, and even then if they're not consecutive, the website between them would have likely evicted the cache.
The website is also slow if the browser requests a resource from an origin that is very far away. Nothing is free. The typical benefit for large-scale CDNs is TTFB latency reduction, which tends to be felt rather acutely by users; and the cache improvements which happen on follow up requests. A big part of this for example is optimizing the TCP handshake (major providers have warmed backbones that skip the 3-way handshake across long paths, so users don't pay for 3xRTT across the Atlantic.) Nobody on this website ever mentions this fact because nobody here ever actually tests their sites under conditions like "the user has bad internet and is really far away from DigitalOcean NYC-3"
> if there ever was a cache hit because a user visited one site then yours that coincidentally used the same CDN and query version
This is not how modern browsers work anymore, and the phrasing makes think it isn't a rhetorical statement?
> the user did not notice.
... That's kind of the whole point; they didn't think the website was slow. They didn't think anything. The website opened and then they went on using it. You do understand this, right?
> This is not how modern browsers work anymore, and the phrasing makes think it isn't a rhetorical statement?
It's in response to the "you should totally direct link your jQuery from this CDN" propaganda.
> That's kind of the whole point; they didn't think the website was slow. They didn't think anything. The website opened and then they went on using it. You do understand this, right?
When the CDN was slow, every user noticed and thought your website was slow.
You gave away free analytics and made your website worse, there wasn't even a trade off.