Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Web 3.0 is pretty much that. Load 10MB of JavaScript libraries sequentially, then every element on the page needs a new loader, 30 HTTP requests, and a web socket.


And that's after a 10-minute build process using some node-thing, which, just to minimize and spit out static assets, needs to pull in 14,000 files of dependencies.


Is modern web really that bad, and are those the reasons that just visiting some of the popular websites overwhelms my brand new cpu?


Slightly exaggerated, but that sort of thing absolutely happens.


The 14,000 files is an actual measurement of the node_modules folder for the build of a static site I've seen. npm is inane.


Straw man. This is actually an EBKAC. Obviously, because I can do the same asinine thing in any language offering the ability to pull in dependencies.


How else are you going to create the "side effect" of allowing the page owner, the hosting company, several CDNs, 10 social media "partners", an ad-network or three, and google[1] to each log page views?

[1] gotta feed the crack^Wanalytics addiction




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: