I'm still happy with the configuration we have where my map is a little slower than maybe I'd like it to be (though honestly, I just loaded maps.google.com and moused around randomly and... it's fine? Certainly not so slow I'm bothered by it) but the mapping app also can't crash my computer due to the three layers of abstraction it's running on top of. Because that would suck.
If you're curious where the time goes, btw... Most of the visible delay in Google Maps can be seen by popping the browser inspector and watching the network tab. Maps fetches many thin slices of data (over 1,000 in my test), which is a sub-optimal way to do networking that adds a ton of overhead. So if they wanted to improve maps significantly, switching out for one of the other protocols Google has that allows batching over a single long-lived connection and changing the client and server logic to batch more intelligently could do it. I doubt they will because most users are fine with the sub-three-second load times (and engineering time not spent on solving a problem most users don't care about is time spent on solving problems users do care about). You're seeking perfection in a realm where users don't care and claiming the engineers who don't pursue it "suck;" I'd say those engineers are just busy solving the right problem and you're more interested in the wrong problem. By all means, make your mapping application perfect, as long as you understand why the one put out by a company with a thousand irons in the fire didn't.
Also, I think the analogy was great, but you reached the wrong conclusion. ;) That is how large-scale engineering works. Scheduling becomes the dominant phenomenon in end-to-end performance. Games have huge advantages on constraining the scheduling problem. General-purpose apps do not. Hell, to see this in action in a game: Second Life's performance is crap because the whole world is hyper-malleable, so the game engine cannot predict or pre-schedule anything.
Since it is software nothing is set in stone, everything can be changed, we know how to do it really really fast and really really efficiently.
People did incredible things to improve almost everything.
To me this means all of the performance loss is there for no reason. All we need is for people to stop making excuses. I for one know how to get out of the way when better men are trying to get work done.
You are the engineer if not the artist, impress me! Impress the hardware people! Impress the people who know your field. Some attention to consumers wishes is good but Gustave Eiffel didn't build his tower because consumers wanted that from him.
Why would you even have tap water if the well is down the street? A horse and carriage is just fine, it is good enough for what people need. no? What if our doctors measured their effort in "good enough's" and living up to consumer expectation only?
The hardware folk build a warp capable star ship and we are using it to do shopping on the corner store because that was what mum wanted. Of course there is no need to even go to Mars. It's missing the point entirely you see?
If you're curious where the time goes, btw... Most of the visible delay in Google Maps can be seen by popping the browser inspector and watching the network tab. Maps fetches many thin slices of data (over 1,000 in my test), which is a sub-optimal way to do networking that adds a ton of overhead. So if they wanted to improve maps significantly, switching out for one of the other protocols Google has that allows batching over a single long-lived connection and changing the client and server logic to batch more intelligently could do it. I doubt they will because most users are fine with the sub-three-second load times (and engineering time not spent on solving a problem most users don't care about is time spent on solving problems users do care about). You're seeking perfection in a realm where users don't care and claiming the engineers who don't pursue it "suck;" I'd say those engineers are just busy solving the right problem and you're more interested in the wrong problem. By all means, make your mapping application perfect, as long as you understand why the one put out by a company with a thousand irons in the fire didn't.
Also, I think the analogy was great, but you reached the wrong conclusion. ;) That is how large-scale engineering works. Scheduling becomes the dominant phenomenon in end-to-end performance. Games have huge advantages on constraining the scheduling problem. General-purpose apps do not. Hell, to see this in action in a game: Second Life's performance is crap because the whole world is hyper-malleable, so the game engine cannot predict or pre-schedule anything.