Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.

It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going; it gives almost no affordances for exploration and cross-referencing.

> Remember when building a computer required work and research and took hours.

As someone who builds their own PC every couple years: it still does. It's actually worse now, due to the amount of products on the market and price segregation involved. Two PCs ago, I didn't have to use two parts compatibility tools, several benchmarking sites and a couple of friends, and didn't have to write CSS hacks for electronics stores, just to be able to assemble a cost-effective PC.

> But I also don't need to spend an entire weekend setting up a PC and a printer for my mom anymore either.

You don't? Printer drivers are only slightly less garbage than they were, but now there's also less knobs to turn if things go wrong. When my Windows 10 doesn't want to talk to a printer or a Bluetooth headset, all I get to see is a stuck progress bar.

Bottom line: I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades (arguably for business reasons: the easier it is for people to make sense of information, the harder it is for your sales tactics to work).

> These posts always have a slight stench of elitism disguised as disappointment.

That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit, but it's the regular Joes and Janes that get the short end of the stick?



> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;

As it turns out, that is probably the most popular use case for maps in the world.

Note also that for most smartphone users of Google Maps the use-case is actually much broader than that. The UI flow also totally accounts for users who only know where they are going—thanks to GPS and Google Maps knowing where you are often isn't necessary.

I'm confused by the complaint that the "Maps" app only caters to the 90-percentile use case for maps, but doesn't cover the other uses-cases well.

> I agree 100% with the author that one of the primary functions of a computer is enabling easy cross-referencing of information. This ability has been degrading over the past decades

I just find this not the case at all. For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!

For non-expert users the information in the world is orders of magnitude more accessible than it used to be.


> As it turns out, that is probably the most popular use case for maps in the world.

There's a very subtle point the Twitter thread was making here. This use case may be most popular not because it's what the people want, but because it's all that they can easily do. The tools you use shape how you work, and what you can work on.

FWIW, I learned to pay attention when the machine doesn't help me do what I want (it's a good source of ideas for side projects), so I've noticed that I do want a map that works like a map - something I can explore and annotate. I do sometimes resort to screenshotting GMaps, or photographing paper maps in the past, just to have a map on my phone. I've seen non-tech people do that as well. So I can be confident that it's not just me and the Twitter thread's author that want this.

> For expert-users the tools that existed decades ago are still there and still usable. Or you can craft your own!

The Twitter thread's point (and mine as well) is that expert users can and do work around this degradation. It's a frustrating chore, but it's not all that difficult if you can code a bit and have some spare time. It's the experience for the non-expert users that has degraded in a way they can't fix for themselves.

> For non-expert users the information in the world is orders of magnitude more accessible than it used to be.

The way it's accessible, it's almost as if it wasn't. Sure, you can easily Google random trivia. But good luck trying to compare things. That's always a pain, and usually involves hoping that someone else made a dedicated tool for similar comparisons on the topic you're interested in, and that the information hardcoded in that tool are current and accurate. Notably, the tools you use for searching have no support for comparing.


> so I've noticed that I do want a map that works like a map - something I can explore and annotate.

I don't doubt that there are use cases for a map that works this way. Even if Google Maps covers 80-90% of the use-cases for mapping, mapping is an absolutely massive domain. 10-20% of use-cases still represents a huge volume.

But it doesn't have to be Google Maps. It actually seems worse to be for one "maps" app try to handle all possible use-cases for a map.

Why isn't there a separate different tool that handles the use-case you describe?

I guess, going back to the original thesis, what would the "1983" replication of what Google Maps does, but faster. Or, what would the "1983" version of the mapping behavior you wanted.

In the thread they say:

> in 1998 if you were planning a trip you might have gotten out a paper road map and put marks on it for interesting locations along the way

I'd argue that this use-case still exists. Paper road maps haven't gone away, so this is still an option. People largely don't use this and prefer Google Maps or other digital mapping tools for most of their problems. Why? If you gave me both the 1998 tools and the 2020 tools, for 95% of the options I'm going to use digital tools to solve it because they let me solve my problems faster and easier. I know this because I have easy access to paper maps and I never touch them. Because they're largely worse at the job.

> There's a very subtle point the Twitter thread was making here. This use case may be most popular not because it's what the people want, but because it's all that they can easily do. The tools you use shape how you work, and what you can work on.

Ultimately, my point above is my response to that. None of the old tools are gone. Paper maps are still available. And yet they have been largely abandoned by the large majority of the population. I agree that there are limitations to our current digital tools, and I hope in 2030 we have tools that do what the article describes. But the 1983 version of the tools are worse for solving problems than the current tools, for most people.


pretty much all games in the early 80's had [so called] pixel perfect scrolling. Each frame showed exactly what was required.

Today it is entirely acceptable for a map to be a jerky stuttering pile of crap. The same goes for the infinite scroll implementations. Its preposterous to start loading things after they are needed.

There is a good analogy with making things in the physical world. The professional doesn't start a job before he has everything he needs to do it, the amateur obtains what he needs after he needs them.


Games have huge advantages of constraint of application that mapping applications don't. You can get pixel perfect scrolling when you constrain the max rate the user can pass through the dataset, you deny them random access into the dataset, your dataset isn't trying to represent a space the volume of planet Earth, etc.

There's a huge gulf between the use cases you're comparing here, and I don't believe for one second that loading the Google Maps dataset into Commander Keene's engine would make for a better experience.

(Also, not to be overly pedantic, but "The professional doesn't start a job before he has everything he needs to do it" pretty much classifies all building construction as unprofessional. The professional doesn't magically have everything on-hand, especially bulky or expensive resources; they have a plan for acquiring them at reasonable rates of input and mitigation strategies if that plan can't be followed)


Illl ignore the pedantic part since it was just an analogy, if it doesn't work for you there is little to talk about.

> Games have huge advantages of ....

I have thoughts like that but I consider them "making up excuses". You don't have to see it that way but I can't see someone fix a problem by making up excuses for it to exist. For me it is just like you can always come up with an excuse not to do something.

8 gigabytes of memory / 64 kilobytes of memory = 125 000 times as much memory.

14 gigahertz (4 cores x 3.5 GHz) / 1.023 MHz = 13 685 times as much processor power.

4108 gigahertz (2560 Cuda cores / 1605 MHz) / 2 MHz = 2 054 000 times as much video power.

Can I just call 2 Mhz memory bandwidth 16 Mbit/s?

If so, 500 Mbit / 16 Mbit = 31.25 fold the bandwidth

We are not rendering many layers of colorful animated game content. A map is just a bunch of boring lines. The modern screen however is a lot bigger. I see a glimmer of hope for an excuse!

320x200 px = 64000 px

1920x1080 px = 2073600 px

2073600 / 64000 = 32.4 times the screen size

meh?

We must applaud everyone involved in making all this hardware progress. It truly blows the mind and defies belief. No one could have imagined this.

Then came weee the software people and... and.....

I'm cringing to hard to continue writing this post.

The numbers don't lie, we suck. Lets leave it at that.


I'm still happy with the configuration we have where my map is a little slower than maybe I'd like it to be (though honestly, I just loaded maps.google.com and moused around randomly and... it's fine? Certainly not so slow I'm bothered by it) but the mapping app also can't crash my computer due to the three layers of abstraction it's running on top of. Because that would suck.

If you're curious where the time goes, btw... Most of the visible delay in Google Maps can be seen by popping the browser inspector and watching the network tab. Maps fetches many thin slices of data (over 1,000 in my test), which is a sub-optimal way to do networking that adds a ton of overhead. So if they wanted to improve maps significantly, switching out for one of the other protocols Google has that allows batching over a single long-lived connection and changing the client and server logic to batch more intelligently could do it. I doubt they will because most users are fine with the sub-three-second load times (and engineering time not spent on solving a problem most users don't care about is time spent on solving problems users do care about). You're seeking perfection in a realm where users don't care and claiming the engineers who don't pursue it "suck;" I'd say those engineers are just busy solving the right problem and you're more interested in the wrong problem. By all means, make your mapping application perfect, as long as you understand why the one put out by a company with a thousand irons in the fire didn't.

Also, I think the analogy was great, but you reached the wrong conclusion. ;) That is how large-scale engineering works. Scheduling becomes the dominant phenomenon in end-to-end performance. Games have huge advantages on constraining the scheduling problem. General-purpose apps do not. Hell, to see this in action in a game: Second Life's performance is crap because the whole world is hyper-malleable, so the game engine cannot predict or pre-schedule anything.


> If you're curious where the time goes, btw...

Since it is software nothing is set in stone, everything can be changed, we know how to do it really really fast and really really efficiently.

People did incredible things to improve almost everything.

To me this means all of the performance loss is there for no reason. All we need is for people to stop making excuses. I for one know how to get out of the way when better men are trying to get work done.

You are the engineer if not the artist, impress me! Impress the hardware people! Impress the people who know your field. Some attention to consumers wishes is good but Gustave Eiffel didn't build his tower because consumers wanted that from him.

Why would you even have tap water if the well is down the street? A horse and carriage is just fine, it is good enough for what people need. no? What if our doctors measured their effort in "good enough's" and living up to consumer expectation only?

The hardware folk build a warp capable star ship and we are using it to do shopping on the corner store because that was what mum wanted. Of course there is no need to even go to Mars. It's missing the point entirely you see?


> pretty much all games in the early 80's had [so called] pixel perfect scrolling. Each frame showed exactly what was required.

> Today it is entirely acceptable for a map to be a jerky stuttering pile of crap. The same goes for the infinite scroll implementations. Its preposterous to start loading things after they are needed.

This doesn't make any sense to me. In which game from the early 80's could I view accurate map data for any region on the earth, and quickly scroll across the planet without any loading artifacts?

Of course you can manage pixel-perfect scrolling if all of your data is local and fits in memory. That's not anywhere close to the same domain as maps.


> ... I do want a map that works like a map - something I can explore and annotate.

You can do that with Google maps, if you're logged in. You can create a custom map, with multiple places marked, and with annotations. You get a URL for it. And looking at it later, you can zoom in as needed.


>FWIW, I learned to pay attention when the machine doesn't help me do what I want (it's a good source of ideas for side projects), so I've noticed that I do want a map that works like a map - something I can explore and annotate. I do sometimes resort to screenshotting GMaps, or photographing paper maps in the past, just to have a map on my phone.

https://mymaps.google.com has that, and I think google has a 'mymaps' app too:

https://support.google.com/mymaps/answer/3433053?co=GENIE.Pl...


How is a world where search is convenient and automated but comparison isn't less accessible than a world where neither search nor comparison are convenient and automated?


> It's the experience for the non-expert users that has degraded in a way they can't fix for themselves.

Why not?


because it's not practical to write a Google Maps (or whatever) replacement most of the time? It's a ton of work. Sure we can smooth over rough edges but actually implementing lots of missing non-trivial things? Or, making things faster that rely on some 3rd party back end? Usually either not an option, or not an option without recreating the entire front end.


As it turns out, that is probably the most popular use case for maps in the world.

Your sentence starts out like you're stating a fact, but then peters out with "probably."

Do you have data on this?

I'd posit the opposite: That exploration is far more used in online maps.

Aside from Uber drivers, SV types, and wannabe road warrior squinters, nobody uses maps for their daily activities. People know where they're going and they go there without consulting technology. That's why we have traffic jams.


I think this isn't true at all. The vast majority of people I know use maps solely find something like an ATM/gas station/coffee shop/etc and then figuring out how to get there or how long it would take to get there.

We don't have data, but the only people that do are Google and they have designed their UX around this use case. And it has become one of the most used tools on Earth. If we are going to play the 'your comment is bad because you don't have data' game, I think the onus is really on you to prove that the company with all the data is getting it wrong.


> Aside from Uber drivers, SV types, and wannabe road warrior squinters, nobody uses maps for their daily activities. People know where they're going and they go there without consulting technology. That's why we have traffic jams.

I might know where I'm going but I don't always know how to get there so I use Google Maps all the time. I don't use it for my daily commute but if I'm going to a friend's or something I'll usually use it.

When I sit in my friends cars we also use it all the time. Often we're going to a restaurant or some other location that we don't often go to.

For exploration my friends pretty much just use Yelp or Google Search. I sometimes but rarely use Google Maps for this because I find that the reviews are usually much lower quality and Google Maps is too slow (I have an old Pixel 2)


I think it depends where you are driving. I live in a large city and there are way more people using maps for their daily activities that you might think. In my parents neighborhood in the suburbs, and in rural areas, I think you're probably right. If there are only a few ways to get somewhere, you probably aren't using maps.


> Your sentence starts out like you're stating a fact, but then peters out with "probably."

Fair enough. I'd argue that I put the word probably in the first half of the sentence, so I don't see it as "petering out", but fair enough.

I will agree with the critique that I'm making an assumption about mapping use cases, and don't have hard data. I'm happy to be corrected by any real data on the topic.


People use maps for daily activities all the time. Not to find where they're trying to get to, but for how to get somewhere more quickly - ie. public transport directions, or if it's quicker to walk than go by bus/train/tram.


> As it turns out, that is probably the most popular use case for maps in the world.

I don't think fully deciding an entire path was ever the major use of paper maps. But well, now that apps only allow for this usage, I am pretty sure it's the most popular one.

Personally, I very rarely use maps this way. What makes Google shit useless for me (worse, actually, because it insists on opening by default, and people now decided they can send a Google marker instead of an address, and Google makes it impossible to retrieve data from a marker).

Most people I know, faced with the option of turn by turn directions or nothing choose nothing nearly all the time.


Personally, I never use turn-by-turn even to walk around in unknown places, because it loses context. Instead, I stop at the screen that shows the route options; I don't press "Start" to activate the nav. Besides being more useful when walking around, it doesn't eat such ridiculous amounts of battery power that the turn-by-turn nav does for some reason.


That's the major usecase for mobile GMaps, because it absolutely sucks for any other usecase, and thus people can't use it for any other purposes really. Except maybe looking up timetables for a previously known public transport stop (only in certain countries, some have much superior ways to plan public transport travel); planning a transport between two stops if you're not standing at one of them right now probably requires a phd.

Also if your mapping app is good, the people will use it as little as possible. That's certainly my desire as a turist. So maybe perverse incentives are at play, if wrong usage metrics are used to improve the app.

Anyway, I'm used to using paper maps from the childhood, so ergonomics of mobile phone apps is really irritating, because of how paper maps work so much better for me most of the time.


"As it turns out, that is probably the most popular use case for maps in the world."

...

"I'm confused by the complaint that the "Maps" app only caters to the 90-percentile use case for maps, but doesn't cover the other uses-cases well."

Seems like there is an awful lot of room between "probably the most popular use case" and "90-percentile use case", no?


I think you’re measuring use-cases by volume (i.e. Monthly Active Users) instead of by mass (i.e. number of man-hours spent engaged with the app’s UI by those users.)

Certainly, a lot of people use Google Maps for turn-by-turn directions. This means that they interact with the actual UI and visible map tiles once, at the beginning of the trip; and then from there on are given timely voice directions every few minutes, which they can maybe contextualize by looking at the map on the screen. Even if you count the time they spend hearing those directions as time spent “interacting with the UI” of Google Maps, it adds up to fewer man-hours than you’d think.

Meanwhile, I believe that there are a much larger number of collective man-hours spent staring at the Google Maps UI—actually poking and prodding at it—by pedestrians navigating unfamiliar cities, or unfamiliar places in their city. Tourists, people with new jobs, people told to meet their friends for dinner somewhere; etc.

And the Google Maps UI (especially the map tiles themselves) is horrible for pedestrians. Half the time you can’t even figure out the name of the road/street you’re standing on; names of arterial roads (like main streets that happen to also be technically highways) only show up at low zoom levels, while names of small streets barely show up at the highest zoom level. And asking Maps to give you a pedestrian or public-transit route to a particular place doesn’t fix this, because GMaps just doesn’t understand what can or cannot be walked through. It thinks public parks are solid obstacles (no roads!) while happily routing you along maintenance paths for subway systems, rail lines, and even airfields. (One time it guided me to walk down the side of an above-grade freeway, outside the concrete side-barriers, squeezing between the barriers and a forest.) And, of course, it still assumes the “entrances” to an address are the car entrances—so, for example, it routes pedestrians to the back alleys behind apartment buildings (because that’s more often where the parking-garage entrance is) rather than the front, where the door is. I don’t live here, Google; I can’t even get into the garage!

The thing is, these are such distinct workflows that there’s no reason for Google Maps to be optimizing for one use-case over the other in the first place. It’s immediately apparent which one you’re attempting by your actions upon opening the app; so why not just offer one experience (and set of map tiles) for people attempting car navigation, and a different experience (and set of map tiles) for people attempting pedestrian wayfinding?

Or, someone could just come out with a wayfinding app for pedestrians that does its own map rendering. There’s already a Transit app with a UI (and map tiles) optimized for transit-takers; why not a Walkthere app with a UI (and map tiles) optimized for pedestrians? :)


> Or, someone could just come out with a wayfinding app for pedestrians that does its own map rendering. There’s already a Transit app with a UI (and map tiles) optimized for transit-takers; why not a Walkthere app with a UI (and map tiles) optimized for pedestrians? :)

This wouldn't really make me happy. It makes more sense to integrate the walking instructions into the Transit app and be good at giving directions for multimodal transport. I need to know if I should get off the bus here, and walk through the park, or wait till three stops later, which leaves me closer as the crow flies but further away overall. The car app doesn't need to work multimodally since it's not normal to drive somewhere, walk 10 minutes, then drive somewhere else.

Google maps is still the best general purpose multimodal transport app I've used, but it could be so much better. I'm in Austria right now and it doesn't know about the Austrian buses. There's an app (OEBB Scotty) from the Austrian rail operator which I assume everyone uses instead.


Honestly, the two use cases have an obvious intersection: a planning/wayfinding session in which I want Google to compute me a route that I want to then inspect, perhaps modify, and then save into my planning session.


Only thing I'd like is for maps to not give directions for the parts that are obvious for you since you'd done that part a million times i.e. from your home, turn right to get to the freeway


> As it turns out, that is probably the most popular use case for maps in the world.

Popularity is not a measure of quality. Also, the snarky tone is unnecessary.


> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.

I know of an ERP system that somehow manages to take about 15 seconds to search an inventory of ~100k items. If you export all those items to CSV, with all their attributes (most of which are not searched), the resulting file is about 15 MB.

It is boggling how they managed to implement search this slowly (in a C++ application using MS SQL as the backend). 3.5 GHz computers, performing plain text string search at about 1 MB/s.

It is even more surprising that users feel this is not an completely unreasonable speed.

(They managed to pull this stunt off by completely not using the SQL database in the intended way, i.e. all tables are essentially (id, blob) tuples, where the blob is a zlib compressed piece of custom TLV encoded data. All data access goes through a bunch of stored procedures, which return data in accordance to a sort of "data extraction string". Search works by re-implementing inverted indices in tables of (word, offset, blob), where blob contains a zlib compressed list of matching IDs; again processed by stored procedures. The client then is wisely implemented using MS SQL's flavour of LIMIT queries which effectively cause a classic quadratic slowdown because the database engine literally has no way to fetch result rows n...m except by constructing the entire result set up to m.

Unsurprisingly the developers of this abomination claim to be competent. They also invented a funny data exchange format involving fixed field lengths and ASCII separators - some time in the 2010s.)


I see you work in enterprise! I’m sure that all of this absolutely cannot be changed because all the functionality is needed for ‘something’.


Tell me about it! A company I recently did maintenance for pays several thousand € each year for possibly the worst, most broken accounting software, despite not actually having an in-house accountant. The reason: one (1!) person in the company uses the software's (barely functioning) inventory feature and refuses to use anything else.

They're currently considering my offer to develop a custom, pixel-identical system and just replace it without telling her. I could probably do it in a week because she only ever interacts with like 4 out of the at least 100 views the app has, but I suspect she'll catch wind of this and stop it. I don't actually know what she does there besides inventory, but she seems to have more power than anyone else below C-level.


There was a time when I would have been interested in how such a system came to be, but now I think I have been around long enough to guess that someone was protecting their job, and then either got fired anyway or became technical-architect-for-life.


What makes Google Maps bad? My computer in 1983 didn't have a map application at all, how can Google Maps possibly be bad compared to that?

And if I had had a map application (I'm sure they existed) it would have taken several minutes to load from cassette tape. That's not faster than Google Maps, either perceptually or objectively.


The article explains all the ways that make Google Maps' UI bad. Your paper map or driver's atlas in 1983 offered better xref functionality. As they do in 2020, if you can still find them.

Sure, hardware progressed in the past 40 years. CPUs are faster, storage is larger, we have a global network. But it's all irrelevant to the point that Google Maps UI is optimized for being a point-by-point nav/ad delivery tool, not a map. That's an absolute statement, not relative to last century's technology.


I have several atlases and IMO they have considerable advantages over Google Maps. As does Google Maps over them. But none of that is relevant to the matter at hand: "Almost everything on computers is perceptually slower than in 1983".


Thus far in this thread nobody has offered any counterpoint to the speed argument.

The first PC we bought at home was a 133MHz Pentium.

My current box is a Ryzen 3700 at 16*3.6GHz.

It doesn't feel like I have that much more power at my disposal doing everyday things. Web browsers aren't 400 times faster today than Internet Explorer was in 1995.

They should be. Even if you account for all the extra stuff that's going on today things should be at least 50 times faster. Why aren't they?


There are other bottlenecks.

RAM bandwidth and speed, network latency, display sound like the most important. If that 133MHz pentium rendered a web page, it did so at 640×400 pixels, right? 16 colours? Or just in text? So it had to process about 4k (if text) or 128k (if graphics). Your current display involves a little more data.

RAM access takes about 10ns now, it took longer back then but not very much longer. Your sixteen cores can do an awful lot as long as they don't need to access RAM, and I doubt that you need sixteen cores to render a web page. The cores are fast, but their speed just removes them further from being a bottleneck, it doesn't really speed up display of text like this page.

And then there's the latency — ping times are a little closer to the speed of light, but haven't shrunk by anything close to a factor of 400.


My current display does include a lot more data.

It's also driven by a graphics card with more RAM than I had HDD space back in the day with a dedicated CPU that's also a whole lot faster than 133MHz.

Every piece of hardware is better but software has bloated up to remove those speed gains, except when it comes to things like AAA games where they're still pushing the envelope. That's the only place you can actually tell you've got hot new hardware, because they're the only ones caring enough about performance.


The increase in video RAM requires more of the CPU and GPU. Downloading and displaying a JPEG at today's resolution requires more of everything, not just video RAM.

Anyway, If you come to over to the server side you can see other code that performs very differently than it could have back in 1983. Sometimes unimaginably different — how would a service like tineye.com have been implemented a few decades ago?


The point I'm making is that my desktop PC sitting in my office now does have more of everything compared to my 133MHz PC from 1995. Not everything has scaled up at the same pace, sure, but literally every piece of hardware is better now.

People talk about difference in resolution and color depth? 640x480x16 isn't that much less than 1920x1080x32. My current resolution has 13 times more data than my 1995 one, and my HW can handle refreshing it 120 times per second and fill it with millions of beautiful anti-aliased polygons all interacting with each other with simulated physics and dozens of shaders applied calculating AI behaviour, path finding, thousands of RNG roll, streaming data to and from disk and syncing everything over the network which is still limited by the speed of light. As long as I play Path of Exile that is.

Opening desktop software is perceptually the same as in the 90s. From launching to usable state is about the same amount of time, and it's not doing so much more that it can explain why current software takes so long.

If I can play Path of Exile at 120fps it's obviously not an issue of HW scaling or not being able to achieve performance.


Is it OS and SDK bloat? Where do you think most of the cruft is coming from?

Let's say I had a given LOB application written in vb6, c#.net winforms, heck, maybe even WPF. All single-threaded.

If I re-wrote the same application (features, UI, still single-threaded) in native Win32, would that improve the latency?


Who knows? My hunch is there's two main factors influencing this. The first is that constraints breed creativity. If you know you only have 133MHz on a single CPU you squeeze as as possible much out of every cycle, on modern CPUs what's a few thousand cycles between friends?

The second is SDK/framework/etc. bloat, which is probably influenced by the first. With excess cycles you don't care if your tools start to bloat.

I think it's primarily an issue of attitude. If you want to write fast software you'll do it, regardless of the circumstances. It all starts with wanting it.


Think harder.

I worked on a framework in the nineties and did such things as render letters to pixels. Here are some of the optimisations we did then, compared to now:

We used much lower output resolution.

We used integer math instead of floating point, reducing legibility. How paragraphs were wrapped depended on whether we rendered it on this monitor or that, or printed it.

We used prescaled fonts instead of freely scalable fonts for the most important sizes, and font formats that were designed for quick scaling rather than high-quality results. When users bought a new, better monitor they could get worse text appearance, because no longer was there a hand-optimised prescaled font for their most-used font size.

We used fonts with small repertoires. No emoji, often not even € or —, and many users had to make up their minds whether they wanted the ability to type ö or ø long before they started writing.

Those optimisations (and the others ­— that list is far from complete) cost a lot of time for the people who spent time writing code or manually scaling fonts, and led to worse results for the users.

I think you're the kind of person who wouldn't dream of actually using anything other than antialiased text with freely scalable fonts and subpixel interletter space. You just complain that today's frameworks don't provide the old fast code ­that you wouldn't use and think develpers are somehow to blame for not wanting to write that code.


Try running Google Maps on your 133Mhz Pentium.


Google Maps runs like crap on my current desktop. That's precisely the problem!


Is your current desktop at 133Mhz Pentium? It runs perfectly well on my 2015 Macbook Pro!


Perfectly well? Really? Scrolling around the map or zooming causes you no kind of rendering delays or artefacts? It feels consistently snappy no matter what you do?


I am sure that Microsoft Autoroute was available for that era. It probably was also faster then Google Maps is today.


Apparently Microsoft Autoroute was first released in 1988, covered several dozen countries, and could be obtained by ordering it. Thus using it for the first time would involve a delay or at least a day in order to order, receive and install the program. After that, starting it should be quick, but I can't tell whether it required inserting the CD into the drive. Even if the appliocation is already installed on the PC and not copy-protected, looking something up doesn't sound obviously faster than opening a web page, typing the name of the location into the search box, and waiting for the result.


It's not the "looking up a thing" part that would be faster. It's the "looking up the next thing", the "and the next after that" parts that would be.


And you had to wait for new CDs to arrive by mail whenever roads changed. And I'm not talking "2 day delivery Amazon with UPS updates" here, I'm talking you send an envelope + check into the mail and maybe a month from now you get a CD back.

It didn't actually work that well. The internet is the only real way you can get a Google-maps like autoroute feature with reasonable update times. Constantly buying new CDs and DVDs on a subscription basis is a no-go for sure. I don't think anyone's 56kb modem was fast enough to update the map data.

Even if you bought the CDs for the updated map data, it only was updated every year IIRC. So there was plenty of roads that simply were wrong. Its been a long time since I used it, but Google Maps is better at the actual core feature: having up to date maps, and an up-to-date route information.

Hint: Microsoft wasn't sending around Google cars to build up its database of maps in the 90s. Nor were there public satellite images released by the government to serve as a starting point for map data. (Satellite imagery was pure spycraft. We knew governments could do it, but normal people did NOT have access to that data yet). The maps were simply not as accurate as what we have today, not by a long shot.

--------

Has anyone here criticizing new software actually live in the 90s? Like, there's a reason that typical people didn't use Microsoft Autoroute and other map programs. Not only was it super expensive, it required some computer know-how that wasn't really common yet. And even when you got everything lined up just right, it still had warts.

The only thing from the 90s that was unambiguously better than today's stuff was like... Microsoft Encarta, Chips Challenge and Space Cadet Pinball. Almost everything else is better today.


With browsers the network latency caps the speed ultimately, no matter how fast a CPU you have. Also HDD/SSDs are very slow compared to the CPU caches. Granted PCs of the previous era also had the same limitation but their processors were not fast enough to run a browser 400 times faster if only the HDD wasn't there.

But other simpler programs should be very much faster. That they perceptually aren't is because (IMO) code size and data size has increased almost exponentially while CPU caches and main memory speeds haven't kept up.


Main memory speeds haven't increased like CPU speeds have but it's nowhere close to where it was in 1995. You can get CPUs today with larger cache than I had RAM back then, as well.

I know that CPU speed isn't everything and so a 400x speedup is not reasonable to expect. That's why I hedged and said 50x.

Every part of my computer is a lot faster than it was back then and I can barely tell unless I'm using software originating from that era because they wrote their code to run on constrained hardware which means it's flying right now.

It's like we've all gone from driving 80 MPH in an old beater Datsun to driving 30 MPH in a Porsche and just shrug because this is what driving is like now.


There was the BBC Domesday book that let you scroll around annotated maps on a BBC micro. They loaded in near-realtime from laserdisc. It was fantastically expensive, and also so tightly tied to its technology that it was impossible to emulate for years.

I believe around 2010 the BBC managed to get it on the web, but this seems to have died again.


I did have map applications in 1993.


It's worse than Google Earth.


> Huh. I'd say the examples are perfectly good and on-point. While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.

What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience? I remember DOS pretty well. I remember trying to configure Expanded Memory vs Extended Memory. Having to wait for dial-up to literally dial up the target machine.

Edit: I didn't realize the poster was talking about Point-of-Sale devices. So the above rant is aimed incorrectly.

> It also doesn't change the fact that Google Maps is very bad at being a map. It's entire UI flow is oriented for giving turn-by-turn directions for people who know where they are and where they are going;

That's exactly what it's made for.

> As someone who builds their own PC every couple years: it still does. It's actually worse now,

No way. Things just work nowadays. Operating Systems have generic drivers that work well. Its so much easier now to build a machine than it was years ago. I remember taking days to get something up and running, but now its minutes. Maybe an hour?

I really hate these "good old days" posts, no matter what the subject. The days of the past weren't better, there were just fewer choices.


> What are you talking about? How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once? How is non-multitasking a better experience?

The person you were replying to was specifically talking about POS (Point of Sale) systems.

Retail workers use these systems to do repetitive tasks as quickly as possible. Legacy systems tend to be a lot faster (and more keyboard accessible) than modern web-based systems.

It is not uncommon for retail workers to have a standard Windows workstation these days so they can reference things on the company website, but then also shell into a proper POS system.


My mistake. I thought they meant Piece-Of-Shit.


Well, the acronym works for both. And I've never seen a Point-Of-Sale system that wasn't (in several ways) a Piece-Of-Shit.


> How in the world is a DOS that can only run a single app at a time better than a system that can run dozens of apps at once?

> How is non-multitasking a better experience?

If you only use the device for one task, such as a point-of-sale system, or menu display system, then multitasking support is not needed.


Besides, you could multitask using software like DESQview.


In no way shape or form does DOS-era UX beat current-era UX (there may be specific programs that do so, but writ large this is incorrect). Getting away from command lines is one of the core reasons that computing exploded. Getting farther away from abstraction with touch is another reason that computing is exploding further.

Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.

It is true that if you are trained up on a command-line system you can often do specific actions and commands quickly but without training and documentation, it is often quite hard to know what to do and why. Command-line systems also provide feedback that is hard for many people to understand.

Here are some guidelines for thoughtful product designs. DOS-era systems often violate quite a few of these: https://uxdesign.cc/guidelines-for-thoughtful-product-design...

Yes, it is true that many current-era systems violate these guidelines as well. This is because UX has become too visual design focused in recent years, but that's a symptom of execs not understanding the true value of design.


> Command-line systems simply do not map well to a lot of users' mental models. In particular, they suffer from very poor discoverability.

I don't want to push too hard into one extreme here, but I believe modern design makes a mistake of being in the other extreme. Namely, it assumes as an axiom that software needs to be fully understandable and discoverable by a random person from the street in 5 minutes. But that only makes sense for the most trivial, toy-like software. "Click to show a small selection of items, click to add it to order, click to pay for order" thing. Anything you want to use to actually produce things requires some mental effort to understand; the more powerful a tool is, the more learning is needed - conversely, the less learning is needed, the less powerful the tool.

It's not like random people can't learn things. Two big examples: videogames and workplace software. If you look at videogames, particularly pretty niche ones (like roguelikes), you'll see people happily learning to use highly optimized and non-discoverable UIs. Some of the stuff you do in 4x or roguelike games rivals the stuff you'd do in an ERP system, except the game UI tends to be much faster and more pleasant to use - because it's optimized for maximum efficiency. As for workplace software, people learn all kinds of garbage UIs because they have no choice but to do so. This is not an argument for garbage UIs, but it's an argument for focusing less on dumbing UIs down, and more on making them efficient (after all, for software used regularly at a job, an inefficient UI is literally wasting people's lives and companies' money at the same time).


> While dealing with autoexec.bat and random BSODs wasn't fun, it's entirely orthogonal to the fact that a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.

These aren't orthogonal in any way. A significant chunk of modern performance hit relative to older Von Neumann architectures is the cross-validation, security models, encapsulation, abstraction, and standardization that make BSODs an extremely unexpected event indicative of your hardware physically malfunctioning that they have become (as opposed to "Oops, Adobe reverse-engineered a standard API call and botched forcing bytes directly into it when bypassing the call library's sanity-checking; time to hard-lock the whole machine because that's the only way we can deal with common errors in this shared-memory, shared-resource computing architecture!").


Yes but the flip side of it is that it takes more time now to restart an Adobe application than it took to reboot the machine in the 1980s


The time that matters isn't how long it takes to restart the app; it's how many hours of changes just got eaten because the app crashed and the data was either resident in memory only or the crash corrupted the save file (the latter scenario, again, being more common in the past where correctly shunting the right bytes to disk was a dance done between "application" code and "OS toolkit" code, not the responsibility of an isolated kernel that has safeguards against interference in mission-critical tasks).


OTOH, lower runtime performance of modern apps eats into how much people can produce with them - both directly and indirectly, by slowing the feedback loop ever so slightly.

While there are couple of extra layers of abstractions on our systems that make them more safe and stable, hardware has accelerated far more than just to compensate. Software of today needs not to be as slow as it is.


In general, people will tradeoff fixed predictable cost to high-variance cost, so even if the slower tools are nibbling at our productivity, it's preferable to moving fast and breaking things.

I'm not claiming there's no room for optimization, but 90% of the things that make optimization challenging make the system reliable.


> a DOS-era POS still offers orders of magnitude better UX than current-era browser POSes, or than most web apps for that matter.

As someone who helped with the transition from crappy text-only DOS interfaces on POSes to graphical interfaces on POSes, I have to disagree. Learning those interfaces was terrible. I don't know where the author gets the idea that even a beginner had no trouble learning them. I worked at NCR during the switchover from the old system to the new one that used graphical representations of things and the new system was way easier to use and for beginners to learn. And that's not just something we felt, we measured it. (Interestingly, it was still DOS-based in its first incarnation, but it was entirely graphical.)


For a software that's used professionally, it does not matter how easy it is for a beginner to learn. That's an up-front investment that needs to be made once. What matters is the ongoing efficiency and ergonomy of use.


You can never really make a map for "exploring" because what people want to explore is domain specific. Do you want to explore mountains? Do you want to explore museums, do you want to explore beaches?

There is too much stuff in the world to present on a map without absolutely overwhelming people, in the same way that the internet is too vast to "explore". You can't explore the internet from google search you've got to have some vague starting point as a miniumum.


> There is too much stuff in the world to present on a map without absolutely overwhelming people

Nevertheless, I think that Google Maps et al. typically show far too little on the screen. My Dad tells me that he still prefers paper maps because he doesn't have to fiddle around with the zoom level to get things to show up. While I'm sure that Google has more data on most places than it could fit in a small window, when I look at my location in Google Maps, it looks very barren: in fact, it seems to prioritize showing me little 3d models of buildings over things that I care about, like street and place names. Paper maps are typically much denser, but I don't think that people 30 years ago were constantly "overwhelmed" by them.


In a world where you can plot a pin of yourself via gps on the map the need for street names, building names e.t.c. at higher zoom levels just doesn't matter because you don't need it to figure out where you actually are, with paper maps you did.

If adding that information to the map serves no need anymore but clutters it up, why do it?


Google Maps' refusal to show the name of the major(!) street names is infuriating. I don't need that information to determine where I am, I need it to figure out what street signs I need to be looking for so that I can get to where I am going. And the really infuriating thing is that it shows one random useless street somewhere until I zoom in until the street I want is the only thing on the screen.

I fired Google Maps and now use Apple Maps unless I'm looking for a business by name that I think is not mainstream enough to be on Apple Maps.


> offers orders of magnitude better UX than current-era browser

And they also did orders of magnitude less. If you've ever done UX, you would know that it gets exponentially harder as you add more features. People love to complain, but not a single person would realistically give up on all the new features they've got for the slightly better UX in certain edge cases.


> and didn't have to write CSS hacks for electronics stores

could you explain that?


Sure. Most popular on-line stores seem to adopt the "material design" pattern, in which an item on a list looks like this:

  +---------+   ITEM NAME          [-20% OFF][RECOMMENDED]
  | A photo |
  | of the  |   Some minimal description.       PRICE (FAKE SALE)
  | item.   |   Sometimes something extra.      ACTUAL PRICE
  |         |   
  +---------+   <some> <store-specific> <icons>   [ADD TO CART]
I started to write userstyles that turn them all into:

  ITEM NAME Some minimal description ACTUAL PRICE [ADD TO CART]
  Sometimes something extra (with a much smaller font).
The userstyles get rid of the photos and bullshit salesy disinformation, reduce margins, paddings and font sizes. This reduces the size of the line item 5-6x while preserving all the important information; it means I can now fit 20-30 line items on my screen, where originally I was able to fit 4-5. With refined enough search criteria, I can fit all results the screen, and compare them without scrolling.


If it truly was worse at being a map, people would still use physical maps, at least in some scenarios. I have never met a person who still uses a physical map for anything more than a wall decoration.


> That I don't get. Is it "elitist" now to point out that the (tech) "elite" can actually handle all this bullshit

It's easy to critique and complain, while at the same time doing it puts someone in a position of superiority. Saying "all the effort and knowledge out there is bullshit because my 1983 terminal searched text faster", in a way says that the person writing this knows better, and thus is elite. OP says it's disguised as disappointment, which I agree, because the author describes (and disguises) the situation from a frustration point-of-view.

But I also think that elitism can be traded with snobbery in this case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: