> I've believed since college that math is the worst taught of all academic subjects. I never had a math professor that engaged the class, and practical applications were never mentioned.
Nearly all math textbooks and lectures mention practical applications. The problem rather seems to be that you have/had a different understanding of "practical" than your math professor.
- written exams where students are distributed over a larger area (e.g. the university rents a warehouse for the examination time) so that the COVID-19 spreading risk is nevertheless kept very small.
This has been done in my last exam phase and has worked well. They cranked the AC to 11, which made it quite an unpleasant envirommemt, but I would rather wear a jacket than take online exams
I can remember taking many exams in the athletic field house at my university back in the 1990s. It was pretty standard when you were in a large class.
The desks were so far apart that the current "social distancing" standards would be met. I remember that many courses had multiple variations of the same exam given to students to further reduce cheating.
I'm taking an online masters right now and have taken a few (pre-pandemic) proctored exams. The nearby university offers to proctor any exam for $10. They have a bunch of rooms with desks, and it's no big deal at all.
This proctoring technology thing has gone too far and too fast. It was a knee-jerk reaction, and with a lot of complaints I bet that a lot of it gets dropped.
There is also "32 bit realmode", which is not mentioned in the official documentation but simply a combination of existing states. Ditto for real mode paging and the like --- which finds more applications as emulator acid-tests than anything else.
I thought the 80286 had 32-bit protected mode, but it was badly implemented (the only way to get back to real mode was a reboot), so they fixed it with the 80386. Unless, are you referring to “unreal mode”?
No, most people forget about it (I had to be reminded by the above comment), but the 80286 did have a 16-bit protected mode. Quoting Wikipedia (https://en.wikipedia.org/wiki/Protected_mode): "[...] Acceptance was additionally hampered by the fact that the 286 only allowed memory access in 16 bit segments via each of four segment registers, meaning only 4*2^16 bytes, equivalent to 256 kilobytes, could be accessed at a time. [...]"
> A big "Yes" to proportional scroll bars. I didn't have an Amiga, but rather an ST. The ST also had proportional scroll bars and for years, I could not understand why the major platforms (Windows, MacOS) did not. It was a pet peeve that really bothered me when I would sit down to use someone's Mac or PC of the time.
TFA explains that it has it now but didn't during the heydey of the Amiga.
My light experience from a long time ago is it also takes quite a bit of code to implement this on Win32 relative to other platforms. Your link could be used as evidence.
You're right, there is something called a "scrollbar", but many people don't see it because they are using macOS which for some reason hides the scrollbars by default.
Also phones/touchpads don't see scrollbars I think.
macOS does indeed show the scrollbar at all times when a input device other than a touchpad is attached.
My desktop Mac never hides the scrollbar, as it uses a normal mouse, however when I was using a Magic Trackpad the scrollbar would indeed hide unless I was actively scrolling.
> I do think one core aspect of the practice of mathematics has been in increasing human understanding of it, not merely compute things with symbols.
The road towards better (in the sense that they can be "more trusted") computer-checked proofs of the four-color theorem has also lead to a better (human) understanding of the four-color theorem.
>
Hobby Lobby refused to pay for any health insurance plan that covered contraception because they believed them to be abortifacients and contrary to the christian values of the corporation.
This is rather an argument why not the employer, but the employee should pay for the health coverage. As they say in Germany:
"Wer zahlt, schafft an."
("who pays, commands", where the verb "anschaffen" (which I translate with "command" here) has the undertone of "giving sexual orders to a prostitute that she has to follow")
A lot of what Blow says is not entirely accurate. For example he presents a simple picture of declining software quality over time, but anyone who was around at the time knows that both desktop OSes and desktop applications (including web browsers) were certainly much more crashy, and probably more buggy in general, than they are now. Likely quality has started to decline again over the past decade, but it's still not remotely back to where it was. It's hard not to suspect that Blow passes over this because it tends to contradict his "higher-level languages and more infrastructure → declining quality" argument. Section 7.4, "Programming Environments Matter" http://philip.greenspun.com/research/tr1408/lessons-learned.... of Phil Greenspun's, apparently, 1993 SITE CONTROLLER dissertation https://dspace.mit.edu/handle/1721.1/7048 makes the same "we don't expect software to work any more" lament which Blow delivers at 22:17 https://youtu.be/ZSRHeXYDLko?t=1337 :
> Another reason the "horde of C hackers" approach has worked remarkably well is that, although the resultant software is rife with bugs, most users have no experience with anything better. When an MBA's Macintosh Quadra crashes due to a C programmer's error in Microsoft Excel, he doesn't say "I remember back in 1978 that my VAX 11/780, with only one tenth the processing power of this machine, had memory protection between processes so that a bug in one program couldn't corrupt the operating system or other applications." Rather, he is likely to say, "Well, it is still easier than using pencil and paper."
but places the blame on a switch to lower-level languages and runtime systems. The improvements on the desktop over about the '00s seem to be attributable to (not an expert) the mainstreaming of, and continued development of, the WinNT and OS X platforms, increasing use of memory-managed languages and/or more recent versions of C++ in applications, and adoption of online crash-reporting infrastructure (though probably also increasing use of increasingly effective error-detection tools, which I assume Blow is fine with as they don't create a runtime dependency). So it certainly seems that Greenspun is more correct than Blow, which is certainly not to say that adding more layers of infrastructure has always been an unqualified good.
Also, Blow's talk has a very '90s focus on crashers, error messages, and the like, but many of the worst regressions in software over the last 10 or 20 years don't manifest as crashers or other straightforward bugs at all; and when they do manifest as bugs the bugginess is often intertwined with architectural issues in a way that makes a bug-hunting mentality relatively ineffective. For example, the pinnacle of WYSIWYG rich text editing was probably about Word 4 for Macintosh, which was a slightly awkward but workable mating of stylesheets to the WYSIWYG UI. Unfortunately it was something of a local optimum: further progress on the problem largely requires serious developer thought and/or further user education. So everyone more or less decided to instead pretend that rich text is a solved problem, and things have largely been gently regressing since then. Which is probably part of the deep background to the GMail rich-text jank Blow complains about at 23:47 https://youtu.be/ZSRHeXYDLko?t=1427 . “We can not solve our problems with the same level of thinking that created them”, as Lincoln said. ;)
Nearly all math textbooks and lectures mention practical applications. The problem rather seems to be that you have/had a different understanding of "practical" than your math professor.