Well, you could just look at things from an interoperability and standards viewpoint.
Lots of tech companies and organizations have created artificial barriers to entry.
For example, most people own a computer (their phone) that they cannot control. It will play media under the control of other organizations.
The whole top-to-bottom infrastructure of DRM was put into place by hollywood, and then is used by every other program to control/restrict what people do.
This article went much deeper than I was expecting. Wow. I always wondered what native peoples alphabets looked like since the Latin alphabet was imposed on them by colonialists. Fascinating.
There were no alphabets in the Americas before European contact. Mayan had written mathematics and hieroglyphics, and some Quechuan speaking peoples had string that had symbolic knots that had some mathematical representation (I don't know if it allowed arithmetic or was just record keeping).
Sequoia developed the Cherokee syllabary (where symbols represent syllables instead of vowels/consonants) in the 1800s after seeing white men reading, and figuring out what they were doing (he spoke little English and could not read it). This is the first real written indigenous language in the Americas.
The Skeena characters shown here are obviously derived from European characters, as was the Cherokee syllabary. I think most written forms of native languages in the Americas are similar.
The Cree have a script which is far from European characters but was nonetheless developed for the Cree by a missionary in the 1800s. The Inuit have modified it for their language.
I don't know much about indigenous languages in the rest of the world.
In most cases, there was simply no native script to begin with. If you look at some examples of non-Latin-based scripts for native American languages (e.g. Canadian Aboriginal syllabics, Cherokee syllabary etc), they are all derived from newly introduced scripts. Mi'kmaw hieroglyphs are an interesting exception in that the glyphs themselves are indigenous, but their use as a full script was introduced from outside.
Latin-based alphabets discussed in the article have mostly been introduced in the 20th century to facilitate the revival of those languages. Although I find that Salishian languages in particular got a very lazy treatment - if you look at some of the examples in the article like "ʔaʔjɛčχʷot" or "ʔayʔaǰuθəm", that's pretty much the https://en.wikipedia.org/wiki/Americanist_phonetic_notation taken as is without much consideration for ease of use or typographic concerns (SENĆOŦEN is a notable exception to this). Kind of ironic, since many of the typographic issues the article addresses stem from this original decision.
I'm impressed. It runs my dev blog quite well. Some of the CSS alignment is off and it doesn't load web fonts, but it looks basically the same as Chrome. Even the syntax highlighted code snippets work.
I don’t hate the C language. I hate the C compiler (and the rest of the toolchain). Anything that helps me not interact with the C compiler is a huge win.
I’m currently working on a browser targeting the T-deck in pure Rust. It’s effectively a text mode command line browser good for reading pages with links and nothing else. There just isn’t the ram for anything more. Interestingly, The slowest part is actually SSL connections.
To do everything onboard, maybe, just maybe, you could parse basic HTML/CSS and images. But the majority of pages would of course fail anyway without full support for every modern feature.
reply