Hacker Newsnew | past | comments | ask | show | jobs | submit | jhasse's commentslogin

You can lock the bootloader again with GrapheneOS and many banking apps work.

You won't pass Google Play hardware attestation that way, and you won't find a bank in Europe or the UK that doesn't require that to log on to their website within five years.

That's a prediction I would disagree with. Firstly, there are application developers which specifically add support for GrapheneOS if they are asked nicely. Secondly, there is a chance that Play Integrity will have to change due to anti-trust regulation.

My bank works fine after relocking (in NL, Europe). And last time I checked all Dutch banks work. My VISA credit card app (from ICS) also works. Same for the government identification app, the government message app, our insurance app. In fact, I haven't encountered anything outside of Google Pay that didn't work.

(I don't deny that there are apps that won't work. Best to check before switching full-time.)


You pass basic, but not device or strong integrity. This is purely googles fault and is an artificial limitation that requires regulatory restrictions.

> A similar spec Android costs about the same but usually has a flimsy plastic build and will become unsupported in 2-4 years

Maybe 10 years ago, but this couldn't be further from the truth today.


Shouldn't sales reduce inflation because they increase supply?

Selling things increases money velocity.

A wealth tax would be like 5% of the $100k, nothing to do with the gains.

Yikes. So even if I store my wealth in cash, you want it to deflate by 5% annually?

How do you handle your neighbor who discovers he has a $2m Pokémon card in his closet? Is he forced to sell it to pay the 5% if he doesn’t have the cash on hand to pay the tax?

It’s a messy proposition. I’ve yet to hear a clear proposal that doesn’t have sticky edge cases.


> So even if I store my wealth in cash, you want it to deflate by 5% annually?

Generally speaking, that's the point. The wealth tax is trying to combat wealth inequality and the only way for such a policy to be effective is if those with considerable assets wealth decreases with time.

> How do you handle your neighbor who discovers he has a $2m Pokémon card in his closet?

Usually that's handled by having a minimum asset requirement before the wealth tax kicks in. 100M is what I've seen. It'd be a pretty easy tax to make progressive.

> It’s a messy proposition. I’ve yet to hear a clear proposal that doesn’t have sticky edge cases.

I've given the proposal I've seen in a different comment. Perhaps you didn't see it? But in any case, taxes are always messy. It's not as if you can't refine them with more and more amendments to address different scenarios as they come up. I don't think the "messiness" should be what keeps us from adopting such a tax system. There will almost certainly be a game of cat and mouse between the regulators and the wealthy regardless the proposal.

Don't let perfect be the enemy of the good.


Switzerland has a wealth tax while people like you wring their hands and the wealthiest see their wealth increase far beyond anyone elses.

In From 1965 to 1995 the richest man in the world had about $30-40b in today's money. This was more than the 1945-1965 era, but way less than the mess pre-war thanks to aggressive action to limit wealth.

Today the richest man in the world has $300b, Rockefeller levels before the 1929 crash.


We don't know how much money the richest person has because many assets are not publicly traded or disclosed.

Even if you go only by Elon's TSLA shares, he has north of 200B net worth.

> Today the richest man in the world has $300b, Rockefeller levels before the 1929 crash.

I think it's more like 800B right now.


I'm using KDE with Wayland and 2 non-standard DPI monitors (one at 100% the other at 150% scale). No workarounds needed, nothing is blurry. I think your experience comes from GNOME which lacks behind in this regard.


FWIW, I can do the same with KDE on Xorg with Gentoo Linux.

Since the introduction of the XSETTINGS protocol in like 2003 or 2005 or so to provide a common cross-toolkit mechanism to communicate system settings, the absence of "non-integer" scaling support has always been the fault of the GUI toolkits.

> I think your experience comes from GNOME which lacks behind in this regard.

When doesn't GNOME lag behind? Honestly, most of Wayland's problems have been because a project that expects protocol implementers and extenders to cooperate in order to make the project work set those expectations while knowing that GNOME was going to be one of those parties whose cooperation was required.


Mint/cinnamon here at 150%, X11, not blurry. It’s FUD.


The issue with X11 is that it's not dynamic. Think using a laptop, which you sometimes connect to a screen on which you require a different scale. X11 won't handle different scales, and it also won't switch from one to the other without restarting it.


> The issue with X11 is that it's not dynamic.

No, it is. Maybe you're using an ancient (or misconfigured) Xorg? Or maybe you've never used a GTK program? One prereq is that you have a daemon running that speaks the ~20 year old XSETTINGS protocol (such as 'xsettingsd'). Another prereq is that you have a DE and GUI toolkit new enough to know how to react to scaling changes. [0]

Also, for some damn reason, QT and FLTK programs need to be restarted in order to render with the new screen scaling ratio, but GTK programs pick up the changes immediately. Based on my investigation, this is a deficiency in how QT and FLTK react to the information they're being provided with.

At least on my system, the KDE settings dialog that lets you adjust screen scaling only exposes a single slider that applies to the entire screen. However, I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors... but for some reason the KDE control widget doesn't bother to let you use them. Go figure.

[0] I don't know where the cutoff point is, but I know folks have reported to me that their Debian-delivered Xorg installs totally failed to do "non-integer" scaling (dynamic or otherwise), but I've been able to do this on my Gentoo Linux machines for quite some time.


I use whatever package is shipped by arch, so I think I'm fairly up to date.

I did look a bit into this at one point, but I've found that it's mostly QT apps which work fine with different scaling (telegram comes to mind). GTK apps never did, but I admit I never went too deep in the rabbit hole. Didn't know there was supposed to be some kind of daemon handling this. I do run xsettingsd, but for unrelated reasons. I'll have a look if it can update things.

In any case, except for work, I always used everything at 100% and just scaled the text as needed, which worked well enough.

> I've bothered to look at (and play with) what's actually going on under the hood, and the underlying systems totally expose per-display scaling factors...

Would you care to go into some details? What systems are those and how do you notify them there's been a change?


Maybe I got lucky but my external screen is at the same scale. 4k monitors are not expensive any longer, mine is perhaps ten years old.


That's also my setup, the external 4k monitor is roughly the same scale as my FHD laptop. I did this on purpose, to avoid having to mess around with scaling and just run everything at 100%.

But more and more laptops start having higher resolutions nowadays, and using my external screen at a scale of more than 100% would be a waste. But a 3-4k 14" laptop at 100% would be unusable.


Signal is so much worse than WhatsApp from a UX perspective. Backup sync forces you to allow background permissions (WhatsApp doesn't), you have to set and get nagged to enter a PIN every few weeks (WhatsApp doesn't), there's no transcription for audio messages (WhatsApp has that for some languages), the desktop app loses its connection if you don't open it ever few weeks (WhatsApp works fine), etc.

If you want people to switch, recommend Telegram.


>If you want people to switch, recommend Telegram.

Why would people switch from always-end-to-end encrypted group chats to never-end-to-end encrypted group chats?


Because they don't even know what e2e encryption is.


Yes. Let's switch to an app with Russian connections that has actively refused to implement E2EE for over a decade now.


Russian connections is FUD and Telegram has E2EE encryption, but not by default.


Said E2EE is mobile only and completely unavailable in group chats


You are moving the goal post. But you're right: Signal's E2EE is miles better than telegram's. I was just trying to point out my experience in getting people to switch, most of the time they have different prioirities.


My circle switched to Signal because we are concerned about tech bros and a fascist America.

Boosting Russia is not the solution.


Telegram is not Russian. In fact Putin hates Pavel Durov.


That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.

But instead we get this mess. I guess it's because there's too much Microsoft in the standard and they are the only ones not having UTF-8 everywhere in Windows yet.


char is always 1 byte. What it's not always is 1 octet.


you're right. What I meant was that it should always be 8 bit, too.


std::string is not UTF-8 and can't be made UTF-8. It's encoding agnostic, its API is in terms of bytes not codepoints.


Of course it can be made UTF-8. Just add a codepoints_size() method and other helpers.

But it isn't really needed anyway: I'm using it for UTF-8 (with helper functions for the 1% cases where I need codepoints) and it works fine. But starting with C++20 it's starting to get annoying because I have to reinterpret_cast to the useless u8 versions.


First, because of existing constraints like mutability though direct buffer access, a hypothetical codepoints_size() would require recomputation each time which would be prohibitively expensive, in particular because std::string is virtually unbounded.

Second, there is also no way to be able to guarantee that a string encodes valid UTF-8, it could just be whatever.

You can still just use std::string to store valid encoded UTF-8, you just have to be a little bit careful. And functions like codepoints_size() are pretty fringe -- unless you're not doing specialized Unicode transformations, it's more typical to just treat strings as opaque byte slices in a typical C++ application.


Perfect is the enemy of good. Or do you think the current mess is better?


std::string _cannot_ be made "always UTF-8". Is that really so contentious?

You can still use it to contain UTF-8 data. It is commonly done.


I never said always. Just add some new methods for which it has to be UTF-8. All current functions that need an encoding (e.g. text IO) also switch to UTF-8. Of course you could still save arbitrary binary data in it.


> That's where the standard should come in and say something like "starting with C++26 char is always 1 byte and signed. std::string is always UTF-8" Done, fixed unicode in C++.

> I never said always

Yes you totally did. And regarding "add some new methods for which it has to be UTF-8", there is no need at all to add UTF-8 methods to std::string. It would be a bad idea. UTF-8 is not bound to a particular type (or C++ type). It works on _any_ byte sequence.


You're right, sorry.

Good point, so maybe the standard should just add some functions that take std::string_view. Definitely not add a whole new class like std::u8string ...


Visual Studio is mostly written in C# btw.


Back in 2005 it was mostly in C++ and it was blazing fast. IMHO VS 2005 was the most performant edition. I never liked VS 2003, felt bloated in comparison.


Let's also not forget one big reason VSCode took over and Sublime lost: VSCode is gratis and (mostly) open-source, while Sublime is proprietary.


On macOS too. On both operation systems 99% apps do though. Maybe its 99.9% on macOS vs 99.8% on Windows. But I'm using HiDPI on both and it was a long time ago that I encountered an app that didn't support it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: