Shouldn't the logical conclusion be that if it's too much/hard to teach these people how to operate a device safely, they operate the devices in an unsafe way, bare the cost of it by being scammed, learn that it's not safe for them to operate the device for certain use-cases due to the experience, they tell others about it and it's in media -> people who do not feel confident operating such a device securely are scared away from using it due to the potential consequences they heard about -> problem solved (from a banking security perspective)
(except now the bank needs more staff behind the counter)
Not 100% sure if you mean this genuinely or joking around a bit. Will assume the former
Well, think just letting the knowledge of user failure expand organically is definite a method of deterrence, and some amount of this probably going to happen to some of the users. But to me, seems like it's a question of what percentage of your user base would be exposed to being scammed. Of course you'd want his to be zero, but if it's significant, yeah, probably should put measures in place to reduce the amount of scamming. Even on a purely practical level, it's bad for the reputation of your product...
...Am thinking, since there is so much resistence to locking down android, one problem might be was it was initially billed as a more open OS that tech people could enhance in whatever way they wanted. But yeah, times have changed, it's now a product that is used by the masses, and guessing the masses are now their most important users. Not saying this is wrong or right, but probably why there is so much push back as compared to say if iOS did the same thing (which they may have already done).
(except now the bank needs more staff behind the counter)