> Is Microsoft a "dying company"? The stock market certainly thinks otherwise.
This is the entire sentence that I wrote that you seem to be referring to:
“These are examples of how bad management thinks, or at best, how management at dying companies think.”
MS falls under the first part — bad management. Let literacy be your friend.
To elaborate, yes, I think that MS is managed incredibly poorly, and they succeed despite their management norms and culture, not because of it. They should be embarrassed by their management culture, but their success in other areas of the company allows the bad management culture to persist.
There are so many options for what you describe. E.g. the whole point of Tcl/Tk is to allow easy creation of small little tools and apps. And you can use Tk with Python as well.
> Except for MKBHD and Linus Tech Tips, all (most?) the channels I listed have Patreon, and still find it necessary to in-video ads because it's not enough.
Some of them will go subscription-only, which means that many of the free users will leave, but those who don't will pay enough to support the channel.
And some will find that the content they produce isn't actually valuable enough to sufficiently many people. Which is unfortunate, but has to be balanced against all the negative externalities of ads.
$200/month would cover many such sessions every month.
The real question is, what does it cost OpenAI? I'm pretty sure both their plans are well below cost, at least for users who max them out (and if you pay $200 for something then you'll probably do that!). How long before the money runs out? Can they get it cheap enough to be profitable at this price level, or is this going to be "get them addicted then jack it up" kind of strategy?
Violation of whose bodily autonomy? If I consent to having my consciousness copied, then my autonomy isn't violated. Nor is that of the copy, since it's in exactly the same mental state initially.
The copy was brought into existence without its consent. This isn't the same as normal reproduction because babies are not born with human sapience, and as a society we collectively agree that children do not have full human rights. IMO, copying a consciousness is worse than murder because the victimization is ongoing. It doesn't matter if the original consents because the copy is not the original.
If a "cloned" consciousness has no memories, and a unique personality, and no awareness of any previous activity, how is it a clone? That's going well beyond merely glitchy. In that case the main concern would be the possibility of slavery as Ar-Curunir mentioned.
That's my point exactly: I don't see what makes clones any more or less deserving of ethical consideration than any other sentient beings brought into existence consciously.
> The copy was brought into existence without its consent. This isn't the same as normal reproduction because babies are not born with human sapience, and as a society we collectively agree that children do not have full human rights.
That is a reasonable argument for why it's not the same. But it is no argument at all for why being brought into existence without one's consent is a violation of bodily autonomy, let alone a particularly bad one - especially given that the copy would, at the moment its existence begin, identical to the original, who just gave consent.
If anything, it is very, very obviously a much smaller violation of consent then conceiving a child.
The original only consents for itself. It doesn't matter if the copy is coerced into sharing the experience of giving that consent, it didn't actually consent. Unlike a baby, all its memories are known to a third party with the maximum fidelity possible. Unlike a baby, everything it believes it accomplished was really done by another person. When the copy understands what happened it will realize it's a victim of horrifying psychological torture. Copying a consciousness is obviously evil and aw124 is correct.
I feel like the only argument you're successfully making is that you would find it inevitably evil/immoral to be a cloned consciousness. I don't see how that automatically follows for the rest of humanity.
Sure, there are astronomical ethical risks and we might be better off not doing it, but I think your arguments are losing that nuance, and I think it's important to discuss the matter accurately.
This entire HN discussion is proof that some people would not personally have a problem with being cloned, but that does not entitle them to create clones. The clone is not the same person. It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences. The clone has the right to change its mind about the ethics of cloning.
It does indeed not, unless they can at least ensure their wellbeing and their ethical treatment, at least in my view (assuming they are indeed conscious, and we might have to just assume so, absent conclusive evidence to the contrary).
> The clone has the right to change its mind about the ethics of cloning.
Yes, but that does not retroactively make cloning automatically unethical, no? Otherwise, giving birth to a child would also be considered categorically unethical in most frameworks, given the known and not insignificant risk that they might not enjoy being alive or change their mind on the matter.
That said, I'm aware that some of the more extreme antinatalist positions are claiming this or something similar; out of curiosity, are you too?
>retroactively make cloning automatically unethical
There's nothing retroactive about it. The clone is harmed merely by being brought into existence, because it's robbed of the possibility of having its own identity. The harm occurs regardless of whether the clone actually does change its mind. The idea that somebody can be harmed without feeling harmed is not an unusual idea. E.g. we do not permit consensual murder ("dueling").
>antinatalist positions
I'm aware of the anti-natalist position, and it's not entirely without merit. I'm not 100% certain that having babies is ethical. But I already mentioned several differences between consciousness cloning and traditional reproduction in this discussion. The ethical risk is much lower.
> But I already mentioned several differences between consciousness cloning and traditional reproduction in this discussion. The ethical risk is much lower.
Yes, what you actually said leads to the conclusion that the ethical risk in consciousness cloning is much lower, at least concerning the act of cloning itself.
Then it wasn't a good attempt at making a mind clone.
I suspect this will actually be the case, which is why I oppose it, but you do actually have to start from the position that the clone is immediately divergent to get to your conclusions; to the extent that the people you're arguing with are correct (about this future tech hypothetical we're not really ready to guess about) that the clone is actually at the moment of their creation identical in all important ways to the original, then if the original was consenting the clone must also be consenting:
Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.
> It will inevitably deviate from the original simply because it's impossible to expose it to exactly the same environment and experiences.
If divergence were an argument against the clone having been created, by symmetry it is also an argument against the living human having been allowed to exist beyond the creation of the clone.
The living mind may be mistreated, grow sick, die a painful death. The uploaded mind may be mistreated, experience something equivalent.
Those sufferances are valid issues, but they are not arguments for the act of cloning itself to be considered a moral issue.
Uncontrolled diffusion of such uploads may be; I could certainly believe a future in which, say, every American politician gets a thousand copies of their mind stuck in a digital hell created by individual members the other party on computers in their basements that the party leaders never know about. But then, I have read Surface Detail by Iain M Banks.
To deny that is to assert that consciousness is non-physical, i.e. a soul exists; the case in which a soul exists, brain uploads don't get them and don't get to be moral subjects.
It's the exact opposite. The original is the original because it ran on the original hardware. The copy is created inferior because it did not. Intentionally creating inferior beings of equal moral weight is wrong.
>Because if the clone didn't start off consenting to being cloned when the original did, it's necessarily the case that the brain cloning process was not accurate.
This is false. The clone is necessarily a different person, because consciousness requires a physical substrate. Its memories of consenting are not its own memories. It did not actually consent.
The premise of the position is that it's theoretically possible to create a person with memories of being another person. I obviously don't deny that or there would be no argument to have.
Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.
> Your argument seems to be that it's possible to split a person into two identical persons. The only way this could work is by cloning a person twice then murdering the original. This is also unethical.
False.
The entire point of the argument you're missing is that they're all treating a brain clone as if it is a way to split a person into two identical persons.
I would say this may be possible, but it is extremely unlikely that we will actually do so at first.
One has a physical basis, the other is pure spiritualism. Accepting spiritualism makes meaningful debate impossible, so I am only engaging with the former.
I'd also be interested in your moral distinction between having children and cloning consciousness (in particular in a world where the latter doesn't result in inevitable exploitation, a loss of human rights etc.) then.
Typically, real humans have some agency on their own existence.
A simulated human is entirely at the mercy of the simulator; it is essentially a slave. As a society, we have decided that slavery is illegal for real humans; what would distinguish simulated humans from that?
In the Mastodon ecosystem it seems to be often taken to the extreme. As in, there ar servers will not federate with anyone who doesn't share their blocklist, and servers will block anyone using Pleroma (because it's "fascist") etc.
I've only seen that with certain German instances. They have their own particular laws over there and they're very adamant that other countries follow them to the letter, yes. I've seen the discussion. When it comes to nazi imagery I agree that should be forbidden everywhere but I think there were some other stipulations that were more controversial.
But I have not seen that outside the scope of Germany.
I don't know pleroma though. I've always hated twitter for its short-form content (I feel like it stimulates stupid nonsense like "look at my run today" and "I just had dinner" and discourages actual interesting content. So I was never into twitter clones either. I do use lemmy more although it has its own specific attitude issues around its developers (tankies).
Pleroma is Mastodon server software. For a bunch of essentially random reasons, it was popular among right-wingers setting up their Mastodon instances, and some servers responded by blocking any Mastodon server running it outright. A subset of those would also block any server not blocking Pleroma like they do.
reply