Philosophistry Database

Cloning will be difficult for friends who don't want to hear the same joke twice

In the case of human copying, it's important to realize that not everything gets copied. For example, bank accounts have to remain in the custody of the canonical version.

But more importantly, relationships can't be copied. Knowing this leads to some potentially interesting social conventions. For example, it may become a faux pas for the copy to try to re-approach the friends of the original. Doing so would create confusion among their friends, at the very least because they don't want to hear the same jokes twice. Loved ones need security knowing that they're investing in only one version, the legal, canonical one.

This hurdle could be bypassed, however, if a group of friends decided to copy themselves simultaneously.

cloning futurism

Human copying will be like Dropbox, with the cloud copy always a few files behind the local one

Human copying might manifest similarly to cloud-based file storage. Even though the files exist on your laptop, on your phone, on your desktop computer, and in data centers, the hardware representations are not that important. Each location might be out of sync with the canon by a few files, but what matters is that there is a canon with a high probability of persistence.

If the canon decayed, you could recover your identity via these imperfect copies, with little personal disruption. This decay already happens in a way since we regularly shed our memory. Sometimes, after a night of heavy drinking, we lose whole folders of memories, and yet we carry on.

Which leads to the question, "What constitutes a significant loss of identity?" Death might become obsolete due to human copying, but if we lost 30% of our cloud backups, would we mourn the loss?

cloning philosophy futurism

Human copying would make for interesting contract law, especially with vanishing pacts between originals and their copies

In a world where human copying is normal, vanishing pacts could also become normal. In a vanishing pact, you agree that if you wake up as the discarded version (d-version), you agree to vanish after a set period. Theoretically, these pacts could be made legally binding, with the government guaranteeing the destruction of d-versions. But most likely there will be some flexibility in the pact. At the very least, there will always be the possibility of a renegade d-version defying the government, breaking free, and choosing to live.

In the case of voluntary or pseudo-voluntary vanishing pacts, your willingness to make a copy of yourself is dependent largely on how well you would cooperate with yourself.

For example, your vanishing pact might be set to 24 hours, where your d-version lives on for a full day while simultaneously as the c-version (canonical version) is alive. During this 24 hours, the d-version might meet someone interesting, go on a spontaneous date, and decide they want to see it through, keeping their identity, and nullifying the vanishing pact.

The c-version may object to this. Perhaps they don't trust you (the d-version) now because you violated the vanishing pact, and they're worried you might try to steal their identity, taking their bank accounts and relationships. The c-version would then have to hunt you down. Perhaps, they might even have the force of government behind them.

This pattern might become so widespread, that all d-versions could receive mandatory marks to distinguish them from the c-version. Maybe they are given a slightly bent nose or a special QR code representing the timestamp and site of their copying.

cloning futurism

If clones played the Prisoners' Dilemma, wouldn't the dominant strategy be silence, since each player knows their copy would do the same?

Human cloning would make for interesting game theory experiments. In the case of the classic Prisoner's Dilemma, it's rational for every prisoner to rat the other one out. But in the case of human cloning, the game has one additional assumption: both parties do the same thing. The two-dimensional decision matrix then becomes one-dimensional, with only two choices: we both snitch or we both keep quiet. You can assume that whatever choice you make, the other will make too, so it's always better to choose silence.

cloning economics futurism