9

Setting

Humans have spent centuries investigating human consciousness, brain physiology, AI, and other technologies. We've finally figured out how to "upload" a "consciousness" (these are quoted because I don't really know what this means at this point).

Personality upload is non-destructive so the original biological entity lives on after the upload.

The Question

How do we protect the rights of both uploaded and non-uploaded personalities?

Some abuses these entities might face:

  • Copying them against their will and using them in roles they do not want (e.g. as guidance AI in warheads).
  • Not permitting them representation in government.
  • Them copying themselves to get too much representation in government.
  • What legal and moral obligations do the two entities have towards each other?

Considering that the "you" walking into the personality upload facility is fully aware that it stands a 50% chance of being the biological entity or the uploaded entity when its done, does the law require a 50% split of assets or allow the one walking in to dictate the terms for the two walking out? Perhaps it requires a minimum level of support for each entity.

Jim2B
  • 28,674
  • 5
  • 75
  • 141
  • Are the upload and the original human linked in any way? Also, in what jurisdiction does this happen? Answers will vary by country, or even at lower levels of government. – HDE 226868 Oct 28 '15 at 21:51
  • 2
    I don't think there is just one answer to this question. In fact, I think this may be one of the less agreed upon classes of legal questions in existence. I'm tempted to vote it as too broad, but I'm holding off, and instead wondering if anyone can come up with an answer that I cannot trivially argue a diametrically opposed answer, and have that new answer be "just as good." So much of the answer depends on the social/cultural details behind the question that do not appear in the question (and likely cannot, due to limits of modern language) – Cort Ammon Oct 28 '15 at 21:54
  • 3
    As an example, consider the works of Asimov, many of which dealt with the countless permutations of the way conscious AIs could be treated in the world, and not one of his books handles them exactly the same way as another, even when constrained by the famous Three Rules of Robotics. – Cort Ammon Oct 28 '15 at 21:55
  • Can we turn them off and then on ? What right would they claim ? – Kii Oct 28 '15 at 22:06
  • @CortAmmon, I agree that this is a broad question. I'm trying to narrow them down and if anyone has suggestions please feel free to contribute. I find this topic fascinating and I do not think there's one right answer (which suggests it is too broad :( ). I can see the beginnings of several answers to several of my questions but I don't have them fully fleshed out. – Jim2B Oct 28 '15 at 22:10
  • And also what would be their needs beside energy to power their computer ? – Kii Oct 28 '15 at 22:11
  • The uploaded personality's costs would include physical maintenance on that computer, software maintenance, ISP costs, Netflix subscription, cost for maintaining a controlled environment for the computer equipment, etc. – Jim2B Oct 28 '15 at 22:15
  • 1
    One of the most impactful lines I've been able to draw while exploring the problem is whether the uploaded personalities are copyable as data, or if they have their own "existence" from the moment they are brought on line. Anything copyable is replacable and often destroyable, so such an AI would have few rights. However, there are many ways the AI could be hooked into systems that are not reasasonable to disconnect in a controlled manner, rendering them uncopyable, and if they shut down, that's "the end." – Cort Ammon Oct 28 '15 at 22:34
  • 1
    Another impactful question is how the copy process works. You mention this process is nondestructive... 100% non-destructive? Could I torture information out of someone by cloning them, torturing their clone to death, then cloning them again and again without ever harming a hair on the original's head? How much is conveyed? Perfect memories, all the way down to the first kiss? Or is it just the personality in a vat, scared because all of its warm memories have been stripped? – Cort Ammon Oct 28 '15 at 22:37
  • The copy process is important because almost everything I have found on this topic suggests the answers are not path independent. We cannot define a Before, define an After, and presume that all the details we need are found in those two endpoints. The way we get from Before to After will affect a great deal. – Cort Ammon Oct 28 '15 at 22:38
  • A copy process that requires the subject focus their willpower on the success of the clone, and failure on their part at any point will result in an imperfect clone (but others might not notice). In such a system, perhaps only a select few have the willpower to actually create a perfect clone, everyone else makes do with degraded copies (which, if they get lucky in the way they are degraded, might be "better" than the original!) – Cort Ammon Oct 28 '15 at 22:40
  • @CortAmmon, I hadn't fully thought it out but my general idea was that by this time most people would be using implanted electronics (embedded cpu, RAM, wireless receivers, etc.). At some point the implanted electronics would have access to much of the human personality and I was expecting the upload to make use of these electronics for at least part of the process. Perhaps some mapping of neurons and their branching could be used too. So in my imagination, it'd be a perfect replication of accessible memories. Perhaps some replication of subconscious memories. – Jim2B Oct 28 '15 at 22:41
  • 2
    Then, it is worth noting, that you have taken a hard physicalist line of philosophical debate. You have declared that the consciousness is nothing but an illusion, and we can reproduce it any time we like, even against its will. (If you can't tell, I've spent quite a bit of time playing with what it means for a computer to be conscious, especially if I do not want to flat out invalidate dualism as a philosophy. That's a popular philosophy to try to fight. Readers like to believe their "self" is actually a thing) – Cort Ammon Oct 28 '15 at 22:44
  • 1
    If it fits well with the world you wish to build, the version I have found least conflicting philosophically is one where the electronics do not clone a mind, they permit a mind to clone itself. The mind must use willpower to create the clone, or the cloning process fails. If the mind exerts partial willpower, both self and clone will be able to distinguish which is the clone. If the mind is fully exerted, neither party will be able to distinguish one from the other (generating all sorts of interesting bits). Once cloned, each clone should have at least one analog electronic timing... – Cort Ammon Oct 28 '15 at 23:01
  • 1
    ... as part of its identity, which inherently prevents naive replays which would clone the data from the consciousness and try to make a new one. Also, a clone should be able to use the techniques on themselves, or else they are incomplete clones, but obviously any degeneration due to lack of willpower will stack. (This construction also sets the stage for the legal questions, as it does a better job of permitting self-identity to clones. One gives rights to other beings. One never gives rights to furniture. The clones have to be able to clear the gap and rise above "furniture" status) – Cort Ammon Oct 28 '15 at 23:02

2 Answers2

6

The digital copy would have no rights.

Certainly not at first. Humanity has not had a good history of recognizing rights, even (or especially) of other humans.

It would take quite a long time before digital-people would be considered people. Even then they would almost certainly be seen as offspring of the flesh-person, not as the same person twice. This would allow them personhood rights, but not entitlement to the possessions/assets of the original (though they may be considered the de facto heir).

The original person might even be legally required to support their copy until it can support itself. For humans in the US that period is defined as 18 years, time would tell what the period of time would be required for spawning a fully developed person.

Samuel
  • 48,522
  • 10
  • 144
  • 232
  • That's one of the directions my own thoughts were trending. After the law started recognizing them as legal entities, then it would require a certain level of support for a certain period of time. How long it would take for the politics to catch up with the technology is a good question. How long it would take an uploaded personality to find a means of support is probably highly dependent upon that personality. How would your uploaded personality like to be someone's personal assistant for a century, lol? – Jim2B Oct 28 '15 at 22:13
  • @Jim2B Even more thoughts come to mind. I've asked here. – Samuel Oct 28 '15 at 22:40
0

Whatever bad thing is in their nature, they must be closely guarded on that front. The real question is, why in the heck would someone 'upload' a personality with bad traits at all? On the 2nd front, the 'new guy' shouldn't have any of the stuff of the former, unless it has the former's memories, too. This would be so awkward in general as to get the whole thing banned before it could start.

  • 3
    This sounds more like a comment than an answer. – HDE 226868 Oct 28 '15 at 21:43
  • 1
    The answer is: you would have to treat the one like the other, but it would get weird if one didn't have the others' memories, because "why am I having psychology sessions? I didn't do anything!" –  Oct 28 '15 at 21:45