[Search for users] [Overall Top Noters] [List of all Conferences] [Download this site]

Conference noted::sf

Title:Arcana Caelestia
Notice:Directory listings are in topic 2
Moderator:NETRIX::thomas
Created:Thu Dec 08 1983
Last Modified:Thu Jun 05 1997
Last Successful Update:Fri Jun 06 1997
Number of topics:1300
Total number of notes:18728

412.0. "RQ #2: Human Qualities" by ROCK::REDFORD (On a pure caffeine high) Wed Nov 19 1986 17:39

Each of the following is in some way marginally human:

1) A chimpanzee who can speak American Sign language.

2) A sentient computer - one able to pass the Turing test for at 
least, say, an hour.

3) A dolphin who speaks no human language, but can be understood 
through translators.

I think we would all agree that these beings should have similar
legal rights to children, idiots, or lunatics.  The question is: what 
additional rights should they have?  To be specific, can they:

1) Vote or hold office? 
2) Bring legal charges against human beings?

And the flip side is:

1) Should they pay taxes?
2) Be susceptible to criminal or civil suits?

Opinions?

/jlr
T.RTitleUserPersonal
Name
DateLines
412.1All Sentience is created equal, and they are endowed...CACHE::MARSHALLhunting the snarkThu Nov 20 1986 08:4621
    re .0:
    
    first a minor semantic objection.
    
> 2) A sentient computer - one able to pass the Turing test for at 
>    least, say, an hour.
   
    A SENTIENT computer would pass the Turing test forever. I would
    not consider sentient a computer that could only pass it for a
    relatively short time.
    
    However, I know what you mean, and they are very good questions.
    I think that I shall have to mull it over more but my initial response
    is Yes....No....ummm....well....very tough question.
    
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
412.2Human is as Human DoesPROSE::WAJENBERGThu Nov 20 1986 14:3924
    Re .0
    
    I would say that we should accord them rights in proportion to the
    responsibility they exhibit.  That's just a very rough guiding
    principle of course.  Perhaps one could administer tests similar
    to the sanity tests they use to test the legal competence of
    standard-issue human beings.  (Of course, I have heard those tests
    loudly criticized, too.)
    
    Re .1
    
    To pitch the nit back to you, a computer might be *sentient* and
    be very poor at the Turing test.  "Sentient" just means "aware"
    or "conscious."  The quality of the sentience's emotions and thoughts
    might not be human-like at all.  The Turing test, as I recall, demanded
    that the computer act just the same as a human being at the other
    end of a teletype.  (In fact, a notesfile would be a grand arena
    in which to run a Turing test.)  A computer might be conscious and
    volitional but still have a very non-human personality.  HAL 9000
    might be a good fictional example.  He was responsive and intelligent
    in his actions, but was depicted as having a manner and delivery
    (and a depth psychology) markedly unhuman.
    
    Earl Wajenberg
412.3More QuestionsDRUMS::FEHSKENSThu Nov 20 1986 15:1935
    re .2 re .0 - this approach would deny rights to a large number
    of people.  Many people are seriously irresponsible, but we grant
    them "standard-issue" rights almost solely because they are "packaged"
    the right way (i.e., live in a human body).  There is a certain
    amount of "species chauvinism" here.  
    
    In particular, given much recent discussion of abortion,
    it raises some interesting questions about the rights of fetuses,
    which exhibit less responsibility than many dogs I have known.
    (Please note that this statement is clearly true, and says nothing
    whatsoever about my position on abortion, so let's not go off track
    by getting into a discussion about abortion or my position thereon).
    This is an enormously difficult question, one that people are not
    predisposed to think rationally about (perhaps because "rationality"
    is not the relevant issue).  All I'm trying to call attention to
    is the futility of "simple" answers.
    
    What does "sentient" mean?  Is there some clearly defined threshold?
    It seems to me that's part of the problem - you can't just draw
    a line and say "on this side, your "life" is unimportant, but on
    that side it is".  What does "aware" mean?  Is a thermostat "unaware"
    of temperature?  Animals are clearly "conscious", because you can
    anesthetize thme and render them unconscious, and there's a dramatic
    behavioural difference between the two states.  Do you mean self
    aware or self conscious?  What about the experiments with chimps
    and mirrors?  Or is the issue, as I have heard argued, awareness
    of death and its meaning to the individual?
    
    And what does important mean?  There are rights,
    and privileges, and responsibilities, both of and to the individual
    in question.  We could debate this issue forever and only succeed
    in aggravating one another.
    
    len.
    
412.4Non-binary CodingPROSE::WAJENBERGThu Nov 20 1986 17:1216
    Re .3
    
    I agree simple answers are no good.  If I have seemed over-simple
    it is because of limited typing time.  One over-simple thing to
    avoid is the binary allocation of rights or no rights.  Ideally,
    rights should be awarded according to many different factors.  For
    instance, the right to be unmolested should be in some way proportional
    to the subject's sensitivity.  The right to exercise a given power
    should presumably entail a responsibility to use that power with
    a regard to the rights of others.  And so forth.
    
    Human legal practice recognizes this in a rough and ready way. 
    For instance, minors have some rights but not all the rights of
    an adult citizen.
    
    Earl Wajenberg
412.5Turing TestCACHE::MARSHALLhunting the snarkThu Nov 20 1986 17:2124
    re .2:
    
    The Turing Test is a test of sentience, not humanness. 
    A human may try to act like a computer to fool you. Hal would pass
    the Turing Test (I think). A computer may be programmed to appear
    to be spontaneous. 
    
    re .3:
    
    The points you bring up are exactly why I ducked the question. The
    questions are loaded, there is no simple answer. The being will
    have whatever rights it declares that it has, as long as it presents
    a reason. Rights are not given. Rights are claimed, or they are
    recognized. To possess rights does not require responsibility, to
    *exercise* those rights does. We do not "grant" rights to people
    because they are human, they possess rights because they are human.
    
    This question I think should be in PHILOSOPHY.
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
412.6AI AliensROCK::REDFORDOn a pure caffeine highThu Nov 20 1986 18:107
One other category they could fall into is foreigners.  Foreigners do 
not have privileges and duties with regard to the government of the 
land, but they are entitled to some legal protection.  This status 
recognizes that they are half way between soceities.  Immigration 
could be permitted from Dolphinia and Netland, but that's more of a 
political question.
/jlr
412.7First reply in SFCOMET::HUNTERNine o'Clock Meetings,A Real winnerThu Nov 20 1986 18:122
     Very interesting discussion, I have to think about it awhile long
    before tackling it.
412.8More Turing TestPROSE::WAJENBERGFri Nov 21 1986 09:4510
    Re .5
    
    The Turing Test is testing sentience (sapience, actually, but that
    includes sentience), but the test is the ability to fake humanness.
    At least, that is how Turing first wrote it up.  If you want to
    propose a modified Turing Test with the simple goal "persuade me
    there is something conscious on the other end of this link," that's
    fine too.
    
    Earl Wajenberg
412.9just trying to understandCACHE::MARSHALLhunting the snarkFri Nov 21 1986 10:0229
    re .8:
    
    Earl, really, I'm not trying to argue with you. Maybe my understanding
    of the test is off, I have not read Turing's actual paper. I'm mostly
    familiar with the test through second sources, the major one being
    _Godel,_Escher,_Bach_.                    
                      
    As I understand it, the subject "passes" if the tester cannot
    decide whether or not the subject is a computer or a human. I do
    not think the tester must declare the subject to be a human in order
    to pass, the undecidability is sufficient.           
    
    Actually, I think the set-up for the test is to have three parties
    involved. the tester, the subject, and (for lack of a better word)
    a reference. The tester and the reference are human. the subject
    is the program or alien or whatever who's sentience we are trying
    to determine.
    
    All three are put before teletypes, and the tester is now periodically
    switched between the subject and the reference. If the tester cannot
    distinguish between the two, then the subject passes.
    
    Is this correct?
                                                   
                  /
                 (  ___
                  ) ///
                 /
    
412.10Parlor GamesPROSE::WAJENBERGFri Nov 21 1986 10:237
    Yes, that is my understanding of the Turing Test.  I have heard
    that Turing developed it from a Victorian parlor game in which the
    tester communicated with a man and a woman by passing notes under
    a door or some such.  Either the man pretended to be a woman or
    vice versa and the tester had to determine their real sexes.
    
    Earl Wajenberg
412.11INK::KALLISSupport Hallowe'enFri Nov 21 1986 15:3524
    Well, let's get back to the questions.
    
    Interestingly, in the 19th and early 20th Century, there was a belief
    among the population of tropical Africa that gorillas and chimpanzees
    were just as smart as anyone, but they stayed in the jungle to avoid
    having to go to work to earn money!  Sort of a "noble savage" concept,
    one imagines. 
    
    On cetacians, etc.  They're intelligent animals, and ought to be
    _respected_.  "Legal rights," one assumes, must be based on the
    presumption that they _want_ human rights.
    
    On intelligent computers, etc.  There was a marvelous Henry Kuttner
    story where the hero was talking to a robot that did the analog
    of "drinking" by sticking a metal finger in an active light socket.
    Slightly paraphrased:
    
    	There was a sharp crack and a flash of blue spark.  "Hmm," said
    the robot.  "DC.  Tasty."
        "You're not dead," said [the hero].
        "I'm not even alive," said the robot.
    
    Steve Kallis, Jr.