Pages

Monday, October 11, 2010

Uncanny Valley

...it's not a location in Maine but a hypothesis in robotics. Just before total likeness with human behavior we experience strong revulsion in robot simulations. Thanks Nick Guetti for reminding me about this.



It proves Jackson's point another way actually. We are weirded out because we glimpse that WE are a form of artificial intelligence. Of course the scientists will tell you it's our mating instinct or something.

6 comments:

  1. Wow, that's Lovecraftian! I'm reminded of that story "The Rats in the Walls", in which the narrator--who loathes rats--investigates an elusive infestation of them in his ancestral mansion & discovers that his whole family is descended from them.

    ReplyDelete
  2. An old 1976 Doctor Who story (or did DW steal it from Isaac Asimov?) coined an idea for a futuristic neurosis called "Robophobia". In a robot-dependent society, you're around robots all the time. These robots are designed to appear humanish for aesthetic reasons, but they give no nonverbal signals (body language), which undermines certain types of personality. Some patients experience panic and total breakdowns, but a few become maniacs who believe that they themselves are robots and must wipe out the enslaving humans (and reprogram the actual robots to think the same). Is this akin to certain deep ecologists becoming convinced that we are all animals and attacking human infrastructure? There are certainly plenty of eco-fantasy stories about Gaia-worshippers attaining magical powers to command avenging armies of fuzzy woodland creatures. Somebody should have told them there are no such things as robots or animals or people.

    ReplyDelete
  3. interesting hypothesis but i dont buy its a universal response in the least. fairly essentialist.

    and are you sure the Lovecraft story about rats isnt some story about his deep-seeded racist ideology? maybe it was his expanded version of his On the Creation of Niggers poem where the bridge between "animal and man" is bridged. what a champion (of shit).

    ReplyDelete
  4. Reminds me of Steven Pinker's answer to john Searle's 'Chinese Room' problem. He says the difference between returning correct answers by rote and understanding, as such - the elusive 'wetness' of consciousness in us biologicals - can be accounted for by speed of interchange.

    ReplyDelete
  5. My girlfriend told me about this a few months ago - apparently CG animators use it to figure out how realistic they should make their characters. Remember that movie The Polar Express? The characters were so real that it really bothered a lot of people.

    I think the problem is not so much in how realistic the robot/character is, but in what it lacks. You can approach an exact replica of a human, but you can never fully get there. And it's this gap between actual humans and fake humans that causes revulsion. There's always a sense that something is missing.
    Also, I agree with Meow - I'd like to know just how universal this is.

    ReplyDelete
  6. AI is only true insofar as we are dumbing ourselves down to believe that machines can think.

    It's not that we can't make them jump through the hoop and be intelligent - it's more that we've reduced our notion of what is intelligent: we've lowered the hoop.

    ReplyDelete