“Was not their mistake once more bred of the life of slavery that they had been living?—a life which was always looking upon everything, except mankind, animate and inanimate—‘nature,’ as people used to call it—as one thing, and mankind as another, it was natural to people thinking in this way, that they should try to make ‘nature’ their slave, since they thought ‘nature’ was something outside them” — William Morris


Friday, June 3, 2011

On Just Trying Not to Fall: Mind Isn't What It's Cracked up to Be


On this computational biologist Mike Paulin and OOO me agree. We had a nice conversation in Dunedin, New Zealand, in which it emerged that we both thought

—mind is overrated by AI and anti-AI theory and consciousness, whatever it is, is far more default to lifeforms than we think

—mind is extended or in my terms interobjective (see Levi's many posts on this).

So Mike has this fabulous series of pages in which he generates algorithms for walking and other movements—in other words, when I walk across a surface, am I holding a picture in my mind, a representation of myself, walking, or my terrain; or am I just trying not to fall? (There is a correct answer.)

1 comment:

Bill Benzon said...

No time to really elaborate, but you need to look up Rodney Brooks. He made great strides in robotic AI (in the early -mid 80s?) when he jettisoned a lot of the symbolic inferential "mind" baggage of 1st generation AI and created insect-like bots that used so-called neural nets to learn their world as they knocked about it. I believe that a creature he called "Gengis" was a break-thru.

Oh, and there's this fantastic TED video about a (Stanford?) roboticist/biologist building bots the mimic biological movement. Really cool stuff. Alas, I've forgotten the guy's name and have no URL.