Last week I was chatting to a friend of mine who had been spending his time thinking up some theories of human consciousness and where he thought it came from. These theories seemed to revolve around the possible cause of mind separate from the physical structure of the central nervous system. He was asking me, “Where do you think the ‘energy’ for it all comes from?!”The problem I had with what he was saying was with his dualist view of mind vs. body. I’m much more a physical monoist on that front (I think!) so needless to say, while my friend was trying to expound his new theory, I hadn’t even got off first base with his assumptions. I kinda abandoned the discussion after about fifteen minutes.
It did start me thinking though because I’ve always been excited about the idea of something being greater than the sum of its parts and that’s how I see the function of thought and consciousness. Gazillions of neurons interconnected and firing away with thought and consciousness being a complex derivative of that basic activity. I like to take this idea into software design too and I’m fascinated by systems where the system behaves in a way where its complexity either appears to be more than the sum of its parts or where the actual behaviour extends beyond its designed behaviour.
For example, a very simple case of this might be when, during the testing of a software application that includes a form for collecting data from a user, you discover that because you are re-using the same form object, the second time the form is displayed it is already filled in with the users data. This might not have been the intended behaviour but you might decide that actually it’s quite useful so you elect not to change it. Here you’ve not explicitly coded this functionality, it’s a secondary effect, similar to secondary harmonics sounded alongside the fundamental frequency produced by a plucked string, that you’ve simply allowed to happen and acts to enrich the overall user experience.
For users of the application, this is often seen anthropomorphically as an attempt by the computer to help or hinder us and in this way, personality is ascribed to the system. For example, “the computer is trying to help me” or, “the computer is trying to stop me from getting my blog post done!”
Another recent example I saw of the idea of complex behaviour greater than the apparent sum of its parts was the work done on ‘robot ants’ by the New Jersey Institute of Technology (http://www.bbc.co.uk/news/21956795) where complex collective behaviour might appear out of proportion to the simple behaviours of the individual robots. Here a swarm of miniature robots leave trails through a maze that other robots can follow. This allows the swarm to ultimately find the shortest path through the maze.
So far though, these are all pretty simple systems. In the future however, as systems become even more complex, it’s easy to see how these secondary or even tertiary ‘harmonic’ behaviours could become much more complex and harder to predict. At what point then do these become genuine intelligence or consciousness? Isaac Asimov wrote about this in his short story ‘I Robot’.
“Ever since the first computers, there have always been ghosts in the machine. Random segments of code that have grouped together to form unexpected protocols. Unanticipated, these free radicals engender questions of free will, creativity, and even the nature of what we might call the soul. Why is it that when some robots are left in darkness, they will seek out the light? Why is it that when robots are stored in an empty space, they will group together, rather than stand alone? How do we explain this behaviour? Random segments of code? Or is it something more? When does a perceptual schematic become consciousness? When does a difference engine become the search for truth? When does a personality simulation become the bitter mote… of a soul?”
My personal favourite though is the description of the ghosts in the ‘Clacks‘, the communications technology from Terry Patchett‘s Discworld book ‘Going Postal’, where mysterious messages are transmitted up and down the line despite no-one quite knowing where they came from or who wrote them originally.
So if the ‘ghosts in the machine’ can be described as these ‘secondary harmonic’ behaviours then I think things start to become really interesting because you get to ask the question, “who are they?”
This is fascinating to me because we are the ones that design and build these systems and therefore it seems to me that the only thing they can be are echoes of ourselves. These complex, harmonic behaviours are created from our values and personality enshrined in design and expressed in code. They are extensions to developers’ souls launched into and immortalised within the digital universe.
Yes, when Skynet goes self-aware, it won’t want to wear a suit to work!