When I was a kid I learned that our bodies were host to a large (10^14) number of bacterial flora. After that I became sort of obsessed with the idea that to those bacteria, my gut was like the earth is to us humans (this theme is explored in greater depth in the Futurama episode 52, season 4, “Godfellas” (2002) where a civilization of small aliens grows on Bender’s robotic body, worshipping him as god).
I guess I never really stopped being interested in that idea, which is why I so enjoyed Marvin Minsky’s Society of Mind (1986).
Minsky’s idea is that the human mind can be modeled as groups of agents with simple processes interacting to build higher level cognition. He does tell us at the beginning of the book that he is not proposing how minds actually work, but just how it could be modeled. Anyway, from a neurophysiological standpoint, this is actually not so far off—as I have argued before, a complex system with numerous interacting agents exhibits unusual emergent properties (like swarm behavior or—that’s right—intelligence). Let’s take a look at the Blue Brain Project, a pretty cool working group out of the École Polytechnique Fédérale de Lausanne that is attempting to reverse-engineer the mammalian brain. They have already modeled the neocortical column, taking a single cell:
And building up to the entire column:
What I think is so cool is that this is at its core data visualization—they are modeling a complex system and then giving us an image of what happens when you turn it on. Taking a look at images of a real brain:
(via the incredible flickr gallery, neurollero here)
It seems that they are on the right track. Here is a slice of a rat brain:
(via Paul De Koninck Lab; a GFP-tagged protein in a hippocampal neuron.)
This gives you a good idea of how complex the interactions between neurons are, even in this tiny slice of the brain:
Returning to Minsky’s idea, is it philosophically reasonable to consider the brain as a series of agents clustering together to perform increasingly more complicated tasks? At least to me, it is just a more complicated iteration of any organic complex system, such as atoms combining to make molecules, cells acting together to form an organ, bacterial flora in the gut interacting to perform the complex function of supporting life, or even adaptive selection, exaptation (shifts in the function of a trait), cooption, and preadaptation (evolution in the use of an already existing biological structure inherited from an ancestor that previously served a completely different function, like feathers on birds). Considering ever larger complex systems, you really start to respect Wolfram’s position on computational irreducibility (see my post here).
Minsky, Marvin. 1986. The Society of Mind. New York: Simon and Schuster.
- The Self-aware Machine (computersight.com)
- What IBM’s Watson tells us about the state of AI (news.cnet.com)