The AskPhilosophers logo.

Mind

I want to compare the human mind to a computer program, for the sake of this question. In a computer program, if a circumstance occurs that the machine can not process due to a fault in the code, or a lack of processing power, or any number of reasons, the program will error out. It can have many symptoms: frozen program interface, the dreaded blue screen of death, or a simple restart. But either way the program ceases to function. (Of course their are nifty programmers out their that protect against simple errors by allowing a tolerated amount of them go unnoticed if they don't impede the overall abilities of the program.) What I want to know is how or mind deals with these errors. What stops us from running infinite loops that stalls out our minds and rends us slobbering piles of useless flesh. When we are confronted with something that our brain can not understand or grasp or comprehend, how do we cope? Or is there a limit to where we cease to function?
Accepted:
August 14, 2007

Comments

Allen Stairs
August 23, 2007 (changed August 23, 2007) Permalink

An intriguing puzzle. The first point is that insofar as it's a question about how our minds actually work, it's an empirical matter, and the answer depends on the facts. But there's a design-level issue here (which I'm hoping my better-informed colleagues might chime in on.) Suppose we have a complicated program that's broken up into sub-programs, or modules. And suppose that there is one module whose job it is to monitor what's going on in various other modules and stop them if they appear to be running amok. Perhaps, for example, this monitor module will kill a process if it has cycled through a million iterations without halting.

You no doubt get the idea (and may well have thought of it yourself.) If a system is modular enough, and if it has enough safeguards, redundancies, monitoring modules and so on built in, then the chance that it will just go nuts might be small. And so if the mind is essentially a computer, it may be that millions of years of evolution have built it in this sort of way. The idea that the mind is highly modular is one that a good many people take seriously. It's a design principle that we can imagine evolution favoring exactly because it provides relatively simple ways of avoiding catastrophe.

But we can also ask whether the way actual minds work leaves them susceptible to the wetware equivalent of the blue screen of death. Once again, that's an empirical question. We might wonder: would it make sense to understand certain forms of mental illness along these lines? The idea of a program getting caught in a loop might be a useful way of thinking about certain forms of obsessive-compulsive disorder, for example. But then it might not. It may be that the mechanism underlying OCD is very different; sorting that out is a job for psychology and neuroscience.

If you're looking for interesting things to read that bear on this question, you might try looking at Chapter 4 (and other parts) of Daniel Dennett's Brainstorms, where he discusses a creature -- the Sphex Wasp -- that's clearly capable of getting caught in a loop. Dennett suggests that in principle, the same could be true of us. But you might also want to look at Richard Double's "Fear of Sphexishness" in the journal Analysis, v. 48 no. 1, 1988.

  • Log in to post comments
Source URL: https://askphilosophers.org/question/1760
© 2005-2025 AskPhilosophers.org