Some psychologists believe, based on empirical research, that people tend first to make a decision intuitively and then afterwards find a way to provide logical justification for why it was a good decision. I think they use the term "heuristic" as a way to describe an analog process in which we use experience, memory, and pattern recognition as tools with which to make that initial intuitive decision. If this description of the process of how we decide is based on how our minds actually do work, what are the implications for philosophy, which seems to imply that our decision-making process is rational? Isn't the "rational" part of our brain a fairly late evolutionary development, in which it was grafted on top of our nervous system?

If the evidence favors the view that we don't always make decisions by reasoning, then philosophy needn't disagree. If the truth of the matter were that all of our decisions—including decisions about which views are more plausible—amounted to post-hoc "rationalizations," then it's hard to see how philosophy as we usually understand it would be possible. But the evidence doesn't come close to showing that. Anyone seriously engaged in doing philosophy implicitly assumes that s/he is capable of giving reasons and being swayed by them. But that's different from assuming that we always exercise that capacity or that it never misfires.

A related thought: even if the reasons we give are often after-the-fact rationalizations, it wouldn't follow that our decision our our belief is unreasonable. The underlying mechanism that brought us to our decision or belief may be well-tuned and suited to the task it was performing, even if we have little or no conscious access to how the mechanism really works. Being reasonable doesn't require being able to give an explicit, articulate account of one's reasons. Indeed, it's not at all unusual for someone to have sounds judgment about one sort of thing or another and yet not to be good at putting the basis for the judgment into words. (Shopworn example: being good at telling whether something is grammatical is one thing; being able to explain or defend the judgment is another.)

Still, it's tempting to assume that when we're doing philosophy, conscious reasoning isn't just an incidental part of the process but is the most important part of the story. And so there's an interesting meta-level question here. If we're just as prone to after-the-fact rationalization when we're doing philosophy as we are in other circumstance, how should this affect our conception of what doing philosophy amounts to? I think you're onto an interesting issue, and I'd be wary of people who offer glib answers. That said, the question isn't really just about philosophy. It seems equally important for science, and for a good many other activities. In the case of science, one common reply is that what's important is not so much the rationality of individual scientists but of the overall enterprise. On this view, science is essentially a social activity and knowledge emerges from somethng like the wisdom of the group. We're a little less inclined to think of philosophy that way, but maybe we should.

Read another response by Allen Stairs
Read another response about Rationality