Cornell humanists strive to understand the mind

Is the mind distinct from the brain? Is there, in fact, a soul directing our thoughts or are they determined entirely by the output of our biology? Before there was cognitive science, before there was neurobiology – before there was even biology – humanists wrestled with these questions.

On Feb. 22, the College of Arts and Sciences brought together faculty members working on philosophy of mind in a Big Ideas panel, part of the New Century for the Humanities celebration. The event was held in the Groos Family Atrium in Klarman Hall and featured Karen Bennett, professor of philosophy; Morten Christiansen, professor of psychology and co-director of the cognitive science program; and Laurent Dubreuil, professor of Romance studies and comparative literature.

As Bennett noted at the event, traditionally philosophy of mind scholars in the West have fallen on one side or the other of the mind/body question.

Dualists say that there is something special about the mind – it’s not just an incredibly interesting and complicated machine. Trees and tables and billiard balls can be explained by physics and biology, but you need to add something extra, some nonphysical property, to explain human consciousness. Dualists would say the mind would function just as it does whether or not it has a body.

On the other side are physicalists. Most philosophers today still find the physicalist explanation more compelling: that mental phenomena have a physiological or a neurophysiological basis. Even that simple statement, though, raises more questions. For example, if physicalism is characterized as everything physical, what does “physical” mean? It can’t mean tangible, since gravity is a physical force but it can’t be touched; rather, says Bennett, it means that “everything is accounted for, or generated by, the kinds of things physicists talk about.” Physicists might not discuss chairs very often, but chairs are composed of the kinds of things physics talks about, like atoms and molecules.

But how do you get belief and thought out of a lump of brain?

“This is a particularly hard version of the question of how ordinary middle-size objects arise from the interaction of atoms and molecules,” explains Bennett. “People can see that if you were smart enough and had all the relevant information about atoms and electrons, you could understand how they become a car. But people find it particularly perplexing to see how the physical goings-on really could explain and generate conscious experience.”

But Dubreuil says, “Whatever your position about dualism or physicalism, a mind is more than a brain. Not only are mental operations ‘extended’ beyond the nervous system and outsourced to books, objects or computers – but they also occur ‘out there,’ especially when we share language.”

Consciousness is a topic in philosophy of mind that has attracted considerable attention in recent years. It’s a familiar phenomenon, the most intimate thing we experience – everything looks a certain way and feels a certain way to us. Yet ever since Sigmund Freud, it’s become common to believe there’s a great deal of sub- or unconscious process going on beneath our awareness. “For example, you might not be conscious of your anger toward your father, but it’s still having an effect,” says Derk Pereboom, the Susan Linn Sage Professor in Philosophy and Ethics and the Stanford H. Taylor ’50 Chair of the Sage School.

Cognitive scientists approach these problems by viewing the mind as similar to a computer. But viewing the mind this way also raises issues. As Will Starr, assistant professor of philosophy, notes, there’s very little work possible in artificial intelligence or psychology that doesn’t connect with big, hard philosophical questions, which may explain why his Mind & Machines course is so popular. Since its inception, the course has grown from 15 students to 70; this semester, students from every college at the university have enrolled.

The course explores questions like whether it is plausible that brain cells compute and whether a computer could ever have a mind, beliefs, emotions and conscious experiences. “One thing that makes the class fun to teach is the continuing technological innovation that brings about machines and software that previous scholars couldn’t have envisioned,” says Starr.

Linda B. Glaser is a staff writer for the College of Arts and Sciences.

Media Contact

Melissa Osgood