I used to think that languages and cultures shape the ways we think. I suspected they shaped they ways we reason and interpret information. But I didn't think languages could shape the nuts and bolts of perception, the way we actually see the world. That part of cognition seemed too low-level, too hard-wired, too constrained by the constants of physics and physiology to be affected by language.
Then studies started coming out claiming to find cross-linguistic differences in color memory. For example, it was shown that if your language makes a distinction between blue and green (as in English), then you're less likely to confuse a blue color chip for a green one in memory. In a study like this you would see a color chip, it would then be taken away, and then after a delay you would have to decide whether another color chip was identical to the one you saw or not.
Of course, showing that language plays a role in memory is different than showing that it plays a role in perception. Things often get confused in memory and it's not surprising that people may rely on information available in language as a second resort. But it doesn't mean that speakers of different languages actually see the colors differently as they are looking at them. I thought that if you made a task where people could see all the colors as they were making their decisions, then there wouldn't be any cross-linguistic differences.
I was so sure of the fact that language couldn't shape perception that I went ahead and designed a set of experiments to demonstrate this. In my lab we jokingly referred to this line of work as "Operation Perceptual Freedom." Our mission: to free perception from the corrupting influences of language.
We did one experiment after another, and each time to my surprise and annoyance, we found consistent cross-linguistic differences. They were there even when people could see all the colors at the same time when making their decisions. They were there even when people had to make objective perceptual judgments. They were there when no language was involved or necessary in the task at all. They were there when people had to reply very quickly. We just kept seeing them over and over again, and the only way to get the cross-linguistic differences to go away was to disrupt the language system. If we stopped people from being able to fluently access their language, then the cross-linguistic differences in perception went away.
I set out to show that language didn't affect perception, but I found exactly the opposite. It turns out that languages meddle in very low-level aspects of perception, and without our knowledge or consent shape the very nuts and bolts of how we see the world.
This blog reports new ideas and work on mind, brain, behavior, psychology, and politics - as well as random curious stuff. (Try the Dynamic Views at top of right column.)
Friday, February 22, 2008
Languages shape the nuts and bolts of our perception.
The debate over whether language nudges the way we actually see the world is being resolved, and what has been the prevailing dogma - that basic parts of perception are too low-level, too hard-wired, too constrained by the constants of physics and physiology to be affected by language - is breaking down. Lera Boroditsky at Standford comments on this.
"If we stopped people from being able to fluently access their language, then the cross-linguistic differences in perception went away."
ReplyDeleteSetting aside the question of how you stop people from doing this, I have to ask:
If you can remove a person's access to language, doesn't this still support the idea of hard-wired perceptions? Doesn't this simply make language a filter on our cognition, but not our perceptions?
I've got a big problem with that "experiment". Color, like sound, is something that most people don't have "perfect pitch" for as adults.
ReplyDelete"perfect pitch" with sounds and colors *can* be taught. The tonal languages of the Chinese have shown that they don't vary but 1/8th of a tone from day to day (or even years) for the same tonal word. They have perfect pitch.
The same can be done with light. Exact wavelengths of light can be memorized just like exact pitch. If we practice enough.
But really... who in the world do you know that practices perfect frequency color matching? Not many people.
So to use blue and green isn't a fair test. They're too close even for languages that *make* the distinction. In fact, how our eyes even *see* a color is dependent on the context of viewing that color (for the most part). There have been experiments at UW where in one picture, a square looks yellow and in another gray, but it's actually the same shade of green in both paintings.
I therefore can't agree with the conclusion.