Friday, February 27, 2009

Followup on genes and language

I wanted to pass on some summary clips from a review by Berwick of the paper featured in a Feb. 12 post on an article by Chater et al. ("Language evolved to fit the human brain...").
Is language more like fashion hemlines or more like the number of fingers on each hand? On the one hand, we know that all normal people, unlike any cats or fish, uniformly grow up speaking some language, just like having 5 fingers on each hand, so language must be part of what is unique to the human genome. However, if one is born in Beijing one winds up speaking a very different language than if one is born in Mumbai, so the number-of-fingers analogy is not quite correct.
The Chater et al. article:
...maintains that the linguistic particulars distinguishing Mandarin from Hindi cannot have arisen as genetically encoded and selected-for adaptations via at least one common route linking evolution and learning, the Baldwin–Simpson effect

In the Baldwin–Simpson model, rather than direct selection for a trait, in this case a particular external behavior, there is selection for learning it. However, as is well known, this entrainment linking learning to genomic encoding works only if there is a close match between the pace of external change and genetic change, even though gene frequencies change only relatively slowly, plodding generation by generation. Applied to language evolution, the basic idea of Chater et al. is to use computer simulations to show that in general the linguistic regularities learners must acquire, such as whether sentences get packaged into verb–object order, e.g., eat apples, as in Mandarin, or object-verb order, e.g., apples eat, as in Hindi, can fluctuate too rapidly across generations to be captured and then encoded by the human genome as some kind of specialized “language instinct.” This finding runs counter to one popular view that these properties of human language were explicitly selected for, instead pointing to human language as largely adventitious, an exaptation, with many, perhaps most, details driven by culture. If this finding is correct, then the portion of the human genome devoted to language alone becomes correspondingly greatly reduced. There is no need, and more critically no informational space, for the genome to blueprint some intricate set of highly-modular, interrelated components for language, just as the genome does not spell out the precise neuron-to-neuron wiring of the developing brain.
Matters boil down to recursion, which I have mentioned in several previous posts.
Chater et al.'s report also points to a rare convergence between the results from 2 quite different fields and methodologies that have often been at odds: the simulation-based, culturally-oriented approach of the PNAS study and a recent, still controversial trend in one strand of modern theoretical linguistics. Both arrive at the same conclusion: a minimal human genome for language. The purely linguistic effort strips away all of the special properties of language, down to the bare-bones necessities distinguishing us from all other species, relegating such previously linguistic matters such as verb–object order vs. object–verb order to extralinguistic factors, such as a general nonhuman cognitive ability to process ordered sequences aligned like beads on a string. What remains? If this recent linguistic program is on the right track, there is in effect just one component left particular to human language, a special combinatorial competence: the ability to take individual items like 2 words, the and apple, and then “glue” them together, outputting a larger, structured whole, the apple, that itself can be manipulated as if it were a single object. This operation runs beyond mere concatenation, because the new object itself still has 2 parts, like water compounded from hydrogen and oxygen, along with the ability to participate in further chemical combinations. Thus this combinatorial operation can apply over and over again to its own output, recursively, yielding an infinity of ever more structurally complicated objects, ate the apple, John ate the apple, Mary knows John ate the apple, a property we immediately recognize as the hallmark of human language, an infinity of possible meaningful signs integrated with the human conceptual system, the algebraic closure of a recursive operator over our dictionary.

This open-ended quality is quite unlike the frozen 10- to 20-word vocalization repertoire that marks the maximum for any other animal species. If it is simply this combinatorial promiscuity that lies at the heart of human language, making “infinite use of finite means,” then Chater et al.'s claim that human language is an exaptation rather than a selected-for adaptation becomes not only much more likely but very nearly inescapable.

No comments:

Post a Comment