I want to pass on three recent articles I found interesting. The first is a
piece by Kawakami in the New York Times on why we like sad music, sumarizing
his article in Frontiers in Psychology. He and his collaborators found that when listening to sad music
...felt emotion did not correspond exactly to perceived emotion. Although the sad music was both perceived and felt as “tragic” (e.g., gloomy, meditative and miserable), the listeners did not actually feel the tragic emotion as much as they perceived it. Likewise, when listening to sad music, the listeners felt more “romantic” emotion (e.g., fascinated, dear and in love) and “blithe” emotion (e.g., merry, animated and feel like dancing) than they perceived. (Glinka's "La Séparation" is one of the pieces used).
They suggest this may have something to do with vicarious emotions:
...when we listen to sad music (or watch a sad movie, or read a sad novel), we are inoculated from any real threat or danger that the music (or movie or novel) represents...If this is true, what we experience when we listen to sad music might be thought of as “vicarious emotions.” Here, there is no object or situation that induces emotion directly, as in regular life. Instead, the vicarious emotions are free from the essential unpleasantness of their genuine counterparts, while still drawing force from the similarity between the two.
The second article,
by Leman et al., examines how music can entrain the speed of beat synchronized walking. Subjects walked to the rhythm of different musical pieces all having a tempo of 130 beats per minute and a meter of 4 beats. Some music was "activating" in that it increased stride length and distance covered, while "relaxing music" had the opposite effect. They suggest that recurrent patterns of fluctuation affecting the binary meter strength of the music may entrain the vigor of the movement, a relationship between entrainment and expressiveness that might lead to applications in sports and physical rehabilitation.
Finally,
Koelsch et al. do an intersting examination of processing of hierarchical syntactic structure in music:
Hierarchical structure with nested nonlocal dependencies is a key feature of human language and can be identified theoretically in most pieces of tonal music. However, previous studies have argued against the perception of such structures in music. Here, we show processing of nonlocal dependencies in music. We presented chorales by J. S. Bach and modified versions in which the hierarchical structure was rendered irregular whereas the local structure was kept intact. Brain electric responses differed between regular and irregular hierarchical structures, in both musicians and nonmusicians. This finding indicates that, when listening to music, humans apply cognitive processes that are capable of dealing with long-distance dependencies resulting from hierarchically organized syntactic structures. Our results reveal that a brain mechanism fundamental for syntactic processing is engaged during the perception of music, indicating that processing of hierarchical structure with nested nonlocal dependencies is not just a key component of human language, but a multidomain capacity of human cognition.
No comments:
Post a Comment