Wednesday, July 23, 2014

Why is melody in the high notes and rhythm in the base?

Hove et al. examine to what extent musical convention might be shaped by evolutionarily-shaped human physiology.
Across cultures, polyphonic music most often conveys melody in higher-pitched sounds and rhythm in lower-pitched sounds. They show that, when two streams of tones are presented simultaneously, the brain better detects timing deviations in the lower-pitched than in the higher-pitched stream and that tapping synchronization to the tones is more influenced by the lower-pitched stream. Furthermore, their modeling reveals that, with simultaneous sounds, superior encoding of timing for lower sounds and of pitch for higher sounds arises early in the auditory pathway in the cochlea of the inner ear. Thus, these musical conventions likely arise from very basic auditory physiology.
The abstract:
The auditory environment typically contains several sound sources that overlap in time, and the auditory system parses the complex sound wave into streams or voices that represent the various sound sources. Music is also often polyphonic. Interestingly, the main melody (spectral/pitch information) is most often carried by the highest-pitched voice, and the rhythm (temporal foundation) is most often laid down by the lowest-pitched voice. Previous work using electroencephalography (EEG) demonstrated that the auditory cortex encodes pitch more robustly in the higher of two simultaneous tones or melodies, and modeling work indicated that this high-voice superiority for pitch originates in the sensory periphery. Here, we investigated the neural basis of carrying rhythmic timing information in lower-pitched voices. We presented simultaneous high-pitched and low-pitched tones in an isochronous stream and occasionally presented either the higher or the lower tone 50 ms earlier than expected, while leaving the other tone at the expected time. EEG recordings revealed that mismatch negativity responses were larger for timing deviants of the lower tones, indicating better timing encoding for lower-pitched compared with higher-pitch tones at the level of auditory cortex. A behavioral motor task revealed that tapping synchronization was more influenced by the lower-pitched stream. Results from a biologically plausible model of the auditory periphery suggest that nonlinear cochlear dynamics contribute to the observed effect. The low-voice superiority effect for encoding timing explains the widespread musical practice of carrying rhythm in bass-ranged instruments and complements previously established high-voice superiority effects for pitch and melody.

No comments:

Post a Comment