Time and Perception
This is part 3 of 5 in the series Brain - Time - Music - Computing.
Previous: The Brain in Middle World
Next: Time and the Brain
Scales of time in MW play a crucial role in the recognition and interpretation of temporal patterns, by the brain, as symbolic relationships such as causality and synchrony.
Events that are perceived as shortly following each other in time tend to be interpreted in a causality relationship. Brains learn the range of latencies that may separate an action and the perception of its effect in MW. The quantitative characterization of acceptable latencies is crucial to the understanding of interaction. Human-computer interaction researchers [4][1] categorize acceptable time delays into three orders of magnitude, which coincide with Newell’s cognitive band in his time scale of human actions [5]: the 0.1s (100ms) scale characterizes perceptual processing, perceived instantaneous reaction; the 1s scale characterizes immediate response, continuous flow of thought (consistent with the notion of psychological present [3]); and, the 10s scale characterizes unit tasks, continued and sustained attention. These orders of magnitude define relatively narrow ranges of applicability for different levels of cognitive activities, especially when compared with the longer time order ranges that characterize other activities in MW: the rational band (100-10000s, i.e. minutes to hours), the social band (100000-10000000s, i.e. days to months) and the historical band (100000000-10000000000s, i.e. years to millenia).
Ensemble musical performance requires both interaction and synchronization. Synchrony, defined as the exact co-occurrence of several observations, is a perceptual abstraction. The different, and finite, speeds at which light and sound travel imply that events whose percepts occur in absolute synchrony would hardly ever have occurred synchronously in MW, or would never have occurred naturally at all. For example, explosions in movies usually occur as if sound and light traveled at the same speed in the air. In MW, sound travels quite slowly, about 33cm in 1ms, whereas light travels so fast that travel times are negligible. Events that do occur simultaneously cause in an observer a variety of percepts that are not received synchronously, but that exhibit specific - and predictable - temporal patterns, which brains learn to understand as signs of a common cause, and thus synchronous origin in MWT. This is especially relevant in music making (and enjoying), due to the relatively low speed of sound travel in air.
Fraisse offers a comprehensive analysis of psychophysical experiments that aim to characterize the perceptual limits of time properties: event succession and duration [3]. Exact numbers depend on many factors, including task modality, and stimulus type, intensity, and duration. Pierce places the time resolution of the ear on the order of 1ms [6], which leads him to question whether human’s acute time resolution is actually of any use in music. Certainly the brain perceives as simultaneous auditory events that the ear detects as distinct. Pierce cites, among others, an experiment by Rasch, which revealed that synchronization in performed small ensemble music is only accurate to 30-50ms [7]. Incidentally, this range also characterizes the travel time of sound between the opposite end of an orchestral stage. Each individual musician in the orchestra experiences the performance in a necessarily specific and unique way. Yet the musicians are collectively capable, under the direction of the conductor, of producing an expert and consistent ensemble rendering of a musical piece. Recent experiments by Chew et al. [2] on sound latency in ensemble performance have shown that, under favorable conditions, professional musicians can deliver a meaningful musical performance while experiencing delays as high as 65ms. This number recalls the experimental fact, also reported by Pierce, that humans perceive no echo when a strong reflected sound occurs within 60-70ms after the direct sound.
These experimental results illustrate the impressive plasticity of brain processes with respect to the perception of MWT. Abstract temporal concepts, such as synchrony, bear limited relevance to the modeling of the MW perceptual and cognitive phenomena that inspired their invention.
References
[1] Stuart K. Card, George G. Robertson, and Jock D. Mackinlay. The information visualizer, an information workspace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI), pages 181–186, 1991.
[2] E. Chew, A.A. Sawchuk, R. Zimmermann, V. Stoyanova, I. Tosheff, C. Kyriakakis, C. Papadopoulos, A.R.J. Francois, and A. Volk. Distributed Immersive Performance. In Proceedings of the 2004 Annual NASM Meeting, San Diego, CA, USA, November 2004.
[3] Paul Fraisse. Perception and estimation of time. Annual Review of Psychology, 35:1–36, 1984.
[4] Robert B. Miller. Response time in man-computer conversational transactions. In Proceedings of the AFIPS Fall Joint Computer Conference, volume 33, pages 267–277, 1968.
[5] Allen Newell. Unified Theories of Cognition. Harvard University Press, 1990.
[6] John R. Pierce. The nature of musical sound. In Diana Deutsch, editor, The Psychology of Music, Second Edition (Cognition and Perception), pages 1–24. Academic Press, 1998.
[7] R. A. Rasch. Synchronization in performed ensemble music. Acustica, 43:121–131, 1979.
Previous: The Brain in Middle World
Next: Time and the Brain
Scales of time in MW play a crucial role in the recognition and interpretation of temporal patterns, by the brain, as symbolic relationships such as causality and synchrony.
Events that are perceived as shortly following each other in time tend to be interpreted in a causality relationship. Brains learn the range of latencies that may separate an action and the perception of its effect in MW. The quantitative characterization of acceptable latencies is crucial to the understanding of interaction. Human-computer interaction researchers [4][1] categorize acceptable time delays into three orders of magnitude, which coincide with Newell’s cognitive band in his time scale of human actions [5]: the 0.1s (100ms) scale characterizes perceptual processing, perceived instantaneous reaction; the 1s scale characterizes immediate response, continuous flow of thought (consistent with the notion of psychological present [3]); and, the 10s scale characterizes unit tasks, continued and sustained attention. These orders of magnitude define relatively narrow ranges of applicability for different levels of cognitive activities, especially when compared with the longer time order ranges that characterize other activities in MW: the rational band (100-10000s, i.e. minutes to hours), the social band (100000-10000000s, i.e. days to months) and the historical band (100000000-10000000000s, i.e. years to millenia).
Ensemble musical performance requires both interaction and synchronization. Synchrony, defined as the exact co-occurrence of several observations, is a perceptual abstraction. The different, and finite, speeds at which light and sound travel imply that events whose percepts occur in absolute synchrony would hardly ever have occurred synchronously in MW, or would never have occurred naturally at all. For example, explosions in movies usually occur as if sound and light traveled at the same speed in the air. In MW, sound travels quite slowly, about 33cm in 1ms, whereas light travels so fast that travel times are negligible. Events that do occur simultaneously cause in an observer a variety of percepts that are not received synchronously, but that exhibit specific - and predictable - temporal patterns, which brains learn to understand as signs of a common cause, and thus synchronous origin in MWT. This is especially relevant in music making (and enjoying), due to the relatively low speed of sound travel in air.
Fraisse offers a comprehensive analysis of psychophysical experiments that aim to characterize the perceptual limits of time properties: event succession and duration [3]. Exact numbers depend on many factors, including task modality, and stimulus type, intensity, and duration. Pierce places the time resolution of the ear on the order of 1ms [6], which leads him to question whether human’s acute time resolution is actually of any use in music. Certainly the brain perceives as simultaneous auditory events that the ear detects as distinct. Pierce cites, among others, an experiment by Rasch, which revealed that synchronization in performed small ensemble music is only accurate to 30-50ms [7]. Incidentally, this range also characterizes the travel time of sound between the opposite end of an orchestral stage. Each individual musician in the orchestra experiences the performance in a necessarily specific and unique way. Yet the musicians are collectively capable, under the direction of the conductor, of producing an expert and consistent ensemble rendering of a musical piece. Recent experiments by Chew et al. [2] on sound latency in ensemble performance have shown that, under favorable conditions, professional musicians can deliver a meaningful musical performance while experiencing delays as high as 65ms. This number recalls the experimental fact, also reported by Pierce, that humans perceive no echo when a strong reflected sound occurs within 60-70ms after the direct sound.
These experimental results illustrate the impressive plasticity of brain processes with respect to the perception of MWT. Abstract temporal concepts, such as synchrony, bear limited relevance to the modeling of the MW perceptual and cognitive phenomena that inspired their invention.
References
[1] Stuart K. Card, George G. Robertson, and Jock D. Mackinlay. The information visualizer, an information workspace. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI), pages 181–186, 1991.
[2] E. Chew, A.A. Sawchuk, R. Zimmermann, V. Stoyanova, I. Tosheff, C. Kyriakakis, C. Papadopoulos, A.R.J. Francois, and A. Volk. Distributed Immersive Performance. In Proceedings of the 2004 Annual NASM Meeting, San Diego, CA, USA, November 2004.
[3] Paul Fraisse. Perception and estimation of time. Annual Review of Psychology, 35:1–36, 1984.
[4] Robert B. Miller. Response time in man-computer conversational transactions. In Proceedings of the AFIPS Fall Joint Computer Conference, volume 33, pages 267–277, 1968.
[5] Allen Newell. Unified Theories of Cognition. Harvard University Press, 1990.
[6] John R. Pierce. The nature of musical sound. In Diana Deutsch, editor, The Psychology of Music, Second Edition (Cognition and Perception), pages 1–24. Academic Press, 1998.
[7] R. A. Rasch. Synchronization in performed ensemble music. Acustica, 43:121–131, 1979.
Comments