Jump to content

Sensory cue

From Wikipedia, the free encyclopedia
(Redirected from Auditory cues)

In perceptual psychology, a sensory cue is a statistic or signal that can be extracted from the sensory input by a perceiver, that indicates the state of some property of the world that the perceiver is interested in perceiving.

A cue is some organization of the data present in the signal which allows for meaningful extrapolation. For example, sensory cues include visual cues, auditory cues, haptic cues, olfactory cues and environmental cues. Sensory cues are a fundamental part of theories of perception, especially theories of appearance (how things look).

Concept

[edit]

There are two primary theory sets used to describe the roles of sensory cues in perception. One set of theories are based on the Constructivist theory of perception, while the others are based on the Ecological theory.

Basing his views on the Constructivist theory of perception, Helmholtz (1821–1894) held that the visual system constructs visual percepts through a process of unconscious inference, in which cues are used to make probabilistic inferences about the state of the world. These inferences are based on prior experience, assuming that the most commonly correct interpretation of a cue will continue to hold true.[1] A visual percept is the final manifestation of this process. Brunswik (1903-1955) later went on to formalize these concepts with the lens model, which breaks the system's use of a cue into two parts: the ecological validity of the cue, which is its likelihood of correlating with a property of the world, and the system's utilization of the cue.[2] In these theories, accurate perception requires both the existence of cues with sufficiently high ecological validity to make inference possible, and that the system actually utilizes these cues in an appropriate fashion during the construction of percepts.

A second set of theories was posited by Gibson (1904-1979), based on the Ecological theory of perception. These theories held that no inferences are necessary to accomplish accurate perception. Rather, the visual system is able to take in sufficient cues related to objects and their surroundings. This means that a one:one mapping between the incoming cues and the environment they represent can be made. These mappings will be shaped by certain computational constraints; traits known to be common in an organism's environment.[3] The ultimate result is the same: a visual precept is manifested by the process.

Cue combination is an active area of research in perception, that seeks to understand how information from multiple sources is combined by the brain to create a single perceptual experience or response. Recent cue recruitment experiments have shown that the adult human visual system can learn to utilize new cues through classical (Pavlovian) conditioning.

Visual cues

[edit]

Visual cues are sensory cues received by the eye in the form of light and processed by the visual system during visual perception. Since the visual system is dominant in many species, especially humans, visual cues are a large source of information in how the world is perceived.[4]

Types of cues

[edit]

Depth

[edit]

The ability to perceive the world in three dimensions and estimate the size and distance to an object depends heavily on depth cues. The two major depth cues, stereopsis and motion parallax, both rely on parallax which is the difference between the perceived position of an object given two different viewpoints. In stereopsis the distance between the eyes is the source of the two different viewpoints, resulting in a Binocular disparity. Motion parallax relies head and body movement to produce the necessary viewpoints.[5]

Motion

[edit]

The visual system can detect motion both using a simple mechanism based on information from multiple clusters of neurons as well as by aggregate through by integrating multiple cues including contrast, form, and texture. One major source of visual information when determining self-motion is optic flow. Optic flow not only indicates whether an agent is moving but in which direction and at what relative speed.

Biological motion
[edit]

Humans in particular have evolved a particularly keen ability to detect if motion is being generated by biological sources, even with point light displays where dots represent the joints of an animal.[6] Recent research suggests that this mechanism can also reveal the gender, emotional state, and action of a given human light point model.[7]

Color

[edit]

The ability to distinguish between colors allows an organism to quickly and easily recognize danger since many brightly colored plants and animals pose some kind of threat, usually harboring some kind of toxin. Color also serves as an inferential cue that can prime both the motor action[8] and interpretation of a persuasive message.[9]

Contrast
[edit]

Contrast, or the difference in luminance and/or color that helps make an object distinguishable, is important in edge detection and serves as a cue.

Auditory cues

[edit]

An auditory cue is a sound signal that represents an incoming sign received through the ears, causing the brain to hear. The results of receiving and processing these cues are collectively known as the sense of hearing and are the subject of research within the fields of psychology, cognitive science, and neurobiology.

Auditory system

[edit]

The auditory system of humans and animals allows individuals to assimilate information from the surroundings, represented as sound waves. Sound waves first pass through the pinnae and the auditory canal, the parts of the ear that comprise the outer ear. Sound then reaches the tympanic membrane in the middle ear (also known as the eardrum). The tympanic membrane sets the malleus, incus, and stapes into vibration. The stapes transmits these vibrations to the inner ear by pushing on the membrane covering the oval window, which separates the middle and inner ear. The inner ear contains the cochlea, the liquid-filled structure containing the hair cells. These cells serve to transform the incoming vibration to electrical signals, which can then be transmitted to the brain. The auditory nerve carries the signal generated by the hair cells away from the inner ear and towards the auditory receiving area in the cortex. The signal then travels through fibers to several subcortical structures and on to the primary auditory receiving area in the temporal lobe.[10]

Cues for locating sound

[edit]

Humans use several cues to determine the location of a given stimuli, mainly by using the timing difference between ears. These cues allow individuals to identify both the elevation, the height of the stimuli relative to the individual, and the azimuth, or the angle of the sound relative to the direction the individual is facing.

Interaural time and level difference

[edit]

Unless a sound is directly in front of or behind the individual, the sound stimuli will have a slightly different distance to travel to reach each ear. This difference in distance causes a slight delay in the time the signal is perceived by each ear. The magnitude of the interaural time difference is greater the more the signal comes from the side of the head. Thus, this time delay allows humans to accurately predict the location of incoming sound cues. Interaural level difference is caused by the difference in sound pressure level reaching the two ears. This is because the head blocks the sound waves for the further ear, causing less intense sound to reach it. This level difference between the two ears allows humans to accurately predict the azimuth of an auditory signal. This effect only occurs for sounds that are high frequency.[11]

Spectral cue

[edit]

A spectral cue is a monaural (single ear) cue for locating incoming sounds based on the distribution of the incoming signal. The differences in distribution (or spectrum) of the sound waves are caused by interactions of the sounds with the head and the outer ear before entering the ear canal.[12]

Principles of auditory cue grouping

[edit]

The auditory system uses several heuristics to make sense of incoming cues, based on the properties of auditory stimuli that usually occur in the environment. Cue grouping refers to how humans naturally perceive incoming stimuli as organized patterns, based on certain rules.

Onset time

[edit]

If two sounds start at different times, they are likely to have originated from different sources. Sounds that occur simultaneously likely originate from the same source.

Location

[edit]

Cues originating at the same or slowly changing positions usually have the same source. When two sounds are separated in space, the cue of location (see: sound localization) helps an individual to separate them perceptually. If a sound is moving, it will move continuously. Erratically jumping sound is unlikely to come from the same source.

Similarity of timbre

[edit]

Timbre is the tone quality or tone character of a sound, independent of pitch. This helps us distinguish between musical instruments playing the same notes. When hearing multiple sounds, the timbre of each sound will be unchanging (regardless of pitch), and thus we can differentiate between sounds from different sources over time.[13]

Similarity of pitch

[edit]

Pitch refers to the frequency of the sound wave reaching us. Although a single object could produce a variety of pitches over time, it is more likely that it would produce sounds in a similar range.[14] Erratic changes in pitch are more likely to be perceived as originating from different sources.

Auditory continuity

[edit]

Similar to the Gestalt principle of good continuation (see: principles of grouping), sounds that change smoothly or remain constant are often produced by the same source. Sound with the same frequency, even when interrupted by other noise, is perceived as continuous. Highly variable sound that is interrupted is perceived as separate.[15]

Factors affecting auditory cue perception

[edit]

The precedence effect

[edit]

When one sound is presented for a long interval before the introduction of a second one originating from a different location, individuals will hear them as two distinct sounds, each originating from the correct location. However, when the delay between the onset of the first and second sound is shortened, listeners are unable to distinguish between the two sounds. Instead, they perceive them as both coming from the location of the lead sound. This effect counteracts the small disparity between the perception of sound caused by the difference in distance between each ear and the source of the auditory stimuli.[16]

The interaction between auditory and visual cues

[edit]

There are strong interactions between visual and auditory stimuli. Since both auditory and visual cues provide an accurate source of information about the location of an object, most times there will be minimal discrepancy between the two. However, it is possible to have a disparity in the information provided by the two sets of cues. An example of visual capture is the ventriloquism effect, that occurs when an individual's visual system locates the source of an auditory stimulus at a different position than where the auditory system locates it. When this occurs, the visual cues will override the auditory ones. The individual will perceive the sound as coming from the location where the object is seen. Audition can also affect visual perception. Research has demonstrated this effect by showing two objects on a screen, one moving diagonally from top-right to bottom-left and the other from top-left to bottom-right, intersecting in the middle. The paths of these identical objects could have been interpreted as crossing over each other, or as bouncing off each other. Without any auditory cue, a vast majority of subjects saw the objects crossing paths and continuing in their original trajectory. But with the addition of a small "click" sound, a majority of subjects perceived the objects as bouncing off each other. In this case, auditory cues help interpret visual cues.[17]

Haptic cues

[edit]

A haptic cue is either a tactile sensation that represents an incoming signal received by the somatic system, or a relationship between tactile sensations which can be used to infer a higher level of information.[18] The results of receiving and processing these cues are collectively known as the sense of touch, and are the subject of research in the fields of psychology, cognitive science, and neurobiology.

The word "haptic" can refer explicitly to active exploration of an environment (particularly in experimental psychology and physiology), but it is often used to refer to the whole of the somesthetic experience.[19]

Somatosensory system

[edit]

The somatosensory system assimilates many kinds of information from the environment: temperature, texture, pressure, proprioception, and pain. The signals vary for each of these perceptions, and the receptor systems reflect this: thermoreceptors, mechanoreceptors, nociceptors, and chemoreceptors.

Haptic cues in research

[edit]

The interaction between haptic and visual cues

[edit]

In addition to the interplay of haptic communication and nonverbal communication, haptic cues as primers have been looked at as a means of decreasing reaction time for identifying a visual stimulus.[20] Subjects were placed in a chair fitted with a back which provided haptic cues indicating where the stimulus would appear on a screen. Valid haptic cues significantly decreased reaction time while invalid cues increased reaction time.[20]

Use in technology for the visually impaired

[edit]

Haptic cues are used frequently to allow those who have impaired vision to have access to a greater wealth of information. Braille is a tactile written language which is read via touch, brushing the fingers over the raised patterns. Braille technology is the attempt to extend Braille to digital media and developing new tools to aid in the reading of web pages and other electronic devices often involves a combination of haptic and auditory cues.[21]

A major issue that different technologies in this area attempt to overcome is sensory overload. The amount of information that can be quickly related via touch is less than that of vision and is limited by current technology. As a result, multi-modal approaches, converting the visual information into both haptic and auditory outputs, often have the best results. For example, an electronic pen can be drawn across a tablet mapped to the screen and produce different vibrations and sounds depending on what is at that location.[21]

Olfactory cues

[edit]

An olfactory cue is a chemical signal received by the olfactory system that represents an incoming signal received through the nose. This allows humans and animals to smell the chemical signal given off by a physical object. Olfactory cues are extremely important for sexual reproduction, as they trigger mating behavior in many species, as well as maternal bonding and survival techniques such as detecting spoiled food. The results of receiving and processing this information is known as the sense of smell.

Olfactory system

[edit]

The process of smelling begins when chemical molecules enter the nose and reach the olfactory mucosa, a dime-sized region located in the nasal cavity that contains olfactory receptor neurons. There are 350 types of olfactory receptors, each sensitive to a narrow range of odorants. These neurons send signals to the glomeruli within the olfactory bulb. Each glomerulus collects information from a specific olfactory receptor neuron. The olfactory signal is then conducted to the piriform cortex and the amygdala, and then to the orbitalfrontal cortex, where higher level processing of the odor occurs.

Olfactory memory

[edit]

Olfactory memory is the recollection of a given smell. Research has found that odor memory is highly persistent and has a high resistance to interference, meaning these memories remain within an individual for long times despite possible interference of other olfactory memories. These memories are mostly explicit, though implicit forms of odor memory do provide some understanding of memory. Mammalian olfactory cues play an important role in the coordination of the mother infant bond, and the following normal development of the offspring. Olfactory memory is especially important for maternal behavior. Studies have shown that the fetus becomes familiar with olfactory cues within the uterus. This is demonstrated by research that suggests that newborns respond positively to the smell of their own amniotic fluid, meaning that fetuses learn from these cues in the womb.[22]

Environmental cues

[edit]

Environmental cues are all of the sensory cues that exist in the environment.

With directed attention, an environmental cue becomes an attended cue.[18] However, most environmental cues are assimilated subconsciously, as in visual contextual cueing.

Environmental cues serve as the primary context that shapes how the world is perceived and as such they can prime prior experience to influence memory recall[23] and decision making.[24] This has applied use in marketing as there is evidence to suggest a store's atmosphere and layout can influence purchasing behavior.[25]

Environmental cues play a direct role in mediating the behavior of both plants[26] and animals. For example, environmental cues, such as temperature change or food availability, affect the spawning behavior of fish. In addition to cues generated by the environment itself, cues generated by other agents, such as ant pheromone trails, can influence behavior to indirectly coordinate actions between those agents.

In the study of perception, environmental cues play a large role in experimental design since these mechanisms evolved within a natural environment[27] which gives rise to scene statistics and the desire to create a natural scene. If the experimental environment is too artificial, it can damage external validity in an ideal observer experiment that makes use of natural scene statistics.

Cueing in Parkinson's disease

[edit]

Among the many problems associated with Parkinson's disease are disturbances with gait, or issues related to walking. One example of this is freezing of gait where a person with Parkinson's disease will stop walking abruptly and struggle with the inability to walk forward for a brief period. Research has shown that auditory cues associated with walking, such as the sound of footsteps in gravel, can improve conditions regarding disturbances in gait in people with Parkinson's disease. Specifically, the two aspects of cue-continuity (pace) and action-relevance (sounds commonly associated with walking) together can help reduce gait variability.[28]

The use of sensory cues has also aided in improving motor functions for people with Parkinson's disease. Research has indicated that sensory cues are beneficial in helping people with Parkinson's disease complete their ADLs (activities of daily living). Although the research showed that these individuals still did not meet standard expectations for motor functions and post-evaluations revealed a slight relapse in motor impairment, the overall results confirm that sensory cues are a beneficial resource in physical therapy and improving motor development in combating Parkinson's disease symptoms.[29]

See also

[edit]

References

[edit]
  1. ^ Rogers, edited by William Epstein, Sheena (1995). Perception of space and motion. San Diego: Academic Press. pp. 3–5. ISBN 978-0080538617. {{cite book}}: |first= has generic name (help)CS1 maint: multiple names: authors list (link)
  2. ^ Rogers, edited by William Epstein, Sheena (1995). Perception of space and motion. San Diego: Academic Press. pp. 5–7. ISBN 978-0080538617. {{cite book}}: |first= has generic name (help)CS1 maint: multiple names: authors list (link)
  3. ^ Rogers, edited by William Epstein, Sheena (1995). Perception of space and motion. San Diego: Academic Press. pp. 7–9. ISBN 978-0080538617. {{cite book}}: |first= has generic name (help)CS1 maint: multiple names: authors list (link)
  4. ^ Posner, Michael I.; Nissen, Mary J.; Klein, Raymond M. (March 1976). "Visual dominance: An information-processing account of its origins and significance". Psychological Review. 83 (2): 157–171. doi:10.1037/0033-295X.83.2.157. PMID 769017.
  5. ^ Steinman, Scott B.; Garzia, Ralph Philip (2000). Foundations of Binocular Vision: A Clinical perspective. McGraw-Hill Professional. pp. 2–5. ISBN 978-0-8385-2670-5.
  6. ^ G. Johansson (1973). "Visual perception of biological motion and a model for its analysis". Percept. Psychophys. 14 (2): 201–211. doi:10.3758/BF03212378.
  7. ^ Alaerts, Kaat; Nackaerts, Evelien; Meyns, Pieter; Swinnen, Stephan P.; Wenderoth, Nicole; Valdes-Sosa, Mitchell (June 9, 2011). "Action and Emotion Recognition from Point Light Displays: An Investigation of Gender Differences". PLOS ONE. 6 (6): e20989. Bibcode:2011PLoSO...620989A. doi:10.1371/journal.pone.0020989. PMC 3111458. PMID 21695266.
  8. ^ Schmidt, T.: The finger in flight: Real-time motor control by visually masked color stimuli. In: Psychological Science, Nr. 13, 2002, S. 112-118.
  9. ^ Gerend, Mary A.; Sias, Tricia (July 2009). "Message framing and color priming: How subtle threat cues affect persuasion". Journal of Experimental Social Psychology. 45 (4): 999–1002. doi:10.1016/j.jesp.2009.04.002.
  10. ^ Gray, Lincoln (1997). Chapter 12: Auditory System: Structure and Function. McGovern Medical School at UTHealth.
  11. ^ Hartmann, William M.; Macauley, Eric J. (February 28, 2014). "Anatomical limits on interaural time differences: an ecological perspective". Frontiers in Neuroscience. 8: 34. doi:10.3389/fnins.2014.00034. PMC 3937989. PMID 24592209. S2CID 7032767.
  12. ^ Voss, Patrice; Lepore, Franco; Gougoux, Frédéric; Zatorre, Robert J. (March 28, 2011). "Relevance of spectral cues for auditory spatial processing in the occipital cortex of the blind". Frontiers in Psychology. 2: 48. doi:10.3389/fpsyg.2011.00048. PMC 3110881. PMID 21716600. S2CID 5393985.
  13. ^ Bregman, Albert (1971). "Primary Auditory Stream Segregation and Perception of Order in Rapid Sequences of Tones". Journal of Experimental Psychology. 89 (2): 244–249. CiteSeerX 10.1.1.615.7744. doi:10.1037/h0031163. PMID 5567132.
  14. ^ Sergeant, Desmond (1969). "Experimental Investigation of Absolute Pitch". Journal of Research in Music Education. 17 (1): 135–143. doi:10.2307/3344200. ISSN 0022-4294. JSTOR 3344200. S2CID 144294536.
  15. ^ Warren, R. M.; Obusek, C. J.; Ackroff, J. M. (9 June 1972). "Auditory Induction: Perceptual Synthesis of Absent Sounds". Science. 176 (4039): 1149–1151. Bibcode:1972Sci...176.1149W. doi:10.1126/science.176.4039.1149. PMID 5035477. S2CID 25072184.
  16. ^ Brown, Andrew D.; Stecker, G. Christopher; Tollin, Daniel J. (December 6, 2014). "The Precedence Effect in Sound Localization". Journal of the Association for Research in Otolaryngology. 16 (1): 1–28. doi:10.1007/s10162-014-0496-2. PMC 4310855. PMID 25479823.
  17. ^ Sekuler, Robert; Sekuler, Allison B.; Lau, Renee (1997). "Sound alters visual motion perception". Nature. 385 (6614): 308. Bibcode:1997Natur.385..308S. doi:10.1038/385308a0. PMID 9002513. S2CID 27165422.
  18. ^ a b Goldstein, Bruce E. (2007). Sensation and Perception. Cengage Learning. pp. 5–6. ISBN 978-0-495-60149-4.
  19. ^ Robles-De-La-Torre, G. (1 July 2006). "The Importance of the Sense of Touch in Virtual and Real Environments". IEEE MultiMedia. 13 (3): 24–30. doi:10.1109/MMUL.2006.69. S2CID 16153497.
  20. ^ a b Young, J.J.; Tan, H.Z.; Gray, R. (2003). "Validity of Haptic Cues and Its Effect on Priming Visual Spatial Attention" (PDF). 11th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2003. HAPTICS 2003. Proceedings. pp. 166–170. CiteSeerX 10.1.1.130.7119. doi:10.1109/HAPTIC.2003.1191265. ISBN 978-0-7695-1890-9. S2CID 5246376.
  21. ^ a b Jay, Caroline; Stevens, Robert; Hubbold, Roger; Glencross, Mashhuda (1 May 2008). "Using haptic cues to aid nonvisual structure recognition" (PDF). ACM Transactions on Applied Perception. 5 (2): 1–14. doi:10.1145/1279920.1279922. S2CID 13924748.
  22. ^ Varendi, H; Porter, RH; Winberg, J (1 September 1997). "Natural odour preferences of newborn infants change over time". Acta Paediatrica. 86 (9): 985–990. doi:10.1111/j.1651-2227.1997.tb15184.x. PMID 9343280. S2CID 28213494.
  23. ^ Godden, D; Baddeley, A. (1975). "Context dependent memory in two natural environments". British Journal of Psychology. 66 (3): 325–331. doi:10.1111/j.2044-8295.1975.tb01468.x. S2CID 10699186.
  24. ^ Elder, Ryan S.; Krishna, Aradhna (2010). "The Effects of Advertising Copy on Sensory Thoughts and Perceived Taste". Journal of Consumer Research. 36 (5): 748–56. CiteSeerX 10.1.1.497.1394. doi:10.1086/605327.
  25. ^ Baker, Julie; Parasuraman, A.; Grewal, Dhruv; Voss, Glenn B. (1 April 2002). "The Influence of Multiple Store Environment Cues on Perceived Merchandise Value and Patronage Intentions". Journal of Marketing. 66 (2): 120–141. doi:10.1509/jmkg.66.2.120.18470. S2CID 167436934.
  26. ^ Berger, Jonah A.; Heath, Chip (March–April 2010). "Idea Habitats: How the Prevalence of Environmental Cues Influences the Success of Ideas". Cognitive Science. 29 (2): 195–221. doi:10.1207/s15516709cog0000_10. PMID 21702772. S2CID 10493169.
  27. ^ Geisler, W. S.; Diehl, R. L. (2003). "A Bayesian approach to the evolution of perceptual and cognitive systems". Cognitive Science. 27 (3): 379–402. doi:10.1016/s0364-0213(03)00009-0.
  28. ^ Young, William R.; Shreve, Lauren; Quinn, Emma Jane; Craig, Cathy; Bronte-Stewart, Helen (April 28, 2016). "Auditory cueing in Parkinson's patients with freezing of gait. What matters most: Action-relevance or cue-continuity?" (PDF). Neuropsychologia. 87: 54–62. doi:10.1016/j.neuropsychologia.2016.04.034. PMID 27163397. S2CID 18971434.
  29. ^ Marchese, R.; Diverio, M.; Zucchi, F.; Lentino, C.; Abbruzzese, G. (2000). "The role of sensory cues in the rehabilitation of parkinsonian patients: a comparison of two physical therapy protocols". Mov Disord. 15 (5): 879–883. doi:10.1002/1531-8257(200009)15:5<879::aid-mds1018>3.0.co;2-9. PMID 11009194. S2CID 34222531.