PYC1501 Unit 2 Sensation and Perception OER
PYC1501 Unit 2 Sensation and Perception OER
PYC1501 Unit 2 Sensation and Perception OER
2.1 Introduction
People are continually in constant interaction with the environment, either influencing or being
influenced by it. The mind is bombarded with information that takes many forms, from the
electromagnetic energy of the sun to vibrations in the air, to molecules dissolved in the saliva on the
tongue. Our sensory systems allow us to explore, understand and respond to a multitude of stimuli in our
environments, giving each of us a unique experience of the world around us. This unit deals with the ways
we receive information from the environment and make it meaningful. You will learn about the process
of sensation and perception and how it helps us to survive in our environments.
1
PYC1501/Sensation and perception/OER/2024
processing. The bottom-up and top-down processing approaches will be elaborated in detail in this
section.
The goal of sensation is detection, while the goal of perception is to create useful information about our
environment. Sensation and perception work seamlessly together to allow us to experience the world
through our senses but also to combine what we are currently learning from the environment with what
we already know about it to make judgments and to choose appropriate behaviours.
There are several basic principles that influence the way our sense organs work. One principle is our ability
to detect an external stimulus. Each sensory organ (e.g. eye) requires a minimal amount of stimulation to
detect a stimulus. This process is called an absolute threshold and is measured by means of signal
detection - the process at which stimulus signal can be detected at any particular intensity by the sense
organs. Table 2.1 below shows some examples of absolute thresholds for each of the five senses.
The principle of the differential threshold (or just noticeable difference - jnd) relates to our ability to
detect the difference between two stimuli of different intensities. Unlike the absolute threshold, the
differential threshold changes depending on the stimulus intensity. That is, the amount of difference
between two stimuli that can be detected depends on the size of the stimuli being compared. As stimuli
get larger, the differences must also become large to be detected. For example, it is easier to detect the
difference between a 5kg weight and a 10kg weight, than between a 100kg weight and a 105kg weight,
even though the absolute difference in weight is the same in both cases (Cacioppo & Freberg, 2013).
Our experiences influence how the brain processes information. You have tasted food that you like and
food that you do not like. There are some food brands that you enjoy and others you cannot stand.
However, during the time you first eat something or hear about a brand, you process the stimuli using
bottom-up processing. That is, the individual elements are taken in, one by one, and pieced together to
get the entire structure. Sometimes our past experiences will influence how we process the new ones.
This is top-down processing. In this processing, the perceptions begin with the most general and move
toward the more specific. These perceptions are heavily influenced by our expectations and prior
knowledge. Put simply, the brain applies what it knows to fill in the blanks and anticipate what is next.
It should be noted that when we experience a sensory stimulus that does not change, we stop paying
attention to it, and this is called sensory adaptation. This explains for example, why you do not feel the
2
PYC1501/Sensation and perception/OER/2024
weight of a wristwatch on your arm, or the weight of a hat on your head. This occurs because if a stimulus
does not change, our receptors stop responding to it. Watch the video on sensation and perception at
https://www.youtube.com/watch?v=unWnZvXJH2o&list=RDLVunWnZvXJH2o&start_radio=1&rv=unWn
ZvXJH2o&t= (Note: Video not for assessment purposes).
There are five sensory systems that provide us with the information necessary to survive. These systems
are discussed below.
This system allows us to hear. Sound wave is a pressure caused by the vibration of sound in a medium that
transfers energy, like air. Sound waves enter the outer ear and are transmitted through the auditory canal
to the eardrum. The resulting vibrations are moved by the three small ossicles (consisting of malleus
(hammer), incus (anvil), and the stapes (stirrup) into the cochlea, where they are detected by hair cells
and sent to the auditory nerve (van Deventer & Mojapelo-Batka, 2013). The human ear is sensitive to a
wide range of sounds. For example, when we pick up the phone, we quickly recognise a familiar voice. In
a fraction of a second, our auditory system receives the sound waves, transmits them to the auditory
cortex, compares them to stored knowledge of other voices, and identifies the identity of the caller.
For us to sense sound waves from the environment, they must reach the inner ear. Initially, sound waves
are channelled by the pinna (the external part of an ear) into the auditory or ear canal (see figure 2.2
below). The auditory canal connects the pinna to the tympanic membrane - a thin, stretched membrane
in the ear that vibrates in response to sound waves. When reaching the eardrum, the sound waves vibrate
against the three small ossicles. Both the tympanic membrane and the ossicles amplify the sound waves
before they enter the fluid-filled cochlea - a snail-shell-like bone structure containing auditory hair cells
3
PYC1501/Sensation and perception/OER/2024
arranged on the basilar membrane according to the frequency they respond to (called tonotopic
organisation). Auditory hair cells are the receptors in the cochlea that convert sound into electrical signals.
After being processed by auditory hair cells, electrical signals are sent through the cochlear nerve to the
thalamus, and then the primary auditory cortex - an area of the cortex involved in processing auditory
stimuli (Romani et al., 1982).
Small hair-like extensions from the olfactory receptor cells serve as the sites for odour molecules dissolved
in the mucus to interact with chemical receptors located on these extensions. Once an odour molecule
has found a given receptor, chemical changes within the cell result in signals being sent to the olfactory
bulb where the olfactory nerves begin. From the olfactory bulb, information is sent to regions of the limbic
system and to the primary olfactory cortex, which is located very near the gustatory cortex (Lodovichi &
Belluscio, 2012; Spors et al., 2013).
4
PYC1501/Sensation and perception/OER/2024
Figure 2.4: Structure of taste buds in the tongue- (a) Taste buds are composed of a number of individual taste
receptors cells that transmit information to nerves. (b) This micrograph shows a close-up view of the tongue’s
surface. https://openstax.org/books/psychology/pages/5-5-the-other-senses
5
PYC1501/Sensation and perception/OER/2024
2.3.4 Touch
The somaesthetic system provides information about the environment immediately outside the skin,
relating to touch, pressure, heat, and pain (Van Deventer & Mojapelo-Batka, 2013). The energy detected
by the sense of touch is physical pressure on tissue, usually on the skin hairs. The receptors that transduce
pressure into neural activity are in, or just below the skin (Bernstein et al., 2012). The skin provides us
with all sorts of information, such as whether something is smooth or bumpy, hot, or cold, or even painful.
Somatosensation (which includes our ability to sense touch, temperature, and pain) transduces physical
stimuli into electrical potentials that can be processed by the brain. Tactile stimuli (stimuli associated with
texture) are transduced by special receptors in the skin called mechanoreceptors that allow for the
conversion of one kind of energy into a form that the brain can understand.
The thousands of nerve endings in the skin respond to four basic sensations: pressure, hot, cold, and pain,
but only the sensation of pressure has its own specialised receptors. Other sensations are created by a
combination of the other four. For instance, the feeling of itchiness is caused by repeated stimulation of
pain receptors, whereas the feeling of wetness is caused by repeated stimulation of cold and pressure
receptors. After tactile stimuli are converted by mechanoreceptors, information is sent through the
thalamus to the primary somatosensory cortex for further processing. Various areas of the skin, such as
lips and fingertips, are more sensitive than others, such as shoulders or ankles.
The skin is important not only in providing information about touch and temperature but also in
proprioception - the ability to sense the position and movement of body parts. Proprioception is
accomplished by specialised neurons located in the skin, joints, bones, ears, and tendons, which send
messages about the compression and the contraction of muscles throughout the body.
Whereas other animals rely primarily on hearing, and smell to understand the world around them, human
beings rely in large part on vision. The visual system allows us to see. The light enters the eye through the
transparent cornea, passing through the pupil at the centre of the iris (see figure 2.5 below). The lens
adjusts to focus the light on the retina, where it appears upside down and backward. Receptor cells on
the retina send information via the optic nerve to the primary visual cortex (van Deventer & Mojapelo-
Batka, 2013). Sensation of light depends on two physical dimensions of light waves: (i) light intensity -
refers to how much energy the light contains. It determines the brightness of light and (ii) light
wavelength - the distance between the two successive crests or troughs of the light wave. The color we
see depends mainly on light wavelength. At a given intensity, different wavelengths produce sensations
of different pitch. For example, 440 nanometer light appears violet blue, and 700 nanometer light appears
orangish red (Bernstein et al., 2012). Also see section 2.5 below.
6
PYC1501/Sensation and perception/OER/2024
In the retina, the light is converted into electrical signal by specialised cells called photoreceptors. The
retina contains two main kinds of photoreceptors; rods and cones. Rods are primarily responsible for our
ability to see in dim light conditions, such as during the night. Cones enable us to see colours and fine
details in bright light. Rods and cones differ in their distribution across the retina, with the highest
concentration of cones found in the fovea (the central region of focus), and rods dominating the
periphery. The difference in distribution can explain why looking directly at a dim star in the sky makes it
seem to disappear because there aren’t enough rods to process the dim light.
Damage to the primary visual cortex can potentially result in agnosia - inability to perceive visual stimuli,
and prosopagnosia - an inability to recognise faces. The specialised regions for visual recognition comprise
the ventral pathway - the ‘what’ pathway. Other areas involved in processing location and movement
make up the dorsal pathway - the ‘where’ pathway. Together, these pathways process a large amount of
information about visual stimuli (Goodale & Milner, 1992).
Normal-sighted people have three different types of cones that mediate colour vision. Each of these cone
types is maximally sensitive to a slightly different wavelength of light. According to the trichromatic theory
of colour vision, all colours in the spectrum can be produced by combining red, green, and blue colours.
Our colour perception is therefore, based on the mixing of these three light wave colours. This theory was
applied in the creation of colour television screens, which contain microscopic elements of red, blue, and
green (Bernstein et al., 2012).
The opponent-process theory states that colour is coded in opponent pairs: black-white, yellow-blue, and
green-red. The basic idea is that some cells of the visual system are excited by one of the opponents’
colours and inhibited by others. So, a cell that was excited by wavelengths associated with green would
be inhibited by wavelengths associated with red, and vice versa. The theory asserts that people do not
experience greenish-reds or yellowish blues as colours. This leads to the experience of negative
afterimages - the continuation of a visual sensation after the removal of the stimulus. For example, when
7
PYC1501/Sensation and perception/OER/2024
you stare briefly at the sun and then look away from it, you may still perceive a spot of light although the
stimulus (the sun) is no longer at sight.
Depth perception
It refers to an ability to perceive spatial relationships in three-dimensional (3-D) space. With depth
perception, we can describe things as being in front, behind, above, below, or to the side of other things.
The world is a three-dimensional sphere and people use a variety of cues in a visual scene to establish the
sense of depth. Binocular vision is the ability to perceive 3-dimennsional objects because of the
differences between the images on each retina in the eyes. Some of the are binocular cues that rely on
the use of both eyes. One example of a binocular depth cue is binocular disparity - the slightly different
view of the world that each eye receives. To experience the binocular disparity, do this simple exercise:
extend your arm fully and extend one of your fingers and focus on that finger. Now, close your left eye
without moving your head, then open your left eye and close your right eye without moving your head.
You will notice that your finger seems to shift as you alternate between the two eyes because of the
slightly different view each eye has of your finger.
We can also perceive depth in 2-D dimension. Generally, people pick up on depth dimension in
photographic images even though the visual stimulus is 2-dimensional. When we do this, we are relying
on several monocular cues, these are cues that require the use of one eye only. This allows us to see depth
because we do not bump into objects when walking with one eye closed. An example of a monocular cues
is linear perspective. Linear perspective refers to the fact that we perceive depth when we see two parallel
lines that seem to converge in an image. Some other monocular depth cues are interposition, the partial
overlap of objects, and the relative size and closeness of images to the horizon.
Before you continue with this unit, please do the following activities:
Activity 1 Write down the differences between the concepts of sensation and perception.
Answer Your answer should indicate that sensation refers to the process of receiving
information from the environment, translating and transmitting it to the brain, and
perception is the process of interpreting the information and forming meaningful images
of the world.
Activity 2 Sarah was born blind. Does this mean that perception is impossible for her? Write down
your thoughts on this question.
Answer If your answer is that Sarah is not capable of perception, then you need to revise this
section very carefully. Perception relates to all the sensory systems (sight, hearing, smell,
taste, and touch). If someone is blind, only one sensory system (visual) is affected.
Perception can still take place through the other sensory systems.
8
PYC1501/Sensation and perception/OER/2024
Perceptual grouping refers to the tendency to group stimuli in a pattern or shape in a way that allows for
meaningful interpretation. In the early 20th century, theorists such as Max Wertheimer, Wolfgang Köhler
and Kurt Koffka believed that perception involved more than simply combining sensory stimuli. This belief
led to a new movement in the field of psychology known as Gestalt psychology. The word gestalt is used
to reflect the idea that the whole is different from the sum of its parts. In other words, the brain creates
a perception that is more than simply the sum of available sensory inputs, and it does so in predictable
ways. Gestalt psychologists translated these predictable ways into principles by which we organise
sensory information (Rock & Palmer, 1990). These principles are presented in figure 2.6 below.
According to Gestalt theorists, our perceptions are based on perceptual hypotheses that we make while
interpreting sensory information. These hypotheses are informed by a number of factors, including our
personalities, experiences, and expectations. We use these hypotheses to generate our perceptual set
(Goolkasian & Woodbury, 2010).
Before you continue with this unit, please do the following activity:
When you look at this picture, you see rows of O and rows of Q. Which principle of
perceptual grouping accounts for this?
A. Continuation
9
PYC1501/Sensation and perception/OER/2024
B. Proximity
C. Similarity
D. Closure
Answer This is an example of perceptual grouping called similarity. The objects with similar shapes
are seen as belonging together. Therefore, option C is the correct answer. Proximity is
incorrect because the rows are not arranged in terms of closeness. Therefore, option B is
incorrect. Continuation (option A) is incorrect because there is no intersection between
the rows of Os and the rows of Qs. Option D is also incorrect because closure involves the
tendency to complete something spontaneously so that it forms meaning.
Visual and auditory stimuli both occur in the form of waves. Although the two stimuli are very different in
terms of composition, waves share similar characteristics that are especially important to our visual and
auditory perceptions. In this section, we describe the physical properties of the waves as well as the
perceptual experiences associated with them.
Two physical characteristics of a wave are wave amplitude and wavelength. The amplitude of a wave is
the height of a wave as measured from the highest point (peak) to the lowest point on the wave (trough).
Wavelength refers to the length of a wave from one peak to the next.
Wavelength is directly related to the frequency of a given wave form. Frequency refers to the number of
waves that pass a given point in a given period and is often expressed in hertz (Hz), or cycles per second.
Longer wavelengths will have lower frequencies, and shorter wavelengths will have higher frequencies.
10
PYC1501/Sensation and perception/OER/2024
Figure 2.8: Different wavelength and frequency. At the top of the figure, the red wave has a long wavelength but
short frequency. Moving towards the bottom, the wavelengths decrease and frequencies increase.
https://openstax.org/books/psychology/pages/5-2-waves-and-wavelengths
The electromagnetic spectrum encompasses all the electromagnetic radiation that occurs in our
environment and includes gamma rays, x-rays, ultraviolet light, visible light, infrared light, microwaves,
and radio waves. The visible spectrum in humans is associated with wavelengths that range from 380 to
740 nm - a very small distance, since a nanometre (nm) is one billionth of a meter. Other species can
detect other portions of the electromagnetic spectrum. For instance, honeybees can see light in the
ultraviolet range (Wakakuwa et al., 2007), and some snakes can detect infrared radiation in addition to
more traditional visual light cues (Chen et al., 2012). In humans, light wavelength is associated with
perception of colour. Within the visible spectrum, our experience of red is associated with longer
wavelengths, greens are intermediate, and blues and violets are shorter in wavelength. The amplitude of
light waves is associated with our experience of brightness or intensity of colour, with larger amplitudes
appearing brighter.
Sound waves refer to a pressure caused by the vibration of sound in a medium that transfers energy, like
air. Like light waves, the physical properties of sound waves are associated with various aspects of our
perception of sound. The frequency of a sound wave is associated with our perception of that sound’s
pitch. High-frequency sound waves are perceived as high-pitched sounds, while low-frequency sound
waves are perceived as low-pitched sounds. The audible range of sound frequencies is between 20 and
20000 hertz, with greatest sensitivity to those frequencies that fall in the middle of this range (Strain,
2003). The loudness of a given sound is closely associated with the amplitude of the sound wave. Higher
amplitudes are associated with louder sounds. Loudness is measured in decibels (dB), a logarithmic unit
of sound intensity. A typical conversation would correlate with 60 dB; and a rock concert might be as loud
as 120 dB (see figure 2.9 below). However, there is the potential for hearing damage from about 80 dB to
130 dB (Dunkle, 1982).
11
PYC1501/Sensation and perception/OER/2024
Although wave amplitude is generally associated with loudness, there is some interaction between
frequency and amplitude in our perception of loudness within the audible range. For example, a 10 Hz
sound wave is inaudible no matter the amplitude of the wave. A 1000 Hz sound wave, on the other hand,
would vary dramatically in terms of perceived loudness as the amplitude of the wave increased. Different
musical instruments can play the same musical note at the same level of loudness, yet they still sound
quite different. This is known as the timbre of a sound. Timbre refers to a sound’s purity, and it is affected
by the complex interplay of frequency, amplitude, and timing of sound waves. Watch the video at
https://www.youtube.com/watch?v=TsQL-sXZOLc (Note: Video not for assessment purposes).
GROUP ACTIVITY
In a discussion forum on myUnisa with your e-tutor and peers, reflect on the amplitude and
wavelengths, and provide examples, where possible.
NB: Please note that it is compulsory for you to have this discussion.
12
PYC1501/Sensation and perception/OER/2024
2.6 Summary
In this unit you learned how to differentiate between sensation and perception. You familiarised yourself
with the different concepts that make up the section on sensation and perception. You were also
introduced to a comprehensive section that discusses the different types of sensory systems and their
function. You learned about Gestalt’s principles of perception and concluded the section with a brief
explanation on waves and wavelengths.
13
PYC1501/Sensation and perception/OER/2024
2.7 Glossary
Absolute threshold: the smallest amount of stimulation needed for detection by a sense.
Afterimage: is the continuation of a visual sensation after the removal of the stimulus.
Auditory canal: a tube-like structure running from the outer ear to the middle ear.
hair cells: the receptors in the cochlea that convert sound into electrical signals.
Binocular disparity: difference in images processed by the left and right eyes.
Binocular vision: the ability to perceive 3-dimennsional objects because of the differences between the
images on each retina in the eyes.
Bottom-up processing: refers to building up to perceptual experience from individual pieces of sensory
input.
Chemical senses: part of sensory system that allows people to process the environmental stimuli of
smell and taste.
Cochlea: Spiral bone -like structure in the inner ear containing auditory hair cells.
Cones: a type of photoreceptors in the eye that enable us to see colours and fine details in bright light.
Depth perception: it refers to an ability to perceive spatial relationships in three-dimensional (3-D) space.
Differential threshold: the smallest change in stimulation that a person can detect. (also known as just
noticeable difference - jnd).
Frequency: refers to the number of waves that pass a given point in a given period
14
PYC1501/Sensation and perception/OER/2024
Gustation (taste): describes the detection of taste by chemoreceptors in the oral cavity, predominantly
on the tongue.
Light intensity: refers to how much energy the light contains and it determines the brightness of the
light.
Light wavelength: refers to the distance between the two successive crests or troughs of the light wave.
The color we see depends mainly on light wavelength.
mechanoreceptors: receptors in the skin that allow for the conversion of one kind of energy into a form
that the brain can understand.
Olfactory receptor cells: are found in a mucous membrane at the top of the nose and are topped with
tentacle-like protrusions that contain receptor proteins.
Opponent-process theory: a theory proposing that colour vision as influenced by cells responsive to
pairs of colours.
Ossicles: A collection of three small bones in the middle ear that vibrate against the tympanic
membrane.
Perception: refers to the way sensory information is organised, interpreted, and consciously
experienced.
Perceptual grouping: refers to the tendency to group stimuli in a pattern or shape that allows for
meaningful interpretation.
Photoreceptors: specialised cells in the retina that convert light into electrical signal.
Primary auditory cortex: an area of the cortex involved in processing auditory stimuli.
Primary somatosensory cortex: an area of the cortex involved in processing somatosensory stimuli.
Primary visual cortex: an area of the cortex involved in processing visual stimuli.
Proprioception: the ability to sense the position and movement of body parts.
15
PYC1501/Sensation and perception/OER/2024
Rods: a type of photoreceptors that are primarily responsible for our ability to see in dim light
conditions.
Signal detection: is the process at which stimulus signal can be detected at any particular intensity by
the sense organs.
Sound waves: pressure caused by the vibration of sound in a medium that transfers energy, like air.
Timbre: is the purity of sound, affected by frequency, amplitude, and timing of sound waves.
Top-down processing: the perceptions that begin with the most general and move toward the more
specific.
Trichromatic theory: a theory proposing that colour vision as influenced by different cones responding
preferentially to red, green and blue colours.
Tympanic membrane (eardrum): a thin, stretched membrane in the ear that vibrates in response to
sound waves.
Wave amplitude: is the height of a wave as measured from the highest point (peak) to the lowest point
on the wave (trough).
Wavelength: refers to the length of a wave from one peak to the next.
16
PYC1501/Sensation and perception/OER/2024
2.8 References
Bernstein, D.A., Clarke-Steward, A., Penner, L.A. & Roy, E.J. (2012). Psychology (9th ed.). Wadsworth.
Cacioppo, J.T. & Freberg, L.A. (2013). Discovering psychology: The science of mind. Wadsworth.
Chen, Q., Deng, H., Brauth, S. E., Ding, L., & Tang, Y. (2012). Reduced performance of prey targeting in pit
vipers with contralaterally occluded infrared and visual senses. PloS ONE, 7(5), e34989.
doi:10.1371/journal.pone.0034989
Goodale, M. A., & Milner, A. D. (1992). Separate visual pathways for perception and action. Trends in
Neurosciences, 15(1), 20-25.
Goolkasian, P. & Woodbury, C. (2010). Priming effects with ambiguous figures. Attention, Perception &
Psychophysics, 72, 168–178
Kinnamon, S. C., & Vandenbeuch, A. (2009). Receptors and transduction of umami taste stimuli. Annals of
the New York Academy of Sciences, 1170, 55–59.
Lodovichi, C., & Belluscio, L. (2012). Odorant receptors in the formation of olfactory bulb circuitry.
Physiology, 27, 200–212.
Maffei, A., Haley, M., & Fontanini, A. (2012). Neural processing of gustatory information in insular circuits.
Current Opinion in Neurobiology, 22, 709–716.
Privitera, A. J. (2021). Sensation and perception. In R. Biswas-Diener & E. Diener (Eds), Noba textbook
series: Psychology. Champaign, IL: DEF publishers. Retrieved from http://noba.to/xgk3ajhy
Rock, I., & Palmer, S. (1990). The legacy of Gestalt psychology. Scientific American, 262, 84–90.
Romani, G. L., Williamson, S. J., & Kaufman, L. (1982). Tonotopic organization of the human auditory
cortex. Science, 216(4552), 1339-1340.
Roper, S. D. (2013). Taste buds as peripheral chemosensory receptors. Seminars in Cell & Developmental
Biology, 24, 71–79.
Schiffman, H. R. (1990). Sensation and perception: An integrated approach (3rd ed.). John Wiley & Sons.
Spors, H., Albeanu, D. F., Murthy, V. N., Rinberg, D., Uchida, N., Wachowiak, M., & Friedrich, R. W. (2013).
Illuminating vertebrate olfactory processing. Journal of Neuroscience, 32, 14102–14108.
17
PYC1501/Sensation and perception/OER/2024
Roper, S. D. (2013). Taste buds as peripheral chemosensory receptors. Seminars in Cell & Developmental
Biology, 24, 71–79.
Strain, G. M. (2003). How well do dogs and other animals hear? Retrieved from
http://www.lsu.edu/deafness/HearingRange.html
Wakakuwa, M., Stavenga, D. G., & Arikawa, K. (2007). Spectral organization of ommatidia in flower-visiting
insects. Photochemistry and Photobiology, 83, 27–34.
Van Deventer, V., & Mojapelo-Batka, M. (2013). A Student’s A-Z of Psychology. Juta.
18