You are on page 1of 5

Volume 9, Issue 4, April – 2024 International Journal of Innovative Science and Research Technology

ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24APR313

Human Perception of Emotions from Canis familiaris


Barks: An Auditory-Perceptual Study
Varun Singh1, Jim Saroj Winston 2*
1
Assistant Professor, Dr. Shakuntala Misra National Rehabilitation University, India.
2
Assistant Professor, Nitte Deemed to be University, India.
*
Corresponding Author ; Jim Saroj Winston 2*

Abstract:- This study investigates how experience and Frequency Modulation), amplitude (Relative Amplitude and
gender influence the perception of emotions in dog barks. Abrupt Onset), and duration (Pulse Duration and Pulse
Drawing from Morton's structural-motivational rule and Repetition) as reported by Lord, Feinstein, and Coppinger
previous research on mammalian vocalizations, we aimed (2009). According to Waldman (1972), an animal's emotional
to discern whether humans, especially those experienced state can be analogous to human emotions. Morton (1977),
with dogs, can accurately identify emotions such as based on bird and mammalian vocalizations, hypothesized
stranger, anger, lonely, and play in dog barks. Using that low-pitch (atonal) vocalizations signal aggressive
recordings of Indian Lesser Spitz barks in various intentions, in contrast to high-pitch (tonal) vocalizations
contexts, we conducted auditory-perceptual experiments signaling friendly and submissive intentions.
with two groups: experienced listeners (with more than 2-
3 years of pet dog experience) and non-experienced Dogs (Canis familiaris) are integral to many households
listeners. Participants listened to bark sequences and and are aptly referred to as man's best friend. Earlier studies
identified the corresponding emotions. Results revealed have proved that dogs emit acoustically different barks in
that experienced listeners consistently outperformed non- different situations, suggesting that emotional changes in dogs
experienced ones in identifying emotions, except for are reflected in barking vocalizations (Feddersen-Petersen,
'play,' where no significant difference was observed. 2000; Yin, 2002; Pongracz et al., 2005). A similar study on
Gender did not significantly affect emotion perception. the vocalization system of wolves revealed that acoustic
Interestingly, 'anger' was most accurately identified features of vocalizations vary with intentions or internal state
across both groups, followed by 'stranger,' 'play,' and (Schassburger, 1993). Further, Linnankoski et al. (1994)
'lonely' emotions. Analyzing open-ended responses, we interestingly reported that humans could identify the emotions
found that acoustic cues such as pitch and inter-bark of a macaque from its vocalizations. Pongracz et al. (2005)
intervals strongly influenced emotional perception. investigated the effect of acoustic parameters of dog barks on
'Stranger' barks were described as low-pitched, while human listeners, and results revealed that barks recorded in
'anger' barks had shorter inter-bark intervals. 'Lonely' different situations had distinctive acoustic patterns regarding
barks were characterized by high pitch and longer inter- harmonic-to-noise ratio, fundamental and peak frequency and
bark intervals than 'play' barks. These findings suggest inter-bark intervals conveying emotional information for
that experience plays a crucial role in accurately human listeners.
perceiving emotions in dog barks, aligning with the
concept of a common mammalian heritage in emotional Pongracz et al. (2005) proposed the possibility of a
communication. Gender differences were negligible in common mammalian heritage in the acoustic communication
this context. Understanding the acoustic cues underlying of emotions. The common mammalian heritage theory
emotional expression in dogs enhances our postulates that during the evolution of species from lower-
comprehension of canine behavior and has implications order to higher-order organisms, the basic principles of
for fields like animal welfare and neuroscience. Further acoustic processing schemes might also have been transferred.
research could delve deeper into the mechanisms
underlying emotional perception in non-verbal All people (including children) make everyday
communication across mammalian species. judgments of emotions when listening to their fellow beings,
even though they may be inexperienced. One might be using
Keywords:- Emotions, Canis Familiaris, Human Perception, the same perceptual schemes for perceiving emotion intensity
Emotion Perception. in non-human vocalizations (e.g., dog barks), which is used
for perceiving emotion in human infant cries and in adult
I. INTRODUCTION speech.

Emotions reflect the internal status of an organism and The present study conducted an auditory-perceptual
help a second person evaluate the internal status of his fellow experiment to explore humans' ability to identify emotions
organism. Emotions can be expressed or comprehended from dog bark. Under the perceptual analysis of emotions, the
through variations in various acoustic parameters of vocal aim of the present study was threefold: (i) The primary aim
signals, such as frequency (Tonality, Noise, Mean Pitch, and was to investigate the influence of experience and gender in

IJISRT24APR313 www.ijisrt.com 282


Volume 9, Issue 4, April – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24APR313

the perception of 4 target emotions: stranger, anger, lonely, adding 30 seconds of silence after each 15 seconds of bark as
and play (happy). (ii) The secondary aim of the study was to response time for human listeners. Three bark sequences
find the order of identification of emotions, i.e., which were made, each consisting of barks from four different
emotion is perceived most accurately and vice versa. (iii) The contexts arranged randomly.
tertiary aim of the study was to determine the possible
implication of the "common mammalian heritage" model in The bark sets were copied into a compact disk (CD) and
the perception of emotion using the present findings. played on a computer via Philips SHP2000 headphones
(frequency response: 15-22000Hz) to the participants. The
II. METHODS listeners had to listen to each bark set and identify the context
in which it was recorded. The listeners' responses were
A. Participants collected using a quaternion forced choice condition in which
Two groups of individuals participated in the study. they had to put a tick mark (√) in the box corresponding to
Group 1 included twenty experienced listeners, ten males and the context. The experimenter handled the player. The bark
ten females in the age range of 18-24 years (mean age=21.0 sequence sets were played back individually to the listeners,
years) who had a minimum of 2-3 years of experience with who were allowed to listen to each set only once. However, a
pet dogs of any breed, and the second group included twenty particular context bark from a bark set was repeatedly played
non-experienced listeners, ten males and ten females in the back at the listeners' request. After listening to 3 bark sets,
age range of 18-24 years (mean age=20.6 years) who did not the listeners were asked to mention the unique features that
have any experience with dogs. distinguished one emotion from the others in an open-choice
manner.
B. Recording of Stimulus
A three-year-old female dog from the Indian Lesser III. RESULTS
Spitz (Canis familiaris) variety participated in the study. This
breed's characteristic feature is that it is a watchdog known The results of the present study are discussed under four
for excessive barking. Bark recordings were collected in four headings: (a) Experienced versus non-experienced listeners,
contexts at the owner's home. All barks were recorded by the (b) Experienced males versus experienced females, (c) Non-
experimenters directly. The four contexts were; experienced males versus non-experienced females, and (d)
 Stranger: The experimenter was a stranger to the dog and Order of identification of emotions.
would arrive at the house gate in the owner's absence. The
owner was asked to stay inside the house during bark A. Experienced versus Non-Experienced Listeners
recordings. Dog barking was elicited and recorded during Experienced listeners identified all four emotions more
the experimenter's appearance at the house's gate for 1-2 correctly than non-experienced listeners. The emotion 'anger'
minutes. was identified correctly and accurately by almost all
 Lonely: The owner tied the lash of the dog to a tree at one experienced listeners, followed by the emotion 'stranger' and
corner of the house and walked away, out of sight of the 'lonely.' The emotion 'play' was least identified by the
dog. The recorder was kept at a distance of 0.5-1 meter, experienced listeners, though it was correctly identified by
and dog barks were recorded for 1-2 minutes. 75% of the participants. Table 1 shows the percentage of
 Play: The owner was asked to play a usual game with the correct identification of emotions by experienced versus non-
dog, such as catching the ball. The experimenter recorded experienced listeners.
barks elicited at a distance of 0.5-1 meter for 1-2 minutes.
 Anger: The owner was asked to stand near the dog, and Table 1 Percentage of Experienced and Non-Experienced
one of the experimenters pretended to be hitting the Listeners Who Gave >50% of Correct Identification of
owner. The barks elicited were recorded at a distance of Emotions
0.5-1 meter for 1-2 minutes. Group 1 Group 2
Emotions Z p
(%) (%)
Recordings were made directly by the experimenters Stranger 90 30 3.8730 0.0001*
with a built-in omnidirectional microphone (frequency range Lonely 80 40 2.5830 0.0098*
16 Hz-12000 Hz) of a Sony Walkman digital media player Play 75 70 0.3541 0.7233
(NWZ-E443, Sony Corp, China). During the recording of the Anger 100 80 2.1082 0.0350*
barks, the distance between the dog and the recorder was
maintained at 0.5-1 meter constantly, and barks emitted A test of equality of proportion was conducted to find
during each context were recorded for 1-2 minutes. Two the statistically significant difference in the correct
visits were made on two different days, and two sets of identification of emotions (>50%) between the two groups.
recordings were made for each of the four contexts. The statistical analysis results revealed that the experienced
listeners identified the emotions more accurately than non-
C. Auditory-Perceptual Experiment experienced listeners. This was significant at 0,05
Barks of 15 seconds that accurately represented the significance level for the emotions' stranger', 'lonely,' and
target emotions (four in number: Anger, stranger, lonely, and 'anger.' There was no statistically significant difference
play) were selected perceptually by three experimenters of between the experienced and non-experienced listeners for
this study. Stimuli were assembled using Adobe Audition the emotion 'play.'
software version 3.0 (Adobe Systems, Incorporated) by

IJISRT24APR313 www.ijisrt.com 283


Volume 9, Issue 4, April – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24APR313

B. Gender Effect on Experienced Listeners Table 4 Friedman Test Results for Rank of Identification of
Experienced female listeners identified two target the Emotions
emotions, 'stranger' and 'play,' correctly than experienced Ranks 1 2 3 4 p
male listeners. On the other hand, emotions like 'lonely' and Experienced Anger Stranger Play Lonely
'anger' were identified similarly by both experienced males 0.001*
listeners (3.08) (2.50) (2.23) (2.20)
and experienced females. The equality of proportions test Non-
results revealed no statistically significant difference between Anger Play Lonely Stranger
experienced 0.000*
experienced males and females in correctly identifying target (3.35) (2.63) (2.13) (1.90)
listeners
four emotions. Table 2 shows the percentage of experienced
male and female listeners correctly identified the four target Table 4 indicates that the order of identification of
emotions. emotions from best to least was anger, stranger, play, and
lonely for experienced listeners and anger, play, lonely, and
Table 2 Percentage of Experienced Male and Female stranger for non-experienced listeners. Also, the Wilcoxon
Listeners Who Gave >50% of Correct Identification of signed rank test was done to find out which emotion pair was
Emotions better identified.
Males
Emotion Females (%) Z p
(%) Table 5 shows the Wilcoxon signed rank test results for
Stranger 80 100 1.4907 0.1360 pair-wise comparison of emotions for experienced and non-
Lonely 80 80 0 1 experienced listeners. Wilcoxon signed rank test revealed
Play 70 80 0.5164 0.6056 that the emotion pairs like 'lonely-stranger,' 'anger-stranger,'
Anger 100 100 - - 'anger-lonely,' and 'anger-play' significantly differ (p<0.05)
among experienced listeners. The pairs' play-stranger', 'anger-
C. Gender Effect in Non-Experienced Listeners stranger,' 'anger-lonely,' and 'anger-play' showed significant
Non-experienced female listeners correctly identified identification differences among non-experienced listeners
two target emotions, 'stranger' and 'lonely,' unlike non- (p<0.05).
experienced male listeners. On the other hand, emotion like
'play' was identified correctly by most non-experienced male Table 5 Results of Wilcoxon Signed Rank Test for Pair-Wise
listeners compared to non-experienced female listeners. The Comparison
equality of proportions test results revealed no statistically Emotion pairs Experienced Non-experienced
significant difference between non-experienced males and listeners listeners
females in correctly identifying target four emotions. Table 3 Z P Z P
shows the percentage of non-experienced male and female lonely - stranger -1.633 .102 -1.026 .305
listeners correctly identified the four target emotions. play - stranger -.682 .495 -2.553 .011*
*
anger-stranger -2.232 .026 -3.625 .000*
Table 3 Percentage of Non-Experienced Male and Female play - lonely -.736 .461 -1.586 .113
Listeners Who Gave >50% of Correct Identification of anger - lonely -2.716 .007* -2.819 .005*
Emotions anger - play -2.558 .011* -2.252 .024*
Males Females
Emotion Z p
(%) (%) Listeners were asked to describe the perceptual
Stranger 20 40 0.9759 0.3291 characteristic of each bark in an open-ended fashion that cued
Lonely 20 60 1.8257 0.0679 them to differentiate one type of emotion from the other. The
Play 80 60 0.9759 0.3291 listeners described "stranger" emotion as low pitch bark and
Anger 80 80 0 1 "anger" bark as having short inter-bark intervals. "Play" and
"lonely" bark were described as high-pitch barks, out of
D. Order of Identification of Emotions which "lonely" bark had more inter-bark interval and high
The rank of emotions identified from most to least tonality than "play" bark.
accurately identified was evaluated using the Friedman non-
parametric test among experienced and non-experienced IV. DISCUSSION
listeners. The Friedman test results found a significant
difference in the rank of identification of emotions between The results of the present study indicated that
the groups. Table 4 shows the results of the Friedman test for experienced listeners could identify the emotions' anger,
the order of identification of emotions (mean values were stranger, and lonely' (negative emotions) better than non-
mentioned in parentheses). experienced listeners. This finding is in agreement with the
results of the Pongracz et al. (2005) study, where people with
different experiences with dogs were asked to describe the
emotional content of several artificially assembled bark
sequences based on five emotional states (aggressiveness,
fear, despair, playfulness, and happiness). Pongracz et al.
reported that experienced dog owners could correctly identify
the target emotions more than the non-experienced. Also, the

IJISRT24APR313 www.ijisrt.com 284


Volume 9, Issue 4, April – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24APR313

findings of the present study are in consonance with earlier schemes' for perceiving emotions in various vocalizations.
findings of behavioral and neuro-imaging studies for parents' Also, the role of a "common mammalian heritage" in acoustic
and non-parents perception of emotions from human infant communication, as mentioned by Pongracz et al. (2005), is
cry, the results of which reveal that parents perform better evident, especially in the perception of emotions.
than non-parents (Green et al., 1987 & Seifritz et al., 2003).
Both the experienced and non-experienced listeners
The identification of the emotion 'play' (a positive described that the unique cues of emotions were in
emotion) was unaffected by the listeners' experience. accordance with Morton's Structural-motivational rules
According to the most widely held view, the right (1977), which articulates that the pitch, inter-bark intervals,
hemisphere dominates the left hemisphere in the perception and tonality were found to have solid cues for the perception
and expression of emotions (Strauss & Moscovitch, 1981; of emotions in dog barks for human listeners.
Campbell, 1978; Chaurasia & Goswami, 1975; Safer, 1981).
Seifritz et al. (2003) found that in the right amygdala and V. CONCLUSION
interconnected limbic regions, non-parents showed relatively
more robust activation for positive emotions, and parents The results showed that communication between
showed relatively stronger activation for negative emotions. humans and dogs is based on the basic principles of
mammalian communication (perceptual processing schemes)
Among the experienced and non-experienced listeners, and follows Morton's structural-motivational rules.
regardless of gender, ' anger' was the best-identified emotion. Interestingly, the results generally aligned with earlier studies
This may be better understood because 'anger' is an on infant cry perception and animal communication,
expression signaling the hostile intentions of perceived emphasizing the role of "common mammalian heritage" in
persons (Hortsmann, 2003). Therefore, it is possible that in the acoustic communication of emotions.
human evolutionary history, perceiving anger and preparing
for a possible attack was more profitable than ACKNOWLEDGMENT
underestimating signals of potential danger and not
anticipating the attack. In contrast, underestimating the The authors would like to thank the management of their
expression of happiness would not have such negative parent institutes and all the participants for facilitating this
consequences (Biele et al., 2006). study at all junctures.

Statistically significant gender differences were not REFERENCES


evident in the identification of emotions in the present study.
Similar findings were reported by Westbrook (1976) in the [1]. Biele, C., & Grabowska, A. (2006). Sex differences in
perception of emotions in 49 males and 51 females, with no perception of emotion intensity in dynamic facial
gender difference. Leger et al. (1996) reported no gender expression. Experimental Brain Research, 171(1), 1-6.
difference in human infant cry perception in adults. It should [2]. Campbell, R. (1978). Asymmetries in interpreting and
be admitted, however, that in some studies on human expressing a posed facial expression. Cortex, 19, 327-
emotion perception, gender differences were observed, 342.
suggesting that gender effects are dependent upon procedural [3]. Chaurasia, B. D., & Goswami, H. K. (1975). Functional
variables that can influence subjects' performance (Biele et asymmetry in the face. Acta Anatomica, 91, 154-160.
al., 2006). [4]. Feddersen-Petersen, D., 2000. Vocalization of European
wolves (Canis lupus lupus L.) and various dog breeds
Earlier studies on human infant cry perception in adults (Canis lupus f. familiaris). Archiv fu¨r
have demonstrated that adults can discriminate between cries Tierzucht/Archives Animal Breeding, 43(4), 387–397.
emitted widely discrepant circumstances (e.g., birth and pain [5]. Green, J.A., Jones, L. E., & Gustafson GE. (1987).
cries) (Wasz-Hockert et al., 1968). However, Gustafson and Perception of cries by parents and non-parents: Relation
Harris (1990) reported poor performance in perceiving more to cry acoustics. Developmental Psychology, 23, 370-
closely related ones (i.e., hunger vs. pain cry). Similar 382.
findings were found in the present study, where the listeners [6]. Gustafson, G. E., & Harris, K. L. (1990). Women's
could discriminate between barks with discrepant responses to young infants' cries. Developmental
characteristics but not between those with closely related Psychology, 26, 144-152.
characteristics. [7]. Hopkins, B. (2000). Development of crying in normal
infants: Method, theory and some speculations. In:
Results of the present study hint that the emotion Crying as a sign, a symptom, & a signal: Clinical
perception from dog bark points to the fact that humans can emotional and developmental aspects of infant and
perceive emotions from the vocalizations of mammals based toddler crying. (Barr, R. G., Hopkins, B, et al., Ed), pp
on perceptual-acoustic characteristics. This was evident from 176-209. New York, NY, US: Cambridge University
the earlier studies on infant cry perception. Vocalizations of Press.
mammals (human infants, dogs, wolves, and macaques) in [8]. Hortsmann, G. (2003). What do facial expressions
different circumstances have been proven to have different convey: Feeling states, behavioral intentions or action
acoustic characteristics (in earlier studies). All these may requests. Emotion, 3(2), 150-166.
indicate that we might use the same 'perceptual processing

IJISRT24APR313 www.ijisrt.com 285


Volume 9, Issue 4, April – 2024 International Journal of Innovative Science and Research Technology
ISSN No:-2456-2165 https://doi.org/10.38124/ijisrt/IJISRT24APR313

[9]. Leger, D.W., Thompson, R.A., Merritt, J.A., & Benz,


J.J. (1996). Adult perception of emotion intensity in
human infant cries: Effects of infant age and cry
acoustics. Child Development, 67, 3238–3249.
[10]. Linnankoski, L., Laakso, M., Aulanko, R., Leinonen, L.
(1994). Recognition of emotions in macaque
vocalizations by children and adults. Language and
Communication, 14, 183-192.
[11]. Lord, K., Feinstein, K., & Coppinger, R. (2009).
Barking and mobbing. Behavioral Processes, 81, 358-
368.
[12]. Safer, M. A. (1981). Sex and hemisphere differences in
access to codes for processing emotional expressions
and faces. Journal of Experimental Psychology:
General, 110, 86-100.
[13]. Schassburger, R. M. (1993). Vocal communication in
the Timber Wolf, Canis lupus Linnaeus: Structure,
Motivation, and Ontogeny. Berlin Germany: Paul Parey
Scientific Publishers.
[14]. Scheiner, E., Hammerschmidt, K., Jurgens, U., &
Zwirner, P., (2002). Acoustic analysis of developmental
changes and emotional expression in the pre-verbal
vocalizations of infants. Journal of voice, 16(4), 509-
529.
[15]. Seifritz, E., Esposito, F., Neuhoff, J.F., Luthi, A.,
Mustovic, H., Dommann, G., Bardeleben, U.V., Radue,
E.W., Cirillo, S., Tedeshi, G., & Di Salle, F. (2003).
Differential sex-independent amygdala response to
infant crying and laughing in parents versus non-
parents. Biological Psychiatry, 54, 1367-1375.
[16]. Strauss, E., & Moscovitch, M. (1981). Perceptual
asymmetries in processing facial expression and facial
identity. Brain and Language, 13, 308-322.
[17]. Simonov, V.P., (1986). The Emotional Brain:
Physiology, Neuroanatomy, Psychology & Emotion.
New York: Plenum Press.
[18]. Leont’ev and Sudakov (1978). The Great Soviet
Encyclopedia 3rd Edition.
[19]. Morton, E.S. (1977). On the occurrence and
significance of motivation-structural rules in some bird
and mammal sounds. American Naturalist, 111, 855–
869.
[20]. Pongra´cz, P., Molna´r, C.S., Miklo´si, A., & Csa´nyi,
V. (2005). Human listeners are able to classify dog
barks recorded in different situations. Journal of
Comparative Psychology, 119, 136–144.
[21]. Waldman, R. (1972). Cited in V. P. Simonov. (1986).
The Emotional Brain: Physiology, Neuroanatomy,
Psychology & Emotion. New York: Plenum Press.
[22]. Wasz-Höckert, O., Lind, J., Vuorenkoski, V., Partanen,
T., & Valanné, E. (1968). The Infant Cry: A
Spectrographic and Auditory Analysis. Clinics in
Developmental Medicine No. 29. London: Spastics
International Medical Publications.
[23]. Westbrook, M. (1976). Sex differences in the perception
of emotion. Journal of Consulting and Clinical
Psychology, 26(2), 139-146.
[24]. Yin, S. (2002). A new perspective on barking in dogs.
Journal of Comparative Psychology, 116, 189–193.

IJISRT24APR313 www.ijisrt.com 286

You might also like