3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

1

Color-Emotion Associations in Art: Fuzzy Approach


Muragul Muratbekova ∗ , Pakizar Shamoi†
School of Information Technology and Engineering
Kazakh-British Technical University
Almaty, Kazakhstan
Email: ∗ mu muratbekova@kbtu.kz, † p.shamoi@kbtu.kz,

Abstract—Art objects can evoke certain emotions. Color is a We need a machine that can objectively perceive objects
arXiv:2311.18518v1 [cs.CV] 30 Nov 2023

fundamental element of visual art and plays a significant role despite subjective color perception. [4].
in how art is perceived. This paper introduces a novel approach The relationship between color and emotion has been stud-
to classifying emotions in art using Fuzzy Sets. We employ a
fuzzy approach because it aligns well with human judgments’ ied in various areas, ranging from psychology to art theory.
imprecise and subjective nature. Extensive fuzzy colors (n=120) The colors we see can greatly impact our emotions. Re-
and a broad emotional spectrum (n=10) allow for a more human- searchers such as J.H. Xin have found that the lightness and
consistent and context-aware exploration of emotions inherent chroma of colors are more influential than the hue itself [5].
in paintings. First, we introduce the fuzzy color representation Other studies, including one conducted by Banu Manav [6],
model. Then, at the fuzzification stage, we process the Wiki
Art Dataset of paintings tagged with emotions, extracting fuzzy have also shown that our emotional response to colors can
dominant colors linked to specific emotions. This results in vary based on their lightness and saturation. In a related
fuzzy color distributions for ten emotions. Finally, we convert experiment, the researchers in [7] explored how changes in
them back to a crisp domain, obtaining a knowledge base of valence and arousal affected emotional response. They were
color-emotion associations in primary colors. Our findings reveal able to accurately map 4 different emotions to 5 parameters:
strong associations between specific emotions and colors; for
instance, gratitude strongly correlates with green, brown, and lightness, chroma, hue, valence, and arousal, with an accuracy
orange. Other noteworthy associations include brown and anger, rate of 80%.
orange with shame, yellow with happiness, and gray with fear. Currently, there are several gaps in research on this topic.
Using these associations and Jaccard similarity, we can find the Technically, there is a limitation in recognizing and compre-
emotions in the arbitrary untagged image. We conducted a 2AFC hending pictures. Aesthetically, there is no apparent correla-
experiment involving human subjects to evaluate the proposed
method. The average hit rate of 0.77 indicates a significant corre- tion between the information conveyed by a picture and the
lation between the method’s predictions and human perception. response to it.
The proposed method is simple to adapt to art painting retrieval The interaction between different emotions is still poorly
systems. The study contributes to the theoretical understanding understood, and the studies conducted so far have been limited
of color-emotion associations in art, offering valuable insights for to a handful of emotions (usually 4 to 6). This is because
various practical applications besides art, like marketing, design,
and psychology. emotions are difficult to represent using a specific metric.
Moreover, it is important to shift the focus from the content
Index Terms—Fuzzy sets, emotions in art, color palette, classifi- of paintings to their forms. However, it seems impossible to
cation, color-emotion model, art image analysis, color perception,
image processing emotion detection. achieve universal calculations in this regard.
It is crucial to acknowledge that various factors influence
perception and should be evaluated in different contexts. In
I. I NTRODUCTION this paper, our focus is on art and we present a technique
Nowadays, an increasing amount of affective information to extract the emotional color palette from art objects by
is distributed worldwide in all formats [1]. The elements using fuzzy sets. This method is particularly useful as color
that contribute to the sensory experience of audio and visual ambiguity and perception are both subjective and inaccurate
content include acoustic quality, facial expressions and ges- by nature. Traditional color-emotion studies have primarily
tures, background music, ambient noise, color schemes, and relied on basic color categories, which may impact emotional
image filters. It is crucial to assess incoming data objectively, experiences evoked by art.
including emotional reactions. [2], [3]. The current paper proposes an emotion classification ap-
Color is a fundamental element of visual art and plays a big proach for art objects using Fuzzy Sets and Logic Approach
role in the emotional responses induced by the art image. It is (see Fig. 1). It matches colors in the art domain to the spectrum
important to classify art not just in terms of age or style but of human emotions. The fuzzy approach is well-suited for
also in terms of the emotions they create in a viewer. That can color and emotion detection. How people perceive color and
facilitate the identification of art expressing similar emotions emotion is inherently subjective and differs from person to
and has a potential application in context-based image retrieval person. The fuzzy approach can effectively represent this
systems. imprecision in the perception of color-emotion associations
Emotion detection aims to determine the presence and in a way that classical binary logic might struggle to capture.
intensity of emotion as close as possible to human perception. As it can be seen in Fig. 1, we aim to bridge the gap
2

Fig. 1. Bridging the semantic gap between low-level features in art objects and high-level semantic concepts of emotions. Lyonel Feininger ”Carnival in
Arcueil” painting.

between low-level features, specifically colors, and high-level


emotion classifications (e.g., happy, fearful, etc.), utilizing a
comprehensive set of bridging tools, including fuzzy sets and
logic, the Russell emotion model, Bundesen’s theory of visual
attention, and color theory.
The contributions of this paper are the following:

• Enhanced emotional spectrum.We consider a broad range


of emotions (N=10) with contextual relevance in art.
• Increased granularity. Basic colors provide a general
understanding of color-emotion relationships, but using
a wider array of fuzzy colors (N=120), our model imi-
Fig. 2. Lee J. Emotional classification of color images. [8]
tates human perception of colors. This enables a richer
exploration of emotions and their connections to specific
color variations and combinations. II. R ELATED W ORK
• Rich color expressiveness of emotions. We employ 10-
color palettes to depict the complexity inherent in art A. Color-Emotion Association Studies
images better, but we also consider color proportion. Most This section presents an overview of literature related to
studies concentrate on emotions within two or three color emotion detection in Art. Several studies have examined the
palettes. color-emotion associations in various contexts, including art.
• Proof of concept. we also provided a method for auto- Prior work can be divided into three main directions: context
matically tagging the art objects for Art Gallery Systems dependency investigation, machine learning algorithms, and
for emotion-based image retrieval. fuzzy approach.
The earliest studies were highly interested in the human
The paper has the following structure. Section I is this perception subjectivity level. Color perception depends on
Introduction. Section II presents a review of prior research factors such as sex, nationality, culture, etc. Authors of [9]
on color-emotion associations. In Section III, various methods decided to find out to what extent human perception is influ-
we use in this study, including fuzzy sets and logic and theory enced by cultural background. They compared the emotional
of visual attention, are detailed, along with explanations of responses of people in 7 different regions of Europe and Asia
data collection and proposed approach details. The subsequent and found out that regardless of their national differences,
section, Section IV, showcases the sample application and people of different cultures reacted to colors quite similarly.
experimental results. Next, the Discussion is presented in So, the cultural context has very limited influence on human
Section V. Finally, Section VI concludes the paper and offers perception. However, the human response to color is mainly
recommendations for future enhancements to the methodology. influenced by lightness, chroma, and hue.
3

Fig. 3. Kang’s study color image scale [14].

Several studies have examined that the aesthetics of art


objects depends on their texture, cultural context, and the
viewer’s emotional state [10] [11]. The work proposed in
[11] uses Kansei engineering to translate feelings into specific Fig. 4. Russell emotional model. [19].
parameters for frequently encountered subjects in aesthetics
and emotion. The paper by [12] conducted several psycho-
logical experiments to evaluate how the perception differs for The authors of the other study designed a machine-learning
single-color and multiple-color paintings. They used different algorithm for emotion classification in images, utilizing color
scale types and concluded that the emotional response to information and categorizing emotions into three distinct
color is more universal. Two more critical factors influence classes: negative, positive, and neutral [20]. Another study
the outcome of emotional perception: order and learning. proposed the L-EEP method, utilizing Culture Technology,
The initial factor involves components like symmetry, rhythm, to automatically extract eight emotions from Iranian-Islamic
repetition, and contrast. The second factor, learning, highlights paintings through color analysis and the automated Luscher
that individuals tend to perceive something as aesthetically test [21]. Recent work proposed a network for recognizing
pleasing when it aligns with their previous experiences [13]. emotions in images, utilizing a weakly supervised approach
From the perspective of the proposed approach, previous to learn emotion intensity [22].
works can be divided into machine learning algorithms and Fuzzy logic has been used to address these concerns,
fuzzy logic approaches. specifically targeting diverse contexts such as gender, age,
Advanced search engines are mostly based on text, key- ethnic group, and environmental factors when computing the
words, or using face recognition algorithms. [15], [16] pro- outcome of an emotional assessment. Several works stressed
posed to use a color as a new metric for image retrieval and the importance of the fuzzy sets approach to imitate human
classification. The work of [17] extended emotion recognition perception of high-level color concepts:
methods by adding the relationship between neighboring emo- • Researchers introduced fuzzy color spaces defined using
tions. Their algorithm considers that emotion and transitions RGB and Euclidean distance, with examples from the
between emotion regions may influence human perception. well-known ISCC-NBS color naming system [23].
The study [8] provided several classification methods for • The other works introduced fuzzy granular colors con-
emotions, such as Decision trees, support vector machines, structed by consolidating fuzzy colors that share semantic
and Naive Bayes. Additionally, it stressed the importance relationships within a specific color category [24], a
of selecting emotional adjectives based on four criteria: uni- method centered on images for constructing fuzzy color
versality, distinctiveness, utility, and comprehensiveness (see spaces within defined contexts [25].
Fig. 2). From the viewpoint of affective computing, it is • Nachtegael introduced a formula for calculating similarity
currently understood that emotional states can only be de- of fuzzy color histograms. [26].
scribed explicitly and not implicitly. The primary objective • In the other work, the connection between colors and
of affective computing is to identify and classify emotions emotions was investigated by employing fuzzy feature
accordingly. Several approaches have been reported in the vectors derived from fuzzy color histograms and emotion
literature: knowledge base, statistical methods, and hybrid [1]. vectors obtained through surveys [4].
Some of them lack semantic strength. • Another model employed fuzzy logic and L*a*b* color
In their theory, Kobayashi created a graph to identify space for emotional semantic queries. The generation
the combination of most similar colors and determine the of semantic terms was facilitated by applying a fuzzy
leading emotion from it [18]. The work proposed in [14] clustering algorithm [27].
uses Kobayashi’s color theories to extract complex emotions • Recent research by [?] used fuzzy logic to explore
from paintings by adding a grayish/clear factor for improved the universality of color palettes concerning specific
accuracy. However, with many different emotions present associations. The study revealed that color harmony
in the picture, the probability of error increases. Figure 3 demonstrates a universal aspect within specific contexts.
illustrates this approach. Generally, certain impressions maintain universal color
4

palettes across diverse settings, while others are context- TABLE I


dependent. There are universally accepted color palettes D ISTRIBUTION OF EMOTIONS IN DATASET
noted for their considerable aesthetic appeal and several
Emotion Dataset Image count
universally perceived color associations. The perception
of aesthetics is partly general and partly domain-specific. happiness 50% 773

As we can see from the above, the level of interest in love 50% 160
this field appears to be substantial. However, there are also anger 50% 21
some technical and semantic limitations to research. One sadness 50% 159
important constraint on most works discussed in this area is gratitude 50% 8
the limited number of colors and their crisp nature, which is in fear 50% 202
contrast to human color perception. Some of the experiments shame 50% 5
were conducted without a validated dataset, and psychological
surprise 50% 532
experiments were conducted without having control of partic-
shyness 30% 6
ipants’ conditions.
trust 50% 358

B. Models of Emotion
Several emotion models have been proposed to describe and A. Dataset
categorize human emotions. Here are a few examples of well-
To proceed, we must first locate a set of paintings that have
known ones: Russell’s Circumplex Model [19], Ekman’s Six
been previously tagged with specific emotions. This step can
Basic Emotions [28], Plutchik’s Wheel of Emotions, James-
be divided into two parts: training and testing. We use Wiki
Lange Theory [29], Cannon-Bard Theory [30], [31], Barrett’s
Art Dataset. This resource contains works from 195 artists,
Conceptual Act Theory [32], The Geneva Emotion Wheel [33],
42129 images for training, and 10628 images for testing [36].
Schachter-Singer Two-Factor Theory [34], Mehrabian’s PAD
This dataset has classified 20 emotions into three categories:
Model [35].
positive, negative, and other/mixed.
In our research, we are using Russell’s emotional model.
The images in this dataset were manually tagged with
Russell’s Circumplex Model is a two-dimensional model or-
the emotions they evoke in humans. At least ten annotators
ganizing emotions based on two primary dimensions: valence
annotated each work of art. There are 3 sets of annotations,
(pleasure-displeasure axis) and arousal (activation-deactivation
namely, based on title, image, and combined (image and title).
axis) [19]. According to Russell’s approach, emotions don’t
For our study, we selected annotations based on the image only
exist in discrete categories but along continuous dimensions.
(without a title) because our analysis is based on colors in the
This is consistent with the fuzzy approach, which allows for
image, ignoring any associated text, including author, title, and
the fuzziness and transitionality of emotional borders. Color
genre.
and emotion relationships in art may be difficult to categorize
using rigid boundaries. Thus, any emotion can be, to some The labeled dataset is divided into three variants according
extent, pleasant/unpleasant and active/passive. to aggregation thresholds of 30%, 40%, and 50%, respectively.
The model was developed based on subjective feelings [19]. A label is selected if at least 30%, 40%, and 50% of the
Participants were asked to group 28 emotion words based on responses, or three out of every 10 respondents, suggest that a
perceived similarity, and a statistical technique was used to particular emotion is applicable. The authors claim that 40%
group related words. of the dataset is generous aggregation criteria; nonetheless, we
As you can see in Figure 4, a set of emotions is distributed selected 50% of the dataset for nine emotions and 30% of the
on the wheel. We are using some of them as tags for art dataset for the emotion of shyness, which only contains one
paintings. image in the 50% of the dataset (see Table I).
Using this dataset and comparing it with Russell’s emotional
model, we chose the main emotions for our model. As you can
III. M ETHODS
see in Table I, these are the ten most relevant emotional words
In this paper, we propose a classification system that takes and their distribution in the dataset. Refer to Table I for details.
an art piece and predicts the emotional response it can gener- Sample art images from the WikiArts Emotions Dataset are
ate. presented in Fig. 5
We aim to enhance the current image emotion classification It already has some annotations. As mentioned before, we
and retrieval algorithms by utilizing fuzzy logic. Our approach can compare these results to determine how much the proposed
comprises several stages. Firstly, we will identify the most method corresponds to the accuracy of emotion recognition.
relevant emotions within the art domain and acquire a rel-
evant dataset. We will then proceed to process the images
by smoothing and normalizing them. Subsequently, we will B. Theory of Visual Attention
retrieve color palettes from the prepared data and map them to Color perception involves categorizing colors through visual
image emotion tags. To increase the precision of the mapping recognition and attentional selection [37], [38].
process, we will convert the RGB color palettes into fuzzy Let us go over the main ideas from a theory of visual atten-
palettes. tion. Bundesen introduces three types of perceptual categories
5

Fig. 5. WikiArts Emotions Dataset [36]. Sample images.

in his work: color, shape, and location categories [38]. Our [45], [46], text [47]–[50], face [42], [43], [51], [52] and
model is primarily based on color analysis. Using Bundesen’s product [53], [54] emotion analysis.
notation, colors represent emotions as perceptual categories. Fuzzy colors are suitable for contextual relevance: they are
According to Bundesen’s theory of visual attention (TVA), often associated with specific contexts or cultural domains,
perceptual categorization involves identifying whether an input greatly influencing emotional responses. By incorporating a
element x belongs to a certain category i. This is expressed fuzzy sets approach, accounting for the cultural, contextual,
as E(x, i), with E(x, i) ∈ [0, 1] representing the certainty of x and individual variations in color-emotion associations leads
belonging to i. The set of all perceptual units is denoted by S, to a more comprehensive analysis. The next section presents
and the set of all categories is denoted by R. fundamental concepts from fuzzy theory utilized in our study.
The concept of saliency refers to the strength of sensory 1) Fuzzy Sets: Fuzzy sets, first introduced by Zadeh [55],
evidence that suggests an element belongs to a perceptual allow membership degrees to be indicated by a number be-
category. This is denoted by the notation η(x, i), where x tween 0 and 1. In Boolean logic, we only deal with a pair
is the element and i is the perceptual category. Additionally, of numbers, 0,1. However, when it comes to fuzzy sets, we
each category is assigned a pertinence value, denoted by πi , consider all the numbers within a range of [0,1]. This range is
representing its importance for a specific task T . We can called a membership function (MF) and is denoted as µA (x),
calculate the attentional weight of x using pertinence and which can be used to represent fuzzy sets.
salience values: MFs are mathematical techniques for modeling the meaning
ωx = ∑ η(x, i)πi (1) of symbols by indicating flexible membership to a set. We can
i∈R use it to represent uncertain concepts like age, performance,
It is important to note that in Equation (1), only categories building height, etc. Therefore, MF’s key function is to convert
that have positive pertinence values have an impact on ωx . a crisp value to a membership level in a fuzzy set. We also
When it comes to visual attention, two mechanisms are often need to show triangular and trapezoidal MF here.
used - filtering and pigeonholing. Filtering refers to the process 2) Linguistic Variables: Zadeh [56] defined a linguistic
of selecting an item, denoted by x, from a set of items, denoted variable as a variable whose values are words or sentences
by S, that belongs to a particular category, denoted by i. For in a natural or artificial language rather than numbers. For
example, a client might be searching for a dark and dramatic instance, the label ’deep’ can be considered a linguistic value
art object, and filtering can be used to narrow down the art
catalog to show only those objects that fit this description.
On the other hand, Pigeonholing refers to the process of
identifying a suitable category, denoted by i, for a given item,
denoted by x, that belongs to a set of categories, denoted by R.
In other words, if we have an item, we can use pigeonholing
to determine its category. For example, when a manager of
the art gallery collects artists’ or experts’ judgments on some
art object and then categorizes them as being, e.g., happy,
romantic, fearful, etc.

C. Fuzzy Sets Theory


Emotion ambiguity and a few emotion classes compared to
human feelings are some difficulties distinguishing emotions
[40]. The fuzzy approach represents a valuable method for
dealing with imprecision, and it has been utilized in several
research to represent emotions and their intensity in audio,
speech, and video [41]–[44], color image [?], [4], [8], [27], Fig. 6. Illustration of the HSI color Space [39].
6

Fig. 7. Proposed Approach: Fuzzy colors palettes extraction.

of the variable ’Color Intensity,’ similar to a number but with D. HSI Color Model
less precision. A linguistic variable’s set of all linguistic values
Because of the high correlation between RGB color space
is called a term set.
attributes and their failure to replicate human color perception,
3) Fuzzy Hedges: There are two types of modifiers or
other color spaces, such as HS* (HSL, HSI, HSV), are favored
hedges: reinforcing and weakening modifiers. These modifiers
for modeling human color perception [37], [57].
are used to either strengthen or weaken a statement. The hedge
”very” represents the reinforcing modifier: Three attributes—hue (H, such as yellow, cyan, or blue),
intensity (I, pale vs. dark), and saturation (S, saturated vs.
tvery (u) = u2 (2) dull)—are used in the HSI paradigm to express colors (see
Fig. 6). As a result, it is more in line with how colors are
Weakening modifiers are the second type of modifiers. For perceived by the human visual system, which is predicated on
example, ”more-or-less” is a hedge that indicates a degree of three interpretations of color: brightness, purity, and category.
uncertainty. All three interpretations perfectly correspond to HSI.

tmore-or-less (u) = u (3) The Hue describes the color in terms of an angle between
0 and 360 degrees. The Saturation describes how much the
Additionally, the hedge of ”not” can be expressed as:
color is mixed with light, with a range between 0 and 100. The
tnot (u) = 1 − u (4) Intensity component ranges from 0 to 255, where 0 represents
black and 255 represents white.
Hedges can be applied several times. For example, not
very good performance is the example of a combined hedge
consisting of two atomic hedges not and very. E. Proposed Approach
4) Fuzzy Operations: In fuzzy logic, the α-cut (alpha cut)
The main idea of the proposed approach can be seen in
is a crisp set that contains all the members of a given fuzzy
Fig. 7. First, we process a dataset comprising art images. We
subset f whose values are greater than or equal to a certain
extract fuzzy dominant colors and their frequencies associated
threshold α (which ranges from 0 to 1). Mathematically, we
with certain emotions for each image in the dataset. As a result,
can represent the α-cut of the fuzzy subset f as:
we get a fuzzy color distribution for each of the ten emotions.
Finally, we convert them back to a crisp domain, obtaining
fα = {x : µ f (x) ≥ α}
color-emotion associations in the form of basic colors.
We can use the α-cuts to perform set operations on fuzzy We employ a fuzzy approach because it enables us to
sets. For instance, given two fuzzy sets A and B, we can evaluate emotions in a way that is consistent with human
obtain their union and intersection by taking the union and judgment. We obtain a more logical and human-consistent
intersection of their respective α-cuts. That is: output by partitioning the set of potential emotions into subsets
corresponding to linguistic tags [58]. Similarly, fuzzy sets
(A ∪ B)α = Aα ∪ Bα , (A ∩ B)α = Aα ∩ Bα make it simple to show how an emotion gradually changes
from one label to another [56].
Fuzzy logic is suitable to cope with subjective metrics, In the theory of visual attention, an art image is represented
such as emotions and broad concepts, like color because it by the variable x, while S represents a database of all images.
is consistent with human perception. [?]. Initially, we get the The variable i represents an emotion category, such as fear or
color palette in RGB format; then we translate it into HSI and happiness, which can be composite (e.g. love and anger). The
FHSI. collection of all emotion categories is represented by R.
7

TABLE II
F UZZY ATTRIBUTES OF THE FUZZY COLOR MODEL . A DAPTED FROM [54], [37].

Fuzzy Variable Term set Domain


Hue T = { Red, Orange, Yellow, Green, Cyan, Blue, Violet, Magenta } X = [0, 360]
Saturation T = { Low, Medium, High } X = [0, 100]
Intensity T = { Dark, Deep, Medium, Pale, Light } X = [0, 255]

Fig. 8. Fuzzy Sets for Hue, Saturation, and Intensity attributes.

Fig. 9. Fuzzy color representation model, which incorporates 120 hues, each characterized by three linguistic terms: hue, saturation, and intensity.

1) Fuzzification of HSI Colors: Triangular Membership Fuzzy partitions were adapted from [54], [37], [57], [58],
Function:  but the changes were done to the ”Low” Saturation because
0,
 x≤a we don’t have ”Any” Category. These partitions were obtained
 x−a , a ≤ x ≤ b

based on a survey on human color categorization.
µ(x) = b−a
c−x
 c−b

 , b≤x≤c Table II shows the information about term sets and domains

0, x≥c of each fuzzy variable in our color model (H, S, I). Figure 8
presents the membership functions for fuzzy H, S, I variables.
Trapezoidal Membership Function: Figure 9 presents the fuzzy colors of the proposed representa-
tion model, which incorporates 120 distinct fuzzy colors. We

0,
 x≤a

x−a have divided the intensity into five fuzzy sets, namely ”Low,”
 b−a , a ≤ x ≤ b


 ”Medium,” and ”High,” with the domain X = [0, 100] and the
µ(x) = 1, b≤x≤c universal set U = 0, 1, 2,..., 99, 100. Similarly, the attribute of
d−x

 d−c , c ≤ x ≤ d hue has been divided into eight fuzzy sets, including ”Red,”



”Orange,” ”Yellow,” ”Green,” ”Cyan,” ”Blue,” ”Violet,” and

0, x≥d
8

Fig. 10. Fuzzy colors examples from the proposed fuzzy color model.

”Magenta,” with the domain X = [0, 360] and the universal set Data: image M1 in RGB format
U = 0, 1, 2,..., 359, 360. Lastly, the attribute of saturation has Result: The histogram that represents the most
been divided into three fuzzy sets, namely ”Dark,” ”Deep,” prominent color in image M1
”Medium,” ”Pale,” and ”Light,” with the domains X = [0, /* The image’s dominant color
255], and the respective universal set U = 0, 1, 2,..., 254, 255. histogram, CH (image) is a vector
By doing this, we have created a comprehensive and accurate (hC1 , ..., hCn ), where each element hCi
spectrum of fuzzy colors. represents the frequency of color
Some color representatives can be seen in Fig 10. On the Ci in the image. */
figure, we have a table like Hue: red, Intensity: deep, etc., and Function FindFuzzyDomColors (image)
an explanation that a fuzzy color is a region, not a point. FuzzyColors ← an empty dictionary;
FuzzyDomColors ← an empty array;
A fuzzy color is a subset of points within a crisp color
/* initialize the frequency of each
space, which in this case is the HSI space, as mentioned in
fuzzy color to 0 */
[37], [54], [59]. We define the domains of the attributes H, S,
while not at end of image do
and I as DH , DS , and DI , respectively.
read current pixel;
Definition 1: Fuzzy color C is a linguistic label whose se- process current pixel;
mantic is represented in crisp HSI color space by a normalized /* convert a pixel from RGB to
fuzzy subset of DH × DS × DI . HSI, then fuzzify - convert
It can be inferred from Definition 1 that there is always to fuzzy HSI */
at least one crisp color that fully belongs to any given fuzzy f c ← computed fuzzy color;
color C. This prompts us to extend the concept of fuzzy color FuzzyColors[ f c] + +;
to the notion of fuzzy color space. As per the fundamental end
idea of a fuzzy color space [59]: /* Return the five most frequent
Definition 2: Fuzzy color space is a set of fuzzy colors that fuzzy colors. The number of
define a partition of DH × DS × DI . colors returned depends on the
Definition 3: Fuzzy color palette is a combination of context of the application. */
several fuzzy colors [45]. FuzzyDomColors ← 5 keys from FuzzyColors
Each element in a fuzzy color palette is a fuzzy color (area) with max frequency;
rather than a sharp color (point) [45]. Let us take Salmon color return FuzzyDomColors;
as an example (see Fig. 10). We convert crisp inputs into fuzzy Algorithm 1: Finding the image dominant colors [37],
sets for the fuzzification process. For example, if the color is in [54].
RGB format (Salmon: R=255, G=160, B=122), we first convert
it into HSI model (H = 17, S = 32%, I = 179%), then to the
fuzzy color model (H = Red, S = Medium, I = Pale. Hue, in We have conducted extensive research on this topic and
this case, is partial ’Red’ and ’Orange’, while Saturation is analyzed various psychological articles, including [61], [62].
partially ‘Medium’ and partially ‘Low’. Our goal was to select a comprehensive list of emotions
2) Selected Emotions: This research aims to explore the based on previous studies, which is crucial for comparing our
connection between color palettes and human emotions. As model’s accuracy with existing results.
emotions are highly subjective and influenced by various The selected dominant emotions in an art context are
factors such as gender, race, mood, and life experiences [60], gratitude, happiness, anger, love, trust, anger, fear, surprise,
we have chosen to focus on the context of art to obtain a sadness, and shame. As a result, we will conduct further
more objective assessment. To achieve this, we have exam- research on these emotions.
ined various online galleries, articles, and datasets to identify 3) Image Preprocessing: The original art images were in
which emotions are most commonly associated with paintings. various sizes, but have been normalized to (200,200) and
Additionally, Kang et al. have already researched color and converted to RGB mode. Next, we transformed each pixel to
emotion pairings, as detailed in their article [14]. HSI and then fuzzy HSI. The algorithm for fuzzy dominant
9

Data: Dataset of images M1 , ..., Mn of some Emotion E


Result: fuzzy colors palette FP for Emotion E
/* This function calculates the
dominant color histogram of the
given emotion image, denoted by
EH (image). The resulting output is a
vector (hC1 , ..., hCn ) where each hCi
element represents the frequency
of the fuzzy color Ci in the
emotion image. */
Function GetEmotionFuzzyColorsPalette (emotion)
emotionFuzzyColors ← an empty dictionary;
emotionFuzzyDomColors ← an empty array;
/* Set the frequency count for each
fuzzy color to zero. */
for each Mi in dataset do
FCi ← FindFuzzyDomColors(Mi );
/* calculate the frequency of
each fuzzy color for emotion.
*/
for each f c in FCi do
emotionFuzzyColors[ f c] + +;
end
end
/* Return the 15 most frequently
occurring fuzzy colors. */
emotionFuzzyDomColors ← 15 keys from
emotionFuzzyColors with max frequency;
return emotionFuzzyDomColors;
Algorithm 2: Receiving emotion fuzzy colors palette.

color extraction (or fuzzy color histogram) can be seen in


Algorithm 1.
4) Detection of Emotion Color Palette: Algorithm 1 aims to
compute the dominant color palette of an image represented in Fig. 11. Extracting the fuzzy dominant colors from art images and mapping
them to basic crisp colors.
RGB format. This process involves converting the RGB image
to HSI (Hue, Saturation, Intensity) color space, fuzzifying
these colors, and then generating a fuzzy dominant color constituted less than 3.5% of the image from the top 15 fuzzy
palette based on the frequency of these fuzzy colors. colors.
Algorithm 2 is designed to generate a fuzzy color histogram As an alternative color representation and for a comparative
specific to an emotion E based on a labeled dataset of evaluation of the obtained color-emotion associations with the
images representing that emotion. For each image Mi in this results of other researchers, we mapped fuzzy dominant colors
dataset, the function extracts the dominant fuzzy colors present for each emotion to basic crisp colors, akin to defuzzification.
within that specific image. This extraction process is executed The set of basic color categories we employed included ’red’,
by employing the method FindFuzzyDomColors(Mi ), which ’orange’, ’yellow’, ’green’, ’cyan’, ’blue’, ’black’, ’brown’,
returns the top 5 most frequent fuzzy colors of this image ’beige’, ’purple’, ’gray’. Fuzzy colors with ’dark’ intensity
denoted as FCi . were aligned with ’black’, while colors with ’low’ saturation
Next, we increment each color in FCi ’s frequency for the are shades of gray, so they were associated with ’gray’ basic
given emotion palette. Following processing each image in color. Considering the contextual relevance and popularity of
the dataset, the algorithm identifies the 15 most frequently oc- beige and brown colors in the domain of art paintings, we
curring fuzzy colors within the emotion dictionary. These top separated them from fuzzy colors with ’red’, ’orange’, and
15 fuzzy colors, determined by their respective frequencies, ’yellow’ hues. Fuzzy colors with ’violet’ and ’magenta’ hues
are considered the dominant representations of this specific were mapped to ’purple’. The remaining fuzzy colors with
emotion. hues ’red’, ’orange’, ’yellow’, ’green’, ’cyan’, ’blue’ were
Note that some of the obtained fuzzy palettes contain fewer mapped to their respective crisp colors.
than 15 fuzzy colors. This is because we also considered the Figure 11 illustrates the fuzzification process and the sub-
proportion of colors; specifically, we filtered out those that sequent mapping of fuzzy dominant colors to basic crisp
10

Fig. 12. Distribution of HSI attributes across certain emotion

the second palette contains a wider range of hues, featuring


shades of yellow, green, cyan, and blue, offering a softer
appearance due to its lower saturation and a blend of medium
to pale intensities. The surprise palette provides cooler tones
compared to the intense warmth of the shame palette.
Our findings suggest that specific emotions exhibit strong
associations with certain colors; for example, the ’gratitude’
emotion is strongly associated with green, brown, and orange
colors. Table III shows the association of all studied emotions
with basic colors. Figures 15, 16 provide the visual support
for this association.
The associations between emotions and colors reveal in-
teresting patterns. Happiness and love share similar color
associations, prominently featuring brown, green, and orange.
Gratitude is distinctly linked to brown, green, and orange.
Surprise stands out as the sole emotion associated with light
Fig. 13. Heatmap of color-emotion association colors, with an absence of black and a prevalence of beige.
Shyness is unique, incorporating purple and cyan as basic
colors. Shame exhibits a more limited palette, with brown
colors. The dominant colors are identified using a fuzzy color
dominating. Anger is characterized by brown, gray, black, and
representation model in the fuzzification phase. These fuzzy
red. Gray is notably intertwined with fear, and fear shares sim-
colors, indicative of emotional responses, are then systemati-
ilar color associations with sadness. For trust, brown and gray
cally mapped to a set of basic crisp colors.
take the lead in color associations. Brown consistently appears
as the most frequent color in these emotional associations.
IV. E XPERIMENTAL R ESULTS
Table III illustrates our experimental results. It shows an
A. Color-emotion Associations Results association between emotions and corresponding basic colors,
Figure 12 illustrates the distribution of hue, saturation, and accompanied by estimated percentages denoting the preva-
intensity attributes across certain emotions. One can mention lence or intensity of each color within the representation of
the obvious difference between color palettes for shame and those emotions.
surprise. The two palettes under consideration evoke distinct Based on our results, prominent color-emotion associations
emotional responses. The first palette, rich in reds, oranges, were brown and gratitude, brown and anger, orange and
and yellows, exudes a vibrant and energetic aura with its shame, yellow and happiness, gray and fear. The full spectrum
medium saturation levels and deep, dark intensity. In contrast, of color associations and emotions can be seen on the heatmap
11

Fig. 14. Examples of Jaccard similarity calculation using happy, shy, shame emotion palettes and the John William Godward painting - Under the Blossom
that Hangs on the Bough, 1917.

(Fig. 13). is the total number of unique colors in image and emotion
palettes I and E.
2) Two-alternative Forced choice:
B. Performance evaluation a) Method description: The experiment involved a total
In this section, we present the performance evaluation of 177 subjects, bachelor and master students of Kazakh-
results of the model. British Technical University. All subjects who provided an-
1) The Jaccard similarity: The Jaccard similarity is used swers passed the Ichihara color test. Two students’ results were
to measure the similarity between the two palettes, namely excluded because they did not prove to have normal vision.
Emotion Palette (E) and Image Palette (I). It calculates the The experiment utilized a well-known behavioral measure -
similarity between two sets by comparing their intersection Two-alternative forced choice (2AFC) [63], [64]. The primary
and union. In this case, the intersection is the set of common idea is to compare individuals’ real emotion choices based on
colors that appear in both palettes, while the union is the total their evaluations to predicted emotions.
number of unique colors in both palettes.The Jaccard similarity In one trial of a standard 2AFC experiment, participants
is calculated by dividing the size of the intersection by the size indicate which of two visual choices displayed simultaneously
of the union. they find more appealing or appropriate for a certain category.
The Jaccard similarity value ranges from 0 (completely dis- For every pair that could exist, this process is repeated.
n(n − 1)
similar) to 1 (completely similar). A higher Jaccard similarity Therefore, trials are needed for 2AFC to measure
indicates a greater association between the image and emotion 2
choices for n stimuli. When a participant is presented with
color palettes. Equation (5) as follows: only two mutually exclusive stimuli, it makes the decision
|E ∩ I| job easier for them. We use The Spearman-Karber method to
J(E, I) = (5) calculate 2afc measures [63].
|E ∪ I|
b) Procedure: The experiment corresponds to the filter-
Where |E ∩ I| is the size of the set of common colors that ing mechanism of the theory of visual attention [38], choosing
appear in both image and emotion palettes I and E, |E ∪ I| an item x ∈ S for a target category i.
12

TABLE III
BASIC COLORS DISTRIBUTION ACROSS EMOTIONS ( MEASURES INDICATED IN PERCENTS )

Emotion Red Orange Yellow Green Cyan Blue Black Brown Beige Purple Gray

Gratitude 8.7 17.4 0 17.4 0 0 13.0 43.5 0 0 0


Anger 9.5 14.3 7.9 0 0 0 12.8 31.7 4.8 0 19.0
Shyness 0 0 11.1 22.2 11.1 22.2 0 11.1 0 11.1 11.1
Shame 13.3 26.7 0 0 0 0 20.0 40.0 0 0 0
Sadness 4.1 9.3 18.3 5.8 0 4.3 12.7 20.3 0 0 25.2
Love 6.1 12.9 14.5 4.4 0 4.2 13.2 28.9 5.3 0 10.5
Fear 6.4 8.0 10.7 5.9 0 5.2 8.6 18.2 5.2 0 31.8
Surprise 0 8.6 14.5 5.0 0 8.6 0 12.4 26.4 0 24.4
Trust 6.3 9.9 11.4 0 0 5.3 14.1 31.7 4.0 0 17.3
Happiness 4.1 10.6 20.9 8.8 0 3.8 5.6 20.3 5.7 0 20.2

Fig. 15. Color-emotion association results: Fuzzy color palettes.

Fig. 16. Color-emotion association results: Crisp, basic color palettes.


13

ˆ ≈ 0.183, SE(µ2AFC
Fig. 18. Psychometric function in a choice task. µ2AFC ˆ )≈
0.013

c) Data Interpretation: Equaltion 6 gives the mean of


the psychometric function cite2afc :
1 k+1
ˆ =
µ2AFC ∑ ( p̂i − pi−1
2 i=1
ˆ )(xi + xi−1 ) (6)

where x1 < x2 ... < xk are k monotonically increasing stimulus


values, and p̂i (i = 1, ..., k) are the observed response probabil-
ities, associated with each stimulus value. To calculate µ2AFCˆ ,
we choose values for x0 and xk+1 such that we can assume
p0 = 0 and pk+1 = 1. Note that x0 and xk+1 are not part of the
experimental design and are only used for the calculation of
ˆ . In our experiment, calculated µ2AFC
µ2AFC ˆ ≈ 0.183.
The mean µ2AFC ˆ in 2AFC is also referred to as the point of
Fig. 17. 2AFC Experiment on perceptual emotion categorization of art subjective equality (PSE) [64]. In the context of psychophysics
paintings. Art images in the figure: Interior (Lorica), Luchian’s last painting; and 2AFC tasks, the PSE represents the stimulus level at which
Lady in a Turkish Costume, Felicita Sartori); Brothel scene, Brunswick an observer is equally likely to respond ”yes” or ”no” (in our
Monogrammist; Landscape,Amadeo de Souza-Cardoso.
case, equally likely to choose first or second art image for
a target emotion). From Fig. 19 we can see that one of the
lowest hit rates correspond to sadness and trust that have 0.13
For each subject, there is one session with 10 questions cor- and 0.12 stimulus levels, which is less that PSE ≈ 0.183.
responding to categorization tasks. According to Bundesen’s For the Spearman-Karber method, we need transformed
theory of visual attention, perceptual categorization takes the probability values [63]. Therefore, the set of observed correct
form of ”x belongs to i” (represented as E(x, i), where E(x, response probabilities p̂i (i = 1, ..., k) in a 2AFC can be
i) is a value between 0 and 1). In this context, x refers to transformed into the corresponding probability estimates. (see
a perceptual unit and i refers to a perceptual category. The Eq. (7)).
collection of all units is denoted as S, while the collection p̂i = 2 · ĝi − 1 (7)
of all categories is denoted as R. Each question has one The experiment’s findings are depicted in a psychometric
perceptual category (e.g., happy) and two perceptual units function (Fig. 18), which displays the likelihood of the sub-
(art paintings).S contains 5 randomly chosen images. For each ject’s decision based on the stimulus difference.
predicted emotion, values were calculated. R corresponds to For fitting, the logistic function is used. This figure’s ab-
emotions {angry, shyness, happy, sadness, gratitude, shame, scissa illustrates the difference between two expected emotion
fear, trust, love, surprise} intensities. The difference is between 0 and 1. The y-axis
On each trial, responders choose art piece that corresponds shows the proportion of correct responses, ranging from 0.5
more to the target emotion. The stimulus pairs differ in to 1.0 based on how distinguishable two stimuli are for the
predicted emotion intensity. Observers are only allowed to user.
select one item only. After collecting the data, we can compare The average percentage of accurate answers, or hit rate, can
it to predicted preferences, calculate the ”Hits” percentage, and be used to evaluate overall performance (see Fig. 19). The
conduct further analyses. average hit rate is 0.77.
14

TABLE IV
2AFC EXPERIMENT RESULTS

Anger Shyness Happiness Sadness Gratitude Shame Fear Trust Love Surprise
# hit rates 165 163 148 97 146 146 156 81 166 70
hit rate, % 0,95 0,94 0,86 0,56 0,84 0,84 0,9 0,47 0,96 0,4
Difference in Emotion Predictions 0,76 0,37 0,05 0,13 0,2 0,38 0,37 0,12 0,05 0,27

Figure 19 illustrates the hit rate for each emotion, namely


the proportion of participants that could identify an emotion in
images representing that particular emotion according to our
algorithm.
Table IV represents two alternative forced choice experi-
ment results analysis. The hit rate refers to the number of
participants correctly identifying an emotion in line with our
model’s classification. We also calculate their percentage of
total respondents. The difference in emotion intensity predic-
tions signifies the numerical variance between the levels of a
particular emotion found in two different images.

Fig. 19. Experiment hit rates distribution across emotions. C. Sample Application (Proof of Concept)
The emotion classification method we proposed is simple
Next, let us find the standard error, which is also referred to adapt for emotion-based retrieval of art paintings. It may
to as a ”perceptual noise” [63] (see Eq. (8)): be advantageous to retrieve art images from big multimedia
archives. The method can be easily adapted for the matching
v
u k
u ĝ · (1 − ĝi ) engine that retrieves art objects with similar emotions. For
ˆ ) = t∑ i
SE(µ2AFC · (xi+1 − xi−1 )2 (8) the arbitrary art image, we get the fuzzy color dominant
i=1 ni − 1 palette (Algorithm 1). Next, we find the relevant emotions to a
given art object using the color knowledge base we previously
where ni represents the number of observations at stimulus created (See Fig. 22).
level i. The standard error associated with the threshold Figure 21 demonstrates the interface of a prototype system
estimates µ̂2AFC is approximately equal to 0.013. for this task. It can potentially use fuzzy natural queries,
retrieve emotion labels (e.g., trust) and their desired intensity
(e.g., very high), and use fuzzy sets and fuzzy hedges to fetch
the corresponding items. The proposed approach can be used
in various applications, e.g., in the matching engine, that uses
emotions similarity measurement to implement retrieval.
Our method manages uncertainty and imprecision in data.
Thus, the system has the potential to assist in recognizing emo-
tions in content, suggesting recommendations, and improving
user interactions on interactive platforms.

V. D ISCUSSION
Several recent studies have explored the emotional effects
of art objects on humans [65], [4], [20], [22]. Conceptually
similar works were completed by psychologists via surveys
Fig. 20. Survey participants hit rates distribution. Outliers are marked with and case studies among real people. We identified certain
orange color. Their responses were excluded from the analysis. patterns and differences across various emotions by compar-
ing the experimental findings and established psychological
As can be seen from Fig. 20, we compared participants’ studies concerning color-emotion associations.
responses regarding the most voted answers and highlighted Yellow emerged as a dominant color associated with hap-
some outliers. According to categorical data analysis, we piness in both the experimental findings and established psy-
consider the respondent an outlier in case of less than 50% chological studies [61], [62] Several studies [61], [66], [67]
responses matching other participants’ most selected choice. matched our results regarding anger, identifying it with red
There are two outliers identified. We also did not analyze the and black colors. Fear is expressed with gray [66] and shame
answers of two respondents who did not pass the Ichihara with red at most researches [62], [67]. Generally, green colors
color blindness test (with IDs 107, and 155). appeared to be associated mostly with positive emotions, and
15

VI. C ONCLUSION
In this study, we introduced a novel approach for emotion
retrieval using color-based features and fuzzy sets, providing
valuable insights into the intersection of computational meth-
ods and human emotional perception in visual art analysis.
The system processed a diverse art dataset to generate
color palettes associated with ten distinct emotions. We ex-
perimentally validated our approach with 2AFC experiment
and discovered that it has good predictive power of the color-
emotion associations (average hit rate is 0.77). This suggests
a substantial alignment between the emotions inferred by the
system and those perceived by human observers, validating the
system’s ability to approximate emotional states accurately.
Adopting the Fuzzy Sets and Logic approach is advan-
tageous due to its consistency with human perception. Our
method enhances algorithm interpretability and adaptability
in human-like applications. This system could aid in content-
based emotion recognition, recommendation systems, and en-
hancing user experiences in interactive platforms.
Our research contributes to a more comprehensive un-
derstanding of color-emotion associations, offering valuable
insights for various practical applications besides art, like
marketing, design, and psychology
As for the limitations, the system’s performance may vary
Fig. 21. Prototype Art Painting Retrieval System
across different art styles, cultural contexts, and individual
perceptions. Next, detecting the closest emotion to an image
based solely on its presence doesn’t account for varying
color frequencies. Colors, texture, and composition can evoke
shyness in the experiment was displayed with blue and cyan
emotions [68], creating additional dependencies. We concen-
as dominant colors. Moreover, both sets of studies showed a
trated on color features only, ignoring the objects in the art
strong correlation between sadness and gray and black. The
image, although they also contribute to the overall emotional
results of [61] also matched our results in associating green
association. Color can have an impact not only if it is abundant
with gratitude, although the experimental findings also noted
but also if it contrasts sharply with the main color scheme of
a similarity with brown.
the background.
Conversely, our findings did not align with psychological re- Future research could focus on expanding the dataset to
sults regarding love-associated colors, with no dominant color encompass more diverse artwork, incorporating contextual
evident in this emotional context. Also, the surprise primarily information, and refining the fuzzy logic model to enhance ac-
featured light tones like beige and yellow, deviating from the curacy. We see great potential for personalized color-emotion
typical association with black in psychological studies. applications. Understanding the emotional nuances of various
Common colors such as gray, brown, and black appeared colors leads to developing personalized color schemes that
dominantly in both sets of studies. However, blue did not dom- evoke specific emotional responses tailored to individual pref-
inate in the experimental findings as it did in psychological erences or target audiences. We also plan to improve the model
studies, highlighting a slight disparity. with additional dependencies, such as pixel importance.

Authors of [4] also used a fuzzy approach to find a rela- R EFERENCES


tionship between colors and emotion. We can observe specific [1] E. Cambria, “Affective computing and sentiment analysis,” IEEE Intel-
universal rules governing certain emotions when comparing ligent Systems, vol. 31, 2016.
[2] K. Ivanova, P. Stanchev, and K. Vanhoof, “Automatic tagging of art
our findings with their results. For example, sadness is repre- images with color harmonies and contrasts characteristics in art image
sented mostly with black and brown, and blue color is relevant collections.” [Online]. Available: www.iaria.org
to fear. Similar results were also received in study [20], where [3] L. C. Ou, M. R. Luo, A. Woodcock, and A. Wright, “A study of
colour emotion and colour preference. part i: Colour emotions for single
the authors proposed a machine learning model. One of the colours,” Color Research and Application, vol. 29, pp. 232–240, 6 2004.
most significant findings is that happiness is undoubtedly [4] C. F. Hibadullah, A. W.-C. Liew, and J. Jo, “Color-emotion association
associated with yellow and orange, as indicated in both the study on abstract art painting,” International Conference on Machine
Learning and Cybernetics, pp. 488–493, 2015.
model prediction outcomes and psychological studies. Also, [5] J. H. Xin, K. M. Cheng, G. Taylor, T. Sato, and A. Hansuebsai, “Cross-
in this study, sadness was correlated with gray color, which is regional comparison of colour emotions part i: Quantitative analysis,”
dominant for our sadness result palette too. Generally, anger is Color Research and Application, vol. 29, pp. 451–457, 12 2004.
[6] B. Manav, “Color-emotion associations and color preferences: A case
expressed with red, black, brown, and gray in all the mentioned study for residences,” Color Research and Application, vol. 32, pp. 144–
studies, psychological experiments, and our results. 150, 4 2007.
16

Fig. 22. Fetching emotions from art image

[7] S. Wang, R. Ding, Y. Hu, and H. Wang, “Analysis of relationships [27] W.-N. Wang and Y.-L. Yu, “Image emotional semantic query based
between color and emotion by classification based on associations,” on color semantic description,” in 2005 International Conference on
vol. 1, 2008. Machine Learning and Cybernetics, vol. 7, 2005, pp. 4571–4576 Vol.
[8] J. Lee and E. Park, “Fuzzy similarity-based emotional classification of 7.
color images,” IEEE Transactions on Multimedia, vol. 13, 2011. [28] P. Ekman, “An argument for basic emotions,” Cognition & Emotion,
[9] X. P. Gao, J. H. Xin, T. Sato, A. Hansuebsai, M. Scalzo, K. Kajiwara, vol. 6, no. 3-4, pp. 169–200, 1992.
S. S. Guan, J. Valldeperas, M. J. Lis, and M. Billger, “Analysis of cross- [29] W. James, “What is an emotion?” Mind, vol. IX, pp. 188–205, 1884.
cultural color emotion,” Color Research and Application, vol. 32, pp. [30] W. Cannon, “Bodily changes in pain, hunger, fear and rage,” The
223–229, 6 2007. American Journal of Psychology, vol. 27, no. 3, pp. 469–485, 1915.
[10] X. Lu, Identifying Emotions Aroused from Paintings, G. Hua and [31] ——, “Emotion and conduct,” Prentice-Hall, 1929.
H. Jégou, Eds. Springer International Publishing, 2016, vol. 9913. [On- [32] L. F. Barrett, “The theory of constructed emotion: an active inference ac-
line]. Available: http://link.springer.com/10.1007/978-3-319-46604-0 count of interoception and categorization,” Social cognitive and affective
[11] D. Joshi, R. Datta, E. Fedorovskaya, Q. T. Luong, J. Z. Wang, J. Li, and neuroscience, vol. 12, no. 1, pp. 1–23, 2017.
J. Luo, “Aesthetics and emotions in images,” IEEE Signal Processing [33] K. R. Scherer, “Emotion classification: A review of methods and their
Magazine, vol. 28, pp. 94–115, 2011. applications,” Social Science Information, vol. 39, no. 3, pp. 433–468,
[12] M. Solli and R. Lenz, “Color emotions for multi-colored images,” Color 2000.
Research and Application, vol. 36, pp. 210–221, 6 2011. [34] S. Schachter and J. Singer, “A cognitive-attributional analysis of emo-
[13] F. Hoenig, “Defining computational aesthetics.” tion,” Psychological Review, vol. 69, no. 5, pp. 379–399, 1962.
[14] D. Kang, H. Shim, and K. Yoon, “A method for extracting emotion using [35] A. Mehrabian, “A semantic space for nonverbal behavior,” Journal of
colors comprise the painting image,” Multimedia Tools and Applications, Consulting and Clinical Psychology, vol. 37, no. 1, p. 109, 1989.
vol. 77, 2018. [36] S. M. Mohammad and S. Kiritchenko, “Wikiart emotions: An
[15] M. Solli, R. Lenz, and L. University, “Color emotions for annotated dataset of emotions evoked by art.” [Online]. Available:
image classification and retrieval,” 2008. [Online]. Available: http: https://en.wikipedia.org/wiki/List
//urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-42582 [37] P. Shamoi, “Fuzzy model for human color perception and its application
[16] R. Lenz and M. Solli, “Color semantics for image indexing,” 2010. in e-commerce - apparel color coordination,” PhD thesis, mar 2019.
[Online]. Available: http://diameter.itn.liu.se/colse/ [38] C. Bundesen, “Theory of visual attention,” Psychological review, vol. 97,
[17] M. Solli and R. Lenz, “Color based bags-of-emotions,” vol. 5702 LNCS, pp. 523–47, 11 1990.
2009, pp. 573–580. [39] P. Shamoi, D. Sansyzbayev, and N. Abiley, “Comparative overview of
[18] “Shigenobu kobayashi the aim and method of the color image scale.” color models for content-based image retrieval,” in 2022 International
[19] J. A. Russell, “A circumplex model of affect,” Journal of Personality Conference on Smart Information Systems and Technologies (SIST),
and Social Psychology, vol. 39, pp. 1161–1178, 12 1980. 2022, pp. 1–6.
[20] A. Wedolowska, D. Weber, and B. Kostek, “Predicting emotion from [40] A. Ualibekova and P. Shamoi, “Music emotion recognition using k-
color present in images and video excerpts by machine learning,” IEEE nearest neighbors algorithm,” in 2022 International Conference on Smart
Access, vol. 11, pp. 66 357–66 373, 2023. Information Systems and Technologies (SIST), 2022, pp. 1–6.
[21] B. Ranjgar, M. Khoshlahjeh Azar, A. Sadeghi-Niaraki, and S.-M. Choi, [41] P. Kozlov, A. Akram, and P. Shamoi, “Fuzzy approach for audio-video
“A novel method for emotion extraction from paintings based on emotion recognition in computer games for children,” 2023.
luscher’s psychological color test: Case study iranian-islamic paintings,” [42] L. Chen, W. Su, M. Wu, W. Pedrycz, and K. Hirota, “A fuzzy deep neural
IEEE Access, vol. 7, pp. 120 857–120 871, 2019. network with sparse autoencoder for emotional intention understanding
[22] H. Zhang and M. Xu, “Weakly supervised emotion intensity prediction in human–robot interaction,” IEEE Transactions on Fuzzy Systems,
for recognition of emotions in images,” IEEE Transactions on Multime- vol. 28, no. 7, pp. 1252–1264, 2020.
dia, vol. 23, pp. 2033–2044, 2021. [43] L. Chen, M. Zhou, M. Wu, J. She, Z. Liu, F. Dong, and K. Hirota,
[23] J. Chamorro-Martı́nez, J. M. Soto-Hidalgo, P. M. Martı́nez-Jiménez, “Three-layer weighted fuzzy support vector regression for emotional
and D. Sánchez, “Fuzzy color spaces: A conceptual approach to color intention understanding in human–robot interaction,” IEEE Transactions
vision,” IEEE Transactions on Fuzzy Systems, vol. 25, no. 5, pp. 1264– on Fuzzy Systems, vol. 26, no. 5, pp. 2524–2538, 2018.
1280, 2017. [44] S. Vashishtha and S. Susan, “Unsupervised fuzzy inference system for
[24] J. Chamorro-Martı́nez and J. M. Keller, “Granular modeling of fuzzy speech emotion recognition using audio and text cues (workshop paper),”
color categories,” IEEE Transactions on Fuzzy Systems, vol. 28, no. 9, in 2020 IEEE Sixth International Conference on Multimedia Big Data
pp. 1897–1909, 2020. (BigMM), 2020, pp. 394–403.
[25] M. Mengı́bar-Rodrı́guez and J. Chamorro-Martı́nez, “An image-based [45] P. Shamoi, M. Muratbekova, A. Izbassar, A. Inoue, and H. Kawanaka,
approach for building fuzzy color spaces,” Information Sciences, vol. “Towards a universal understanding of color harmony: Fuzzy approach,”
616, pp. 577–592, 2022. 2023.
[26] M. Nachtegael, D. V. D. Weken, V. D. Witte, S. Schulte, T. Mélange, [46] J. Yang, J. Li, X. Wang, Y. Ding, and X. Gao, “Stimuli-aware visual
and E. E. Kerre, “Color image retrieval using fuzzy similarity measures emotion analysis,” IEEE Transactions on Image Processing, vol. 30, pp.
and fuzzy partitions.” 7432–7445, 2021.
17

[47] E. Shamoi, A. Turdybay, P. Shamoi, I. Akhmetov, A. Jaxylykova, and Muragul Muratbekova received a B.S. degree in
A. Pak, “Sentiment analysis of vegan related tweets using mutual information systems from the Kazakh-British Tech-
information for feature selection,” PeerJ Computer Science, vol. 8, p. nical University, Almaty, Kazakhstan, in 2022. She
e1149, Dec. 2022. [Online]. Available: https://doi.org/10.7717/peerj-cs. is currently pursuing an M.S. degree in IT manage-
1149 ment at the same university and works as a senior
[48] S. B. Sadkhan and A. D. Radhi, “Fuzzy logic used in textual emo- software developer in a leading telecommunication
tion detection,” in 2017 Second Al-Sadiq International Conference on company in Kazakhstan. She participated in a num-
Multidisciplinary in IT and Communication Science and Applications ber of conferences (EUSPN 2023, FSDM 2023)
(AIC-MITCSA), 2017, pp. 242–245. and one grant project. Her research interests include
[49] F. Es-Sabery, I. Es-Sabery, A. Hair, B. Sainz-De-Abajo, and B. Garcia- visual perception computing, emotion engineering,
Zapirain, “Emotion processing by applying a fuzzy-based vader lexicon image processing.
and a parallel deep belief network over massive data,” IEEE Access,
vol. 10, pp. 87 870–87 899, 2022.
[50] Y. M. Tashtoush and D. A. Al Aziz Orabi, “Tweets emotion prediction
by using fuzzy logic system,” in 2019 Sixth International Conference
on Social Networks Analysis, Management and Security (SNAMS), 2019,
pp. 83–90.
[51] Y.-D. Zhang, Z.-J. Yang, H.-M. Lu, X.-X. Zhou, P. Phillips, Q.-M. Liu,
and S.-H. Wang, “Facial emotion recognition based on biorthogonal
wavelet entropy, fuzzy support vector machine, and stratified cross
validation,” IEEE Access, vol. 4, pp. 8375–8385, 2016.
[52] D.-J. Kim and Z. Bien, “Design of “personalized” classifier using soft
computing techniques for “personalized” facial expression recognition,”
IEEE Transactions on Fuzzy Systems, vol. 16, no. 4, pp. 874–885, 2008.
[53] K. Y. Chan and U. Engelke, “Varying spread fuzzy regression for
affective quality estimation,” IEEE Transactions on Fuzzy Systems,
vol. 25, no. 3, pp. 594–613, 2017.
[54] P. Shamoi, A. Inoue, and H. Kawanaka, “Modeling aesthetic preferences:
Color coordination and fuzzy sets,” Fuzzy Sets Syst., vol. 395, pp. 217–
234, 2020.
[55] L. A. Zadeh, “Fuzzy sets,” Information and Control, vol. 8, pp.
338–353, 1965. [Online]. Available: http://www-bisc.cs.berkeley.edu/
Zadeh-1965.pdf
[56] L. Zadeh, “The concept of a linguistic variable and its application to
approximate reasoning-iii,” Inf. Sciences, vol. 9, no. 1, pp. 43–80, 1975.
[57] P. Shamoi, A. Inoue, and H. Kawanaka, “Fuzzy Color Space for Apparel Pakizar Shamoi received the B.S. and M.S. degrees
Coordination,” Open Journal of Information Systems (OJIS), pp. 20–28, in information systems from the Kazakh-British
2014. Technical University, Almaty, Kazakhstan in 2011
[58] P. Shamoi, A. Inoue, , and H. Kawanaka, “Fhsi: Toward more human- and 2013, and the Ph.D. degree in engineering from
consistent color representation,” Journal of Advanced Computational Mie University, Tsu, Japan, in 2019. In her academic
Intelligence and Intelligent Informatics, vol. 20, no. 3, 2016. journey, she has held various teaching and research
[59] J. Chamorro-Martı́nez, D. Sánchez, J. M. Soto-Hidalgo, and P. Martı́nez- positions at Kazakh-British Technical University,
Jiménez, “Histograms for fuzzy color spaces,” in Eurofuse 2011. Berlin, where she has been serving as a professor in the
Heidelberg: Springer Berlin Heidelberg, 2012, pp. 339–350. School of Information Technology and Engineering
[60] T. Clarke and A. Costall, “The emotional connotations of color: A since August 2020. She is the author of 1 book and
qualitative investigation,” Color Research and Application, vol. 33, pp. more than 28 scientific publications. Awards for the
406–410, 10 2008. best paper at conferences were received five times. Her research interests
[61] J. M. B. Fugate and C. L. Franco, “What color is your anger? assessing include artificial intelligence and machine learning in general, focusing on
color-emotion pairings in english speakers,” Frontiers in Psychology, fuzzy sets and logic, soft computing, representing and processing colors in
vol. 10, 2019. computer systems, natural language processing, computational aesthetics, and
[62] D. Jonauskaite, C. A. Parraga, M. Quiblier, and C. Mohr, “Feeling human-friendly computing and systems. She took part in the organization
blue or seeing red? similar patterns of emotion associations with colour and worked in the org. committee (as head of the session and responsible for
patches and colour terms,” i-Perception, vol. 11, 1 2020. special sessions) of several international conferences - IFSA-SCIS 2017, Otsu,
[63] R. Ulrich and J. Miller, “Threshold estimation in two-alternative forced- Japan; SCIS-ISIS 2022, Mie, Japan; EUSPN 2023, Almaty, Kazakhstan. She
choice (2afc) tasks: The spearman-kärber method,” Perception & Psy- served as a reviewer at several international conferences, including IEEE: SIST
chophysics, vol. 66, pp. 517–533, 2004. 2023, SMC 2022, SCIS-ISIS 2022, SMC 2020, ICIEV-IVPR 2019, ICIEV-
[64] M. Jogan and A. Stocker, “A new two-alternative forced choice method IVPR 2018.
for the unbiased characterization of perceptual bias and discriminability,”
Journal of Vision, vol. 14, no. 3, p. 20, 2014.
[65] P. Goel, M. Goyal, and R. R. Shah, “Arten-net: An emotion classification
system for art (student consortium),” in 2020 IEEE Sixth International
Conference on Multimedia Big Data (BigMM), 2020, pp. 302–306.
[66] C. Damiano, P. Gayen, M. Rezanejad, A. Banerjee, G. Banik, P. Patnaik,
J. Wagemans, and D. Walther, “Anger is red, sadness is blue: Emotion
depictions in abstract visual art by artists and non-artists,” Journal of
vision, vol. 23, p. 1, 04 2023.
[67] D. Jonauskaite, A. Abu-Akel, N. Dael, D. Oberfeld, A. Abdel-Khalek,
A. Al-rasheed, J.-P. Antonietti, V. Bogushevskaya, A. Chamseddine,
E. Chkonia, V. Corona, E. Fonseca-Pedrero, Y. Griber, G. Grimshaw,
A. Hasan, H. Jelena, M. Hirnstein, B. Karlsson, E. Laurent, and C. Mohr,
“Universal patterns in color-emotion associations are further shaped by
linguistic and geographic proximity,” Psychological Science, vol. 31, pp.
1245–1260, 10 2020.
[68] M. Lucassen, T. Gevers, and A. Gijsenij, “Texture affects color emotion,”
Color Research and Application, vol. 36, pp. 426 – 436, 12 2011.

You might also like