The Visual Brain Is Tuned to Having Feelings About the World
Published in Neuroscience, Computational Sciences, and Behavioural Sciences & Psychology

Imagine walking through an art gallery or taking a stroll in nature. You see a colorful painting or a serene landscape, and almost instantly, you feel something—maybe good or bad. This dimension of feeling from negative to positive, known as valence, is a fundamental way our brain processes the world. In our latest study, we explore an intriguing question: Can our visual system directly decode feelings from the environment without needing our higher cognitive processes to analyze and interpret objects or scenes, to tell us how to feel?
A New Perspective on Feelings
Traditionally, feelings and emotions have been thought to emerge from a mix of body sensations and memories. Our study challenges this view by introducing the concept of Visual Valence (VV)—a form of feeling that we perceive directly from visual features, like colors, shapes, and patterns, without any need for deeper cognitive processing. To test this, we developed the Visual Valence Model (VVM), a machine learning model trained on nearly 8,000 emotionally charged photographs. Our model analyzed basic visual elements, such as color proportions and composition, to predict whether an image feels positive or negative.
What’s groundbreaking here is that the VVM did not rely on recognizing objects (like a cute puppy or a scary face) but instead used abstract features that are present in many kinds of images. This suggests our brain might have a more fundamental and objective way of perceiving emotion—through simple visual cues.

Emotions Beyond Reality: Predicting Feelings from Abstract Art
To push the boundaries, we tested our model on abstract paintings—artworks with no clear subjects or objects. Even though the VVM was trained on real-world photos, it predicted the emotional reactions people had to these abstract pieces. What was also revealed is that it predicts our feelings for abstract art even more accurately. This discovery suggests that part of our feeling of valence isn’t tied to recognizing specific things but rather to visual patterns, regardless of whether the image shows something real or imaginary.
Seeing with Our Feelings
Given that we have stronger visual feelings when viewing abstract art, we wanted to find out in what conditions feelings rely more on simple visual features in realistic objects and scenes. When the brain has limited resources for processing the content of an image (just like our VVM’s input features), it might perceive it more abstractly. In one experiment, participants were shown realistic images very briefly (for only 100 milliseconds) and upside down to prevent them from fully recognizing the content. When given such limited viewing, participants’ feelings aligned much more closely with the VVM’s predictions than when they had more time to look and interpret the images. This suggests that, under quick or limited viewing conditions, our brains experience feelings differently, relying heavily on these basic visual cues, independent of what the image depicts.
The Brain Behind Visual Valence: A Look Inside
To find out how visual-emotional connections occur in the brain and specifically whether it was the visual brain creating visual valence, we used fMRI scanning. We discovered that visual valence is associated almost exclusively with activity in visual regions of the brain, particularly areas involved in processing simple visual features. On the other hand, the emotions tied to interpreting objects or concepts (like recognizing a friendly dog or human face) engage higher-level brain areas. This division implies that our brains have a built-in system to process visual valence separately from more complex, thought or content driven emotional responses.
What Do These Feelings Look Like? Generating Images from Visual Brain Activity
One fascinating part of our research delves into generating images directly from brain activity. This allows us to see what the visual brain sees that creates feelings. Using a generative model, NeuroGen, created by our collaborators at Cornell University, we synthesized images based on patterns in the brain’s visual regions that are linked to positive or negative emotional tones. These generated images matched the feelings predicted by our Visual Valence Model (VVM) and were recognizable as “pleasant” or “unpleasant” by human observers as well. This proves that our visual system itself contains emotional information that can generate or “imagine” new ways to make us feel.
Additionally, we found that the brain’s visual system responds most positively to natural scenes and living organisms, while human-made objects tend to evoke more negative responses. This supports the concept of biophilia—the idea that we have an innate comfort and preference for natural environments. Our visual system may be evolutionarily tuned to interpret certain environmental features as emotionally meaningful, while modern human-made objects often diverge from what feels visually positive to us.
New Doorways to How We Feel the World
Our study contributes to the growing understanding that perception involves more than just recognizing objects or recalling memories; it also includes sensing fundamental qualities in our environment that carry emotional weight. While emotional responses have long been embedded in perception for senses like smell and touch, which we call the proximal or close sense, we now understand and extend this concept to vision, a distal sense.
The brain dedicates significant resources to visual perception—not just for identifying objects, but also for encoding emotional aspects. Just as texture reveals whether something is rough or smooth, visual valence tells us if something feels positive or negative. By decoding these basic visual elements, our findings reveal how deeply our emotions are intertwined with the fabric of visual perception itself.
This opens up exciting possibilities for future research—such as understanding how and when visual emotions are formed in development, and their distinct contributions to how we perceive, remember and make choices
Follow the Topic
-
Nature Communications
An open access, multidisciplinary journal dedicated to publishing high-quality research in all areas of the biological, health, physical, chemical and Earth sciences.
Related Collections
With collections, you can get published faster and increase your visibility.
Smart Materials for Bioengineering and Biomedicine
Publishing Model: Open Access
Deadline: Sep 30, 2025
Health in Africa
Publishing Model: Open Access
Deadline: Dec 31, 2025
Please sign in or register for FREE
If you are a registered user on Research Communities by Springer Nature, please sign in