Connect with us

Science

AI-Generated Music Evokes Stronger Emotions than Human Compositions

Editorial

Published

on

Generative artificial intelligence (Gen AI) is making waves in the creative sector, particularly in music. A recent study published in PLOS One examined whether AI-generated music can elicit emotional responses comparable to those produced by human composers. Conducted by the Neuro-Com Research Group from the Autonomous University of Barcelona (UAB), the research involved collaboration with the RTVE Institute in Barcelona and the University of Ljubljana in Slovenia.

The study involved 88 participants who viewed audiovisual clips featuring identical visuals but accompanied by three distinct sound conditions: music composed by humans, AI-generated music crafted from a complex prompt, and AI-generated music created from a simpler prompt. The researchers measured participants’ physiological responses, including pupil dilation, blinking, and galvanic skin response, alongside their self-reported emotional reactions.

The findings revealed that AI-generated music triggered significantly greater pupil dilation, which indicates heightened emotional arousal among listeners. Specifically, music produced with sophisticated prompts led to more blinking and greater changes in skin response, suggesting a higher cognitive load. Interestingly, while participants perceived AI-generated music as more exciting, they found human-composed music to be more familiar.

Nikolaj Fišer, the lead author of the study, noted that “both types of AI-generated music led to greater pupil dilation and were perceived as more emotionally stimulating compared to human-created music.” This physiological response is closely associated with elevated emotional arousal levels. Fišer added that “our findings suggest that decoding the emotional information in AI-generated music may require greater cognitive effort,” highlighting a potential divergence in how human brains process AI-generated versus human-composed music.

These insights hold significant implications for the future of audiovisual production. The ability to customize music to align with visual narratives could enhance resource efficiency in creative processes. Moreover, the potential for fine-tuning emotional impact with automated tools offers new avenues for creators looking to evoke specific responses from their audiences.

This research not only expands our understanding of emotional responses to sound stimuli but also poses challenges for designing effective sensory experiences in audiovisual media. With AI’s growing role in creative fields, understanding how this technology influences emotional engagement is crucial.

For more detailed insights, refer to the study by Nikolaj Fišer et al., titled “Emotional impact of AI-generated vs. human-composed music in audiovisual media: A biometric and self-report study,” published in PLOS One on July 24, 2025. The study can be accessed through DOI: 10.1371/journal.pone.0326498.

Our Editorial team doesn’t just report the news—we live it. Backed by years of frontline experience, we hunt down the facts, verify them to the letter, and deliver the stories that shape our world. Fueled by integrity and a keen eye for nuance, we tackle politics, culture, and technology with incisive analysis. When the headlines change by the minute, you can count on us to cut through the noise and serve you clarity on a silver platter.

Continue Reading

Trending

Copyright © All rights reserved. This website offers general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information provided. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult relevant experts when necessary. We are not responsible for any loss or inconvenience resulting from the use of the information on this site.