Skip to main content

A Theory-Based Explainable Deep Learning Architecture for Music Emotion

Marketing Science
Articles
Published: 2025
Author(s): H. Fong, V. Kumar, and K.Sudhir

Abstract

This paper develops a theory-based, explainable deep learning convolutional neural network (CNN) classifier to predict the time-varying emotional response to music. We design novel CNN filters that leverage the frequency harmonics structure from acoustic physics known to impact the perception of musical features. Our theory-based model is more parsimonious, but it provides comparable predictive performance with atheoretical deep learning models while performing better than models using handcrafted features. Our model can be complemented with handcrafted features, but the performance improvement is marginal. Importantly, the harmonics-based structure placed on the CNN filters provides better explainability for how the model predicts emotional response (valence and arousal) because emotion is closely related to consonance—a perceptual feature defined by the alignment of harmonics. Finally, we illustrate the utility of our model with an application involving digital advertising. Motivated by YouTube’s midroll ads, we conduct a laboratory experiment in which we exogenously insert ads at different times within videos. We find that ads placed in emotionally similar contexts increase ad engagement (lower skip rates and higher brand recall rates). Ad insertion based on emotional similarity metrics predicted by our theory-based, explainable model produces comparable or better engagement relative to atheoretical models.

Topics:
Marketing
Journal:
Marketing Science
Volume:
44
Issue:
1
Pages:
1-246