|
|
Bimonthly Since 1986 |
ISSN 1004-9037
|
|
|
|
|
Publication Details |
Edited by: Editorial Board of Journal of Data Acquisition and Processing
P.O. Box 2704, Beijing 100190, P.R. China
Sponsored by: Institute of Computing Technology, CAS & China Computer Federation
Undertaken by: Institute of Computing Technology, CAS
Published by: SCIENCE PRESS, BEIJING, CHINA
Distributed by:
China: All Local Post Offices
|
|
|
|
|
|
|
|
|
|
Abstract
Music plays a vital role in our everyday life. Life without music cannot be imagined. Music changes our mood; Whatever our mood might be, the only thing we do in all of our moods is to listen to music. We also listen to music when working, driving, travelling and even when reading a comic or a story. Music can induce a clear emotional response in its listeners. The pitch and rhythm of the music are managed in the areas of the brain that deal with emotions and mood. Thus, music plays an important role in enhancing our mood. As elders have said “Face is the Index of the Mind”, the mood of a person can be known by looking at the face of the person. The abstract of this system/ project is to build an automated system that builds playlists and plays the songs according to the mood of the user by directly discerning the facial emotions of the user. This model requires a camera to capture the face of the user and then the mood of the user is recognized by CNNs. Then the playlist is recommended to the user based on the discerned “Mood” of the user. This disposes of the tedious and monotonous task of physically gathering tunes into various records and helps in creating a suitable playlist dependent on a person's passionate highlights. Hence, the proposed system can be used to build a music recommendation system based on the facial emotion gestures of the user.
Keyword
Mood, Music, CNN, Facial emotion gestures
PDF Download (click here)
|
|
|
|
|