Live facial expression recognition

When having a conversation with our friends, facial expression plays an essential role in conveying emotions, unrevealed messages, and unconscious thoughts. By reading others' facial expressions in conservation, we could better understand how they feel towards the incidents and relate ourselves...

Full description

Saved in:
Bibliographic Details
Main Author: Tan, Wei Mun
Format: Final Year Project / Dissertation / Thesis
Published: 2022
Subjects:
Online Access:http://eprints.utar.edu.my/4669/1/fyp_CS_2022_TWM.pdf
http://eprints.utar.edu.my/4669/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1833428671290933248
author Tan, Wei Mun
author_facet Tan, Wei Mun
author_sort Tan, Wei Mun
building UTAR Library
collection Institutional Repository
content_provider Universiti Tunku Abdul Rahman
content_source UTAR Institutional Repository
continent Asia
country Malaysia
description When having a conversation with our friends, facial expression plays an essential role in conveying emotions, unrevealed messages, and unconscious thoughts. By reading others' facial expressions in conservation, we could better understand how they feel towards the incidents and relate ourselves to others' speech or discover hidden information that the author does not want to reveal to us. However, people with barriers like autistic symptoms or visual impairments could have a hard time understanding others' facial expressions, which could provide them with additional information while communicating with others. Hence, this project is motivated by the will to help people with disabilities recognize others' facial expressions, helping them better communicate with others. The first draft design integrates an online FER API into a mobile application to help the target audience recognize others' emotions. However, several issues identified in the research are the high number of queries to a paid FER service, a high amount of data transmission when sending all the camera input frames to the online FER service and the relatively long overall processing time to recognize facial expressions from the original images captured by the smartphone. Therefore, this paper proposed three stages of potential optimizations: down sampling to reduce the number of camera frames to be processed, spatial trimming on the image to be sent to the FER service, and caching methodologies to reduce the number of FER queries that cost. Finally, the potential optimizations will be evaluated and determine the best implementation to produce a FER procedure that sends the appropriate frames only to be recognized and reduce the amount of data transmission during the FER query and perhaps improve the overall processing time from acquiring the input to delivering the FER result. The determined implementation will be realized by building the mobile application, improving the portability in FER.
format Final Year Project / Dissertation / Thesis
id my-utar-eprints.4669
institution Universiti Tunku Abdul Rahman
publishDate 2022
record_format eprints
spelling my-utar-eprints.46692023-01-15T13:37:34Z Live facial expression recognition Tan, Wei Mun Q Science (General) T Technology (General) When having a conversation with our friends, facial expression plays an essential role in conveying emotions, unrevealed messages, and unconscious thoughts. By reading others' facial expressions in conservation, we could better understand how they feel towards the incidents and relate ourselves to others' speech or discover hidden information that the author does not want to reveal to us. However, people with barriers like autistic symptoms or visual impairments could have a hard time understanding others' facial expressions, which could provide them with additional information while communicating with others. Hence, this project is motivated by the will to help people with disabilities recognize others' facial expressions, helping them better communicate with others. The first draft design integrates an online FER API into a mobile application to help the target audience recognize others' emotions. However, several issues identified in the research are the high number of queries to a paid FER service, a high amount of data transmission when sending all the camera input frames to the online FER service and the relatively long overall processing time to recognize facial expressions from the original images captured by the smartphone. Therefore, this paper proposed three stages of potential optimizations: down sampling to reduce the number of camera frames to be processed, spatial trimming on the image to be sent to the FER service, and caching methodologies to reduce the number of FER queries that cost. Finally, the potential optimizations will be evaluated and determine the best implementation to produce a FER procedure that sends the appropriate frames only to be recognized and reduce the amount of data transmission during the FER query and perhaps improve the overall processing time from acquiring the input to delivering the FER result. The determined implementation will be realized by building the mobile application, improving the portability in FER. 2022-04-21 Final Year Project / Dissertation / Thesis NonPeerReviewed application/pdf http://eprints.utar.edu.my/4669/1/fyp_CS_2022_TWM.pdf Tan, Wei Mun (2022) Live facial expression recognition. Final Year Project, UTAR. http://eprints.utar.edu.my/4669/
spellingShingle Q Science (General)
T Technology (General)
Tan, Wei Mun
Live facial expression recognition
title Live facial expression recognition
title_full Live facial expression recognition
title_fullStr Live facial expression recognition
title_full_unstemmed Live facial expression recognition
title_short Live facial expression recognition
title_sort live facial expression recognition
topic Q Science (General)
T Technology (General)
url http://eprints.utar.edu.my/4669/1/fyp_CS_2022_TWM.pdf
http://eprints.utar.edu.my/4669/
url_provider http://eprints.utar.edu.my