Although facial attractiveness is data-driven and nondependent on a perceiver, traditional statistical methods cannot properly identify relationships between facial geometry and its visual impression. Similarly, classification of facial images into facial emotions is also challenging, since the classification should consider the fact that overall facial impression is always dependent on currently present facial emotion.
To address the problems, both profile and portrait facial images of the patients (n = 42) were preprocessed, landmarked, and analyzed via R language. Multivariate regression was carried out to detect indicators increasing facial attractiveness after going through rhinoplasty.
Bayesian naive classifiers, decision trees (CART) and neural networks, respectively, were built to classify a new facial image into one of the facial emotions, defined using Ekman-Friesen FACS scale. Nasolabial and nasofrontal angles' enlargement within rhinoplasty increases facial attractiveness (p<0.05).
Decision trees proved the geometry of a mouth, then eyebrows and finally eyes affect in this descending order an impact on classified emotion. Neural networks returned the highest accuracy of the classification.
Performed machine-learning analyses pointed out which facial features affect facial attractiveness the most and should be therefore treated by plastics surgery procedures. The classification of facial images into emotions show possible associations between facial geometry and facial emotions.