Author
Wu, Yue
Other Contributors
Ji, Qiang, 1963-; Sanderson, A. C. (Arthur C.); Radke, Richard J., 1974-; Braasch, Jonas;
Date Issued
2016-12
Subject
Electrical computer Systems engineering
Degree
PhD;
Terms of Use
This electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.;
Abstract
Specifically, this research consists of two parts: automatic facial landmark detection and tracking, and facial behavior analysis and recognition using the tracked facial landmark points. Facial landmark detection and tracking involves the detection and tracking of fiducial facial points located around the major facial components, such as the eyes and mouth. These points encode critical information about muscle movements of various facial behaviors. However, robust facial landmark detection and tracking are challenging due to variations in facial expression, head pose, and illumination, as well as the existence of facial occlusions. To address these challenges, we develop several techniques. First, to handle facial expression and head pose variations, we introduce a hierarchical probabilistic face shape model and a discriminative deep face shape model to capture the spatial relationships among facial landmark points under different facial expressions and face poses. Our methods leverage the captured relationships to significantly improve facial landmark detection under varying face poses and expressions. Second, to handle facial occlusion, we improve upon the effective cascade regression framework by explicitly predicting not only the landmark locations but also their occlusion probabilities, and use these probabilities to weight the contributions of the local appearances around the landmarks to the prediction of the new landmark positions.; The second part of this research applies our facial landmark detection and tracking algorithms to facial behavior analysis, including facial action recognition and face pose estimation. For facial action recognition, we introduce a novel regression framework for joint facial landmark detection and facial action recognition. By exploiting the interactions between the facial landmark positions and facial actions as well as the superior performance with the regression framework for landmark detection, our method improves both facial landmark detection and facial action recognition. For head pose estimation, we propose two methods. The first method is a unified framework for simultaneous facial landmark detection, head pose estimation, and facial deformation analysis on facial images, and the proposed model is robust to facial occlusion. Following a cascade procedure augmented with model-based head pose estimation, we iteratively update the facial landmark locations, facial occlusion, head pose and facial deformation until convergence. The second method is a video-based head pose estimation method that can jointly perform pose estimation for multiple frames. The experimental results on benchmark databases demonstrate the effectiveness of those proposed facial behavior analysis methods.; The face plays an important role for human communication. It conveys the identity, communicates emotion, and indicates intent. Automatic analysis of facial behavior allows machines to understand and interpret a human's intents and needs for natural interactions. This thesis focuses on developing advanced computer vision techniques to process and analyze facial images for the recognition of various facial behaviors.; Experimental evaluations of our methods on benchmark databases demonstrate their improved performances on facial images under varying facial expressions, head poses, and facial occlusions.;
Description
December 2016; School of Engineering
Department
Dept. of Electrical, Computer, and Systems Engineering;
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Relationships
Rensselaer Theses and Dissertations Online Collection;
Access
Restricted to current Rensselaer faculty, staff and students. Access inquiries may be directed to the Rensselaer Libraries.;