In this work, a variant version of recurrent neural network is adopted to accomplish several tasks in non-verbal expression in emotions. This work was sponsored by EU ALIZ-E project.
Based on Recurrent Neural Network with Parametric Bias Units, we trained with a selection of the data on expressive human movement collected using an inertial motion capture system in the first year and analyzed subsequently. We discovered that the RNNPB has additional PB variables that act as bifurcation parameters for the non-linear dynamics. Therefore, the PB units constitute a small-dimensional space to reduce features and represent slow-changing profiles (such as emotion) of the features of body movements. Furthermore, in agreement with of our above-mentioned study on analysis of human movement and with the work presented in the previous subsection, this behavioral expression space should not be restricted to the basic emotions but should also be continuous.
With kinect input and coordinate mapping, in the second part of we successfully used RNNPB to identify personalised emotion with their non-verbal bahaviours. In this way, our lab colleagues brought together affect recognition, expression, and the internal parameters of the emotion model into the embodied “cognitive architecture” of the NAO robot to generate a child behaviour with various emotion expressions, since in this case affective behavioral expression and recognition are directly linked to the affective space modeling the (continuous) internal affective states of the robot and their dynamics.