Getting the Butterflies to Fly in Formation: Developing an Individual Flow Zone Profile to Maximise Learning Outcomes

Location: Joondalup, WA

Duration: 3 months

Project Background

Cinglevue aims to make learning tangible and realisable through the development of innovative solutions, delivered and tracked through our Virtuoso platform. One of the primary research focus areas for this platform is on the enhancement of learning outcomes, contextualised in terms of identifying an individual’s flow zone profile – a relationship between emotional quality and academic achievement that is moderated by emotional intensity. However, it is difficult for teachers to recognise this profile via summative assessments. Without a more intuitive, quantified, cognitive approach, the Virtuoso platform is currently disadvantaged in being able to offer a comprehensive emotional view of the student, which is likely to negatively impact upon the platform’s ability to maximise learning outcomes.

Consequently, Cinglevue is developing a commercially viable solution to identify individual flow zone profiles through static and dynamic human facial expression measures. The focus of this internship will be of the manipulation of static human and emoticon faces. To date, most approaches (and their associated software packages), including FACGen2.0 (Krumhuber, Tamarit, Roesch, & Scherer, 2012), HAPFACS 3.0 (Amini, Lisetti, & Ruiz, 2015), and FACSHuman (Gilbert, Demarchi, & Urdapilleta, 2018), only offer the option of manipulating virtual faces. Moreover, researchers who have successfully manipulated static human facial expressions find it difficult to maintain authentic image identity (e.g., Averbuch-Elor, Cohen-Or, Kopf, & Cohen, 2017). But promising algorithmic approaches exist that can (a) change the quality and quality of human facial expressions, and (b) preserve the identity of the individual in the image (e.g., Kim et al., 2018; Tripathy, Kannala, & Rahtu, 2019). The latter approaches are of interest to the current project. Therefore, a PhD with expertise in the field of computer vision science (particularly in the area of algorithm development to manipulate facial expression quality and intensity in human and emoticon faces) is required to complete the first stage of the current project.

Research to be Conducted

The overall goal of this project is to test a model aimed at identifying if there exists an individual flow zone profile for every student. However, this is a working model (i.e., the relationship between emotional quality and emotional intensity in predicting academic achievement) and its proposed principles require further empirical research and replication. This would be the first model to incorporate emotional quality and emotional intensity to develop an individual flow zone profile contextualised to the classroom setting. To test this model, however, there are specific pre-requisites that need attention. One pre-requisite is to confirm the factor structure of a proposed latent variable ‘emotional quality’. To achieve this, we are aiming to develop and validate a test that assesses the ability to recognise emotional quality (academic emotions) through facial expressions. Developing the facial stimuli for this assessment will be the major aim of this internship. In particular, the objectives of the internship are:

  • To be involved in manipulating human static faces to be used a series of initial pilot studies;
  • Based on the results of the pilot studies, you would develop a usable algorithm (an associated tool) that is able to manipulate uploaded facial images on the basis of the Facial Action Coding System (FACS; Ekman, Friesen, & Hager, 2002). With this tool, users will be able to convert any image to exhibit specific emotional qualities (e.g., enjoyment, hope, pride), at 3 different emotional intensity levels (e.g., low enjoyment, medium enjoyment, high enjoyment, low hope, medium hope, high hope, etc.). This algorithm should be able to manipulate both human and non-human faces (including emoticons), while maintaining the identity of the person in the image. **The AU specifications along with the intensity levels specifications will be provided to the intern. The principles behind this tool can be modelled based on previous software, for example, FACSGen Animation Software. **

Skills Required

We are looking for a PhD student with the following:

ESSENTIAL

  • Experience with facial manipulation/transformation/morphing
  • Experience in image generation and synthesis
  • Experience in machine learning and development of algorithms
  • Experience with using C#
  • Knowledge and experience with software/application development

DESIRABLE

  • Knowledge and experience of using facial manipulation programs (e.g., Fantamorph, Morpheus and Psychomorph)
  • Knowledge of facial action unit and emotion research
  • Experience developing reusable code

Expected Outcomes

In the first instance, it is expected that the APR intern will produce a series of manipulated human faces that will be used in a small-scale pilot experiment. Following the results of this experiment, it is then expected that the intern will lead the creation of a commercially viable solution that will permit anyone to be able to automatically change the facial expression of static faces. Finally, at the end of the internship, the intern is encouraged to prepare and present a short presentation of his/her approach, as well as what was learnt throughout the course of the internship.

Additional Details

The intern will receive $3,000 per month of the internship, usually in the form of stipend payments.

It is expected that the intern will primarily undertake this research project during regular business hours, spending at least 80% of their time on-site with the industry partner.  The intern will be expected to maintain contact with their academic mentor throughout the internship either through face-to-face or phone meetings as appropriate.

The intern and their academic mentor will have the opportunity to negotiate the project’s scope, milestones and timeline during the project planning stage.

Applications Close

7 February 2020

Reference

APR – 1211