======WP9 meeting at UH====== Things to look into: * SAIBA [[http://www.mindmakers.org/projects/SAIBA]] Multimodal Behavior Generation Framework * Bieldefeld XML (behaviour markup language) [[http://wiki.mindmakers.org/projects:bml:main]] =====Demos for milestone 4.2===== ====INESC-ID==== Use the ICAT to: * Play chess as normal with someone * But this time remembers players from previous games Needs player identification, planned to use RFID for this, but it's turning out not so good due to a limited range. Another possibility is face identification, but there is no current solution for this, or potentially something simple like different coloured hats - the fallback is a wizard of oz ([[wp>Wizard_of_Oz_experiment]]) solution. Needs: Possibly face track from QM ====UH==== A scenario in the robot house: find and locate a user as a mobile robot, migrate to another place to communicate with a second user (upstairs) in fixed graphical form. * Find locate user could use motion tracking, or face finding. Needs: Possibly face track from QM, ION ====HW==== * Greta on a fixed system - use face tracking to make eyes follow user. * Mobile robot - move relation to an interaction partner * Handheld - demonstrate moving FAtiMA data from one platform to another. Needs: Face finding from QM, ION and 2D Greta face