As the introduction of robots into our daily life becomes a reality, the social compatibility of such robots gains importance. In order to meaningfully interact with humans, robots must develop an advanced real-world social intelligence that includes novel perceptual, behavioural, emotional, motivational and cognitive capabilities.
We propose a biomimetic, brain-inspired approach such as “game play” that includes music composition as well. To facilitate the realization of game like interactions with a social machine, we will use the humanoid iCub robot, and adopt a mixed-reality interaction paradigm and a table-top tangible interface system called reactable which “per se” is a music instrument.
Bio-inspired mechanisms are implemented for enabling the robot to perceive, interact with, and learn from human partners in the context of increasingly complex table-top games on the musical tangible table display. Using biomimetic models of perception, emotion and cognition the iCub will locate and identify a single human partner to play games/compose music with.
In this context, we will present an interactive synthetic music composition system, called RoBoser 2.0, where an embodied synthetic DJ, or DJdroid, will perform and interact with human performers and audience to create novel and complex musical compositions.
Building on the interactive synthetic composition system Roboser 2.0, the DJborg will be able to create coherent musical structures without pre-established macro level rules. The outcome of this music composition effort is exclusively based on the interaction between man and machine that will be sharing the experience of composing a piece of music together. There are no previous rules only a matrix with time and a set of pre-defined pitch patterns. Music forms will emerge from this interaction!
The DJBorg is implemented using the iCub humanoid robot (IT) and the Distributed Adaptive Control theory of mind and brain (SPECS).
Other video links for the Experimental Functional Android Assistant efAA