4i: Immersive Interaction design for Indie developers with Interactive Machine Learning

Aiming to support independent developers and artists in designing movement and body-based interaction for Virtual Reality and immersive media this project builds tools to allow designing by moving via Interactive Machine Learning. Creating better tools and working processes aims to enable developers to create better movement interaction for players, audiences and end-users.

The project has two major, intertwined elements: Firstly developing interaction design tools based on interactive machine learning and testing these tools through creative, artistic work. The creative work will inform the design of the tools and the tools enable the creative work. The tool will be a plugin to a development platform such as Unity or Unreal Engine supporting three basic movement sensing technologies: the controllers available with standard VR systems; optical motion capture for more accurate, full-body interaction and also more experimental sensor technologies based around physical computing.

Over the full 2 years of the project, Gibson/Martelli will develop a movement-based VR experience for public exhibition, using the final prototype tools and will be a mix of arts practice as research and qualitative research. Practice as research will be conducted by Gibson and will centre on studio-based experimentation with HCI systems. This work will develop gestures, forms of movement interaction, avatars and custom-built virtual environments for art galleries and performative contexts. During the project, a number of hackathons expose the tools to indie game developers and artists.

We are pleased to work with our teammates, led by Marco Gillies together with Ruth Gibson, Phoenix Parry & Rebecca Fiebrink.

The project is commissioned by the EPSRC and supported by The Centre for Dance Research (C-DaRE) at Coventry University & The Department of Computing at Goldsmiths University.