This is a short (as yet incomplete) report about the first of the Reality Remix Salons themed around Memory that took place recently in the Vision Building in Dundee, home of the Biome Collective. The Reality Remix project is an AHRC/EPSRC funded research project where a group of interdisciplinary researchers, artists and academics come together to develop immersive projects. At this stage we are all developing proof of concepts. There will be 3 physical Salons with a final sharing event, open to the public at Ravensbourne and Siobhan Davies Studios in the Summer. We are going to be blogging about the project here and also putting up some of the useful technics we discover along the way. The project is led by our own Ruth Gibson with her Coventry University hat on together with Joseph Delappe and Darshanna Jayemanne from Abertay University in Dundee.
What I’m working on with Ruth concerns performance in VR and attempting to beam live performance into cyberspace using motion capture and see if we can make a kind of live virtual performance. This project is building on from the Augmented Reality experiments we have been developing with our MAN A project. The difference here is that we want to involve a live performer, rather than the prerecorded anims we have been using up until now.
This doesn’t look like much but finally we were able to beam the mocap into the virtual environment – its coming from the Perception Neuron suit into unreal via a plugin. For some irritating reason I cant get the suit to work wirelessly – I think its due to the internet speed in this room in Vision being unutterably dire. Instead ive got it attached via the USB cable. Theres a whole ton of jiggerypokery that needs to take place regarding whats known as retargeting – so that the bone names on skeleton of the character match the incoming mocap data bone names. Luckily Unreal allows retargeting without too much pain. In the demo file here the skeleton is already set up. I was able to get it so that I could see in VR my movement driving the robot character realtime. (And of course I didn’t get a picture) The good news is that the latency is super low, at least on the USB. In the future im going to use my own characters and experiment with performers. And then as a side effect experiment with the suit driving a body that the users inhabits. (So you can look down at your own virtual body)