Reality Remix Salon MEMORY

This is a short (as yet incomplete) report about the first of the Reality Remix Salons themed around Memory that took place recently in the Vision Building in Dundee, home of the Biome Collective. The Reality Remix project is an AHRC/EPSRC funded research project where a group of interdisciplinary researchers, artists and academics come together to develop immersive projects. At this stage we are all developing proof of concepts. There will be 3 physical Salons with a final sharing event, open to the public at Ravensbourne and Siobhan Davies Studios in the Summer. We are going to be blogging about the project here and also putting up some of the useful technics we discover along the way. The project is led by our own Ruth Gibson with her Coventry University hat on together with Joseph Delappe and Darshanna Jayemanne from Abertay University in Dundee.

Testing out the shared virtual environment of Facebook ‘Spaces’ for VR. It scrapes your feed and makes a tweakable avatar based on photos of you..
Mixed reality testing- this is a screenshot from my phone which is running a custom app developed in the Unity game engine. You can see the avatars overlaid into the view. The effect is really convincing. The phone is tracking the floor so you can move all around and through the avatars, and they consistently remain rooted in the world

What I’m working on with Ruth concerns performance in VR and attempting to beam live performance into cyberspace using motion capture and see if we can make a kind of live virtual performance. This project is building on from the Augmented Reality experiments we have been developing with our MAN A project. The difference here is that we want to involve a live performer, rather than the prerecorded anims we have been using up until now.

The collaborators and leads were all equipped with matching Oculus Rift VR Headsets and suitable PCS. Like Xmas morning on steroids…
Collaborator Dustin Freeman getting into the Google Daydream headset
Lead Darshana Jeyemanne checking out the Rift
Principle Ruth Gibson wearing augmented clothing made by collaborator Alexa Pollmann
Testing out realtime mocap into the Unreal engine using the (unworn) Perception Neuron mocap suit being waved about by project partner Alex Woolner

This doesn’t look like much but finally we were able to beam the mocap into the virtual environment – its coming from the Perception Neuron suit into unreal via a plugin. For some irritating reason I cant get the suit to work wirelessly – I think its due to the internet speed in this room in Vision being unutterably dire. Instead ive got it attached via the USB cable. Theres a whole ton of jiggerypokery that needs to take place regarding whats known as retargeting – so that the bone names on skeleton of the character match the incoming mocap data bone names. Luckily Unreal allows retargeting without too much pain. In the demo file here the skeleton is already set up. I was able to get it so that I could see in VR my movement driving the robot character realtime. (And of course I didn’t get a picture) The good news is that the latency is super low, at least on the USB. In the future im going to use my own characters and experiment with performers. And then as a side effect experiment with the suit driving a body that the users inhabits. (So you can look down at your own virtual body)

Robotics at the Vision building
Darshana looking suitably Augmented
Dusting Freeman demonstrating his virtual village murder mystery machinations whilst Scott Kildall looks on

This is the ginormus room we had at the Vision building – 7 Oculus VRsetups all facing out in different directions from the central table square.