Baby steps with Marco’s demo files for Unreal using the InteractML plugin for Machine Learning for the 4i project. More info about the 4i project here and info about InteractML here.
This guide is not about Machine Learning per se it’s about getting going with the plugin & also to get ppl introduced to Unreal – so for some of you these steps may be completely obvious.
Goals: Get a test level set up in Unreal for VR using InteractML plugin. Have a pickupable object that we can use for training, Hotkey controls to make it somewhat easier to use in VR
01. Get Set up in VR
Marco set up a totally basic project using the plugin, with a basic scene ‘Minimal_Default’.
It would be convenient to add in VR support, and Unreal has starter templates that do this automagically, however Marco didn’t- so we must add it in.
The first step is to look in the Content Browser for the Add New button.
Hit that and choose Add Feature or Content Pack from the list
Now choose Virtual Reality from the selection and hit Add to Project
Then in Content Browser look under VirtualRealityBP/Maps for MotionControllerMap and open that. You can run it to test if your VR setup is working. Do this by looking for the Play button. The little arrow opens a drop-down and you can swap the play mode to VR. If VR Preview is greyed out this means the headset is not plugged in or Oculus or Steam are not running. (Unreal will want you to open the appropriate platform first, and then open Unreal to detect the headset- so you may have to do this.)
If it’s all good then after testing, stop play and in the World Outliner list – look for VRPawn. Rclick/Edit/Copy VRPawn and open the original map ‘Minimal_Default’ and Rclick in the viewport and Edit/Paste.
Now delete the Player Start and then rotate the VRPawn if necessary – to give start orientation. You can do this either in the details panel by entering numbers or dragging over the number. (Tip: Hitting the little yellow back arrow resets to defaults – useful for setting objects to the world origin.)
Or move the VR Pawn in the viewport by selecting it and moving it- SPACE bar toggles between T, R and S. The VRpawn should stay at Z=0 as it takes height from the headset.
For Oculus headset, the Pawn should be level with the floor – the floor is thick and floor level is not at 0
For quick testing, you can set the Pawn to Auto Possess the Player under the Pawn details:
(SideQuest: Because we are setting which DefaultPawn the player will get via the GameMode, it’s actually possible to delete the VRPawn from the level, replacing it with the more abstract PlayerStart, and you might be doing this if taking an existing map and converting it to VR. Once again the PlayerStart will need rotating and dropping down so the centre is on the floor level. This throws a warning ‘Bad Size’ which you can safely ignore. However for this example, it’s convenient to keep VRPawn in the level because we can quickly edit its blueprint. If you do go with PlayerStart it looks like this:
(Tip: when you are running in VR Preview you can swap to the unreal editor by Alt-tabbing. )
However, the controllers, while showing you a hand model, won’t do anything.
Note: if the Hand models aren’t showing up make sure that SteamVR plugin is enabled — the error is tied into the Steam chaperone boundaries being needed for this VRPlayerPawn. I will have to write something about this later. BRUNO)That’s because the input mapping isn’t set up – because we added the VR support afterwards, we did not start from a Vrtemplate. Anyway, it’s not too much bother, in this version of Unreal 4.22.
in the World Outliner if you dbl-click Edit MotionControll… it will open up the VRPawn blueprint. Look in the Event graph tab – some of the nodes have yellow underneath indicating a error.
So to fix this we have to set the input mapping. This lives inside the top menu Edit/ProjectSettings & under Project settings look for Input.
The blueprint is looking for InputActions which here in Input section are called Action Mappings named: GrabLeft, GrabRight, TeleportLeft, TeleportRight
To make a new association Click the + next to Action Mappings then type in the name e.g Grableft then select the drop-down and start typing to search which brings up the respective controller button list.
This system allows you to set up something you want to happen e.g fire the gun but map a whole bunch of different buttons from various controllers to actually activate then firing action. (This will change/simplify in I believe Unreal 2.5 to a more unified version with OpenVR). Anyway you need
GrabLeft = MotionController (L) Grip1
GrabRight = MotionController (R) Grip1
TeleportLeft = MotionController (L) Thumbstick
TeleportRight = MotionController (R) Thumbstick
Axis mappings are handled the same way. You will need
MotionControllerThumbLeft_X = MotionController (L) Thumbstick X
MotionControllerThumbLeft_Y =MotionController (L) Thumbstick X (set scale to -1.0)
and make similar for the ThumbRight X and Y.
Here’s my mapping:
(The naming e.g GrabLeft is only important in that it lines up with the blueprint setup, if we had set it up the other way around – so inputs first and then made a blueprint, we could use any old names. )
Back in the VRPawn blueprint if its all good hitting the compile button makes the yellow error go away.
Now the controllers should be working – so what we need now is something to interact with. Look in the VirtualRealityBP/Maps for a BP_PickupCube. Drag a few into the level and when it’s playing you will be able to pick them up with the controller grip button. They are physics objects and will drop to the ground. It might be handy to add in a shelf to put them on so they don’t fall on the floor.
Look in the Modes menu on the left and under Geometry and drag in a box.
(This kind of object is a BSP, not an imported 3d mesh. )
Use the Translate, scale and rotate controls to move and shape the box. There are controls for manipulating the vertices etc. but that’s for another time. At the end, you need to Build the thing to set the shape. In the menu under the build section under the tiny drop-down arrow select Build Geometry. This menu will also build the lighting and the Reflection Captures that it might be showing a warning for.
Nowadays Unreal does a great job of realtime lighting nowadays so we should set the lighting to be dynamic – this will mean it won’t ask you to rebuild every time you edit anything. To do this look under Worldoutliner, select the Light Source and Skylight and in the details panel set them to be movable instead of static. That’s that.
Now in the default VRTemplate map you can teleport around – we can’t do it here yet because we need to set up a navigation mesh. To do this go back to the Modes tab and in the search bar type in Volumes, select the Nav Mesh Bounds Volume and drag it into the level.
You cant see the navmesh so in the left-hand viewport controls select Show and select Navigation (or press P)
This will make a green shape on the ground where the volume overlaps. Use the scale controls to stretch it out over the floor and maybe make the floor a bit bigger too. You have to build it by choosing Build Paths from the Build dropdown.
So we have the map set up for VR let’s get to the Machine learning.
Oh yeah just before that give your Editor that pro game developer touch be editing the highlight colours to be not the bilious Unreal yellow: this is in the Edit/EditorPreferences – Selection colour and Pressed Selection colour
Now Select the BP_PickupCube
Under Details add Component, MLTest
That’s it you’re done!
02. Get Set up with Keyboard controls
If we open up MLtest in Blueprint editor we can rearrange nodes to neaten up and select groups of nodes and press C to make a Comment box around each section like so:
Back in the Details panel we can see the exposed variables:
These can also be seen in the blueprint. I think it’s a good idea to be able to control collecting data and running from some keyboard commands and we will change the exposed variables to do that.
(Because I can’t access Unreal editor when VR is running as I’m using a Windows Mixed Reality Headset
Therefore the plan to start with is make my own ‘control’ variables in the VRPawn and then send them to IMLtest.
So to set this up we need to:
Set up a Game Mode that sets the player pawn to our MotionControllerPawn (it’s called VRPawn in the level) – & reference a Game Instance.
The game instance is a handy way to hold ‘global’ values that can pass from the VRPawn to the MLTest blueprint.
Open Edit/Project Settings and under Maps and Modes click the + next to Default GameMode to make a new GameMode, choose where to save it. Underneath that set the Default Pawn Class to MotionControllerPawn. Under Game Instance click the + to make a Game Instance and save it too. Give it a useful name, mine is IMLgameInstanceBruno. Then go File/Save All levels. Heres mine:
Now close this and back in the Content Browser open <YourGameInstance>. Add variables here, I’m adding some booleans and floats by clicking the + next to Variables in the left and editing the Details on the right – you change the Variable type by choosing from the dropdown. Then Compile & Save.
You can get and set these variables from any other blueprint by casting, like this method here which I’m using in MotionControllerPawn:
To make the above, R-click in the Blueprint (BP) event graph to bring up this menu, then type to search. If what you want doesn’t show up then untick Context Sensitive.
<YourGameInstanceName> should show up here and the variables you added to it to.
Here’s what my MotionControllerPawn looks like, (well my additions..)
First I make sure I get the player input:
Right Arrow to Run the ML
So this setup sends data to the GameInstance.
And in the MLTest BP we read the data back, like so:
The setters outside the comment box are setting the actual values in MLTest.
Heres the whole MLTest. Anyhoo, this works for testing.
So now if you have all that working it might be helpful to allow easier control customisation so we can set other buttons or the VR controllers to collect data and to run the machine learning. This is easy. Open up the Project Settings look under Input and add some Action mappings. I’m making StartStop DATA collect, Run IML and Stop IML like this: which I’m mapping to Space Bar, Right and Left.
Over in the MotionControllerPawn I can now insert my newly created InputActions nodes and rewire like so (and I can delete Space Bar, Right and Left – no longer needed)
The advantage of abstracting to InputActions is that I can map any input button to control the action inside of my Project Settings, without touching the code, for example, if I want to Run IML using my Oculus Touch controllers Right Trigger I just add it as a MotionController input like so (obviously I can have lots of different buttons all triggering the same action):