Making of Future Boy HoloLens Game



Download

https://www.microsoft.com/en-us/store/p/future-boy/9NBLGGH52SH3

Concept

The original Future Boy artwork illustrated a boy in the attic wearing HoloLens while hanging out with his holographic robot friend and going through boxes of this parent’s old stuff. He pulled out an unknown dusty object from the Kitchen box – an iPad. It has been misplaced in the Kitchen box because of the apple on the back. The artwork is a statement – Apple is going into obsolescence; Windows logo shining in the center.

Future Hololens Boy

The artwork started as a collage but ended up as a 3d scene. I created it in 3ds max and rendered with Vray. That made it easy to export to unity and make something out of it. The first AR version was made for Android for an Art Exhibition in July. That version simply allowed exploring the scene with a phone AR.



3D Software

I used Unity light baking only in one project and it came out bad and it was super slow, so I decided to do it in 3ds max. The 3ds max has Render to Textures and is great for baking shadows and global illumination into one texture. I had to Unwrap the textures first. The unwrapping is a boring process and it is not any faster than my first unwrap in 2004. I used Vray as the rendering engine. All shadows were baked, fbx was exported with media checked, and in Unity I have set Unlit materials to all objects and turned off all shadows casting and receiving.

Boy is a child model from free Daz3D, which I posed and then exported to Zbrush. I converted it to Dynamesh so it creates a watertight model without self-intersecting polygons, and decimates it. Then I moved it back to 3ds max to have a nice Unwrap UVW using Pelt, then back to Zbrush, and Zapp Link a few time back and forth to Photoshop to paste a few photos, and use Liquid brush to warp them nicely. This may sound complicated but it is actually the fastest process. I practiced it for 10+ years, culminating in this glorious moment when I can save a few minutes today.future-boy-AR-morph-optimized

Morphing between flat image and the 3d scene was done in several programs. I dug up a program I developed around 2008 for drawing reliefs over 3D scans, but the only thing I needed was automatic Delaunay triangulation of drawn vertices. I drew the least number of vertices as I could, and the program did the triangulation. Then I imported it in 3ds Max, slapped the texture and then manually moved the vertices to the positions in 3d space using 3d snap. I did it about 7 times before I was satisfied. It was easy to get lost with even this low number of faces. The screen recording was from the first one that was not very successful but you will get an idea.




I hoped to do the morph animation with the Morph modifier in 3ds max, but Unity does not like max’s skin modifier, which is the resulting output of Morph in the FBX. So I exported both the original triangulated mesh and the distorted one, and did the morph in Blender, and both Unity and I were happy.

You may have noticed that the robot in the print and the one in the game are different. I did not think that far ahead when I made the first image. After searching for a robot 3d model under some kind of creative commons license for longer time than it would take to model one from scratch, I decided to call the wasted time as a research, and then I modeled that robot with all the ideas from the research. Then I went to mixamo.com and uploaded my model, and mixamo did the rigging. The only worse thing in 3D than unwrapping UV map is rigging, and these guys made a great automatic tool for that. I got a few motion captures on the same site and downloaded my robot with a Unity-ready rig and the chosen animations, for free. How convenient is that?!

future-boy-rigging-with-mixamo

Vuforia

Vuforia is still only available as “Early Release”, I am not sure where does it put it relative to Alpha or Beta in the release cycle, (the link path says beta) but I will guess it is not ready for prime time, so know that something may not work 100% as you expect.

Legal considerations for using Vuforia for Unity and HoloLens

Read the license agreement (remember Human Centipad!), but in short, you cannot publish the app using Vuforia library to the store while it is in “Early Release”. Once it is officially out of early release read it again, but if the model is the same as for Mobile Apps, you will be able to use it for free if you are indie developer and do not have more than 1000 total downloads, and your app does not help selling anything with it. I am not a lawyer and this from a verbal explanation of a not-recorded phone call with Vuforia’s sales person.

Design considerations

We have this experienced but sometimes we forget while immersed using HoloLens, is that Holograms are not 100% opaque. That means that if you are not just placing objects ON the printed target, but instead cutting a wall behind as in Future Boy Game, the printed target will be slightly visible. Make sure your users’ HoloLens is at high brightness and that the environment lighting can be controlled, otherwise the print will show through the holograms that are supposed to be behind the print, and that will ruin the experience. If you use it on the phone it does not matter, as rendered 3d objects are 100% opaque.

Choose your Image Target

Vuforia has target design considerations for best results, and you should read it. What is not mentioned is that Image Target has to have good enough contrast and have high number of features for tracking, but the more contrast you have, the more the target will be visible through the attached holograms.

How to use Vuforia for HoloLens

If you go to https://developer.vuforia.com/downloads/beta you can download it for Unity AND you should download documentation which has following steps in more details, as I have feeling this post will get long anyway.
Open package in Unity, find in the your newly imported folder an AR Camera and drag it into hierarchy. Set up the camera according to documentation
Then go to developer.vuforia.com and then Develop>License Manager>Add License Key>Development>
Choose Digital Eyewear and copy the license key into the appropriate key of the AR Camera
Go to Develop>Target Manager>Add Database>Device>(Name it)>Upload target
Make sure you measure your printed target in millimeter precision and enter that data as meters
Click Download Database>Choose Unity Editor>Download and import it into Unity
In Unity find ImageTarget prefab and drag it into Hierarchy, choose the image.

Perfect match

Make sure printed target size matches perfectly the one that you have downloaded from Target Manager and have set in the scene. Otherwise the AR object’s position will be offset behind or in front of the camera. That was my first mistake. I had the print laminated onto the foamcore board and I had to trim a few millimeters on each side to get rid of white edges and a few bad cuts so the Unity Image Target did not match perfectly the print BEFORE trimming sides months ago.
Size is not important if you use Vuforia for AR on the phones, but for stereo camera it is important.

Parenting

The objects you are augmenting user’s reality with must be a child of Image Target. Here is a gotcha: I expected that the Image Target’s or its childrens’ Active state is toggled depending on whether Image Target has been found by the camera, BUT instead, all components in all children are enabled when Image Target is recognized, turning on even those I had disabled in editor. That also means that if you detach the Image Target’s children at runtime and give them for adoption by the root or other GameObject, you will need to iterate through all children enabling the components if you want to see them again. Vuforia’s Image Target is a bad parent, rendering all their children invisible once they leave her house. But she just wants some recognition. That was a horrible joke. I am not saying that one to my students.

Recording

I am amazed that HoloLens can record these apps at all since both Vuforia and Cortana (I think of her recording it) use the camera stream at the same time. You already know that HoloLens drops the framerate to 30fps when recording and you may know the resolution. What I have noticed is that these resulting videos are at 24fps and much lower resolution than default. I am not sure if the Vuforia is also forced to use smaller resolution and framerate for the tracking because of that, but the quality of the experience noticeably suffers while recording.

Be warned that as soon as Vuforia starts tracking the recording will stop. You will have to call Cortana to start it again.
Something else happens while recording which I have not figured out: The recorded hologram’s position and size are wrong or slowly drift away. Even though the AR overlay is perfect from user’s viewpoint, the recording and the user’s experience will look very different (you can see that in the video if you pay attention to the right and bottom edges). It may have to do with HoloLens’ camera position being 5cm in front of user’s eyes. If I get a reply on Vuforia’s support forum I will update this text. Several hours of ruining perfect user’s AR overlay, trying to brute-force fake the perfect position of the recording overlay instead, were without success. Compiling takes very long time, so investing more time was not justified considering that it is just an temporary improvisation to go around a bug.

Bug

I also experienced weird bugs. During my first game tests I noticed that nothing was in the Kitchen box in front of the boy; I worked on more important issues, and after few hours I went to kitchen to get a glass of water, and I find the tiny utensils from that box hovering in the middle of the kitchen (my physical kitchen). Is that a bug or some cool feature?! I did import those utensils as separate a FBX and made them a child of that box in the original scene, but it just would not stick. Later I merged the scenes in 3ds max and reimported everything as one FBX and that solved the problem (or broke the feature, not sure).

Warning

DO NOT move the printed target while Vuforia is tracking it. I did, and when I closed the app the whole world was rotated and translated. All the holograms around the apartment, pinned apps, everything was rotated by the amount that I moved the printed image target while playing the game. To make things worse, I had a hologram of a cat on my sofa, and while air-tapping to see what happened to room tracking mesh, I somehow activated the cat hologram in this alternate dimension; the poor thing kept on meowing and I could not find it. I saw in the movies that people in US call firefighters in these situations, but after a few minutes I asked Cortana to restart. I am just thankful I did not click the monkey as I did not have 3K to replace the HollowLens. Boss: “What happened to HoloLens”? Me: “Well, I moved the print and the room started spinning, and then I accidentally tapped a monkey and it started slurping… “. Then I would have needed to find some drugs because passing a drug test would look worse than failing it.

Code

The code is based on Holographic Academy tutorial Holograms 230. To make Vuforia work, in PlaySpaceManager I am checking instead

if (!imageFound || (limitScanningByTime && ((Time.time - SpatialMappingManager.Instance.StartTime) < scanTime)))

To check if image has been found I have this on ImageTarget

public class MyImageTargetBehaviour : MonoBehaviour, ITrackableEventHandler
{
private TrackableBehaviour mTrackableBehaviour;
public bool IsPaintingFound = false;
void Start()
{
mTrackableBehaviour = GetComponent();
if (mTrackableBehaviour)
{
mTrackableBehaviour.RegisterTrackableEventHandler(this);
}
}
public void OnTrackableStateChanged(
TrackableBehaviour.Status previousStatus,
TrackableBehaviour.Status newStatus)
{
if (newStatus == TrackableBehaviour.Status.DETECTED ||
newStatus == TrackableBehaviour.Status.TRACKED)
{
IsPaintingFound = true;
}
}

And if the image (target) has been found, I am stopping the observer and detaching the MyScenefrom the ImageTarget, so that I can turn off vuforia tracking. The moment Vuforia finds image it will set the proper position and rotation, and since we do not need or want target to be moved (remember the warning), we can detach it, and it will still have proper world coordinates.

This is part of ChildServices() function

myScene.transform.SetParent(null);
Renderer[] rendererComponents = myScene.GetComponentsInChildren();
foreach (Renderer component in rendererComponents)
{
component.enabled = true;
}
Collider[] colliderComponents = myScene.GetComponentsInChildren();
foreach (Collider component in colliderComponents)
{
component.enabled = true;
}

If the target is not found I am using the Placeable.cs from the same tutorial and then I simply in OnPlacementStop
I am calling above ChildServices and

myScene.transform.position = targetPosition;
myScene.transform.rotation = transform.rotation;

The sounds is my least favorite part of every project. I had an idea to improvise and make all sounds with my mouth as I play animations and then put an effect on top of it to make it sound like robot servo, or explosion. If I was that guy from movie Police Academy it would be a good idea, but this way I spent too much time on it and was not worth it. Then I thought I will just attach a regular servo sound to a few joints of the robot
Of the direction of the rotation changes I am just rewinding the sound

if (rotation > lastRotatation)
direction1 = true;
else
direction1 = false;
if (direction0 != direction1 )
audio.time = 0;
audio.pitch = Mathf.Abs(rotation - lastRotatation) + .5f;
direction0 = direction1 ;
lastRotatation= rotation ;

One the demo I have it only on head and upper arm. Since I have read a warning of spatial sounds being expensive, I did not want to push it.

Publishing

I wrongly assumed that if the game works in Release mode, that it will also run compiled as “Master”. I used DOTween for some animation but in master mode it crashes if I start animation and then remove the object before animation finishes. I had it on Laser object where I animate scale as the laser beam moves. If it hits the object I call Destroy and DOTween does not like it. If you use it, make sure you use DOTween.Kill(gameObject); before Destroy.

I submitted the first release on Sunday evening and it took until Thursday to go through certification and Show Store Listing, then I noticed the above mentioned crashing, I fixed it and resubmitted on Thursday evening, and it took until Saturday to arrive to the store. They need to speed up this part 10x.

Play

So it is in the store without Vuforia part (does not recognize the print) because of their terms, but let’s be honest, even if I could include Vuforia in the store version, printing a 50x32cm image would be just too much of an effort just to test a silly little game.

I uploaded it to the store:

https://www.microsoft.com/en-us/store/p/future-boy/9NBLGGH52SH3

Link to this post
Tags: , , , , ,

Categorised in: ,