This year our Lead XR Prototyper Todd Little made it down to GDC 2018 - here are his thoughts on the experience.
I’m very happy that I went to GDC this year. I came away super inspired from the people I met and the talks I went to. These last months I felt stuck in routine and that I wasn’t being as creative as I could be. The number one thing that I came away with is that I need to slow down and draw out my ideas instead of going straight to development. I also felt challenged to tackle problems facing accessibility in VR.
What follows are descriptions and insights I learned from the talks I attended at GDC 2018 split between two tracks: Unity and VR talks.
Stella Cannefax(Unity, @computerpupper), Jono Forbes(Unity, @jonoforbes), Andrew Maneri (Unity, @yarharhar), Matt Schoen (Unity, @mtschoen)
Stella called AR Devices like babies: they have a very limited understanding of the world. Unity Labs is researching on ways to improve this understanding. How do we teach the device what is a ‘floor’? What is a ‘chair? What is a door? We could define it by being on a vertical plane, having a door knob, and can be opened. The technology currently only supports the plane detection and image recognition, so how do we know if a door can be opened?
This gets into affordances: our relationship with objects and the semantic understanding of the world. How do we teach devices about those types of relationships? One of the possible areas is through Machine Learning. Unity has been making a push to get these tools into developers, so that they can experiment and allow AI to learn from trial and error.
They pointed out that computer vision researchers have had recent success in semantic understanding experiments, which you can read more about here: “Semantic Scene Completion from a Single Depth Image.”
Jimmy Alamparambil(Unity, @jimmy_jam_jam), Tim Mowrer(Unity)
Currently, it’s difficult to do cross-platform AR development since ARKit and ARCore are separate services. Unity is trying to rectify this by creating a layer of abstraction, so that we can develop once and this new architecture, Unity’s AR Subsystems” talks to the specific provider (e.g. ARKit, ARCore, Vuforia). They are also decoupling their release schedule from the provider’s release schedule by introducing a new concept in 2018, which is Packages. So, if Apple releases a new version of ARKit, we don’t have to wait till Unity releases a new version - we can just import that package.
Timoni West(Unity, @timoni)
Timoni walked us through the history of computing and how the physical hardware changes the ideas of what are possible in software and vice versa (from an early version Duck Hunt from 1936 to the mouse to game controllers)
This year to improve graphics rendering and improving workflows. The Scriptable Rendering Pipeline allows us to create our own rendering. Unity is shipping two pre-made pipelines: the High Definition (HD) and Light Weight. The Book of the Dead example shows off what is possible with that pipeline. The Light Weight (LW) pipeline strips out the unneeded rendering for lower end devices.
Tim Cooper(Unity, @stramit), Natalie Burke(Unity)
They demonstrated how to get started with this Scriptable Rendering Pipeline. At the very beginning of starting a project in 2018, you can select HD or LW as a preset. Something to keep in mind is that materials will need to be adjusted since these are using different pipelines (for example, the Standard shader has a HD and LW versions).
They also demonstrated how to use and blend between Post-Processing Volumes (the new Post-Processing V2 stack).
They also talked about the new Shader Graph that is a node-based shader creator built in.
1 Game, 6 Headsets, 10 Controllers: Multiplatform VR with 'Floor Plan'
Nic Vasconcellos(Turbo Button, @njvas), Holden Link(Turbo Button, @holdenlink)
These guys initially created an Elevator game for GearVR and described the challenges it took to porting to Oculus Rift. For example, now that the user can now look outside of the elevator, so they had to add more 3D models to the scene. They also showed all the different interaction systems they tried now that hands were possible. The takeaway was that it isn’t worth chasing the install base. It’s all about timing and it’s better to pick your battles than try to do it all.
Andrew Eiche(Owlchemy Labs, @buddingmonkey), Cy Wise(Owlchemy Labs, @cyceratops)
Andrew and Cy talked about how to make VR more accessible. They conducted an analysis of a wide selection of VR titles on the store with a criteria mostly focused on mobility (e.g How much bending? Can you play this game seated? Can you use one hand?). They found that we as an industry do pretty well with seated experiences and not too much bending, but we fall short with one-handed experience). They did a great job of pointing out experiences that did well in different ways.
They had two recommendations to explore:
*Notes courtesy of Alexandria Heston (@ali_heston, tweet)
Robin Hunicke(Funomena, @hunicke)
Robin took us through her process of creating Luna from conception to the final product. Her big takeaways are that if you have the ability to: slow down and explore the idea as much as you can before development. I was surprised that it took 5 years to make the game. She came off Journey and wanted to explore a more personal project surrounding sexual abuse that she endured when she was growing up. It took two years working with an artist to explore themes and art styles. She talked about how challenging it is to pitch this type of project. She showed lots of prototypes. It initially was not a VR project that had not even crossed her mind. The VR piece came when they had already started exploring Intel’s hand recognition.
Joachim Holmer(Neat Corporation, @joachimholmer)
Joachim walked us through the inspiration behind the Portal Locomotion. They looked at how they didn’t like the current state of locomotion with teleportation. You can travel as fast as you want and as far as you want. It would be cheap to add a timer to it, so they took inspiration from Portal and Unreal Tournament portal gun. Unlike Portal’s portals, these ones don’t allow you to throw objects through. They talked about the development challenges in implementing this system.
Kevin Harper(ustwo, @angryarray)
Kevin works as a Unity Developer for ustwo in NYC as a Unity Developer. His talk focused around how through prototyping we can uncover problems that we didn’t even knew existed. His team got in a habit of writing down ideas that they wanted to prototype and then on a Friday would conduct a “#1HourPrototype.” (e.g. how does it feel to swim in VR?)
He ended by challenging the audience to tackle three problems that he thinks are worth exploring: 1) Weight, 2) Health, and 3) Inventory. He also showed a very clever prototype where you play as a gunslinger during a shoot-out, but your hat keeps falling down over your eyes, so you have to use your gun to lift up your hat.
Jeff Hesser(Harmonix), Kevin Cavanaugh(Harmonix)
They talked about how to create interfaces for gaze-based interactions. They went through the process of creating a comfortable experience for 3DoF through an angle-based approach. The designer would create a pizza-slice 3D model to show the angle that’s comfortable and then could move the UI in the Z knowing that the ratio will be the same.