June 23, 2020

Spatial Audio and other WWDC 2020 reveals

The last year has been one of anxiety at checking the news. What new fresh hell awaits when twitter is opened up? Today's kick from a pointy boot came with the announcement that The Alamo Ritz is closing down permanently due to a restructuring through Chapter 11 related to this seemingly never ending pandemic. The Ritz location of the Alamo Drafthouse was the flagship of the Alamo Drafthouse and the location to countless good memories. While most other locations are staying open, the loss of the Ritz is a bit of an emotional gut punch. It represents a piece of creative nexus to the city of Austin. The theater itself has been around since 1929 and has been a number of things over the years. Before the Alamo took it over in '07 it was different movie houses (sometimes adult), a comedy space, an arcade and a venue.

Texas legends the Big Boys

Photos in the Ritz location have seminal punk bands Black Flag, The Big Boys and the Misfits performing in the same spot where one would go see film noir retrospectives like Noir City along with the latest Marvel blockbusters. Walking out from a film in the evening into the busy somewhat debauched atmosphere of 6th street was always a stark contrast from the suburban theaters most of us attend these days. The decor was a perfect mix of old world theater with modern flair. The Ritz was the go to for Weird Wednesdays, Music Mondays, Master Pancake and all manner of innovative programing. Many a premiere was held there and occasionally you would get lucky and have Tarantino just show up to talk movies.

Friend of Fairworlds and film editor Greg McClennan curated and hosted film screenings and marveled at the skill of the staff and projectionists. "Those projectionists would juggle 16mm, 35mm and 70mm without a blink while also being the most calm and professional humans alive. The staff, bless them, put up with a lot of stupid ideas and insanity from us all and I'll never fully been able to thank them enough for indulging and participating in so many spectacular nights.The Alamo is a magical place, but the Ritz had the ability to get everyone to begrudgingly slog downtown to seek shelter from the madness of 6th st. and within its hallowed walls, force you to give over to the power of cinema. Maybe it was because we were all trapped together or maybe it was because the Ritz holds a special kind of theatrical punk rock magic in it's bricks, but that place was special."



In these COVID times, I look forward to the end of my day when I can turn off my brain out of its loops of logic and creativity and can go all out in my body. I slip my Oculus Quest onto my head and grab the controllers. I’m presented with a gorgeous vista out to a valley in Iceland and feel a wave of relaxation. I look down at my hands and I’ve got Baseball Bat Hands. I start up the daily exercise and I see a coach in front of me. She tells me a little about the inspiration of the routine and we teleport to a frozen-over lake in Iceland with a gorgeous sunset splaying color across the ice. She says she is doing the exercise right along with me. The coach tells me that this is going to focus a lot on the lower body - lots of squatting and lunging.

A portal opens up in front of me that feels like an optical illusion - it looks like a reflection of the world - cutout and flipped on its head. Streams of particles emanate from the portal bridging the gap making me think of the Rainbow Bridge of Asgard.

“Can’t Hold Us” by Macklemore and Ryan Lewis starts playing and I have no time to react, but to start hitting orbs coming at me. The color of the orb designates which bat I should hit it with. The direction of the hit is indicated by a glass ampoule attached to the orb. Today since it’s leg day, I’m seeing lots of neon triangles that come up to me. It’s up to me to keep my head below the apex of the triangle. When the triangles form, the tip of the triangle changes from left, center, and right - forcing me to squat and lunge. The coach suggests letting my butt fall back to protect my knees.

The coach reminds me that the more accurate I am with hitting the orbs, the more orbs I will see. The song’s difficulty changes dynamically based on how well I’m doing. She also tells me to breathe. I catch my breath between songs. Each song is a different landscape. She’s asking me to tighten my abs to get more power. Power is tracked by the velocity of the bat when I hit the orbs.

By the end of the playlist, I’m sweating and ready to cooldown. She has me shrug my shoulders and stretch out my arms. She says she’s proud of my work and that she will see me next time. After a few seconds, a screen pops up showing how accurate and powerful I was. Today I got a Supernatural score of “Double Platinum,” which indicates that I was on average between 90-95% with accuracy and power. If I had a heart-rate monitor, I could see how well I did. I also see my cumulative Supernatural score for the week compared to my friends on leaderboards.

----------------------------------------------------------------------

This description should give you a sense of what a workout feels like in Supernatural. Multiply these 20 minute workouts per day for the last two weeks and that’s been my routine. It’s short-format, new workout every day that keeps it fresh and me coming back.

The first inevitable question is why not just use Beat Saber since the gameplay mechanics are so similar. I say do both. I’ve actually noticed that I’m much better at Beat Saber. I feel much more confident with stronger shoulders and I have new techniques to respond faster. With Supernatural, they advise you to control your breathing and tighten the abs to respond quickly.

The second inevitable question is whether it is worth $20 per month for this service. I don’t know if I can answer that. I would recommend trying it for one month and seeing if it works for you. I do recommend doing it more than once - I was initially skeptical for the first few days until I was able to pick up on the techniques and settle into a groove. So give it a try and let me know how it goes!

THE DISSENT from Erik Horn, co-founder of Fair Worlds

I was really excited when Supernatural was announced, as I have been of the mindset that VR’s secret “killer app” is being an exercise device. This potential for VR as a workout came to me when I was sprinting at full speed through a half baked tech demo on the Virtuix Omni over five years ago. And it honestly felt a bit more like ice skating mixed with snowboarding, but wonderful and exhausting all the same. I hope the device gets more adoption if and when Location-Based Entertainment (LBE) VR makes a comeback post-Covid (more on that soon).

Fast-forward a few years, I bought wrist weights and became obsessed with “working out” in VR. I earned all of SuperHot’s Steam achievements (a first for me, as I’ve never seen the appeal in achievements). My arms would burn after playing Rec Room Quests that featured the bow and arrow and it’s super physical game mechanic. In a similar vein, I turned on the piñata feature in GornVR and most likely pulled a few muscles using the weights and the virtual mace to knock candy out of cartoon gladiators. (What a weird sentence).

Then... Beat Saber launched in 2018 and we now have the gold standard for VR exercise in my mind, a VR game that has almost infinite replayability, AND an incredible aerobic workout hidden in its DNA.

And that is the word of the day for me: “Hidden.”

Before VR, I hated working out. All caps HATED it. Now I love it. Why? Because it doesn’t feel like working out. My routine now is that I get on an exercise bike, pop on my Quest (with this sanitary silicone insert that I’m able to wash in-between workouts), change some Beat Saber settings (no obstacles) so I don’t topple off the bike blind folded, and I blissfully play for roughly 30-45 minutes or until I notice through the sweat and (exaggeration) high scores that my arms are about to fall off. I’ve never looked forward to working out, but now I do, it’s an amazing way to escape quarantine life and “get my sweat on” (ugh).

So that brings me to Supernatural. I’m not going to say that I don’t like it or that it has issues, but simply: It’s not for me. I’m not motivated by the type of instructors they employ and the Peloton-eseque style of daily classes. But you know what, thousands of other people are, and that’s awesome. I can’t argue with that and I think this is the first VR game my wife will want to play repeatedly. Anything that furthers VR as a killer workout is great in my book.

My last thought on it is that it is absolutely beautiful, it's worth the price of admission (free trial for now for the first month) alone to hop in and marvel at how they achieved the 360 landscapes and how smoothly it runs on the Quest. It’s a real achievement.

So, speaking of achievements, I need to go hop on the bike and into Beat Saber and try to get a few more.


The word LiDAR is popping up up a lot these day, and this is just the beginning. Apple is giving it a big push and we are really going to start seeing it’s benefits and uses in our daily lives sooner than later. I predict, that in the next few years, LiDAR will be a household name.

Image for post
I always wanted to have Terminator vision

Light Detection and Ranging (LiDAR) is really the stuff of 80’s science fiction fantasies. Ten year old me would’ve wondered why we’re not all making a bigger deal out of having it on our phones. The phone sends out a pulse of light into a room. That light bounces back and transmits the room’s current layout. It’s like sonar, except using light, instead of sound waves. Thousands upon thousands of light pulses, give information about the dimensions of the room. In addition, the same information is created for all of objects in the room. The data basically creates a 3D model of the space. It knows how big and far away the couch is, and how tall and wide the pile of laundry you have to fold on that couch is.

Obviously, this technology is good for more than just clothes pile triangulation. LiDAR is essential in a diverse range of industries. Self autonomous vehicles, safety features on current cars, archeology, geology, forestry, atmospheric conditions, mining, astronomy and surveying all use Lidar.

Image for post
Oh, that's what that thing is

The newest development in LiDAR is that it now comes on the iPhone 12 Pro, marking it the first smartphone with such capabilities. There aren’t currently many bells and whistles with apps implementing LiDAR on the phone just yet. What we get right now are significant improvements on any thing AR related, and the promise of potential for big things to come.

The fact that Apple is sneaking this feature in now, on their niche high end models, bodes well for LiDAR’s future. By introducing the LiDAR sensors and providing big ARkit updates, Apple is giving developers a chance to experiment. Apple has a history of introducing new features on their higher end devices before trickling them down to cheaper models in later years. Inside word has it that the entire iPhone 13 line will have have a LiDAR scanner, as opposed to just the Pro models of the 12. We can mostly likely expect more seamless interaction with AR game apps, (Pokémon that jump from couch to chair to floor), or for shopping (Sizing the 72 x 84 Pokémon painting in your bedroom). The LiDAR addition to photos will help with low lighting and focus improvement.

Image for post
Imagine a tiny cluster of pictures

Most likely LiDAR will make its way to other products. Certainly the developers could have a variety of uses for all sorts of things most of us haven’t thought of yet. The hopeful guess is that LiDAR will make a big part of Apple’s much rumored AR Glasses. The possibilities are exciting to say the least. Much of the experimentation and data collected now in these sensors could be what we are wearing in the future.


A Poly Puppy by Ayako Hirayama

Expeditions and Tours are closed for business.

announced recently in an email to Poly users, that they will be shutting down the 3D-object creation and library platform “forever” next year.

For the uninitiated, Poly (while it’s still up) is a 3D gallery with everything from fully realized environments, to smaller sized assets — many of them created with Google’s VR 3D creation tool, Blocks. Many of these assets are for sale or free with credits. The site has been a much beloved beacon for VR artists, and an important tool in their workflow as it integrates with Unity. Shutting down the site is a huge blow for those who showcase their work there, as well as those who used it’s VR developer tools. The timing is particularly painful for VR artists who might already be challenged financially due to Covid. It would be akin to musicians losing Bandcamp, or artists who creates commissioned paintings of dogs that look like Napoleon losing Etsy.

The service is said to shut down on June 30, 2021 and users won’t be able to upload 3D models to the site after April 30, 2021

Alliterative headline aside, it would be a true shame to lose all this great content. While most of the artists will get a heads up and be able to back up their projects, the connectivity of this community will be a great loss. To be able to scroll through an organized site with curated pages and links to artists other works, as well as, genre categories is a big win that will be sorely missed if brought back in a less accessible way.

A petition by U.K. based artist Rosie Summers (check out her amazing work), has been started, to keep the site alive and make it open sourced. The petition states : “This directly impacts artists and creators using their (Google’s)paid product, Tilt Brush, as they lose the ability to easily share and view their creations easily with their clients and the world.” Summers laments the timing of the short sighted decision, not only because of issues surrounding the pandemic, but also that it’s at a clear start of another VR/AR boom..

“To me the timing of the shutdown is particularly odd, as the cost of entry to VR is rapidly falling, and it feels like the XR content revolution is about to begin. Poly would and should have been a key component in the digital asset space.”

The petition has over 500 signatures already, and it seems like it’s not too big of an ask for Google to open source the site.

We propose that Google should open source as much of the Poly platform as possible, allowing the community to create a drop-in replacement for this vital service. This includes the brush shaders that are vital to displaying Tilt Brush creations correctly.

To sign the petition go here

Check out our other post about Expeditions and Tours shuttering which happened earlier in the year.

Todd: I’m a UX design engineer here at Fair Worlds, and I’ve been using Unity for five years. I started in 2015. It was largely because I was hearing so much about the potential of VR, but it seemed daunting right from the get go. I was going to these indie game dev groups, because I was interested in game development in general and started to hear about Unity. I looked at the interface and there were so many different windows and it was really confusing. Right now it’s much more approachable than it was back even five or six years ago.

I went to school in user experience design, at the University of Washington and in that process, got interested in accessibility work. VR had just jumped off the map when the Oculus rift and  the Vive came out. So I got to do some capstone projects that were really exciting. One was with the (at that time top secret) Valve Index controllers.

They called them knuckles controllers back then, but we did a lot of civic experiments of what it felt like to be able to hold the controllers. Instead of tapping the button to throw something we tried different things to simulate, say, what it feels like to grab a fireball? To see if there was any sort of immersion breaking when that happens. After that,I got involved with Fair Worlds pretty soon after that. It’s been a great journey, just taking what I learned in those years in university, with Unity and exploring that, and then kind of now bringing it into sort of agency life.

Erik: You said Unity was daunting when you jumped in. What was your entry point? Were there a series of tutorials you looked at? Were you going to the meetups and the jams? What was that experience like?

Todd: Do you guys remember the game Myst? (everyone enthusiastically agrees) My point of entry was HyperCard if you remember that. It was the 1990s so my point of entry was you that have stacks and then you’ve got cards, right? So every screen would be a card.

The syntax, the coding language, was very English.The classic Hello World example is like put quote “Hello World” unquote into a field. So instead of you could actually read it like English and kind of understand it. Whereas if you look at C-sharp or something like that, it’s much harder for people to get up to speed.

Gates: Well, just a little background about myself. I’m a filmmaker and Erik and I had a company together a while back called Super Alright. It was similar to FairWorlds although it wasn’t interactive. It was more, video and some motion graphics. I stayed in film whereas Eric went more in the graphics, UX, and interactive field. I kind of moved more into filmmaking, but I’ve always been passionate about games. Chris is my brother. He’s very passionate about games. And then between he and Erik tickling my brain about what’s happening in that world and then the pandemic shutting down the film industry, my interest in it became kind of two-fold. The idea of maybe trying to pivot to game design as visual storytelling. Tthe medium is exciting and the possibilities got me thinking about that world. I‘ve been learning more about virtual cinema and the power of unity for filmmaking itself. It looks like engines are going to be replacing the Mayas and the Cinema 4DS of the world, or at least augmenting or supplementing them. It’s tricky for me because I don’t have an interactive background. I know a little bit of 3D, not much. I guess my whole thing is that I feel a little too behind the curve and a little too old to dive in and become a hardcore dev, but I want to be able to speak the language and I want to know what the possibilities are.

Chris: Quick aside for you there though, Gates. If you’re feeling like you’re a little too old school, if it’s any consolation, I know the creative director for Overwatch, that super huge Blizzard title. His very first game directing was Overwatch and he entered game design as an industry when he was like 33.

Mike: I’m still 1000 years older than that so it really doesn’t help (everyone laughs knowingly)

Todd: The industry is so new that everybody’s kind of getting up to speed on the new tools. I know a few friends who are 3D artists and they don’t use the traditional tools. They don’t use Blender or Maya. Instead they instead use Tilt Brush or use VR only tools.

Gates: You can model like that with TiltBrush?

Todd: Yeah, you can model, you can do Google blocks. You can use Oculus Medium. There’s four or five different 3D modeling tools that are VR specific. By doing that I think that it’s fun to play with your own limitations. I know in game design there’s a guy who created a game that was basically just using cubes. It’s using whatever kind of storytelling, and starting off with just cubes or spheres and going from there.

Everything

Sean: You’re thinking of David O’Reilly. He’s a digital artist, but he made a few games that are really interesting. One’s called Everything. You can start on a macro level, a cellular level and you go through to the universe, or you can start as a universe and you can decrease down to the cellular level. It's a different take on perspective.

I’m Sean. I have a small, litigation support firm, that does legal support, but we use a lot of A.I. and technology to crunch the data, gather public data sets and make recommendations to attorneys based on the predictions of judges, rulings, or the likely likelihood that they’re going to rule on one motion versus another, depending on what. I had a small startup called, wrap God.AI with a company called Second Brain here in town. And this was two or three years ago. We built a rap engine, basically that would cumulate to the style of various rappers. So we trained the machine’s learning engine on all the different hip hop artists and lyrics that we could find and we went to Techstars incubator. Then we went to Dubai to do a big tech conference there, which was cool. And then it didn’t cap and all the monies were dried up and everyone else kind of caught up to us. Now there’s tons of artificial intelligence writing tools that are coming out, getting the market that we were sort of pursuing two years ago, and GPT3.

I’ve been looking at non-linear storytelling where you can just immerse yourself. For instance there is a meditation app called TRIPP.

It claims that it’s Artificial Intelligence. Every iteration is different. So every day there’s different sorts of visuals and there’s a different process. The unique thing about it is that it listens to your voice and your breath. When you exhale there’s little sparks and embers that emerge from where your mouth would be and when you inhale they breathe in. The graphic element of breathing in and out really sort of paces and sets your breathing, which is cool. So the intersection between the biological processes and VR or AI is pretty interesting because it paces you through a meditation.

Todd: There is an experience called SoundSelf. It’s a meditative experience where you lay down on your back and put on a VR headset and then as you breathe in and then breathe out, it moves you forward in the experience. In this experience also supposed to be humming, so you hum. And the visuals will change. So it’s very kind of psychedelic. but kind of meditative experience.

Gates: Is there some sort of advice you would give for someone starting out? Is it better to burn through a bunch of tutorials or is just learning code better? I would be curious to hear what you guys have to say about starting out and common pitfalls and the things you wish people had told you that you didn’t know?

Todd: What I would suggest is looking at specific tutorials and failing fast. Maybe starting off with small chunks. Instead of going for the super ambitious idea, I would suggest going for something, that’s going be playable within a week. Kind of work up to that bigger idea, but start off small. The nice thing about Unity is they’ve got some built in tutorials.


What’s cool about the tutorials is that they have these little modules. So whatever interests you, you can add on to that particular one. For example they have a Mario Kart one that you can play around with or they have a first person shooter version, and then whatever interests you can kind of explore.. You can start off with the full project and all of its files. And then the nice thing is you can sort of sum things up, be running immediately and you can kind of kit bash. You can bring in your own 3d assets. You can cut up what it’s doing.

Gates: You can actually use that base as a springboard for your own projects instead of starting from scratch?

Todd: Totally. That way, you don’t have to start from a blank canvas, trying to figure out, say, that there is an empty Skybox. If you don’t know what to do you can start with a template and then kind of learn from how they’re doing it. I think that’s probably the best approach, because if you start completely from fresh, there’s so much onboarding that you have to learn.

It doesn't end here. Stay tuned for part two of our roundtable with filmmakers and interactive developers!



Ready Player One
In the year 2044, reality is an ugly place. The only time teenage Wade Watts really feels alive is when he’s jacked into the virtual utopia known as the OASIS. Wade’s devoted his life to studying the puzzles hidden within this world’s digital confines, puzzles that are based on their creator’s obsession with the pop culture of decades past and that promise massive power and …more.


Tokyo Ghost Volume 1: Atomic Garden (Graphic Novel)
The Isles of Los Angles 2089 — Humanity is addicted to technology. Getting a virtual buzz is the only thing left to live for, and gangsters run it all. And who do these gangsters turn to when they need their rule enforced? Constables Led Dent and Debbie Decay are about to be given a job that will force them out of the familiar squalor of LA and into the last tech-less country on Earth: The Garden Nation of Tokyo.


Rainbow End
Robert Gu is a recovering Alzheimer’s patient. The world that he remembers was much as we know it today. Now, as he regains his faculties through a cure developed during the years of his near-fatal decline, he discovers that the world has changed and so has his place in it. He was a world-renowned poet. Now he is seventy-five years old, though by a medical miracle he looks…more


Snow Crash
In reality, Hiro Protagonist delivers pizza for Uncle Enzo’s CosoNostra Pizza Inc., but in the Metaverse he’s a warrior prince. Plunging headlong into the enigma of a new computer virus that’s striking down hackers everywhere, he races along the neon-lit streets on a search-and-destroy mission for the shadowy virtual villain threatening to bring about infocalypse. Snow Cra …more


Malcolm Gladwell kicked off the Unity For Humanity 2020 Summit in typical Malcolm Gladwell fashion. He told a story. A meandering story, that made me wonder if he had forgotten  what event he was at halfway through. Then, like a yarn spinning wizard, he tied it all up perfectly at the end. I’m a fan of Malcolm, and even when the ol’ Gladwell switcheroo is coming, I still am delighted by the twist. On the surface, his keynote seemingly had exactly nothing to do with XR and technology, but it actually had absolutely everything to do with XR and technology.

I’m doing my best to paraphrase here, because I ain’t Malcolm.

There was a fella named Chester Winger,  who recently passed away at the age of 102. He was a pastor in the Mennonite church. Mennonites are kind of like the Amish. Except the Mennonites are ok with cars and zippers, however, they are still very conservative and old world religious.

Chester lived his life in service to the church as an Ethiopian missionary for most of his early adulthood, until he moves to Pennsylvania to preach in the church. In the 1970’s, Chester’s youngest son, Philip, came out to his dad as gay. Chester accepted who Phillip was, but the church did not. Philip was promptly fired from his job at the church,  and subsequently excommunicated. Philip found another church,  and Chester, while disappointed in his church’s decision, continued preaching.

Cut to 40 years later, the year 2014. Pennsylvania has now legalized gay marriage. Chester performs the wedding ceremony for his son Philip, and Philip’s longtime partner. Knowing that this act violates church precepts,  but wanting to be forthright, Chester goes to the church and informs them what he has done. The church asks Chester to step down from his 70 year role as pastor. Chester does so,  but then writes an open letter on Facebook (again, lucky he isn’t Amish). He writes an appeal to his church to reconsider their position on gay people. He writes with eloquence,  and asks how the church thinks Jesus would behave towards  sons and daughters who have it so much more difficult in communities around the world.

After he posts the letter it goes viral. I even remember reading the letter, despite having a Facebook feed that mostly consists of snarky comments about werewolf movies. The letter continues to gain traction and is a catalyst for change in the Mennonite church regarding the LGBT community.

The tie in between this story and XR,  is inclusion through empathy and storytelling. (Again, I’m not as eloquent as Malcolm). Using technology to tell his own personal story,  Chester was able to affect more people. By making the storytelling personal,  he opened people up to the situation better than a cut and dry scriptural argument. In his view,  the church, by excluding a group of people, had contradicted the very teachings of their own doctrine. He didn’t provide dry facts and he didn’t do a PowerPoint presentation. It worked because he told a story. It was personal. People could get inside it. Stories have power, where arguments do not.

This was the theme of the Summit. How this new technology can truly change  the user. Facts are great,  but facts with feelings, hidden in a story, make change. It was a powerful keynote and a great start to the event.

After the keynote, Embodied Labs spoke about creating modules to help others experience the aging process. Their VR training platform, called DIMA lab, helps build caregiver’s empathy. Using film and animation,  the user gets a  first person experience of having hallucinations from dementia. Multiple elderly caregivers said that the training has changed how they interact with patients. The DIMA Lab modules feature several diverse protagonists,   in order to give different examples of struggles that older adults face. These can range from experiencing hearing or vision loss to the very different types of hallucinations brought on by Alzheimer, Parkinson’s or Lewy Body. The training was given to over two thousand caregivers. (Learn more at beingpatient.com.)

Being Henry is a VR documentary about Henry Evans. Henry is a paraplegic technology enthusiast, who works with assisted robotics. Unable to speak, Henry used technology to communicate with people. The experience gives you the opportunity to see through his eyes, how he lives, speaks and interacts with the world using eye movements and technology.

Thicken the Plot focused on the work of Tamara Shogalou. Tamara created a three-part experience focusing on the the plight of Egyptian LGBT people, seeking asylum,  and the risk they seek. I was drawn to some the the stylistic choices of her film work,  and her mix of gameplay. The idea of using the core material for both playable and watchable versions is intriguing.

Another Dream, a hybrid animated documentary and VR game, brings the gripping, true love story of an Egyptian lesbian couple to life.

Deep is a simple VR app used with breathing. As a recent devotee to mindfulness mediation I can speak to the efficacy of mindful breathing. I’ve just been using the Calm app on my phone. The Deep app is scientifically proven to increase the efficacy shown with mediation and deep breathing. Putting it into a game. It is being used to help treat those who suffer from PTSD.

I’m someone who has to grapple with the instinct to change minds with only raw facts at times. Unity For Humanity was a good reminder that this isn’t always the most effective tool. Even in just casual conversation, intellectual intelligence seems to work better in conjunction with emotional intelligence. And when you pair that fact with immersive technology, the possibilities become very exciting.

For more of the over 50 talks at the Unity For Humanity 2020 Summit check out the playlist here


It seems to be somewhere in between a total loss and simple site move but The Google Expeditions app will be no more after June 30 2021. Expeditions enabled students to take virtual field trips on 360 tours in significant historical or geological locations via Daydream

While the app will go away MOST of the content is said to be moved over to Google Arts & Culture app as 360- degree experiences. The good new is that it will still exist unlike (unlike Scott Pilgrim vs. the World : The Game is getting a updated rerelease after being delisted for the last 6 years ). The bad news is that it’s unclear how accessible in VR the content will be. Google Arts and Culture is a fee app and web destination that gives users access to art and cultural pieces for viewing from the comfort of their personal computer or smart device.

The reasons being listed for the discontinuation are the covid related. Google’s Jennifer Holland dropped the news on Friday in this blog post “Engaging students in the classroom has taken on an entirely different meaning this year,” Holland wrote. She later added: “We’ve heard and recognize that immersive experiences with VR headsets are not always accessible to all learners and even more so this year, as the transition to hybrid learning has presented challenges for schools to effectively use Expeditions.”

Makes sense. It does seem like this would have been something that would’ve just been in the one A.P. class that could never get into. With kids being stuck at home and technology leaping over Google Cardboard this incarnation of educational VR may have just been wrong time at the wrong place.

Hopefully Google Arts & Culture can build on the goals of what Expeditions started as far as interactive educations. Google Arts & Culture is continuing to expand its augmented reality (AR) content using interactive camera features, such as Art Filter and Art Transfer, that help you learn about cultural artifacts in new and engaging ways that would otherwise not be possible to create in the physical world.

See you in class


These last few months, I have been making augmented reality (AR) prototypes with ARKit and ARCore at my desk at CoMotion Labs as the Lead Prototype Developer at Fair Worlds and I would like to share what I’ve learned.

Overview

You've probably heard “AR” in the news. But what the heck is AR and why should I care?

AR overlays information on top of your existing world unlike virtual reality (VR) which completely absorbs you into another world. You’ve probably seen a recent example of AR with Pokémon Go.

Pokémon Go players, like the one above, would catch virtual Pokémon by wandering around the world. The game mobilized players to leave their homes and actually go on walks instead of binge watch the latest show on Netflix. You could see people wandering in groups around looking furtively back and forth for Pokémon calling out to each other. You can still find players playing around Pokémon Gyms, which are real life landmarks that have virtual gyms super imposed on them. For a couple weeks in the summer, everyone was absorbed into the craze. Pokémon Go was the gateway application of AR.

Niantic, the developers of Pokémon Go, are set to make another AR experience with the beloved franchise Harry Potter in 2018.  So AR is big and it’s not going anywhere.

With Pokémon Go leading the way, there has been a thirst for more augmented reality experiences. This year Apple opened up the flood gates for AR development. They launched a single-camera AR solution with ARKit. Google followed up with their version of ARKit, ARCore, a few months later.

Before this, AR development was limited to only a select group of developers because the devices were not commercially available. Google was working on a two-camera AR solution with Project Tango, but it was not sold for public consumption.

Above is a demo ofProject Tango 3D capturing a physical room.

Microsoft has a similar ability to map the 3D world with the HoloLens. The $3,000 price tag though keeps this from becoming widely adopted.

Here you can see the table turns into a Minecraft world from E32015.
So why is this a big deal? Because ARKit/Core is a technology that is for the masses or at least the masses that have a smartphone.

I won’t go until the super technical, but if you are interested I would read this article by Matt Miesnieks on "Why is ARKit better than the alternatives?" What I will tell you is that ARKit/Core gives you the ability to place objects on surfaces.

ARKit/Core detects surfaces using the camera and accelerometer/gyroscope, which allows you to drop lots of LaCroix on the floor as seen above. It doesn’t map your whole room into a 3D mesh for you like the HoloLens. So you can place objects on these surfaces like floors and tables.

ARKit/Core gives you light estimation through the camera, which modifies the ambient intensity of the scene.

ARCore Light Estimation Demo

Insights

I want to go through some of my personal experiences while developing for AR.

The first experience I created was called ARKatz. The basic premise is that you need to collect mischievous cats and capture them by booping their virtual noses

The idea of the project came from the above meme that has been floating around and

I got excited about physically engaging with virtual characters in a DIRECT manner with the "boop." I'm telling you it's the next paradigm shift in user interaction design. ;) It's going to be BIG!

One of my core tenants as a designer is to try to make experiences more than just pushing a flat touch screen. We’ve seen a dystopian future with the Black Mirror TV show where people become slaves to their devices.]

​​

I worked on the concept of virtual boops that hopefully reengages an appreciation of your surroundings. What I found was that when I went on walks, when I came across a cat, I wanted to put the virtual cats into the same scene as the real cat. I as a dumb human liked to create stories about these cats.

That’s the power of AR is that you can re-contextualize the mundane.

Amazon and IKEA have realized that and have been updating their apps to allow users to see what their products will look like in context before they buy.

Housecraft by @AsherVo and @DanielZarick

There’s incredible value of seeing a virtual 3D object placed in your home. There’s no more guess work. It’s now overlaid onto your world.

Stay tuned to our site and blog to see more of our AR experiences we're building for our clients. It's going to be gigantic in 2018!


Earlier this month Amazon dropped this bomb on not just the VR/AR Twitterscape...but the whole mobile web.

I actually missed the initial wave of news by being in back to back meetings, so I sent a snarky text to my co-founder James Brunk when he broke the news to me.

Text exchange

(He was asking for it to be the topic of our still yet unnamed and unrecorded podcast, look for that soon!)

But what does this really mean? In a nutshell, Amazon has broken down the door for brands to jump into Augmented Reality. Because they are the number 5 free app on the store, brands aren't saddled with the issue of getting someone to download a new app.

But what does it do currently? This really.

And this

You are able to see items featured on Amazon in the setting where you would place them (IRL) in your home or office. All in fully immersive 3D Augmented Reality. This isn't a "sticker" or a trick, it looks and feels great out of the box.

This is from the official release.

"Amazon’s latest augmented reality offering within the Amazon App launched today for customers with iOS 11 installed on their iPhone 6S or later. Using Apple’s ARKit, AR view helps customers make better shopping decisions by allowing them to visualize the aesthetic and fit of products in their own living space. Customers simply open the Amazon App, click on the camera icon and choose AR view. They can then select from thousands of items – from living room, bedroom, kitchen and home office products to electronics, toys and games, home décor and more. Whether customers are buying a sofa or a kitchen appliance, they can overlay it onto their existing living space, move it and rotate it to get a full 360-degree peek in a live camera view to make sure it fits their style and aesthetic."

But how can you be a part of it?

Well....Today's Phrase of the Day is OPTIMIZED 3D MODEL

If you are a brand who sells on Amazon, the best way to get there is to take your product and create an optimized 3D model for the app. It's that simple (and we can help).

The better news is once you've done that you're well on your way to creating your own AR/VR storytelling experiences with that model, and you can deliver it across a lot of different platforms.

Like Sketchfab (think the YouTube of 3D)

Toaster Remake by siotech2011 on Sketchfab

Or WebVR, or Room-Scale VR, or a video, or printed material....the list goes on. The 3D asset is the biggest barrier to getting to immersive storytelling.

Look for some real (augmented) world examples from us on this front very shortly!


Gif sourced from HERE Technologies recap video

We were thrilled to be able to attend CES 2018 this year with our new client, HERE Technologies. We helped install and set up our interactive installation in the very impressive HERE booth, well less of a booth and more technically a two story building constructed from scratch, right outside the convention center.

Our project was an interactive screen powered by a tablet that enabled our client, Alex Osaki, to walk customers through some of the new innovations around tracking for industrial and shipping.

The above is the entire split screen video capture of the tablet that controlled the experience and the screens that it projected on.

While we were mostly occupied with that project, Nick and I were able to get around the show floor a bit and here are some of our impressions.

Vive 2.0 (Pro)
Photo from Road To VR

I was honestly surprised to see this announcement. We were expecting something from HTC but I didn't think the new headset would be coming out this soon, especially with such a large increase in screen resolution. 78 percent to be exact.

The new headset will have a Dual OLED display with a 615 dpi resolution – that's a 78% increase over the current HTC Vive's resolution. - From The Road to VR

I've found that above fact was left out in most mainstream coverage. A lot of folks are saying things like "a moderate upgrade in the resolution," but in actuality this is a huge increase to the already best in class headset on the market. (Though we are becoming pretty partial to the new Microsoft family of MR headsets.)

Vuzix

Nick and I were able to take some time to demo the new AR offerings from Vuzix and we are excited about the potential here. For technical and training applications this tech could be a great option for certain segments and the more consumer facing Vuzix Blade is interesting. They claim these are "fashion" forward and I think some would argue to the contrary...but in certain scenarios I could see wearing these out in public (but I'm the guy who shamelessly wears my Google Daydream on long flights).  What got my brain going the most was the possibilities around location based entertainment and groups wearing the glasses with a gamified feature that augments something physical like an escape room or immersive theater.

Samsung 4D VR

The most eye catching booth for my money was the mini theme park Samsung set up to showcase it's VR offerings paired with high end motion simulators. There was a snowboarding experience with a physical board, and a dragon ride with a full 360 degree motion simulator, much like you'd see at Space Camp. The lines were too long for us to try with our work schedule, but I was able to capture the above video. Notice that the 2 guys in this clip aren't exactly smiling....I'm not sure how enjoyable being spun fully upside down while wearing a mobile headset feels. I would have rather seen this done with the new Samsung Odyssey giving you some 6 degrees of freedom to prevent motion sickness. I have previously done the roller coaster demo seen next to it at SXSW, and the more simplified DBox chairs offer a surprising amount of immersion with far less stomach churning rotations. All in all, this is the future of themed entertainment in my book, especially for groups- giving a more shared experience vs. the isolation of a "one at a time" queue.

For another good breakdown of VR news around the conference we recommend this breakdown from Road to VR.

https://www.roadtovr.com/4-most-important-stories-ces-2018-breakdown-analysis/

I've got this [...physical object...]. Now, how can I bring it to an audience in [...new immersive medium...]?!

Baby Stroller by John Keane on Sketchfab

There is a theme in our work these days, and it's getting 3D assets to be optimized so that we can use them for a multitude of platforms, some old tried and true ones like video and social media, and new ones that are popping up every day like VR and AR (Google AR Stickers dropped yesterday...more on that soon).

The main issue most of our clients have is hunting down the actual 3D files for their product (usually a CAD file). Even if they can find the CAD, there is an education process on what these different file types are capable of doing. CAD is great for manufacturing and engineering but terrible at rendering videos or anything interactive. They are too complex to be rendered in a real time way. This gets us to the idea of optimization and creating 3D models that can be used a variety of new immersive platforms. The best way to think about is that you need a version of your product that could run in a modern video game.  Once you have spent the time doing this there are so many things you can do with it.  (It's not as much as you'd think, but it varies from product to product as they are all "unique snowflakes" - between 1-5 days typically for a 3D artist)

Lets go through the use cases.

VIDEO

This one is a no brainer - 3D videos have been notoriously difficult to produce and time consuming to render. With real time assets we can create these types of videos more easily and affordably.

FACEBOOK

Facebook this year has launched a new feature that allows 3D objects to be natively posted in the news feed. See this link for details, but this is a fantastic way to show off products in the ubiquitous social media platform.

This video shows how easy it is to post the models once they are created.

AR

This is a newer format and one that we're incredibly excited about. Phone and Tablet based AR is primed to really take off this year as consumers realize that they can unleash it's potential on their favorite devices (that they already own and constantly upgrade).

Below is an example we made for a proof of concept for Vertiv that shows the use case of placing data center technology in your existing space to see if it fits with your current footprint.

This was possible because we had already created the optimized asset for a VR project and also delivered it via the Sketchfab platform.

CRV 600 v2 by Erik Horn on Sketchfab

AMAZON AR View

I've written a whole blog post about this format - but the biggest takeaway here is that this is one the most downloaded apps on the planet, and how the majority of people shop...being able to include your product in it's ecosystem in AR is a slam dunk.

VR

Our favorite of the new immersive mediums, Virtual Reality allows potential customers to interact with your product in new and fantastical ways and without the distractions and constraints of reality (there isn't a virtual phone in their avatar's pocket buzzing every 10 seconds with push notifications).

Our favorite example is our Whole Foods project, which shows what you can do with optimized 3D models of wine bottles and frozen chickens.

GIFS

Just like the above one for Whole Foods, who doesn't love a good GIF, and having the 3D asset allows for all kinds of creative ways to make them easily for social media platforms.

This year our Lead XR Prototyper Todd Little made it down to GDC 2018 - here are his thoughts on the experience.

I’m very happy that I went to GDC this year. I came away super inspired from the people I met and the talks I went to. These last months I felt stuck in routine and that I wasn’t being as creative as I could be. The number one thing that I came away with is that I need to slow down and draw out my ideas instead of going straight to development. I also felt challenged to tackle problems facing accessibility in VR.

What follows are descriptions and insights I learned from the talks I attended at GDC 2018 split between two tracks: Unity and VR talks.

Unity Talks

AR Lightning Talks with Unity (Unity Labs)

Stella Cannefax(Unity, @computerpupper), Jono Forbes(Unity, @jonoforbes), Andrew Maneri (Unity, @yarharhar), Matt Schoen (Unity, @mtschoen)

Stella called AR Devices like babies: they have a very limited understanding of the world. Unity Labs is researching on ways to improve this understanding. How do we teach the device what is a ‘floor’? What is a ‘chair? What is a door? We could define it by being on a vertical plane, having a door knob, and can be opened. The technology currently only supports the plane detection and image recognition, so how do we know if a door can be opened?

This gets into affordances: our relationship with objects and the semantic understanding of the world. How do we teach devices about those types of relationships? One of the possible areas is through Machine Learning. Unity has been making a push to get these tools into developers, so that they can experiment and allow AI to learn from trial and error.

They pointed out that computer vision researchers have had recent success in semantic understanding experiments, which you can read more about here: “Semantic Scene Completion from a Single Depth Image.”

First Look at Unity 2018 Mobile AR Key Features

Jimmy Alamparambil(Unity, @jimmy_jam_jam), Tim Mowrer(Unity)

Currently, it’s difficult to do cross-platform AR development since ARKit and ARCore are separate services. Unity is trying to rectify this by creating a layer of abstraction, so that we can develop once and this new architecture, Unity’s AR Subsystems” talks to the specific provider (e.g. ARKit, ARCore, Vuforia). They are also decoupling their release schedule from the provider’s release schedule by introducing a new concept in 2018, which is Packages. So, if Apple releases a new version of ARKit, we don’t have to wait till Unity releases a new version - we can just import that package.

Hardware vs. Software: UX Loops, 1936 to Now

Timoni West(Unity, @timoni)

Timoni walked us through the history of computing and how the physical hardware changes the ideas of what are possible in software and vice versa (from an early version Duck Hunt from 1936 to the mouse to game controllers)

Unity at GDC Keynote

This year to improve graphics rendering and improving workflows. The Scriptable Rendering Pipeline allows us to create our own rendering. Unity is shipping two pre-made pipelines: the High Definition (HD) and Light Weight. The Book of the Dead example shows off what is possible with that pipeline. The Light Weight (LW) pipeline strips out the unneeded rendering for lower end devices.

Rendering in 2018: How to Get the Most out of Unity's New Rendering Pipeline

Tim Cooper(Unity, @stramit), Natalie Burke(Unity)

They demonstrated how to get started with this Scriptable Rendering Pipeline. At the very beginning of starting a project in 2018, you can select HD or LW as a preset. Something to keep in mind is that materials will need to be adjusted since these are using different pipelines (for example, the Standard shader has a HD and LW versions).

They also demonstrated how to use and blend between Post-Processing Volumes (the new Post-Processing V2 stack).

They also talked about the new Shader Graph that is a node-based shader creator built in.

VR talks

1 Game, 6 Headsets, 10 Controllers: Multiplatform VR with 'Floor Plan'

Nic Vasconcellos(Turbo Button, @njvas), Holden Link(Turbo Button, @holdenlink)

These guys initially created an Elevator game for GearVR and described the challenges it took to porting to Oculus Rift. For example, now that the user can now look outside of the elevator, so they had to add more 3D models to the scene. They also showed all the different interaction systems they tried now that hands were possible. The takeaway was that it isn’t worth chasing the install base. It’s all about timing and it’s better to pick your battles than try to do it all.

Accessibility in VR: How It Can Be Better

Andrew Eiche(Owlchemy Labs, @buddingmonkey), Cy Wise(Owlchemy Labs, @cyceratops)

Andrew and Cy talked about how to make VR more accessible. They conducted an analysis of a wide selection of VR titles on the store with a criteria mostly focused on mobility (e.g How much bending? Can you play this game seated? Can you use one hand?). They found that we as an industry do pretty well with seated experiences and not too much bending, but we fall short with one-handed experience). They did a great job of pointing out experiences that did well in different ways.

They had two recommendations to explore:

  1. Narratively if force-pull doesn’t make sense to pick up objects: you can use hover pick up to avoid bending.  
  2. They recommend experimenting with putting text in-world with speech bubbles. What do speech bubbles look like in VR?

*Notes courtesy of Alexandria Heston (@ali_heston, tweet)

The Story of 'Luna': Designing Accessible PC and VR Experiences

Robin Hunicke(Funomena, @hunicke)

Robin took us through her process of creating Luna from conception to the final product. Her big takeaways are that if you have the ability to: slow down and explore the idea as much as you can before development. I was surprised that it took 5 years to make the game. She came off Journey and wanted to explore a more personal project surrounding sexual abuse that she endured when she was growing up. It took two years working with an artist to explore themes and art styles. She talked about how challenging it is to pitch this type of project. She showed lots of prototypes. It initially was not a VR project that had not even crossed her mind. The VR piece came when they had already started exploring Intel’s hand recognition.

The Portal Locomotion of 'Budget Cuts'

Joachim Holmer(Neat Corporation, @joachimholmer)

Joachim walked us through the inspiration behind the Portal Locomotion. They looked at how they didn’t like the current state of locomotion with teleportation. You can travel as fast as you want and as far as you want. It would be cheap to add a timer to it, so they took inspiration from Portal and Unreal Tournament portal gun. Unlike Portal’s portals, these ones don’t allow you to throw objects through. They talked about the development challenges in implementing this system.

Exploring the Unsolved Challenges of VR Gaming

Kevin Harper(ustwo, @angryarray)

Kevin works as a Unity Developer for ustwo in NYC as a Unity Developer. His talk focused around how through prototyping we can uncover problems that we didn’t even knew existed. His team got in a habit of writing down ideas that they wanted to prototype and then on a Friday would conduct a “#1HourPrototype.” (e.g. how does it feel to swim in VR?)

He ended by challenging the audience to tackle three problems that he thinks are worth exploring: 1) Weight, 2) Health, and 3) Inventory. He also showed a very clever prototype where you play as a gunslinger during a shoot-out, but your hat keeps falling down over your eyes, so you have to use your gun to lift up your hat.

Mind Control in Mobile VR: Gaze Activated UI in 'SingSpace'

Jeff Hesser(Harmonix), Kevin Cavanaugh(Harmonix)

They talked about how to create interfaces for gaze-based interactions. They went through the process of creating a comfortable experience for 3DoF through an angle-based approach. The designer would create a pizza-slice 3D model to show the angle that’s comfortable and then could move the UI in the Z knowing that the ratio will be the same.

I am trying to figure out how to perform in this pandemic. Not only I'm I trying to perform basic human behavior that has started slipping away from being isolated for so long, but I also want to be onstage. I am a performer. These past 7 or 8 months is the longest period of time in the last 25 years, in which I have not participated in some sort of live performance. I’ve got a void that requires attention from strangers, and I need to figure out how to fill this chasm, ASAP.

“Live” concerts have changed. I’ve watched several bands stream live shows. Bands that I really like but the lack of noise between songs is deafening. The quiet room mic audio buzz between songs during The Dickies stream, on Facebook Live just felt weird. For all it’s weirdness, it still was nice to see something though.

There was no audience there. It was a well shot performance, and I was happy to see it, but it was lacking in so many ways. The experience was not very different than watching a YouTube video. In fact, it was a bit like watching TV before we could watch whatever we wanted when we wanted. I actually had to log in at a certain time to see the performance. The model being how TV worked before your family had a VCR and someone responsible enough to hit record. You were watching from home at that time or you missed it. I suppose that would be a novelty to someone younger than me, but I don’t see it catching on, and it may not take off at all….Facebook decided they are no longer allowing music to be live streamed starting Oct. 1st (TBD if that holds out though).

Some of these shows have been considered successes. Maybe it’s a music genre thing, maybe it’s an age demographic thing, or maybe it’s just about things going weirder. Fortnite has held in-game concerts that have garnered praise. DJ Marshmello and rapper Travis Scott have both held “concerts” within their game. 10 million concurrent users attended the Marshmello show in some capacity, 12.3 million for Travis Scott. At a given time in the game, you and other players are treated to, (or bothered by depending on your tastes), a show with a giant floating psychedelic thing-a-ma-deal. Lots of colors, and lots of visuals to get lost in. I asked my Fortnite obsessed 10 year old nephew, if he was aware that such an event had occurred in the virtual play-space, where he spends 80% of his waking time, and I was met with a resounding “no duh”. I’m not sure that Travis Scott made a new music fan in the kid. However, he did get some name recognition from a consumer who’s only other celebrity talking points are what’s going on with the members of Paw Patrol.

Image for post
Sadly none of the other characters from Uncut Gems show up

Tik Tok has started experimenting with the concert model. A few weekends ago the app premiered an “augmented reality concert” by The Weeknd (why would he use the third letter “e”?). The XR company Wave created a world not dissimilar to the Fortnite experience; with trippy colors, and moving lights, and a Weeknd avatar that moved around and danced. There are a series of fan mini-games throughout the concert, enabling the viewer to vote on what type of blacklight poster/headshop display should be seen during the next song. The show went so well that The Weekend ended up as an investor in the company.

Again, maybe I’m just way out of the demographic, but even if I loved those songs; I have trouble wrapping my head around how this would get close to translating to feeling like a live show with real people present. It’s more like playing Concert: The Video Game. Maybe it’s supposed to be just that and not try to compete with reality. It’s too early to tell if the virtual streaming model will work the same way MTV did and change how music was consumed. Live shows still existed, but the way we were exposed to music changed. Which in turn changed the types of artists that got popular. Maybe the next big thing is the musician who is also the best software developer.

Image for post

Just because it isn’t for me, doesn’t mean it isn’t valid. Every generation thinks the one after them is ruining what they created. I’m doing my best not to be the old man that yells at clouds, so I asked around the Fair Worlds office, to get some takes from some people who aren’t me.

Xuny, my co-worker in our Seattle office, is a big fan of the Wave shows. The Wave is the same company that put together the Weeknd Tik Tok experience. Essentially, they put on VR concerts. You sign on, slap on a headset, pick an avatar, and attend a concert with a group of other people at the same time.

Xuny: “I actually did more of these before Covid. I love live concerts, but I used to do the Wave, when I would work late nights, at my previous job. When I used to attend the concerts they were around 7pm, and I would just stay at the office, eat a quick dinner, and hop into the headset. It was a really great way to experience the concert setting without having to leave the office early. Without having to head home to get ready, and back to the venue to look for parking. All of the hassle is gone. So some of it is convenience. The Wave concerts are shorter than most live shows. Then there are also drinks at live shows, and if you are doing a late night with work the next morning, it’s not easy.”

I saw Galantis, a big EDM band. I didn’t watch on my headset.. just on my desktop, so if was a different experience but still cool.

I am an extravert.. I like live concerts for the energy and crowd and the people.. the other reason I like EDM for is the light shows.. and that is something that that EDM in VR can recreate exceptionally . The whole light show experience, looking at an environment that is augmented by pretty lights is something that VR can deliver on and possibly even top a live show. One of the first WAVE shows I saw we are all in the concert area and then all the avatars where transported to a different world.. a ethereal space. So drastic change in a surrounding is something that WAVE will always have over a live experience.” I will also note that I have never been to a club that didn’t have a horrifying bathroom so that may be a positive too.

When you log in and your headset microphone is on and you're in there with 20–50 people that you can interact with, but mostly I’m just enjoying it on my own - that being said I go to concerts on my own irl anyway. I’ve enjoyed watching on my computer screen as well as VR, but it’s always much better in a headset.

I also spoke with Andy Slater. Andy is a multitalented musician and has both performed and experienced in many streaming music levels.

Image for post

Andy: The Chicago quarantine concerts, DSS are a non profit studio that do residences and cool performances out of Chicago. They were the first thing that popped up that had any kind of organization for these virtual concerts. Early on they had some heavy hitters like Thurston Moore and Ricardo Lindsey doing weird experimental stuff.

They asked me to curate a concert. They let me do whatever I wanted. So, I did a performance, and I got a group called The Von Trapp Family to do an endearing performance that featured delayed scratching over The Sound Of Music soundtrack. New experimental stuff. They were just so organized in putting all this together. It was great. You’ve got Thurston Moore on his phone and it sounds like shit, but it’s great, because it’s Thurston Moore and who cares. The webcam was sideways halfway through. The idea to make all of this to where it’s not a novelty, it’s just a thing that’s going to be this way for awhile. I think that the audiences have been watching things on their phone for so long, that they are either used to it, or very forgiving when it comes to sound quality. Also, the chat functions are keeping audiences busy too. They can still pay attention to the set, while also engaging with other people. So, it’s not like going to the bar, and having a conversation and then losing your spot in the crowd. It made a few bucks, everybody had fun, and it was a no stress situation.

I did another concert for a group called Shared Air. They are an L.A. based electronic label. It featured groups from L.A. and Berlin, and a diverse audience. Which is great, because it’s hard for me to find an audience outside of the States. I was excited because people from Berlin stumbled onto what I do.

There is a group out of L.A. called Distant Discos. It’s essentially a Zoom room where an audience can watch a D.J. doing a set. My friend Vitigrrl is a Chicago DJ did a set. It was wonderful because you can pop between the windows and see people dance. They did a program where the first hundred people that signed up to get in got a cheap disco ball light display for their rooms so potentially when an audience member is going through all these zoom rooms looking at people dancing the rooms all have a similar party effect. It’s a great way that they are branding themselves and also giving out this permanent physical connection to the virtual thing they are doing.

Image for post
An instagram photo of a photo of a computer streaming a show shot on a webcam. We sounded better than this looks

Since the beginning of writing this article I have been able to participate in the magic of performing to webcam. My musical co-conspirator Ian MacDougall and I performed a 45 minute show on StageIt.tv. It has a pretty great model too. It’s a show that is only seen at that moment for people who have gotten tickets. It is not recorded but the crowd could interact with us. The tickets can be priced at however much the artist chooses and StageIt.tv takes a small fee. We decided to do a “pay what you want” as this was our first time and we had no idea how it was going to go. The audience can also tip the performers with real money but it is listed as coins and there is a weird conversion rate (I’m not sure way.)

Ian is tech genius as well as a musician of the duo and he set up amps and microphones through a mixer and ran them through a computer. I honestly never would have figured any of it out on my own so it is to be assumed this would be a stumbling block for many bands that don’t have an Ian. We tweaked our room sound and then connected Peter Clarke on the StageIt site. He helped us soundcheck to get the proper levels he needed to get it to sound right for streaming.

Image for post
looking the audience in the eye

It was a little harrowing hoping the internet wouldn’t crash (which it did during soundcheck but was fine during the show). The show went well on our end and gave me some of the endorphins that usually come with getting onstage. I actually enjoyed it quite a bit. I went in reminding myself that this was going to be a different feeling and to be comfortable with the uncomfortability of that. The idea that essentially I’m doing a show for HAL from 2001 took a minute but snapped into place after a few minutes. I still need to impress sentient A.I. and get used to the idea that they aren't going to hoot when I end a song. It’s just different. The audience interacting mostly through emojis and little digital coins letting us know we are getting tipped. The silence after saying “how ya’ll doing tonite”. It’s just different.

I keep describing the experience as methadone compared to heroin which I probably shouldn’t do because I haven’t tried either of those. I should probably find a better description. It was like eating Little Ceasar’s pizza as opposed to some wood oven pizza in an a high end Italian joint. Sure, it’s not as good but it’s still pizza.

Taking a step back I’m reframing my take on this. I did enjoy my time as a musical cam-boy and without making a contest out of it there were some real positives about the experience. Xuny and Andy has got my brain spinning in new things that I can try and the benefits that they see. I probably won’t be asked to do a Wave show but the conversation about time saved that Xuny brought up also applies to the performer. I like the stress of doing a live show but I would be lying if I said that setting up my own thing on my own time with a crew of me and one other person wasn’t nice. Talking to Andy about the experimental vibe of what he has participated and seen makes me rethink what I can do. For 25+ years I’ve mostly been performing in the same way. I’m going to take some chances and try some stuff and just see what happens. New things feel weird because they are new and I have to remind myself that.

I’m officially ready to be the backdrop to a Fortnite melee. I also have a lot of apologies I need to make to the cloud I’ve been swearing at. Please forgive me cloud.

I'm sorry that got so heated.

I miss going to the movies. It’s been 163 days since I was in a theater. 163 days…..That is the longest I’ve gone in my entire life without seeing a movie in a theater.

It’s basic but we all know a movie theater is absolutely the best place to see ANY movie.

The big dumb tent-pole flicks with explosions.

The comedies with the cuss words and foreign films with the words I have to read.

The art films that I really like and sometimes pretend to understand even though I didn’t.

And it doesn’t matter what kind of theater it is.

The dainty little 2 screen art house that sells wine.

The mega-plex that you have to put a down payment on a dump truck of popcorn and a water tower of Coke Zero (trying to cut my sugars).

And our beloved Alamo Drafthouse.

I would go any time, any day, any movie, but I really loved going on opening weekends when it’s packed with people like me.

I’m just reminiscing here.

The lights go down in a theater filled with strangers. The projector lights the screen. There’s a crowd shift of collective energy you can feel between the trailers and when the main feature begins. It’s a collective “ alright let’s go, we’re ready”. We’re together as a crowd and individually we feed off and add to the group with chuckles, gasps, laughs, groans, tears and every other appropriate (and sometimes not appropriate) response. It’s more than the sounds of people though. There is an energy of being in a crowd. On paper we’re just watching another movie on a really big TV but it’s not the same is it? You can buy a really big TV with a really impressive sound system for your home but it really isn’t the same is it?

There has got to be a poet who can describe the human spirit of gathering better than me but Dylan Thomas sadly passed away before 2 Fast 2 Furious came out.

Many of us here at Fair Worlds come from every aspect of film backgrounds. Back when we were in the office together (seems like a different life) it wasn’t too hard to get sidetracked talking about what, when, where, and with whom we’d seen a film that moved us in any direction.

For those of us in the Austin team we live in a great spot for cinema. Our office is right next to the Alamo Drafthouse office.

We have friends and family over there. It hurts to see them hurt.


It’s bleak overall. AMC has “substantial doubt” it can remain in business. They are the world’s biggest theater chain with over 11,000 screens across 15 countries.

According to the survey, 40 percent of the country’s cinemas could permanently shut down due to the financial hit they’ve incurred as a result of COVID-19, Variety reports. This would result in the loss of 5,000 venues and nearly 28,000 screens. China accounted for nearly 22 percent of the global box office last year; such a significant loss of theaters would produce a damaging financial ripple effect throughout Hollywood.

All that is scary for cinefiles and movie geeks but let’s not forget how resilient the crowd gathering can be. Theater owners have gotten panicky and many an op-ed about the death of cinema have been written over the years. Television, video cassette, DVD and streaming services have all been written up in their respective times as being the nail in the coffin for movie goers.

Yet people keep coming out and the kids still wanna have fun. Certainly there will always be a communal aspect that is difficult if not impossible to replace.

That being said there are itches that need to be scratched and a lot of us feel like we’ve been rolling in poison oak. And XR has taken a few cracks on what this could look like in the future.

I don’t think there were really that many people in the theater for Genisys

I don’t actually have enough friends with VR headsets and times that would match up to pull off a Bigscreen VR hang, although it sounds ideal. It works on the model of a theater with showtimes actual specific times. There is a lobby that looks like a theater lobby where you can hang out and I assume you spend too much on virtual Raisinettes. The theaters themselves are customizable. The only drawback is see is the content. It’s nothing new, nothing old, nothing weird, nothing classic. Just kind of oddly in the middle. As of this writing the selection feels like a United Airlines flight 3 years ago. Top Gun and old Michael Bay type pictures along with some anime. I suppose those are good choices to watch in VR for visual purposes but I would guess that 95% of the time people are spending their money in the theater they are there for something they haven’t seen before.

If your friends don’t have access to headsets there are some more low-fi options.

Every Thursday night a few of my friends have set up a makeshift movie hangout. There are a bunch of ways to do this. Netflix Party works but Netflix is more of background channel than something to appointment watch. Amazon Watch Party won’t allow you to swear in the chat. We’ve tried multiple ways to get this going and so far this is the best version of how we have set our nights up.

We all chat in Facebook messenger about what we wanna watch that week (except for the one guy who wisely isn’t on Facebook). One guy sets up a zoom link to us all in messenger except emailing it to our friend who (again wisely) isn’t on Facebook. From there we catch up on Zoom for a bit and then another fella usually sets a group stream we all log into on TwoSeven.xyz. I had no idea you could end a web address with “xyz” but I also don’t know a lot of other things too. TwoSeven is a free service that’s you watch videos together. It’s Patreon funded and free but if it’s a pay account (Netflix, Hulu, Disney) every member will have to have their own account. That limits some things. Now there is a video chat function on TwoSeven but it’s not very good. So we have the TwoSeven on one monitor, Zoom on the other.

It’s kind of unwieldy and due to the internet and the fragility of both zoom and Two Seven it seems like one or the other is always freezing up. Again, it’s better than other things we have tried. Also for some reason the load screen on TwoSeven is ALWAYS the start screen for a music video of Disturbed covering Sounds of Silence. I mean isn’t life confusing enough right now?

Makes you wonder how good The Graduate could’ve been

Not quite me and the crew hitting the theater and then grabbing a drink after but it is what it is. Without trying to replace being a physical place with a group of people what is the next best option? We are in the infancy of trying to bridge the gap but now is the time for innovation.

Or reverse innovation.

Drive In’s are also having a resurgence for the first time since the 1980’s. They were a juggernaut through the 50’s into the 70’s before urban sprawl, VHS, mandated daylight savings time and finally digital projection chipped away at them. What was once antiquated combo’d with current technology may be the perfect compromise.You can get out of the house while staying safe in a hermetically sealed pod (your car). All the ticketing and concessions can be done online. Even the audio signal can run over WiFi instead of an AM channel. Currently we have an excess of event space parking lots with no events happening and digital projection and stage rigging has made it really easy to make pop ups.

Maybe I’m just looking to save money by stuffing myself in trunk of a KIA but I would really like to watch Tenet on a giant screen with a crowd. The floorboards of my car are easily as dirty as those of a movie theater and microwave popcorn doesn’t make my stomach hurt near enough. Mostly I just want to get out of the house and feel a little sense of a the “before times.” I can’t think of anything I wouldn’t go to see right now…

Well….almost anything.


To date, the history of “Smart” or AR Glasses has had two modes — excitement and disappointment.

We have read about the promise of smart glasses for the last decade or so. One would think that these game changers would have…changed the game? So where are we with these devices? Exactly which game are we trying to change?

Image for post

It’s important at the start to delineate between a professional tool and a day to day device. We are not talking about smart glasses for industry, as those are here and are practical for a certain subset of industries and jobs. The HoloLens 2 being the current leader in our opinion, but there are more on the way for that niche. But these are multi thousand dollar devices that are not intended to be worn like normal glasses. Let’s keep this discussion to a hypothetical consumer product. One that is worn from morning to night — Consumer Smart Glasses.

Image for post

I think we can all agree that once perfected consumer smart glasses are a no-brainer. A wearable heads up display that layers on information as we go through our daily lives would fulfill all our cyberpunk dreams. Visual 3D navigation, real time translations and training are just the tip of the iceberg for what is possible. We haven’t yet dreamed up the use cases that will be “killer apps” for the future.

But we are ready for them? Hopefully they’ll look cool like Tony Stark’s spectacles instead of some 90’s Oakley’s but either way let’s go.

Tony Stark AR Glasses compared to Oakleys

It’s not like we’re demanding Neuromancer cyberpunk silver eyeballs right now. This isn’t Star Trek transporter technology. It seems like every few months we get an article like this one that lets us feel like we are behind the scenes of a big merger. Yes, indeed Google acquires smart glasses maker North but they’re not making them and not shipping them. So what’s the deal?

Between science fact and fiction it’s hard to tell who first started legitimately probing around into the smart glasses game. We can only assume that it had to do with the disappointment of finding out that Arnold’s sunglasses were not in fact producing the Terminator-vision as seen by the titular killing machine. Either way the idea has been around for awhile with the most public attempt in 2012, with Google Glass.

Image for post

Google Glass may be forgotten by the majority of the general public but it’s spectacular industry failure may be still causing clenched teeth by would-be developers. It can not be understated how quickly the words Google and Glass went from evolutionary tech leap to punchline.

In 2012 Time Magazine named Google Glass as one of the best inventions of the year. The forward fashion gadget got it’s own 12 page spread in Vogue. It was everywhere in the media. People were excited… until they actually tried it. It was plagued with bugs, had no support, an awful battery life and let’s face it: it looked dorky.

How dorky? How about, “give me your lunch money” dorky. Multiple reports of people getting assaulted for wearing Google Glass surfaced. The public was not cool with the idea of weird looking expensive monocles that are recording their every move. It became late night monologue fodder and had it’s moment on SNL a’la Fred Armisen’s character Randall Meeks.

Joking aside, there were also practical security concerns. Some of these concerns seem a bit quaint now. This is right before the ubiquity of everyone recording everything with their phones. It was 2013 and somehow the idea of a crowd pulling out their phones in unison to record anything between life’s most mundane and dramatic moments wasn’t an accepted norm yet. Hitting record on your glasses vs. holding up your phone to record hardly seems like a real difference in 2020. In 2013 however, movie theaters were worried about bootlegs and banned Google Glass thus saving a compromised pirated viewing of Divergent. Casinos were worried about Ocean’s 11’s style heists by nerd versions of George Clooney and had them banned as well.

They just made people mad. Some of it was a class issue. In San Francisco the general public saw Google Glass as a $1500 totem of how Silicon Valley was rapidly pricing them out of their city. They quickly became a punchline as evidenced with the blog White Men Wearing Google Glass. Initally it was simply a collection of photos mocking the tech monocles. The site still exists in name but has shifted to different subject matter seeing as any photographer trying to snap a pic of someone wearing Google Glass in the wild is more difficult then getting a shot of Big Foot.

Sure some other attempts at smart glasses have been made. Many. But reviews always seem to include the phrase “awkward first step” or “designed by people who’ve never worn glasses before”. Snapchat Spectacles exist but having the word “Snapchat” in the title certainly scared off a large demographic. I don’t know any consumer who has ever worn a pair despite constantly hearing about these world changers. I’m starting to get the feeling that these are like the hoverboards promised to me when Back To The Future 2 came out in 1989.

Image for post

We also would be remiss not to mention Magic Leap. They were ahead of the curve in terms of getting an AR headset to consumers. Unfortunately, their original price tag of $2,295 far exceeded consumers ability and willingness to pay for that. Earlier this year, Magic Leap laid off a large portion of their workforce and have pivoted towards the enterprise market. They cite COVID-19 as the main reason, but it should be mentioned that their low consumer sales definitely played a role.
As we mentioned before — this idea of everyday wear is really critical for adoption of AR headsets. Are you willing to be tethered to a puck that you carry around your belt buckle? How does this feel to wear for extended periods of time? Are you willing to pay extra for prescription lenses? We are seeing headset makers having to make compromises to pack a lot of sensors without making the device uncomfortable (off-balanced).

Image for post

Here is what’s going on now.

At the end of June, Google announced that it had acquired North. North being the company that made the Focals smart glasses. The Canadian based company was planning on Focals 2.0 being the next release but since the Google acquisition those are not shipping. Consumers that pre-bought a pair are getting a full refund. The Focals app is going down and once again the “you can almost touch it” future of smart glasses feels yanked away from us.

What’s the plan? It’s all speculation now but obviously the guess is a Google assistant powered pair of smart glasses is in the works. Something that could work with all the existing Android and Google devices. How far away we are from that is an unanswered question.

With equal amounts of quiet to speculate and gossip on lots of of people are looking towards Apple. Of course there have been rumors. The rumors that Apple was going to at least tease some smart glasses this summer were starting feel really credible. Alas, we didn’t get any word about Apple Glasses from WWDC 2020 like we were hoping. There was talk that the lenses would launch this year but, really, what has gone as planned this year? Rumors are now posing to 2022 as a launch date with others even speculating 2023. We do know that Tim Cook is a fan of AR and that Apple acquired Akonia Holographics in 2018 so fingers crossed this isn’t wishful thinking.

We can at least make a healthy guess without getting into tech that Apple is heavily invested in design. There is no chance they jumped into the Smart Glasses game without planning on being the antithesis style wise to the fashion disaster of Google Glass. As long as they don’t look like a giant cheese grater I feel reasonably assured I won’t hate the way they look.

Image for post

Tech wise we are in the dark. Lots of patents have been bought and leaked. A new Apple patent shows how the lenses might be auto adjustable saving buying prescription lenses. It’s called “Pupil Expansion.” It generally relates to optical reflective devices, and more specifically to expanding light beams within optical reflective devices.

It’s clear that the big boys are all invested. Facebook, Google and Apple all have 3rd parties they are working with on both the tech and the frames front. Throw Ray-Ban in the mix and take out a 2nd mortgage.

In order to get anywhere these companies have some work cut out for them. Our phones are in our pockets half the time (OK, a quarter of the time). Glasses obviously would be visible 100% of the time and the way they look, for better or worse, say a lot about the person who wears them. Fashion will factor in to it’s success in a much larger way than any other technology has had to deal with. Look at Beats by Dre and what they did for the headphone industry. Hell, many argue that the sound quality is lesser with Beats, but the fashion aspect has made them worth billions. That’s without bringing any significant new technology to the headphone game.

That’s not to say that fashion alone is going to declare the “emperor of smart spectacles.” Whoever is going to be on top is going to have to nail both fashion and function. Headphones are meant to take us away from reality whereas smart glasses need to accentuate and sharpen our surroundings. If these devices distract or impede vision they are headed to the tech junk drawer. If they make you look stupid they aren’t even going to get tried on. It’s a fine line, and a tough job figuring out how to pull off both. Marketing will be a big part. What celebrity or athlete is going to be able to actually pull off making these things look cool. My guess is either Wallace Shawn or The Rock.

Time will tell, but hopefully not too much time. We could all use a new gadget to fixate on right now. Fortunately I’ve been told that they finally figured out hover boards and mine should show up any day now.

AirPods

Leading off with what got us the most excited: Apple’s new features to their AirPods.

The big news is that the AirPods Pro are getting spatial audio. The new feature will offer 3D, surround sound style audio to replicate a movie theater experience and will constantly calibrate based on the position of your head to whatever device you’re using. Your head movement will be tracked allowing you to “look” at the sound you are hearing. This opens up a whole new ball game of ideas and innovation in developing audio for VR and AR experiences.

Also announced is a nice time saving feature of automatic switching. If you’re like me you are in a constant state of pairing, unpairing in order to switch between phone, computer and ipad. The new switching feature will recognize what device you are actively using and automatically switch over.

Tim Cook kicked off the Apple Worldwide Developers Conference 2020 keynote by addressing racism, inequality and injustice. Apple has created the Racial Equity and Justice Initiative with a commitment of 100 million dollars as well as a developer and entrepreneur program for black developers.

From there we got some sneak peaks of a number of changes coming to every platform. Lots of new features, some bigger than others. Here are the ones that made an impression

IOS 14

The key innovation seems to be an effort into grouping and organizing apps (something I could indeed use some help with).

The App Library as they are calling is at the end of your home screen pages. It has a streamlined search feature that sits on top and alphabetizes the apps.

Apps are also grouped into sub categories based on frequency of use, recently added and customizable curations

Image for post

Widgets

Widgets have always been a little sidelined with iOS but now seem to be getting some fresh attention and already there is some online side eying from android users . Widgets are now moveable onto any screen and resizable. A smartstack widget will allow you to scroll through all your widgets or set timers to appear at different times ofday based on their uses. (I know I’ll be reminding myself to meditate)

Multiple open media

Picture in picture so you can keep watching media while scrolling through other apps is now resizable and a blessing for the ADD multitasker.

Translation

Siri translation sounds innovative and helpful for communicating in different languages. It’s no babel fish in your ear but it seems super helpful in trying to communicate conversationally with someone who speaks a different language

Texting

Pinning texts in messaging will be helpful for those of us who constantly refer to a text with specific information (In my case a key code so I’m not locked out).

Group texts are getting some extra little tweaks like specific replies or those of us who get confused as to whom is talking about what from earlier in the thread.

Maps and Travel

Maps is adding new countries and a cycling feature to help you navigate cities on bicycles. The 2021 BMW 5 series will be the first car to allow the iPhone to be used as a car key to both start and unlock your car. You can share your key information with others by sending information to their phones.

Privacy

Privacy in where your data goes and who is collecting was emphasized. Maps will have a feature giving you the option of leaving approximate location as well as specific pin drops. Apps will have to ask if they can track you. 3rd party apps will be forced to record what data they are collecting on you and you will be able to see that information.

Ipads

Ipads are getting a much needed search as well some sidebars and the widget effect. The big iPad push seems to be the scribble function to be used in conjunction with the iPad pencil. Writing to text on every app.

AirPods

The AirPods big new feature is spatial audio giving an immersive audio experience. This could be great for VR .

Big Sur and Apple Silicon

Big Sur is the name of the new OS for Mac.

Not too much was unveiled with Big Sur other than many design tweaks and layout placement. Again, widgets are getting love from Apple.

Big Sur was really a smaller reveal to the fact that it was designed to work with Apple Silicon. Apple Silicon is an in-house chip designed to replace the previously used intel. Developers are working on converting to support. Microsoft Office and Adobe Creative Cloud will all be ready to work with the rollout of Apple Silicon. According to Cook the first Mac with Apple Silicon is scheduled to ship by the end of the year and expect a transition of 2 years.

RIP, Oculus Go. We’ll miss you. Sort of.

Some ideas seem so good that in the second that you hear them, you knowthat they will be “The Next Big Thing.” These are ideas you are convincedwill change the world.

And then a couple years later… they are no longer available.

We can now add the Oculus Go to that list.

On that same list of great-but-defunct ideas is the Circarama, which preceded the Oculus Go by about 65 years. Both inventions are now relics of the past, and both were bold attempts at new ways to present stories.

Last week, the team at Fair Worlds discussed the Oculus Go’s recent discontinuation and its similarities to the fate of Circarama theaters.

Before we dive into this, let’s start at the beginning.


In response to the shiny, new, three camera, film format of the Cinerama, Walt Disney, in the parlance of our times, had a “hold my beer” moment. (Fun fact: Walt actually drank Scotch Mists.)

Adding eight more cameras to Cinerama’s three, Ol’ Walt revamped the format with the premiere of his Circarama in 1955, at Disneyland.

The Circarama was mounted on top of a car and used footage from 11 16mm cameras. To view the Circarama’s first film, A Tour Of The West, audiences stood in the center of a round room and watched as the film was displayed on 11 movie screens that wrapped around the perimeter. Viewers had to constantly turn around to figure out where to focus their attention. Some had to hold onto poles in the auditorium to center their equilibrium in order to keep from getting nauseous. (Does this sound familiar?)

Eventually, the Circarama became Circle-Vision 360 and continued to show nature and travel films at Disneyland until ’97. Outside of the park however, it had no utility, and all proclamations that it was the next big thing fell flat. Do you have a Circle-Vision 360 theater in your town? Case in point.

The answer to why larger adoption never occurred mirrors some adoption failures with the Oculus Go.

Both innovations put the user in the center of a story, and surrounds them with 360 degrees of visuals — and more importantly, filmed visuals. Even the camera systems are almost identical.*

https://variety.com/2019/digital/news/google-jump-shutting-down-1203219306/

In technical terms, both are what we call 3DoF (three degrees of freedom). This means that the user has rotational camera control and can look all around them, but they cannot walk into the image or interact with it. With 6DoF, the user has complete control. This is best demonstrated with the following Gif: 3DoF is above, 6DoF is below.

SOURCE: http://www.outpostvfx.com/blog/2016/12/19/vfx-for-vr-projects-and-why-you-are-not-prepared-for-it-part-1

The Go is a 3DoF experience, and I have always referred to a 3DoF experience as you being an observer. With 6DoF, you get to be a participant. That is the thing with VR - it’s the participatory experience that makes it incredible.
— Ross Safronoff , XR Developer at Fair Worlds

Despite the happy-go-lucky marketing, Oculus Go users should be seated for safety purposes. Even as a device for just watching VR and traditional films (watching stereoscopic 3D movies is an underrated use case for VR), the Oculus Go has a risk of inducing nausea. In fact - a huge factor in the quality of the experience is determined by what one is sitting on.

Imagine sitting on a couch and trying to watch a 360 video. If that video truly utilizes all 360 degrees, it will be physically uncomfortable to turn all the way around. Even if you are in a swivel chair - which is the best option for 360 videos - you will still have micro head movements forwards and backwards. Eventually, this could cause nausea due to the disconnection with your eyes and inner ear.

The Oculus Go didn’t die because of advancements in immersive technology, it died because of stagnation in chair technology.
— Brad Parrett, Creative Director at Fair Worlds

The issue of needing a specific chair to properly enjoy a 360 movie was made apparent when we unveiled a piece that our Creative Director, Erik Horn, produced for the NYTimes a few years ago. He proudly placed a Samsung Gear VR (another R.I.P.) on his mother, who was sitting on a couch. She quickly complained that she could not see what was behind her…and took the headset off.

We learned from that lesson when we created The Monarch Effect in collaboration with the Environmental Defense Fund (EDF). We opted to use the woefully under appreciated VR180 platform from Google. Creatively, this is more akin to the largest IMAX 3D movie you have ever seen, and has some minor interactive capabilities. It currently is still on the Oculus Go storefront for free and we hope to bring it to the Quest in 2021.

Though it is being retired to the trophy case of obsolete VR Headsets (where there are a few), we have to applaud the Oculus Go. The little grey headset did a great job in setting a precedent for what consumers were going to want and need. It introduced a new audience to the concept of VR, and like a good showman, left them wanting more. While it lacked technical prowess and more immersive, full body experiences, it brought a lightweight comfort-ability and made waves as a wireless headset.

And maybe most importantly of all, it signaled the end of the Google Cardboard era, as it did not require a user to insert their phone into a frustrating “contraption.” (Though the idea of the cardboard was, and is, brilliant, the execution was maddening)

What’s interesting in the history of the Oculus Go is that it was an R&D project. A lot of women were involved in it, and the straps came from bra straps… to make it feel comfortable. [The Oculus Go] feels more comfortable (to me) than the Oculus Quest when worn for long periods of time.
— Todd Little, Lead UX Design Engineer at Fair Worlds

Moving into the future, you can see the Go’s influence on the Oculus Quest (our current headset of choice) design-wise, and you’ll surely see its positive aspects emulated in future headsets to come. Ultimately, the Oculus Go will have a large place on the technological timeline of immersive storytelling.

So — we started with standing in line at a theme park in order to be amazed (and slightly nauseous) by a futuristic 360 degree film experience, and years later were able to experience a similarly awe-inspiring (and slightly nauseating) experience in our very own homes. Similar concept, same result.

We joke around as a company, but there are going to be bumps on the road to new immersive reality. In a few years, we’ll probably be reveling in the antiquated notion that our headsets were larger than sunglasses and required controllers at all.

In our current, socially distant isolation, we still would not mind going to a theme park at some point again…sigh. Even if it has to be in yet anotheroutdated format, Cinerama.

*We realize that the Oculus Go featured more content than just 360 and 180 degree videos, but for the sake of this comparison, we focused on them as they were the predominant use case. We love(d) experiences like “Virtual Virtual Reality” from Tender Claws and the output of the Google Spotlight Stories, but that is for a different post.