The Urban Design of XR
↑ 2 References
So much of the existing conversation related to XR is focused on personal interfaces and experiences, with the occasional acknowledgement of the need to support local and remote multi-user experiences. We in the XR community need to take some time to focus on how these immersive and spatial technologies will impact the way we map, build, and experience the world outside our own home.
Google, Apple, ESRI (ArcGIS), and Unity are building tools to allow developers of these spatial interfaces to interact with the geography of the real world. Just like we’ve seen with how Apple’s ARKit has been used as the basis for the functionality in their Vision Pro, it will be only a matter of time before head-mounted devices can track the outside world, allowing us to seamlessly experience digital objects placed in our physical world. Pokemon Go hinted at this future, but with the important improvements in these technologies, we will be unlocking a new tier of possibilities for the future. There will be an opportunity to “build a new world” made up of digital objects, and I hope to provoke some thoughts about how we think that world should be designed.
What new types of art, experiences, and stories can we create that were not possible before? Digital creations are much less costly to create than physical ones, what does this unlock? New types of public art, sure, but what about new types of shows? What if there was a show you could experience on a street corner, and every week there was a new episode?
What is the importance of shared perspective and experience design among local users? For example, one can imagine the simple act of pointing while communicating: I believe out digital interfaces have to allow us to have enough of a shared physicality such that we can point to communicate.
I am currently working on an iOS app that explore some of these possibilities: placing digital public art in places never before possible with physical art (like floating above Times Square), with the knowledge that these iOS experiences will soon become experiences available for the Vision Pro. I am also developing an app for the Vision Pro that prototypes some of these new types of experiences.
I would also be happy to focus more on the technicalities of the experiences that are possible now on the Apple Vision Pro, and what the Vision Pro’s current capabilities say about the future of the product and the future of the space. Since I own a Vision Pro for development, and have been developing on it consistently since it’s launch, I have a lot of context on that conversation as well.