spatial interfaces need to allow us to point
It’s important for the “virtual” nature of the interface to fall away, and for the group members to begin operating with the standard comforts of sharing a physical space. And I think that idea is embodied by the physical act of pointing to show something virtual to someone else who’s next to you.
There is also the element of the communication delay, but that shouldn’t be an issue in the With others scenario
↑ 7 References
What is the importance of shared perspective and experience design among local users? For example, one can imagine the simple act of pointing while communicating: I believe out digital interfaces have to allow us to have enough of a shared physicality such that we can point to communicate.
This is followed by the issue of object permanence in multi-user experiences. I’m glad, but not surprised, to see Apple thinking carefully about the perspectives of multiple users in remote experiences, both in terms of audio and video. If I’m not looking at you, you should be able to see that. This also applies to in-person multi-user experiences: if multiple users are wearing these headsets in the same room, there needs to be a shared perspective (and graceful ways to manage unshared experiences). In other words, pointing at something needs to work. I expect them to take this thinking into the greater world (see ARGeoAnchors in ARKit).
Even if we get perfect, display-free passthrough, how much confidence will we need to have that we’re sharing the same experience to feel comfortable? If my virtual screen is not exactly where your virtual screen is, will that feel uncomfortable? This is related to the spatial interfaces need to allow us to point