This is followed by the issue of object permanence in multi-user experiences. I’m glad, but not surprised, to see Apple thinking carefully about the perspectives of multiple users in remote experiences, both in terms of audio and video. If I’m not looking at you, you should be able to see that. This also applies to in-person multi-user experiences: if multiple users are wearing these headsets in the same room, there needs to be a shared perspective (and graceful ways to manage unshared experiences). In other words, pointing at something needs to work. I expect them to take this thinking into the greater world (see ARGeoAnchors in ARKit).