Thinking different about AR

Gene Becker starts tearing into the current crop of AR Hypeware with a litany of design faults such as Inaccuracy of position, direction, elevation Line of sight Lat/long is not how we experience the world Simplistic, non-standard data formats Public gesture & social ambiguity Ergonomics Small screen visual clutter These are going to plague AR … Continue reading “Thinking different about AR”

Gene Becker starts tearing into the current crop of AR Hypeware with a litany of design faults such as

  • Inaccuracy of position, direction, elevation
  • Line of sight
  • Lat/long is not how we experience the world
  • Simplistic, non-standard data formats
  • Public gesture & social ambiguity
  • Ergonomics
  • Small screen visual clutter

These are going to plague AR apps though many of the issues are relevant to any mobile application. We have additional limits not mentioned such as in the difference between online storage (which is theoretically unlimited yet static) as opposed to online storage (which is even larger and even dynamic to the point of interactive but subject to signal drop out and bandwidth issues which limit the utility).

I’ve thought about some of these issues – Line of Sight being one specific issue and my resolution was using the OpenStreetMap vector data as an opaque underlay (as opposed to the AR overlay) on an AR camera view in an attempt to obscure some items which should not be seen in line of sight. It’s a hack, it’s a cludge, it’s yet another layer to manage but it might just work.

The Ergonomics and Gesture/Social Ambiguity arguments are going to depend where the AR use it but suffice to say that there’s a lot of work for Mobile apps to use AR ‘vision’ at the moment and not enough work in ‘AR audio’ and ‘AR haptics’. These are going to make it very possible for better AR experiences that do not require holding your phone at an awkward angle. These also, coincidentally, solve some of the issues with small screen clutter – something that has already been solved in both iPhone apps (where we can see examples of excellence as well as the polar opposite) and, ironically, in another Apple device which doesn’t have a screen.

The data formats issue will become moot at some point as we see standards arising not only from platforms gaining prominence not only in the AR space but also in all Location-Aware applications and the Open Data formats which are being pursued by forward-thinking governments.

The problem is that it’s an exciting time. It’s not dissimilar to the web but at least everyone was on a single protocol base there. We all used IP, we all used HTTP and HTML.

Where do we go from there with AR? Where are the published datasources which any AR browser can hook into? At the moment we will have a proliferation of different platforms, companies wanting to have plugins for their platform developed for every service. This is the wrong way to go.

Leave a Reply