When most of us think of augmented reality, thanks to Google, we think of dorky glasses with an ugly camera mounted on the frame. But what about the other senses? What if all of them were augmented and all those augmentations worked in concert?
That broader view of AR might be closer to what tech companies will be unveiling in the next couple of years. Humans take in enormous amounts of sensory input every second. We make judgements about threats, reactions, choices and pleasures from subtle and dramatic changes in the patina of inputs we're swaddled in.
Some decisions are made on minuscule cues -- a wink, a flicker at the periphery of vision, a rough burr on a flat surface, a chirp.
So, for AR to be effective, sometimes its outputs can be incredibly low bandwidth. When I'm running with my Apple Watch, for example, I'm tuned to a subtle pattern on my wrist that connotes a kilometre mark. For the first month I didn't notice it regularly, now it's simply a part of the information that touches convey to me -- information that augments my knowledge about the exact distance I've travelled.
A Kickstarter project called the Vue AR glasses uses the same idea, except with vision. The Vue doesn't project images on your retina nor record video. Instead it simply blinks a small LED at the edge of your vision to let you know you have a text or a message. Different colours can indicate who the missive is from. The glasses also allow you to get and transmit audio information via bone conduction.
Bone conduction is a really old technology. It bypasses the ear canal and sends sounds via the bones of the skull. That leaves your ears wide open to catch real-world signals. And, you don't have earphones in your ears so the bone-conducted information is private and unobtrusive. And again, audio cues from the glasses need not be complex. A signal from the left temple may mean a left turn towards a destination, for example. And, because the bone-conducted sounds are in stereo, directional cues can be very carefully tuned across the acoustic landscape.
And directional cues don't need to be limited to the skull. The India-based company Lechal is already selling insoles that can convey information to your feet using another old technology: haptics. Haptic feedback is the result of carefully controlled vibrations that can simulate a variety of cues and surfaces. The old "rumble packs" on console game controllers used off-balance motors to send crude signals to players. Haptics today use a mix of variations in force and frequency to fool fingers into thinking they've pressed a touchpad, hit the end of a list or have scrolled through a notched contact list. And neuroscientists are using even more dielectric haptics to give amputees a sophisticated sense of touch.
Even smell, heat sensing and wind intensity can enhance AR and VR experiences. For example, the FeelReal gaming mask can convey heat, wind, humidity and smell to gamers. Again, this is based on old tech, as anyone who remembers Smell-o-Vision will attest.
All of this doesn't rule out that near-future AR will not use high-bandwidth visual cues, like text, stills or video projected into our retinas from unobtrusive eyewear. But, that feedback, sensory and information data will certainly be coupled with other bodily inputs that will help us enhance our understanding of objects around us and the way we move through the world. And that augmentation will be private. No one need know when our wrist or insoles know the distance we've travelled or where we need to turn. The audio cues or voice in our head can be for us alone. The information cards that float in front of storefronts or historic sites can provide us with knowledge or reminders only we can see.
Like our actual senses, AR that amplifies those inputs will be a multi-input enhancement of how we already make sense of our environment, from the tops of our heads right down to the soles of our feet.
Wayne MacPhail has been a print and online journalist for 25 years, and is a long-time writer for rabble.ca on technology and the Internet.
Photo: Benjamin Linh VU/flickr