There are two developments today that are giving us hope for how we may interact with the world around us: Smart Things and Google Glass. Smart Things is a company that at first glance is providing you the hardware to automate your life, their demonstrating example saying, "Wouldn't it be smart if... your house could secure itself when you leave?" They provide users not only with sensor hardware such as power meters for home plugs, movement sensors, connection sensors, proximity sensors, and more, but they provide the basic API to work with those sensors and easily create applications. Google Glass is Google's answer to the popular demand for augmented reality, providing a heads-up display with elegant hardware that can fit over your glasses, inside your goggles, or stand on their own (picture above). This week at the Google Technology Developer Group we explored the possibilities, what apps could be created, what business models were possible, and investors were circling like sharks. So what's possible when open source sensors combine with augmented reality?
On Their Own
As they are, the Smart Things sensors are fun. You're not just getting raw data that you then have to build interpretive algorithms for, they come designed to let you know when certain events happen, such as the magnetic seal being broken, something being moved, something moved out of proximity, etc., all linked through their wireless hub which you can easily access. Suspect that your babysitter keeps going into your room? Use a 'Seal Sensor' and know if the door was opened after you left. What are the UV levels outside today? Use the multi-spectral sensor. Want to conserve energy? Have the thermostat in your house cool only the room that you're in.
Google Glass uses 'cards' to present information to the user in a notification like window in the upper right hand corner. It's not meant to replace a smart phone, it's designed to present information in real time, and so the cards are ordered by time. Scroll through the cards by running your finger along the frame temples, select by tapping and enter the next level within that card, so on and so forth. The cards are static, so the most movement you'll get is if you include an animated gif, and finally the glasses are normally off, the idea being to preserve battery life. Of all the apps that were discussed, the one I saw garner smiles and laughter was an augmented reality video game where people were given dances or tasks to do in some area of a city, with Glass providing the notification and directions when you entered a part of the city; so imagine people dancing on one block, pretending to be Mario on another block, and running an 'invisible' course on another, orchestrated by Glass.
Curation is hard, we want to explore as we go, to discover as questions arise, to visualize and navigate the surrounding world's information on demand. Smart sensors act as a tour guide offering up the world's meta information, whereas our five senses handle the raw physical information. Glass layers that meta information on top of the raw, allowing us to more seamlessly transition between the two points of view. So enter a room, survey the landscape, find the bar, then scroll through everyone's picture and bio (assuming they're playing as well) so conversations begin with something interesting about them as opposed to "Hello, what do you do?"; it's almost like time travel, fast forward, get to the good stuff. Walk down the street with a real-estate app and get all the information about the buildings with rental or condo space, vacancies, crime statistics, customer complaints, rooftop pools, turn your head and the same standard info is presented about the building on that side of the street. Maybe place the thermal sensors inside and outside your home to visualize high thermal diffusion areas (i.e. you need new insulation). Place power meters throughout the house and visualize how each room spins your meter. Use the moisture sensors and visualize the best room for your humidor and world-class cigar collection. You don't have to switch between online and off, you have the meta-data presented to you depending on your context.
The New Marketplace
The goal is ultimately to create new cloud based platforms where innovation runs rampant. The Google Glass marketplace does not yet allow advertising or allow you to sell apps, leaving monetization for paying customers' specialized applications. This will change eventually, we are only in the Alpha phase and Google is known for its early releases to test the market. Smart Things has a number of third parties already developing on their platform, but this is currently limited to the sensors you own, not anyone else's. Imagine Smart Sensors that you can share like a Google Document, keep your thermal information private or share with whomever you choose. Now with access to communities of sensors you have a very interesting big data challenge, and an even better visualization challenge. How will developers shape your meta-view of the world? Will they be sharpening our vision? There will always be the raw information for the observant eye.