We have already witnessed a major transition in what we think of as computers as they moved from machines that filled rooms, to one that sat on our desktops, then our laps and then our hands. In each case we have gained versatility and convenience. Computing has moved to being available in a format that is always accessible at times and in situations we weren’t even aware of.
This “transmogrification” of computing is set to continue with the move to wearable computing. The best example of this and the most likely to become a commercial reality is Google Glass, Google’s project to combine a glasses with a camera and heads up display (HUD) that allows the wearer to capture video and augment that with the onboard computer.
It is not hard to see the many ways in which this can enhance our ability to interact with what is happening around us. The camera becomes an extension of our eyes and form a more natural and transparent way in which to capture images and video. Handheld cameras and smartphones always make the act of capturing the shot obvious and are more likely to make people behave differently. Google’s glasses are largely invisible and are less likely to interfere with the true recording of the moment.
The true power of the Google Glass is the augmentation features that it offers in being able to interpret video and present relevant information to the wearer in real-time. This is best illustrated in Google’s concept video “One Day" which shows a wearer using the glasses for directions, general information and communication.
Although the current design of these glasses may not be the final form they highlight that this technology is not that far away from being generally available. Indeed researchers have shown that they can adapt display technology onto contact lenses although there is still the issue of where to carry the actual computing element of this.
Of course, the Google Glass project raises as many issues as it does possibilities. There are the technical ones in terms of how to control the device and how to power it. There are also the social and personal issues about privacy as the glasses could be used to capture information without people knowing. There is also the more prosaic issue of looking like an “uber geek” when wearing them.
From wearable computing to cyborgs
Other examples of wearable computing include phone watches and smart sensors that measure activity and our body’s phyisiological state. Already body sensors in particular are influencing behaviour by allowing us to become aware of inputs and outputs. We can measure how many calories we have consumed and burned for example. But this is just the start. Heart rate monitors can also tell us how stressed someone is and provide feedback to enable them to avoid or control the situation. These sensors can be built into fabric that can communicate continuously to computing devices like a phone, watch or even glasses.
Ultimately however, we will see a parallel development with wearable computing which is computers and sensors which are integrated directly into our bodies. This is again not necessarily new. Implanted glucose sensors and insulin pumps, pacemakers and other devices have been inserted into the body to correct physical abnormalities for some time. Computers and circuitry to augment vision and movement in blindness and spinal injury are also being trialled. What will be a major shift is when the ability to have non-clinical sensors and devices interact directly with the body becomes common place.
This may come first in the form of epidermal electronics which is circuitry tatooed directly onto the skin. The benefit here is that this is relatively non-invasive and can be modified or removed easily. Of course it may also be that implanted chips are just a convenient way of carrying devices which can then be powered as a byproduct of our normal physiological processes.
Wearable computing will come to the fore over the next five to 10 years. This is almost inevitable as we are reaching the limits of further development of the current mobile computing devices. We will unfortunately have to wait a while longer than that for our cyborg future.
David Glance is a director at the Centre for Software Practice at The University of Western Australia.