I have been following the Web 3.0/Internet of things debate at the European Union(and indeed am a part of it based on my various activities) – and hence this is a fascinating post from Tim o Reilly which talks of sensor based interaction and Cloud integration
And that of course is the future of mobile as well. A mobile phone is inherently a connected device with local memory and processing. But it’s time we realized that the local compute power is a fraction of what’s available in the cloud. Web applications take this for granted — for example, when we request a map tile for our phone — but it’s surprising how many native applications settle themselves comfortably in their silos. (Consider my long-ago complaint that the phone address book cries out to be a connected application powered by my phone company’s call-history database, annotated by data harvested from my online social networking applications as well as other online sources.)
Put these two trends together(sensor based interaction and cloud integration) , and we can imagine the future of mobile: a sensor-rich device with applications that use those sensors both to feed and interact with cloud services. The location sensor knows you’re here so you don’t need to tell the map server where to start; the microphone knows the sound of your voice, so it unlocks your private data in the cloud; the camera images an object or a person, sends it to a remote application that recognizes it, and retrieves relevant data. All of these things already exist in scattered applications, but eventually, they will be the new normal. This is an incredibly exciting time in mobile application design. There are breakthroughs waiting to happen. Voice and gesture recognition in the Google Mobile App is just the beginning.