Our mobiles devices are packed full of sensors and every year they bring out something new to enrich our lives. The challenge is thrown to developers to find new and interesting ways to take advantage of these sensors.
Currently your mobile device comes with:
Proximity Sensor, makes sure you don’t accidentally hang up on the caller when taking the call. Or place your phone face down, to put it into ‘do not disturb’ mood for meetings.
Add a headset you get Virtual Reality, and the camera for Augmented Reality.
The Gyroscope places your Pokémon in the environment, while the Accelerometer tells you to play while driving.
The Barometer senses a drop in pressure, and lets your weather screen saver know to change from sun to rain.
The Digital compass tells your map app which way you’re facing.
And now Biometrics, allow you to unlock your phone or log into your apps without the need for a pin or password.
As you can see, we have come a long way since the greatest thing on our phones was the Nokia Snake game. Each of these sensor delivers data to the application without the user having to think about. You don’t have to make sure you lock your phone as you take a call, so your face doesn't accidently hang up on the caller. And you don't have to tell google map you’re facing North in order for it to direct you.
These sensors help create seamless experiences. The latest sensor added to our device, is an infrared camera.
TrueDepth infrared system project a grid of 30,000 invisible light dots onto the user's face. An infrared camera then captures the distortion of that grid as the user rotates his or her head to map the face's 3-D shape..
Without depth perception, your device could be deceive by a photo or video. This new level of security allows Facebook to verify your identity with a selfie. Amazon to aurthorise purchases with a selfie. And KFC is not only allowing you to pay with your face, but also tries to predict your order based on previous orders, age, mood, and gender.
This current trend utilises the front facing camera. I believe next generation devices will focus on the adding senors to the rear camera.
In future our mobile devices will have both motion and depth perception cameras. GPS doesn’t work well when determining if a person has crossed a room. These additional sensors will allow developers to create applications that can:
- Get physical space measurement.
- Anchor 3D models to real space so you can walk around it.
- Create characters that can interact with environment. Write on the wall, stand on table.
With this much environmental data, you can even create an application to help the blind see. Microsoft is currently playing with Seeing AI. It can read documents, recognise people and objects.
When depth and motion cameras become commonplace in our mobile devices, an application could tell you when you might walk into an object and what that object is, as well as how far away it is. The possible applications for the next generation devices are very exciting.
Never miss another event or update from Adelphi Digital, subscribe here.
Are you ready for the next sensory wave?
The Melbourne office is offering complimentary 90 minute idea workshops in January/February 2018