Amazon's brightly colored models of its Echo Dot speaker are designed for children. (Photo: IC)
The weather in London recently has been a bit of a mystery at best, but when I went to visit my mother on the weekend, she had her own way of finding out how it would be.
"Alexa!" she barked at the inanimate black mass in the corner. "What is the weather like this week?"
Alexa dutifully replied; instantly, and clearly. The whole process took a few seconds and demonstrated clearly to me that my middle-aged parents have a better grasp on AI technology than I do.
Smart speakers are all the rage these days, with devices such as Amazon Echo and Google Home proving to be a huge hit in family homes. These speakers can be trained to recognize your voice and offer you intelligent suggestions, from the weather and news, to your choice of music.
The market is competitive for such innovative products, relying on low-cost high-volume sales, and, in China, competitors including Xiaomi and Alibaba are starting to also raise their heads to take on the Western technology giants.
The devices can hear and understand human speech impressively well, but what if they could see as well?
Researchers are currently working on smart speakers that can "see" by using the latest developments in LIDAR, a type of radar.
But just how does this impressive technology work?
In May this year, technology called SurfaceSight was unveiled at a technology conference in Glasgow by Pittsburgh University researchers. LIDAR, as previously mentioned, is a system that works in a similar way to radar. It bounces beams of electromagnetic waves of surrounding physical objects and measures the incremental differences in how long it takes for them to return.
When the LIDAR system revolves, it is able to create a 360-degree image of its surroundings. Researchers were able to integrate this technology into an Amazon Echo speaker, and could teach it to recognize hand gestures and reply with appropriate responses.
Soon, such technology may allow your speaker to recognize hand gestures, types of food, and even see the clothes you wear. Such is the potential power of this new technology that current prototypes have had their vision "limited" by limiting radar signal strength because researchers have been sensitive to the idea that the systems could be hacked by people wanting to spy on the owners of such speakers.
Currently, prototypes are only capable of seeing 6 millimeters deep, however, simple alterations to the software instructions would vastly increase this range.
The software that processes the LIDAR beams could potentially be trained to recognize objects, from different types of vegetables to saucepans. One day, these smart assistants may be able to tell us what ingredients are missing from the chopping board if we are to complete a certain type of meal, or help us locate a lost item in the house.
Radio waves can instantly scan 360 degrees of a room and locate any object that has been misplaced.
Smart speakers may also be able to recognize your smartphone and connect to it via Bluetooth in order to play music.
Whether society accepts something this radical remains to be seen, but what is considered strange, and what is not, is constantly changing.
The webcam hacking scares of several years ago have long shifted away from the public consciousness, as utility over time gradually wears away old prejudices.
The use of smart speakers that can see may appeal to some consumers however, and if this number is significant enough, then we can look forward to the next generation of smart AI assistants operating visually too. Losing my keys in the house may not be so much of a hassle for me by the time I reach my mother's age.