IBM has released its annual set of predictions for the next five years – and says it expects computers and smartphones to see, smell, touch, taste and hear.
You’ll be able to touch through your phone, experiencing different textures, and your device will be able to suggest new flavor combinations and even detect if you’re coming down with a cold.
“Just as the human brain relies on interacting with the world using multiple senses, by bringing combinations of these breakthroughs together, cognitive systems will bring even greater value and insights, helping us solve some of the most complicated challenges.”
A sense of touch would mean that shoppers could experience the texture of clothing before buying online, differentiating silk from linen or cotton, for example.
In terms of sight, says the company, systems will not only be able to look at and recognize the contents of images and visual data, they will be starting to make sense of it, much as a person views and interprets a photograph.
Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies,” says IBM.
“By being trained to discriminate what to look for in images – such as differentiating healthy from diseased tissue – and correlating that with patient records and scientific literature, systems that can ‘see’ will help doctors detect medical problems with far greater speed and accuracy.”
Improving sound perception will allow devices to detect elements of sound such as pressure, vibrations and sound waves at different frequencies – and use them, for example, to predict when trees will fall in a forest or when a landslide is on the way.
And your device will also be able to help when your palate’s feeling a little jaded, says IBM.
“IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer,” the company says.
“By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.”
And, says the report, your phone will soon be able to tell if you’re ill by checking you for germs. The company’s also working on systems to ‘sniff’ for disinfectant, for example, to check that hospital rooms are clean.
“The world is tremendously complex. We face challenges in deciphering everything from the science governing tiny bits of matter to the functioning of the human body to the way cities operate to how weather systems develop,” says Meyerson.
“Gradually, over time, computers have helped us understand better how the world works. But, today, a convergence of new technologies is making it possible for people to comprehend things much more deeply than ever before, and, as a result, to make better decisions.”