At least outside of science fiction, computers haven’t usually been built in our own image. They were developed for certain business or personal tasks. But things will change, says IBM.
A new generation of computers will operate on the basis of “concept and constructs” rather than simple data, says Bernie Meyerson, IBM fellow and vice-president of innovation. Known as cognitive system, these computers are starting to take on more human-like abilities. Especially learning.
“You don’t program a cognitive system. A cognitive system, you teach,” says Meyerson. But while humans have the theoretical capacity for infinite imagination, he adds, they have a finite amount of memory.
“At the end of the day the difference is: Humans don’t scale. Computers do.”
The innovations that will take place in computers over the next five years will go beyond speech and image recognition. Machines will begin to understand the five human senses on a deeper level based on what they learn, IBM predicts. In its seventh annual “5 on 5” list of predictions, the company expects the following advances to take place:
Sight
Computers will be able to recognize meaning in visual imagery itself, not just the information we use to describe it. When humans take in a visual scene, they immediately gather “monumental amounts of data,” says Meyerson, equivalent to terabytes in a computer.
The capability to understand images at that level would be especially useful in the medical field, where computers could recognize the difference between healthy and diseased tissue, for example.
Sound
As it stands, speech recognition platforms are most widely used to translate human vocal language into text. In some cases, notably in the security field, companies have tried to train computers to recognize stress in the human voice (e.g., to create a more versatile type of lie detector) with mixed results. But in five years, IBM predicts that computers will be listening to and translating baby talk for parents, detecting more complex emotions in people’s speech patterns, and even predicting avalanches.
Touch
These won’t be your grandfather’s tablets. Tomorrow’s touch screens will provide an analog of “feel” to users. For example, a tablet could simulate the texture of a garment through a series of specific vibrations.
These won’t be your grandfather’s tablets. Tomorrow’s touch screens will provide an analog of “feel” to users. For example, a tablet could simulate the texture of a garment through a series of specific vibrations.
Taste
No, you won’t be able to lick your iPhone, and you won’t be able to feed anything edible to your computer. But the data from “wet bench” chemistry work get processed through computers. Chemists have already identified (and been tinkering with) the molecular structure of flavours. Stir in a bit of analytics, and we could wind up with some really good tasting recipes—or better yet, healthy and good tasting recipes.
No, you won’t be able to lick your iPhone, and you won’t be able to feed anything edible to your computer. But the data from “wet bench” chemistry work get processed through computers. Chemists have already identified (and been tinkering with) the molecular structure of flavours. Stir in a bit of analytics, and we could wind up with some really good tasting recipes—or better yet, healthy and good tasting recipes.
Smell
While it may seem futuristic, the technology to “smell” things really isn’t that complex, since odours are really just airborne chemicals. Case in point: Eliminate all carbohydrates from your diet and your liver will begin to produce ketones, including volatile acetone. Next thing you know, your breath smells like nail polish remover. Not hard for a sensor to detect the same chemical signature and for a computer to infer your macronutrient intake.
While it may seem futuristic, the technology to “smell” things really isn’t that complex, since odours are really just airborne chemicals. Case in point: Eliminate all carbohydrates from your diet and your liver will begin to produce ketones, including volatile acetone. Next thing you know, your breath smells like nail polish remover. Not hard for a sensor to detect the same chemical signature and for a computer to infer your macronutrient intake.
But sensors will be able to detect much more in the future, says IBM. Just from your breath, they could identify a host of different diseases. Meanwhile, in a hospital setting, sensors could “smell” rooms to ensure they’ve been properly disinfected.