Most people know Watson as IBM’s answer to Jeopardy star Ken Jennings. But IBM’s aspirations for its artificially intelligent supercomputer are now less quiz show champion and more medical genius.
“Watson, the supercomputer that is now the world Jeopardy champion, basically went to med school after it won Jeopardy,” MIT’s Andrew McAfee, coauthor of The Second Machine Age, said recently in an interview with Smart Planet. “I’m convinced that if it’s not already the world’s best diagnostician, it will be soon.”
Watson is already capable of storing far more medical information than doctors, and unlike humans, its decisions are all evidence-based and free of cognitive biases and overconfidence. As IBM scientists continue to train Watson to apply its vast stores of knowledge to actual medical decision-making, it’s likely just a matter of time before its diagnostic performance surpasses that of even the sharpest doctors.
Back in 2011, McAfee wrote on his blog about why a diagnosis from “Dr. Watson” would be a gamechanger:
It’s based on all available medical knowledge. Human doctors can’t possibly hold this much information in their heads, or keep up it as it changes over time. Dr. Watson knows it all and never overlooks or forgets anything.
It’s accurate. If Dr. Watson is as good at medical questions as the current Watson is at game show questions, it will be an excellent diagnostician indeed.
It’s consistent. Given the same inputs, Dr. Watson will always output the same diagnosis. Inconsistency is a surprisingly large and common flaw among human medical professionals, even experienced ones. And Dr. Watson is always available and never annoyed, sick, nervous, hungover, upset, in the middle of a divorce, sleep-deprived, and so on.
It has very low marginal cost. It will be very expensive to build and train Dr. Watson, but once it’s up and running the cost of doing one more diagnosis with it is essentially zero, unless it orders tests.
It can be offered anywhere in the world. If a person has access to a computer or mobile phone, Dr. Watson is on call for them.
That’s one reason IBM has been pumping Watson full of medical knowledge — a subject area that’s actually significantly more contained than “all the world’s general knowledge,” which is what Watson tried to learn for Jeopardy.
Watson has “read” dozens of textbooks, all of PubMed and Medline (two massive databases of medical journals), and thousands of patient records from Memorial Sloan Kettering. All together, “Watson has analysed 605,000 pieces of medical evidence, 2 million pages of text, 25,000 training cases and had the assist of 14,700 clinician hours fine-tuning its decision accuracy,” Forbes reported in 2013.
And it’s getting “smarter” every year. So how would Dr. Watson work in practice? Here’s how IBM describes the process:
First, the physician might describe symptoms and other related factors to the system. Watson can then identify the key pieces of information and mine the patient’s data to find relevant facts about family history, current medications and other existing conditions. It combines this information with current findings from tests, and then forms and tests hypotheses by examining a variety of data sources — treatment guidelines, electronic medical record data and doctors’ and nurses’ notes, as well as peer-reviewed research and clinical studies. From here, Watson can provide potential treatment options and its confidence rating for each suggestion.
The supercomputer’s potential is huge, but — as The Wall Street Journal reported earlier this year — currently “just a handful of customers are using Watson in their daily business,” and it’s far from performing at the level and in the range of domains that should be possible in the future.
So far, IBM’s most high-profile AI partnerships are with MD Anderson Cancer Center, where Watson helps recommend leukemia treatments, and WellPoint, where Watson helps the insurer evaluate doctors’ treatment plans.
Wellpoint, currently the exclusive reseller for Dr. Watson, has claimed that the system is already significantly better than human doctors at diagnosing lung cancer.
Watson is not yet able to leverage all the information it has absorbed, so it still has a ways to go before it catches up with our best human diagnosticians, whose versatility and agility is difficult to match. But Watson’s ability to learn, analyse, and apply knowledge suggests that it will get there — eventually.
“If and when Dr. Watson gets as good at diagnosis as Watson is at Jeopardy! I want it as my primary care physician,” McAfee wrote, back in 2011.
That day may come sooner than we imagined.