A robot roams the Great Barrier Reef without supervision seeking and destroying crown-of-thorns starfish. A child coughs into a smartphone and instantly his parents know whether this is a serious medical condition or just a cold. A computer learns to recognise a cancer from an x-ray image.
Machines are being taught to take the load when it comes to processing huge amounts of data, helping to find answers to problems, big and small, from seeking the origins of the galaxy to predicting human behaviour or finding the genetic causes of cancer.
The smart computer, which self-improves with the more information it gets, is rolling out of the pages and screens of science fiction and into the real worlds of science, medicine and commerce.
It’s everywhere, even when we don’t realise it. Those anti-spam filters on the email box are a form of machine learning. They “learn” your preferences and take action based on past behaviour. Search engines also use a form of machine learning to bring up recommendations based on past activity.
The march of the machines is so loud that the Royal Society, the world’s oldest scientific academy, has just started a longer term project to determine the current and future impact on society.
“Many of us don’t realise just how many of our daily interactions involve machine learning, and what a powerful impact it has,” says Professor Peter Donnelly, director of the Wellcome Trust Centre for Human Genetics at Oxford University.
Smart machines are everywhere
“Just on our phones, machine learning underpins the voice recognition software that allows us to call home in a few words, or dictate messages, the friends Facebook suggests we add, mapping apps, and Google search.
“Looking forward, machine learning could improve disease diagnoses and personalised treatments, and power driverless vehicles that could revolutionise transport systems.
“As well as increasing awareness of the potential applications of the technology, we want to raise the level of public debate and identify key scientific and technical challenges and the social, ethical and legal issues raised by machine learning.”
In Australia, machine learning is being applied and researched at an increasing rate.
On the Great Barrier Reef, a robot designed to seek out and control the crown-of-thorns starfish — responsible for an estimated 40% decline in the reef’s coral cover — is being released into the wild.
The COTSbot, build by the Queensland University of Technology, has stereoscopic cameras to give it depth perception, five thrustors for stability, GPS and pitch-and-roll sensors, and an injection arm to deliver a fatal dose of bile salts to the offending starfish.
“Human divers are doing an incredible job of eradicating this starfish from targeted sites but there just aren’t enough divers to cover all the COTS (crown-of-thorns starfish) hotspots across the Great Barrier Reef,” says the robot’s creator, Dr Matthew Dunbabin from QUT’s Institute for Future Environments.
“We see the COTSbot as a first responder for ongoing eradication programs, deployed to eliminate the bulk of COTS (crown-of-thorns starfish) in any area, with divers following a few days later to hit the remaining COTS.
“The COTSbot becomes a real force multiplier for the eradication process the more of them you deploy — imagine how much ground the programs could cover with a fleet of 10 or 100 COTSbots at their disposal, robots that can work day and night and in any weather condition.”
The robot can deliver more than 200 lethal shots over eight hours.
At the centre of the starfish hunter is a machine learning system. Roboticists spent six months training the robot using thousands of still images and videos of the reef, showing it how find and recognise starfish among coral.
And the robot will continue to learn from experiences in the field.
“Its computer system is backed by some serious computational power so COTSbot can think for itself in the water,” says Dr Feras Dayoub, who designed the software.
“If the robot is unsure that something is actually a COTS, it takes a photo of the object to be later verified by a human, and that human feedback is incorporated into the robot’s memory bank.
“That in itself is quite an accomplishment given the complexity of underwater environments, which are subject to varying visibility as well as depth-dependent colour changes.”
Dunbabin first built a vision system for detecting the starfish from underwater images 10 years ago but shelved the idea of building a robot because of the limitations of the eradication methods then in use.
A breakthrough from James Cook University (JCU) last year allowed him to relaunch the project.
“I was really pleased to hear about JCU’s announcement last year of a one-shot injection method that had proved just as effective,” says Dunbabin.
“That was the game changer that opened the doors for a robotic solution to the COTS problem. Combining this with new advances in machine learning meant we could make COTSbot a reality.”
In the health sector, machine learning will tackle the big issues, one day getting down to the complex matter of mapping cells and redefining what is meant by personalised medicine.
Machines will help crunch large amounts of data to identify the exact types of cancer attacking or to detect, better than the human eye, whether a mark on a chest x-ray is serious or something which can be ignored.
Cancers and breaks in bones often missed by the human scanning images will be picked up and analysed by machine.
Melbourne-based radiology business Capitol Health is implementing a machine-learning system created by start-up Enlitic founded by Victorian entrepreneur Jeremy Howard. The system is based on a machine `learning from the millions of scans, images, tests and other records held in database.
“This is the beginning of a transformation of global health services,” says Howard.
Studying scans is part art and part science. Experience is needed. Radiologists have to know what they are looking for in X-rays and other medical images. But with machine help, radiologists will be able to work faster, give a more accurate diagnosis and save lives.
Machine learning is also spinning off new types of businesses.
Cough into the phone, please
In Western Australia, a smartphone app for respiratory diseases is being tested this summer at the Joondalup Health Campus in northern Perth.
The technology is based on machine learning algorithms which use sound alone to diagnose and measure the severity of respiratory conditions without the need for additional hardware.
The algorithms were initially developed by the University of Queensland with funding from the Bill and Melinda Gates Foundation.
The software application is now being brought to market by ResApp Health, an ASX-listed medtech company which raised $4 million in July to commercialise its exclusive licence from the University of Queensland.
The Emergency Department at Joondalup has emergency services to nearly 100,000 patients a year, making it one of busiest in Australia. Data has already been gathered from 450 children over eight months. The latest trials are to test adults.
“We have demonstrated the robustness of our machine learning algorithms to accurately diagnose the majority of childhood respiratory diseases,” says Dr Tony Keating, CEO of ResApp.
The latest study will gather data from adults coming in with respiratory conditions such as upper respiratory tract infections, bronchitis, pneumonia, asthma and chronic obstructive pulmonary disease.
ResApp is working with the University of Queensland to analyse the clinical data and is targeting 400 patients with preliminary results expected in the first half of 2016. ResApp plans to extend the adult study to a second site, one of the largest private emergency departments on the east coast of Australia.
Markets for ResApp’s technology include telehealth, emergency departments, clinics, at home use, and for the developing world where the usual hardware is too costly.