MIT engineers design AI model capable of detecting the coronavirus in a person's cough with 98% accuracy

 An algorithm can detect the coronavirus in people who are asymptomatic, just from listening to the way they cough. 

Coronavirus patients who don't have symptoms still exhibit subtle changes not always detectable by the naked eye - or ear. 

Researchers at MIT developed an AI-powered model that distinguishes asymptomatic people from uninfected individuals by analyzing recordings of coughs submitted by tens of thousands of volunteers online.

The algorithm accurately identified 98.5 percent of coughs from people who tested positive for the virus, including 100 percent of coughs from asymptomatic patients.

The team is gathering more samples, with the goal of producing an app that could be a convenient and free pre-screening tool. 

Researchers at MIT used AI to analyze thousands of coughs and detect differences in those of people with coronavirus. Their program is now able to identify 98.5 percent of people with COVID-19 and 100 percent of people who are asymptomatic but still positive

Researchers at MIT used AI to analyze thousands of coughs and detect differences in those of people with coronavirus. Their program is now able to identify 98.5 percent of people with COVID-19 and 100 percent of people who are asymptomatic but still positive  

Such a program could fight the spread of the virus 'if everyone uses it before going to a classroom, a factory, or a restaurant,' says co-author Brian Subirana, director of MIT's Auto-ID Laboratory.

The lab had already been using algorithms using coughs and vocalizations to analyze conditions such as pneumonia, asthma and even Alzheimer's Disease, which is associated with neuromuscular degradation, such as weakened vocal cords.


When the pandemic started, Subirana thought the program might work for COVID-19, which is also thought to induce temporary neuromuscular impairment.

'The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs,' he said. 'This means that when you talk, part of your talking is like coughing, and vice versa.'

Brian Subirana, director of MIT's Auto-ID Laboratory and his team gathered recordings of more than 200,000 coughs to detect patterns in vocal cord strength, lung and respiratory performance, and muscular degradation

Brian Subirana, director of MIT's Auto-ID Laboratory and his team gathered recordings of more than 200,000 coughs to detect patterns in vocal cord strength, lung and respiratory performance, and muscular degradation

In April, they launched a website inviting people to record a series of coughs and fill out a survey detailing any symptoms, whether they have the virus, and how they were diagnosed.  

More than 70,000 forced-cough recordings were submitted to MIT's site, including 2,500 from people who had COVID-19, some of who were asymptomatic.

In all, more than 200,000 intentional coughs were recorded.

The algorithm was able to find patterns in vocal cord strength, lung and respiratory performance, and muscular degradation with uncanny accuracy.

It correctly identified all of the submissions from people who were asymptomatic but nonetheless tested positive for the novel coronavirus.

'We think this shows that the way you produce sound changes when you have COVID, even if you're asymptomatic,' Subirana said.

The University of Cambridge launched a similar project this spring, but it only reported an 80 percent success rate. 

An algorithm shouldn't replace traditional testing, Subirana says, since it's main function is to distinguish asymptomatic coughs from 'healthy' coughs.

Powered by Blogger.