TECH Scientists find way to predict chances of waking patients from 'vegetative state'

TECH

Scientists find way to predict chances of waking patients from 'vegetative state'

Global Times

07:50, May 27, 2020

53c9873f-cbd7-4d66-aa2d-ee472a3c426c.jpeg

(Photo: Global Times)

Chinese scientists and clinicians from Shanghai have developed a new diagnostic and prognostic assessment for patients suffering from consciousness disorders, also known as being in a "vegetative state," by studying their brain activity in language processing, which has been verified in clinical try-outs as having much greater accuracy than traditional observation methods.

The study was jointly carried out by a research team led by Wang Liping from the Institute of Neuroscience, Center for Excellence in Brain Science and Intelligence Technology of the Chinese Academy of Sciences and by Mao Ying and Wu Xuehai from the Department of Neurosurgery in Huashan Hospital, Fudan University, and was published in the academic journal Nature Neuroscience on Tuesday, according to the CAS institute in Shanghai. 

Having adopted a novel language paradigm, the study set out to provide a new and reliable approach to objectively characterize and predict states of consciousness and analyze individual patients' language processing abilities, the Global Times learned from the CAS research team on Tuesday. 

Scientists and clinicians elicited rhythmic brain responses tracking the single-word, phrase and sentence rhythms in speech, to examine whether bedside electroencephalography (EEG) recordings can help inform diagnosis and prognosis. 

There are nearly 100,000 patients that fall into a coma due to brain trauma, stroke, anoxia and other diseases, and then enter a long-term state of consciousness disorder, which is the traditional "vegetative state," according to the researchers. 

Traditionally, among these patients, those who are in a minimally conscious state, as opposed to the unresponsive wakefulness syndrome, are most likely to recover. However, the current assessment of their situation largely relies on observation by an experienced physician, which is subjective and has a rate of misdiagnosis of around 40 percent.

The research team found that, when patients receive speech with word-, phrase- and sentence-level structures, the brain activity recorded by the EEG reflects their cognitive level, which could help determine their conscious level. These EEG-derived neural signals, including both speech-tracking responses and temporal dynamics of global brain states, are associated with behavioral diagnosis of consciousness.

Crucially, as multiple EEG measures in the language paradigm were able to predict future outcomes in individual patients, EEG-based language assessment provides a new and reliable approach for objectively characterizing and predicting states of consciousness and to longitudinally track individual patients' language processing abilities at the bedside, the research team told the Global Times on Tuesday. 

According to the research team, the new approach has successfully and accurately predicted individual patients' recovery from consciousness disorder after 100 days with an accuracy of 80 percent.

Also, although neuroimaging studies, like fMRI, have demonstrated a potential for informing diagnosis and prognosis in unresponsive patients, these expensive methods involve sophisticated brain imaging technologies and several restrictions, which limit their clinical application.

The EEG-based study of patients' language processing abilities at the bedside are expected to fix the limitations of neuroimaging studies and provide accurate predictions, the Global Times was told.


Related Stories

Terms of Service & Privacy Policy

We have updated our privacy policy to comply with the latest laws and regulations. The updated policy explains the mechanism of how we collect and treat your personal data. You can learn more about the rights you have by reading our terms of service. Please read them carefully. By clicking AGREE, you indicate that you have read and agreed to our privacy policies

Agree and continue