This study evaluated the use of virtual reality-based infrared pupillometry (VIP) to detect long COVID. It was a prospective, case-control cross-sectional study involving 185 participants aged 20-60 recruited from a community eye screening program.
Pupillary light responses (PLR) were recorded using a virtual reality head-mounted display in response to 3 light intensities. Nine PLR waveform features were extracted and analyzed statistically. Machine learning models were also trained and tested on the full PLR waveforms for classification.
Key findings:
– Constriction Time after the brightest stimulus was significantly associated with long COVID status
– Using select pupillometric features, accuracy/AUC was 0.8000/0.9000 for distinguishing controls vs. long COVID and 0.9062/0.9710 for post-COVID vs. long COVID
– An LSTM model analyzing the whole pupillometric waveform achieved the highest accuracy/AUC of 0.9375/1.000 in differentiating long COVID from post-COVID
– Accuracy was 0.7838 for three-class classification (long COVID vs. post-COVID vs. control).
The study identified specific pupillometric signatures that can differentiate long COVID from post-COVID or control subjects using a VR headset.
Combining statistical methods to identify key features with machine learning analysis of full waveforms enhanced performance.
The authors conclude that VIP shows promise as a non-invasive, low-cost, portable, and objective method to detect and monitor long COVID. This represents the first report of using VR-based pupillometry for this purpose. The technique could potentially provide an accessible tool for screening and assessing long COVID in clinical and community settings.
Tang, C. H., Yang, Y. F., Poon, K. C. F., Wong, H. Y. M., Lai, K. K. H., Li, C. K., …Chong, K. K. L. (2024). Virtual Reality-based Infrared Pupillometry (VIP) for long COVID. Ophthalmology, 39631631. Retrieved from https://pubmed.ncbi.nlm.nih.gov/39631631