Skip to main content

smartphone

Predicting ‘Long COVID Syndrome’ with Help of a Smartphone App

Posted on by

Zoe COVID Sympton Study Tracker app
Credit: Zoe Global

As devastating as this pandemic has been, it’s truly inspiring to see the many innovative ways in which researchers around the world have enlisted the help of everyday citizens to beat COVID-19. An intriguing example is the COVID Symptom Study’s smartphone-based app, which already has been downloaded millions of times, mostly in the United States and United Kingdom. Analyzing data from 2.6 million app users, researchers published a paper last summer showing that self-reported symptoms can help to predict infection with SARS-CoV-2, the coronavirus that causes COVID-19 [1].

New work from the COVID Symptom Study now takes advantage of the smartphone app to shed more light on Long COVID Syndrome [2], in which people experience a constellation of symptoms long past the time that they’ve recovered from the initial stages of COVID-19 illness. Such symptoms, which can include fatigue, shortness of breath, “brain fog,” sleep disorders, fevers, gastrointestinal symptoms, anxiety, and depression, can persist for months and can range from mild to incapacitating

This latest findings, published in the journal Nature Medicine, come from a team led by Claire Steves and Tim Spector, King’s College London, and their colleagues, and that includes NIH grantee Andrew Chan, Massachusetts General Hospital, Boston, and others supported by the Massachusetts Consortium on Pathogen Readiness. The team began by looking at data recorded between March 24-Sept. 2, 2020 from about 4.2 million app users with an average age of 45, about 90 percent of whom lived in the U.K., with smaller numbers from the U.S. and Sweden.

For this particular study, the researchers decided to focused on 4,182 app users, all with confirmed COVID-19, who had consistently logged in their symptoms. Because these individuals also started using the app when they still felt physically well, the researchers could assess their COVID-19-associated symptoms over the course of the illness.

While most people who developed COVID-19 were back to normal in less than two weeks, the data suggest that one in 20 people with COVID-19 are likely to suffer symptoms of Long COVID that persist for eight weeks or more. About one in 50 people continued to have symptoms for 12 weeks or more. That suggests Long COVID could potentially affect many hundreds of thousands of people in the U.K. alone and millions more worldwide.

The team found that the individuals most likely to develop Long COVID were older people, women, and especially those who experienced five or more symptoms. The nature and order of symptoms, which included fatigue, headache, shortness of breath, and loss of smell, didn’t matter. People with asthma also were more likely to develop long-lasting symptoms, although the study found no clear links to any other pre-existing health conditions.

Using this information, the researchers developed a model to predict which individuals were most likely to develop Long COVID. Remarkably, this simple algorithm—based on age, gender, and number of early symptoms–accurately predicted almost 70 percent of cases of Long COVID. It was also about 70 percent effective in avoiding false alarms.

The team also validated the algorithm’s predictive ability in data from an independent group of 2,472 people with confirmed COVID-19 and a range of symptoms. In this group, having more than five symptoms within the first week also proved to be the strongest predictor of Long COVID. And, again, the model worked quite well in identifying those most likely to develop Long COVID.

These findings come as yet another important reminder of the profound impact of the COVID-19 pandemic on public health. This includes not only people who are hospitalized with severe COVID-19 but, all too often, those who get through the initial period of infection relatively unscathed.

Recently, NIH announced a $1.15 billion investment to identify the causes of Long COVID, to develop ways of treating individuals who don’t fully recover, and, ultimately, to prevent the disorder. We’ve been working diligently in recent weeks to identify the most pressing questions and areas of greatest opportunity to address this growing public health threat. As a first step, NIH is funding an effort to track the recovery paths of at least 40,000 adults and children infected with SARS-CoV-2, to learn more about who develops long-term effects and who doesn’t. If you’d like to find a way to pitch in and help, getting involved in the COVID Symptom Study is as easy as downloading the app.

References:

[1] Real-time tracking of self-reported symptoms to predict potential COVID-19. Menni C, Valdes AM, Freidin MB, Sudre CH, Nguyen LH, Drew DA, Ganesh S, Varsavsky T, Cardoso MJ, El-Sayed Moustafa JS, Visconti A, Hysi P, Bowyer RCE, Mangino M, Falchi M, Wolf J, Ourselin S, Chan AT, Steves CJ, Spector TD. Nat Med. 2020 Jul;26(7):1037-1040. doi: 10.1038/s41591-020-0916-2. Epub 2020 May 11.

[2] Attributes and predictors of long COVID. Sudre CH, Murray B, Varsavsky T, Graham MS, Penfold RS, Bowyer RC, Pujol JC, Klaser K, Antonelli M, Canas LS, Molteni E, Modat M, Jorge Cardoso M, May A, Ganesh S, Davies R, Nguyen LH, Drew DA, Astley CM, Joshi AD, Merino J, Tsereteli N, Fall T, Gomez MF, Duncan EL, Menni C, Williams FMK, Franks PW, Chan AT, Wolf J, Ourselin S, Spector T, Steves CJ. Nat Med. 2021 Mar 10.

Links:

NIH launches new initiative to study to “Long COVID”. 2021 Feb 23. (NIH)

COVID-19 Research (NIH)

Massachusetts Consortium on Pathogen Readiness (Boston)

COVID Symptom Study

Claire Steves (King’s College London, United Kingdom)

Tim Spector (King’s College London)

Andrew Chan (Massachusetts General Hospital, Boston)

NIH Support: National Institute of Diabetes and Digestive and Kidney Diseases


Using Artificial Intelligence to Detect Cervical Cancer

Posted on by

Doctor reviewing cell phone
Credit: gettyimages/Dean Mitchell

My last post highlighted the use of artificial intelligence (AI) to create an algorithm capable of detecting 10 different kinds of irregular heart rhythms. But that’s just one of the many potential medical uses of AI. In this post, I’ll tell you how NIH researchers are pairing AI analysis with smartphone cameras to help more women avoid cervical cancer.

In work described in the Journal of the National Cancer Institute [1], researchers used a high-performance computer to analyze thousands of cervical photographs, obtained more than 20 years ago from volunteers in a cancer screening study. The computer learned to recognize specific patterns associated with pre-cancerous and cancerous changes of the cervix, and that information was used to develop an algorithm for reliably detecting such changes in the collection of images. In fact, the AI-generated algorithm outperformed human expert reviewers and all standard screening tests in detecting pre-cancerous changes.

Nearly all cervical cancers are caused by the human papillomavirus (HPV). Cervical cancer screening—first with Pap smears and now also using HPV testing—have greatly reduced deaths from cervical cancer. But this cancer still claims the lives of more than 4,000 U.S. women each year, with higher frequency among women who are black or older [2]. Around the world, more than a quarter-million women die of this preventable disease, mostly in poor and remote areas [3].

These troubling numbers have kept researchers on the lookout for low cost, but easy-to-use, tools that could be highly effective at detecting HPV infections most likely to advance to cervical cancer. Such tools would also need to work well in areas with limited resources for sample preparation and lab analysis. That’s what led to this collaboration involving researchers from NIH’s National Cancer Institute (NCI) and Global Good, Bellevue, WA, which is an Intellectual Ventures collaboration with Bill Gates to invent life-changing technologies for the developing world.

Global Good researchers contacted NCI experts hoping to apply AI to a large dataset of cervical images. The NCI experts suggested an 18-year cervical cancer screening study in Costa Rica. The NCI-supported project, completed in the 1990s, generated nearly 60,000 cervical images, later digitized by NIH’s National Library of Medicine and stored away safely.

The researchers agreed that all these images, obtained in a highly standardized way, would serve as perfect training material for a computer to develop a detection algorithm for cervical cancer. This type of AI, called machine learning, involves feeding tens of thousands of images into a computer equipped with one or more high-powered graphics processing units (GPUs), similar to something you’d find in an Xbox or PlayStation. The GPUs allow the computer to crunch large sets of visual data in the images and devise a set of rules, or algorithms, that allow it to learn to “see” physical features.

Here’s how they did it. First, the researchers got the computer to create a convolutional neural network. That’s a fancy way of saying that they trained it to read images, filter out the millions of non-essential bytes, and retain the few hundred bytes in the photo that make it uniquely identifiable. They fed 1.28 million color images covering hundreds of common objects into the computer to create layers of processing ability that, like the human visual system, can distinguish objects and their qualities.

Once the convolutional neural network was formed, the researchers took the next big step: training the system to see the physical properties of a healthy cervix, a cervix with worrisome cellular changes, or a cervix with pre-cancer. That’s where the thousands of cervical images from the Costa Rican screening trial literally entered the picture.

When all these layers of processing ability were formed, the researchers had created the “automated visual evaluation” algorithm. It went on to identify with remarkable accuracy the images associated with the Costa Rican study’s 241 known precancers and 38 known cancers. The algorithm’s few minor hiccups came mainly from suboptimal images with faded colors or slightly blurred focus.

These minor glitches have the researchers now working hard to optimize the process, including determining how health workers can capture good quality photos of the cervix with a smartphone during a routine pelvic exam and how to outfit smartphones with the necessary software to analyze cervical photos quickly in real-world settings. The goal is to enable health workers to use a smartphone or similar device to provide women with cervical screening and treatment during a single visit.

In fact, the researchers are already field testing their AI-inspired approach on smartphones in the United States and abroad. If all goes well, this low-cost, mobile approach could provide a valuable new tool to help reduce the burden of cervical cancer among underserved populations.

The day that cervical cancer no longer steals the lives of hundreds of thousands of women a year worldwide will be a joyful moment for cancer researchers, as well as a major victory for women’s health.

References:

[1] An observational study of Deep Learning and automated evaluation of cervical images for cancer screening. Hu L, Bell D, Antani S, Xue Z, Yu K, Horning MP, Gachuhi N, Wilson B, Jaiswal MS, Befano B, Long LR, Herrero R, Einstein MH, Burk RD, Demarco M, Gage JC, Rodriguez AC, Wentzensen N, Schiffman M. J Natl Cancer Inst. 2019 Jan 10. [Epub ahead of print]

[2] “Study: Death Rate from Cervical Cancer Higher Than Thought,” American Cancer Society, Jan. 25, 2017.

[3] “World Cancer Day,” World Health Organization, Feb. 2, 2017.

Links:

Cervical Cancer (National Cancer Institute/NIH)

Global Good (Intellectual Ventures, Bellevue, WA)

NIH Support: National Cancer Institute; National Library of Medicine


Building a Smarter Bandage

Posted on by

Smart Bandage

Credit: Tufts University, Medford, MA

Smartphones, smartwatches, and smart electrocardiograms. How about a smart bandage?

This image features a prototype of a smart bandage equipped with temperature and pH sensors (lower right) printed directly onto the surface of a thin, flexible medical tape. You also see the “brains” of the operation: a microprocessor (upper left). When the sensors prompt the microprocessor, it heats up a hydrogel heating element in the bandage, releasing drugs and/or other healing substances on demand. It can also wirelessly transmit messages directly to a smartphone to keep patients and doctors updated.

While the smart bandage might help mend everyday cuts and scrapes, it was designed with the intent of helping people with hard-to-heal chronic wounds, such as leg and foot ulcers. Chronic wounds affect millions of Americans, including many seniors [1]. Such wounds are often treated at home and, if managed incorrectly, can lead to infections and potentially serious health problems.


Seven More Awesome Technologies Made Possible by Your Tax Dollars

Posted on by

We live in a world energized by technological advances, from that new app on your smartphone to drones and self-driving cars. As you can see from this video, NIH-supported researchers are also major contributors, developing a wide range of amazing biomedical technologies that offer tremendous potential to improve our health.

Produced by the NIH’s National Institute of Biomedical Imaging and Bioengineering (NIBIB), this video starts by showcasing some cool fluorescent markers that are custom-designed to light up specific cells in the body. This technology is already helping surgeons see and remove tumor cells with greater precision in people with head and neck cancer [1]. Further down the road, it might also be used to light up nerves, which can be very difficult to see—and spare—during operations for cancer and other conditions.

Other great things to come include:

  • A wearable tattoo that detects alcohol levels in perspiration and wirelessly transmits the information to a smartphone.
  • Flexible coils that produce high quality images during magnetic resonance imaging (MRI) [2-3]. In the future, these individualized, screen-printed coils may improve the comfort and decrease the scan times of people undergoing MRI, especially infants and small children.
  • A time-release capsule filled with a star-shaped polymer containing the anti-malarial drug ivermectin. The capsule slowly dissolves in the stomach over two weeks, with the goal of reducing the need for daily doses of ivermectin to prevent malaria infections in at-risk people [4].
  • A new radiotracer to detect prostate cancer that has spread to other parts of the body. Early clinical trial results show the radiotracer, made up of carrier molecules bonded tightly to a radioactive atom, appears to be safe and effective [5].
  • A new supercooling technique that promises to extend the time that organs donated for transplantation can remain viable outside the body [6-7]. For example, current technology can preserve donated livers outside the body for just 24 hours. In animal studies, this new technique quadruples that storage time to up to four days.
  • A wearable skin patch with dissolvable microneedles capable of effectively delivering an influenza vaccine. This painless technology, which has produced promising early results in humans, may offer a simple, affordable alternative to needle-and-syringe immunization [8].

If you like what you see here, be sure to check out this previous NIH video that shows six more awesome biomedical technologies that your tax dollars are helping to create. So, let me extend a big thanks to you from those of us at NIH—and from all Americans who care about the future of their health—for your strong, continued support!

References:

[1] Image-guided surgery in cancer: A strategy to reduce incidence of positive surgical margins. Wiley Interdiscip Rev Syst Biol Med. 2018 Feb 23.

[2] Screen-printed flexible MRI receive coils. Corea JR, Flynn AM, Lechêne B, Scott G, Reed GD, Shin PJ, Lustig M, Arias AC. Nat Commun. 2016 Mar 10;7:10839.

[3] Printed Receive Coils with High Acoustic Transparency for Magnetic Resonance Guided Focused Ultrasound. Corea J, Ye P, Seo D, Butts-Pauly K, Arias AC, Lustig M. Sci Rep. 2018 Feb 21;8(1):3392.

[4] Oral, ultra-long-lasting drug delivery: Application toward malaria elimination goals. Bellinger AM, Jafari M1, Grant TM, Zhang S, Slater HC, Wenger EA, Mo S, Lee YL, Mazdiyasni H, Kogan L, Barman R, Cleveland C, Booth L, Bensel T, Minahan D, Hurowitz HM, Tai T, Daily J, Nikolic B, Wood L, Eckhoff PA, Langer R, Traverso G. Sci Transl Med. 2016 Nov 16;8(365):365ra157.

[5] Clinical Translation of a Dual Integrin avb3– and Gastrin-Releasing Peptide Receptor–Targeting PET Radiotracer, 68Ga-BBN-RGD. Zhang J, Niu G, Lang L, Li F, Fan X, Yan X, Yao S, Yan W, Huo L, Chen L, Li Z, Zhu Z, Chen X. J Nucl Med. 2017 Feb;58(2):228-234.

[6] Supercooling enables long-term transplantation survival following 4 days of liver preservation. Berendsen TA, Bruinsma BG, Puts CF, Saeidi N, Usta OB, Uygun BE, Izamis ML, Toner M, Yarmush ML, Uygun K. Nat Med. 2014 Jul;20(7):790-793.

[7] The promise of organ and tissue preservation to transform medicine. Giwa S, Lewis JK, Alvarez L, Langer R, Roth AE, et a. Nat Biotechnol. 2017 Jun 7;35(6):530-542.

[8] The safety, immunogenicity, and acceptability of inactivated influenza vaccine delivered by microneedle patch (TIV-MNP 2015): a randomised, partly blinded, placebo-controlled, phase 1 trial. Rouphael NG, Paine M, Mosley R, Henry S, McAllister DV, Kalluri H, Pewin W, Frew PM, Yu T, Thornburg NJ, Kabbani S, Lai L, Vassilieva EV, Skountzou I, Compans RW, Mulligan MJ, Prausnitz MR; TIV-MNP 2015 Study Group.

Links:

National Institute of Biomedical Imaging and Bioengineering (NIH)

Center for Wearable Sensors (University of California, San Diego)

Hyperpolarized MRI Technology Resource Center (University of California, San Francisco)

Center for Engineering in Medicine (Massachusetts General Hospital, Boston)

Center for Drug Design, Development and Delivery (Georgia Tech University, Atlanta)

NIH Support: National Institute of Biomedical Imaging and Bioengineering; National Institute of Diabetes and Digestive and Kidney Diseases; National Institute of Allergy and Infectious Diseases


Cool Videos: Insulin from Bacteria to You

Posted on by

If you have a smartphone, you’ve probably used it to record a video or two. But could you use it to produce a video that explains a complex scientific topic in 2 minutes or less? That was the challenge posed by the RCSB Protein Data Bank last spring to high school students across the nation. And the winning result is the video that you see above!

This year’s contest, which asked students to provide a molecular view of diabetes treatment and management, attracted 53 submissions from schools from coast to coast. The winning team—Andrew Ma, George Song, and Anirudh Srikanth—created their video as their final project for their advanced placement (AP) biology class at West Windsor-Plainsboro High School South, Princeton Junction, NJ.


Next Page