Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New app uses smartphone selfies to screen for pancreatic cancer

29.08.2017

Pancreatic cancer has one of the worst prognoses -- with a five-year survival rate of 9 percent -- in part because there are no telltale symptoms or non-invasive screening tools to catch a tumor before it spreads.

Now, University of Washington researchers have developed an app that could allow people to easily screen for pancreatic cancer and other diseases -- by snapping a smartphone selfie.


BiliScreen is a new smartphone app developed by University of Washington computer scientists, electrical engineers and doctors that can screen for pancreatic cancer by having users snap a selfie. It uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person's sclera, or the white part of the eye, which is an early warning sign of pancreatic cancer and other diseases.

Credit: Paul G. Allen School of Computer Science & Engineering

BiliScreen uses a smartphone camera, computer vision algorithms and machine learning tools to detect increased bilirubin levels in a person's sclera, or the white part of the eye. The new app is described in a paper to be presented Sept. 13 at Ubicomp 2017, the Association for Computing Machinery's International Joint Conference on Pervasive and Ubiquitous Computing.

One of the earliest symptoms of pancreatic cancer, as well as other diseases, is jaundice, a yellow discoloration of the skin and eyes caused by a buildup of bilirubin in the blood. The ability to detect signs of jaundice when bilirubin levels are minimally elevated -- but before they're visible to the naked eye -- could enable an entirely new screening program for at-risk individuals.

In an initial clinical study of 70 people, the BiliScreen app -- used in conjunction with a 3-D printed box that controls the eye's exposure to light -- correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used.

"The problem with pancreatic cancer is that by the time you're symptomatic, it's frequently too late," said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering. "The hope is that if people can do this simple test once a month -- in the privacy of their own homes -- some might catch the disease early enough to undergo treatment that could save their lives."

BiliScreen builds on earlier work from the UW's Ubiquitous Computing Lab, which previously developed BiliCam, a smartphone app that screens for newborn jaundice by taking a picture of a baby's skin. A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants.

In collaboration with UW Medicine doctors, the UbiComp lab specializes in using cameras, microphones and other components of common consumer devices -- such as smartphones and tablets -- to screen for disease.

The blood test that doctors currently use to measure bilirubin levels -- which is typically not administered to adults unless there is reason for concern -- requires access to a health care professional and is inconvenient for frequent screening. BiliScreen is designed to be an easy-to-use, non-invasive tool that could help determine whether someone ought to consult a doctor for further testing. Beyond diagnosis, BiliScreen could also potentially ease the burden on patients with pancreatic cancer who require frequent bilirubin monitoring.

In adults, the whites of the eyes are more sensitive than skin to changes in bilirubin levels, which can be an early warning sign for pancreatic cancer, hepatitis or the generally harmless Gilbert's syndrome. Unlike skin color, changes in the sclera are more consistent across all races and ethnicities.

Yet by the time people notice the yellowish discoloration in the sclera, bilirubin levels are already well past cause for concern. The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.

"The eyes are a really interesting gateway into the body -- tears can tell you how much glucose you have, sclera can tell you how much bilirubin is in your blood," said senior author Shwetak Patel, the Washington Research Foundation Entrepreneurship Endowed Professor in Computer Science & Engineering and Electrical Engineering. "Our question was: Could we capture some of these changes that might lead to earlier detection with a selfie?"

BiliScreen uses a smartphone's built-in camera and flash to collect pictures of a person's eye as they snap a selfie. The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics. The app then calculates the color information from the sclera -- based on the wavelengths of light that are being reflected and absorbed -- and correlates it with bilirubin levels using machine learning algorithms.

To account for different lighting conditions, the team tested BiliScreen with two different accessories: paper glasses printed with colored squares to help calibrate color and a 3-D printed box that blocks out ambient lighting. Using the app with the box accessory -- reminiscent of a Google Cardboard headset -- led to slightly better results.

Next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements -- including removing the need for accessories like the box and glasses.

"This relatively small initial study shows the technology has promise," said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70.

"Pancreatic cancer is a terrible disease with no effective screening right now," Taylor said. "Our goal is to have more people who are unfortunate enough to get pancreatic cancer to be fortunate enough to catch it in time to have surgery that gives them a better chance of survival."

###

Co-authors include Allen School undergraduate student Megan A. Banks, Lauren Phillipi and assistant professor of medicine Lei Yu.

The research was funded by the National Science Foundation, the Coulter Foundation and endowment funds from the Washington Research Foundation.

For more information, contact the research team at uwbiliscreen@gmail.com or Mariakakis at atm15@cs.washington.edu.

Media Contact

Jennifer Langston
jlangst@uw.edu
206-430-2580

 @UW

http://www.washington.edu/news/

Jennifer Langston | EurekAlert!

More articles from Health and Medicine:

nachricht 3D images of cancer cells in the body: Medical physicists from Halle present new method
16.05.2018 | Martin-Luther-Universität Halle-Wittenberg

nachricht Better equipped in the fight against lung cancer
16.05.2018 | Friedrich-Alexander-Universität Erlangen-Nürnberg

All articles from Health and Medicine >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: LZH showcases laser material processing of tomorrow at the LASYS 2018

At the LASYS 2018, from June 5th to 7th, the Laser Zentrum Hannover e.V. (LZH) will be showcasing processes for the laser material processing of tomorrow in hall 4 at stand 4E75. With blown bomb shells the LZH will present first results of a research project on civil security.

At this year's LASYS, the LZH will exhibit light-based processes such as cutting, welding, ablation and structuring as well as additive manufacturing for...

Im Focus: Self-illuminating pixels for a new display generation

There are videos on the internet that can make one marvel at technology. For example, a smartphone is casually bent around the arm or a thin-film display is rolled in all directions and with almost every diameter. From the user's point of view, this looks fantastic. From a professional point of view, however, the question arises: Is that already possible?

At Display Week 2018, scientists from the Fraunhofer Institute for Applied Polymer Research IAP will be demonstrating today’s technological possibilities and...

Im Focus: Explanation for puzzling quantum oscillations has been found

So-called quantum many-body scars allow quantum systems to stay out of equilibrium much longer, explaining experiment | Study published in Nature Physics

Recently, researchers from Harvard and MIT succeeded in trapping a record 53 atoms and individually controlling their quantum state, realizing what is called a...

Im Focus: Dozens of binaries from Milky Way's globular clusters could be detectable by LISA

Next-generation gravitational wave detector in space will complement LIGO on Earth

The historic first detection of gravitational waves from colliding black holes far outside our galaxy opened a new window to understanding the universe. A...

Im Focus: Entangled atoms shine in unison

A team led by Austrian experimental physicist Rainer Blatt has succeeded in characterizing the quantum entanglement of two spatially separated atoms by observing their light emission. This fundamental demonstration could lead to the development of highly sensitive optical gradiometers for the precise measurement of the gravitational field or the earth's magnetic field.

The age of quantum technology has long been heralded. Decades of research into the quantum world have led to the development of methods that make it possible...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

VideoLinks
Industry & Economy
Event News

Save the date: Forum European Neuroscience – 07-11 July 2018 in Berlin, Germany

02.05.2018 | Event News

Invitation to the upcoming "Current Topics in Bioinformatics: Big Data in Genomics and Medicine"

13.04.2018 | Event News

Unique scope of UV LED technologies and applications presented in Berlin: ICULTA-2018

12.04.2018 | Event News

 
Latest News

Research reveals how order first appears in liquid crystals

23.05.2018 | Life Sciences

Space-like gravity weakens biochemical signals in muscle formation

23.05.2018 | Life Sciences

NIST puts the optical microscope under the microscope to achieve atomic accuracy

23.05.2018 | Physics and Astronomy

VideoLinks
Science & Research
Overview of more VideoLinks >>>