News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel

News, Analysis, Trends, Management Innovations for
Clinical Laboratories and Pathology Groups

Hosted by Robert Michel
Sign In

Study Finds Smartphones Can Be as Accurate as Pulse Oximeters at Reading Blood-Oxygen Saturation

Technology could enable patients to monitor their own oxygen levels and transmit that data to healthcare providers, including clinical laboratories

Clinical laboratories may soon have a new data point to add to their laboratory information system (LIS) for doctors to review. Researchers have determined that smartphones can read blood-oxygen levels as accurately as purpose-built pulse oximeters.

Conducted by researchers at the University of Washington (UW) and University of California San Diego (UC San Diego), the proof-of-concept study found that an unmodified smartphone camera and flash along with an app is “capable of detecting blood oxygen saturation levels down to 70%. This is the lowest value that pulse oximeters should be able to measure, as recommended by the US Food and Drug Administration,” according to Digital Health News.

This could mean that patients at risk of hypoxemia, or who are suffering a respiratory illness such as COVID-19, could eventually add accurate blood-oxygen saturation (SpO2) readings to their lab test results at any time and from any location.

The researchers published their findings in the journal NPJ Digital Medicine titled, “Smartphone Camera Oximetry in an Induced Hypoxemia Study.”

“In an ideal world, this information could be seamlessly transmitted to a doctor’s office. This would be really beneficial for telemedicine appointments or for triage nurses to be able to quickly determine whether patients need to go to the emergency department or if they can continue to rest at home and make an appointment with their primary care provider later,” Matthew Thompson, DPhil, Professor of Global Health and Family Medicine at University of Washington, told Digital Health News. Clinical laboratories may soon have a new data point for their laboratory information systems. (Photo copyright. University of Washington.)

UW/UC San Diego Study Details

The researchers studied three men and three women, ages 20-34. All were Caucasian except for one African American, Digital Health News reported. To conduct the study, a standard pulse oximeter was placed on a finger and, on the same hand, another of the participant’s fingers was placed over a smartphone camera.

“We performed the first clinical development validation on a smartphone camera-based SpO2 sensing system using a varied fraction of inspired oxygen (FiO2) protocol, creating a clinically relevant validation dataset for solely smartphone-based contact PPG [photoplethysmography] methods on a wider range of SpO2 values (70–100%) than prior studies (85–100%). We built a deep learning model using this data to demonstrate an overall MAE [Mean Absolute Error] = 5.00% SpO2 while identifying positive cases of low SpO2 < 90% with 81% sensitivity and 79% specificity,” the researchers wrote in NPJ Digital Medicine.

When the smartphone camera’s flash passes light through the finger, “a deep-learning algorithm deciphers the blood oxygen levels.” Participants were also breathing in “a controlled mixture of oxygen and nitrogen to slowly reduce oxygen levels,” Digital Health News reported.

“The camera is recording a video: Every time your heart beats, fresh blood flows through the part illuminated by the flash,” Edward Wang, PhD, Assistant Professor of Electrical and Computer Engineering at UC San Diego and senior author of the project, told Digital Health News. Wang started this project as a UW doctoral student studying electrical and computer engineering and now directs the UC San Diego DigiHealth Lab.

“The camera records how much that blood absorbs the light from the flash in each of the three color channels it measures: red, green, and blue. Then we can feed those intensity measurements into our deep-learning model,” he added.

The deep learning algorithm “pulled out the blood oxygen levels. The remainder of the data was used to validate the method and then test it to see how well it performed on new subjects,” Digital Health News reported.

“Smartphone light can get scattered by all these other components in your finger, which means there’s a lot of noise in the data that we’re looking at,” Varun Viswanath, co-lead author in the study, told Digital Health News. Viswanath is a UW alumnus who is now a doctoral student being advised by Wang at UC San Diego.

“Deep learning is a really helpful technique here because it can see these really complex and nuanced features and helps you find patterns that you wouldn’t otherwise be able to see,” he added.

Each round of testing took approximately 15 minutes. In total the researchers gathered more than 10,000 blood oxygen readings. Levels ranged from 61% to 100%.

“The smartphone correctly predicted whether the subject had low blood oxygen levels 80% of the time,” Digital Health News reported.

Smartphones Accurately Collecting Data

The UW/UC San Diego study is the first to show such precise results using a smartphone.

“Other smartphone apps that do this were developed by asking people to hold their breath. But people get very uncomfortable and have to breathe after a minute or so, and that’s before their blood-oxygen levels have gone down far enough to represent the full range of clinically relevant data,” said Jason Hoffman, a PhD student researcher at UW’s UbiComp Lab and co-lead author of the study.

The ability to track a full 15 minutes of data is a prime example of improvement. “Our data shows that smartphones could work well right in the critical threshold range,” Hoffman added.

“Smartphone-based SpO2 monitors, especially those that rely only on built-in hardware with no modifications, present an opportunity to detect and monitor respiratory conditions in contexts where pulse oximeters are less available,” the researchers wrote.

“This way you could have multiple measurements with your own device at either no cost or low cost,” Matthew Thompson, DPhil, Professor of Global Health and Family Medicine at University of Washington, told Digital Health News. Thompson is a professor of both family medicine and global health and an adjunct professor of pediatrics at the UW School of Medicine.

What Comes Next

The UW/UC San Diego research team plans to continue its research and gather more diversity among subjects.

“It’s so important to do a study like this,” Wang said. “Traditional medical devices go through rigorous testing. But computer science research is still just starting to dig its teeth into using machine learning for biomedical device development and we’re all still learning. By forcing ourselves to be rigorous, we’re forcing ourselves to learn how to do things right.”

Though no current clinical laboratory application is pending, smartphone use to capture biometrics for testing is increasing. Soon, labs may need a way to input all that data into their laboratory information systems. It’s something to consider.

—Kristin Althea O’Connor

Related Information:

A Smartphone’s Camera and Flash could Help People Measure Blood Oxygen Levels at Home

Smartphones Can Measure Blood Oxygen Levels at Home

Smartphone’s Camera, Flash, Can Measure Blood Oxygen Up to 70% at Home

Smartphone Camera Oximetry in an Induced Hypoxemia Study

Cedars-Sinai Researchers Determine Smartphone App Can Assess Stool Form as Well as Gastroenterologists and Better than IBS Patients

Artificial intelligence performs BSS assessments with higher sensitivity and specificity than human diagnosticians

In a recent study conducted by scientists at Cedars-Sinai Medical Center in Los Angeles, researchers evaluated a smartphone application (app) that uses artificial intelligence (AI) to assess and characterize digital images of stool samples. The app, it turns out, matched the accuracy of participating gastroenterologists and exceeded the accuracy of study patients’ self-reports of stool specimens, according to a news release.

Though smartphone apps are technically not clinical laboratory tools, anatomic pathologists and medical laboratory scientists (MLSs) may be interested to learn how health information technology (HIT), machine learning, and smartphone apps are being used to assess different aspects of individuals’ health, independent of trained healthcare professionals.

The issue that the Cedars Sinai researchers were investigating is the accuracy of patient self-reporting. Because poop can be more complicated than meets the eye, when asked to describe their bowel movements patients often find it difficult to be specific. Thus, use of a smartphone app that enables patients to accurately assess their stools in cases where watching the function of their digestive tract is relevant to their diagnoses and treatment would be a boon to precision medicine treatments of gastroenterology diseases.

The scientists published their findings in the American Journal of Gastroenterology, titled, “A Smartphone Application Using Artificial Intelligence Is Superior to Subject Self-Reporting when Assessing Stool Form.”

Mark Pimentel, MD

“This app takes out the guesswork by using AI—not patient input—to process the images (of bowel movements) taken by the smartphone,” said gastroenterologist Mark Pimentel, MD (above), Executive Director of Cedars-Sinai’s Medically Associated Science and Technology (MAST) program and principal investigator of the study, in a news release. “The mobile app produced more accurate and complete descriptions of constipation, diarrhea, and normal stools than a patient could, and was comparable to specimen evaluations by well-trained gastroenterologists in the study.” (Photo copyright: Cedars-Sinai.)

Pros and Cons of Bristol Stool Scale

In their paper, the scientists discussed the Bristol Stool Scale (BSS), a traditional diagnostic tool for identifying stool forms into seven categories. The seven types of stool are:

  • Type 1: Separate hard lumps, like nuts (difficult to pass).
  • Type 2: Sausage-shaped, but lumpy.
  • Type 3: Like a sausage, but with cracks on its surface.
  • Type 4: Like a sausage or snake, smooth and soft (average stool).
  • Type 5: Soft blobs with clear cut edges.
  • Type 6: Fluffy pieces with ragged edges, a mushy stool (diarrhea).
  • Type 7: Watery, no solid pieces, entirely liquid (diarrhea). 

In an industry guidance report on irritable bowel syndrome (IBS)and associated drugs for treatment, the US Food and Drug Administration (FDA) said the BSS is “an appropriate instrument for capturing stool consistency in IBS.”

But even with the BSS, things can get murky for patients. Inaccurate self-reporting of stool forms by people with IBS and diarrhea can make proper diagnoses difficult.

“The problem is that whenever you have a patient reporting an outcome measure, it becomes subjective rather than objective. This can impact the placebo effect,” gastroenterologist Mark Pimentel, MD, Executive Director of Cedars-Sinai’s Medically Associated Science and Technology (MAST) program and principal investigator of the study, told Healio.

Thus, according to the researchers, AI algorithms can help with diagnosis by systematically doing the assessments for the patients, News Medical reported.

30,000 Stool Images Train New App

To conduct their study, the Cedars-Sinai researchers tested an AI smartphone app developed by Dieta Health. According to Health IT Analytics, employing AI trained on 30,000 annotated stool images, the app characterizes digital images of bowel movements using five parameters:

  • BSS,
  • Consistency,
  • Edge fuzziness,
  • Fragmentation, and
  • Volume.

“The app used AI to train the software to detect the consistency of the stool in the toilet based on the five parameters of stool form, We then compared that with doctors who know what they are looking at,” Pimentel told Healio.

AI Assessments Comparable to Doctors, Better than Patients

According to Health IT Analytics, the researchers found that:

  • AI assessed the stool comparable to gastroenterologists’ assessments on BSS, consistency, fragmentation, and edge fuzziness scores.
  • AI and gastroenterologists had moderate-to-good agreement on volume.
  • AI outperformed study participant self-reports based on the BSS with 95% accuracy, compared to patients’ 89% accuracy.

Additionally, the AI outperformed humans in specificity and sensitivity as well:

  • Specificity (ability to correctly report a negative result) was 27% higher.
  • Sensitivity (ability to correctly report a positive result) was 23% higher.

“A novel smartphone application can determine BSS and other visual stool characteristics with high accuracy compared with the two expert gastroenterologists. Moreover, trained AI was superior to subject self-reporting of BSS. AI assessments could provide more objective outcome measures for stool characterization in gastroenterology,” the Cedars-Sinai researchers wrote in their paper.

“In addition to improving a physician’s ability to assess their patients’ digestive health, this app could be advantageous for clinical trials by reducing the variability of stool outcome measures,” said gastroenterologist Ali Rezaie, MD, study co-author and Medical Director of Cedars-Sinai’s GI Motility Program in the news release.

The researchers plan to seek FDA review of the mobile app.

Opportunity for Clinical Laboratories

Anatomic pathologists and clinical laboratory leaders may want to reach out to referring gastroenterologists to find out how they can help to better serve gastro patients. As the Cedars-Sinai study suggests, AI smartphone apps can perform BSS assessments as good as or better than humans and may be useful tools in the pursuit of precision medicine treatments for patient suffering from painful gastrointestinal disorders.

—Donna Marie Pocius

Related Information:

Smartphone Application Using Artificial Intelligence is Superior to Subject Self-Reporting When Assessing Stool Form

Study: App More Accurate than Patient Evaluation of Stool Samples

Industry Guidance Report: Irritable Bowel Syndrome—Clinical Evaluation of Drugs

Artificial Intelligence-based Smartphone App for Characterizing Stool Form

AI Mobile App Improves on “Subjective” Patient-Reported Stool Assessment in IBS

Artificial Intelligence App Outperforms Patient-Reported Stool Assessments

University of Washington Researchers Develop Home Blood Clotting Clinical Laboratory Test That Uses a Smartphone and a Single Drop of Blood

UW scientists believe their at-home test could help more people on anticoagulants monitor their clotting levels and avoid blood clots

In a proof-of-concept study,researchers at the University of Washington (UW) are developing a new smartphone-based technology/application designed to enable people on anticoagulants such as warfarin to monitor their clotting levels from the comfort of their homes. Should this new test methodology prove successful, clinical laboratories may have yet one more source of competition from this at-home PT/INR test solution.

PT/INR (prothrombin time with an international normalized ratio) is one of the most frequently performed clinical laboratory blood tests. This well-proven assay helps physicians monitor clotting in patients taking certain anticoagulation medications.

However, the process can be onerous for those on anticoagulation drugs. Users of this type of medication must have their blood tested regularly—typically by a clinical laboratory—to ensure the medication is working effectively. When not, a doctor visit is required to adjust the amount of the medication in the bloodstream.

Alternatively, where a state’s scope of practice law permits, pharmacists can perform a point-of-care test for the patient, thus allowing the pharmacist to appropriately adjust the patient’s prescription.

Though in the early stages of its development, were the UW’s new smartphone-based blood clotting test to be cleared by the federal Food and Drug Administration (FDA), then users would only need to see a doctor when their readings went and stayed out of range, according to Clinical Lab Products (CLP).

The UW researchers published their findings in the journal Nature Communications, titled, “Micro-Mechanical Blood Clot Testing Using Smartphones.”

Enabling Patients to Test Their Blood More Frequently

More than eight million Americans with mechanical heart valves or other cardiac conditions take anticoagulants, and 55% of people taking those medication say they fear experiencing life-threatening bleeding, according to the National Blood Clot Alliance.

They have reason to be worried. Even when taking an anticoagulation drug, its level may not stay within therapeutic range due to the effects of food and other medications, experts say. 

“In the US, most people are only in what we call the ‘desirable range’ of PT/INR levels about 64% of the time. This number is even lower—only about 40% of the time—in countries such as India or Uganda, where there is less frequent testing. We need to make it easier for people to test more frequently,” said anesthesiologist and co-author of the study Kelly Michaelsen, MD, PhD, UW Assistant Professor of Anesthesiology and Pain Medicine, in a UW news release.

Shyam Gollakota, PhD
“Back in the day, doctors used to manually rock tubes of blood back and forth to monitor how long it took a clot to form. This, however, requires a lot of blood, making it infeasible to use in home settings,” said senior study author Shyam Gollakota, PhD (above), professor and head of the Networks and Mobile Systems Lab at UW’s Paul G. Allen School of Computer Science and Engineering, in the UW news release. “The creative leap we make here is that we’re showing that by using the vibration motor on a smartphone, our algorithms can do the same thing, except with a single drop of blood. And we get accuracy similar to the best commercially available techniques [used by clinical laboratories].” (Photo copyright: University of Washington.)

How UW’s Smartphone-based Blood Clotting Test Works

The UW researchers were motived by the success of home continuous glucose monitors, which enable diabetics to continually track their blood glucose levels.

According to the Nature Communications paper, here’s how UW’s “smartphone-based micro-mechanical clot detection system” works:

  • Samples of blood plasma and whole blood are placed into a thimble-size plastic cup.
  • The cup includes a small copper particle and thromboplastin activator.
  • When the smartphone is turned on and vibrating, the cup (which is mounted on an attachment) moves beneath the phone’s camera.
  • Video analytic algorithms running on the smartphone track the motion of the copper particle.
  • If blood clots, the “viscous mixture” slows and stops.
  • PT/INR values can be determined in less than a minute.  

“Our system visually tracks the micro-mechanical movements of a small copper particle in a cup with either a single drop of whole blood or plasma and the addition of activators,” the researchers wrote in Nature Communications. “As the blood clots, it forms a network that tightens. And in that process, the particle goes from happily bouncing around to no longer moving,” Michaelsen explained.

The system produced these results:

  • 140 de-identified plasma samples: PT/INR with inter-class correlation coefficients of 0.963 and 0.966.
  • 79 de-identified whole blood samples: 0.974 for both PT/INR.

Another At-home Test That Could Impact Clinical Laboratories

The UW scientists intend to test the system with patients in their homes, and in areas and countries with limited testing resources, Medical Device Network reported.

Should UW’s smartphone-based blood-clotting test be cleared by the FDA, there could be a ready market for it. But it will need to be offered it at a price competitive with current clinical laboratory assays for blood clotting, as well as with the current point-of-care tests in use today.

Nevertheless, UW’s work is the latest example of a self-testing methodology that could become a new competitor for clinical laboratories. This may motivate medical laboratories to keep PT/INR testing costs low, while also reporting quick and accurate results to physicians and patients on anticoagulants.

Alternatively, innovative clinical laboratories could develop a patient management service to oversee a patient’s self-testing at home and coordinate delivery of the results with the patient’s physician and pharmacist. This approach would enable the lab to add value for which it could be reimbursed. 

Donna Marie Pocius

Related Information:

Smartphone App Can Vibrate a Single Drop of Blood to Determine How Well It Clots

Blood Coagulation Testing Using Smartphones

Micro-Mechanical Blood Clot Testing Using Smartphones

55% of Americans Taking Blood Thinners Indicate They Fear Suffering from Major Blooding, 73% More Cautious with Routine Activities to Avoid Risk

University of Washington Develops New Blood Clotting Test

Dermatopathologists May Soon Have Useful New Tool That Uses AI Algorithm to Detect Melanoma in Wide-field Images of Skin Lesions Taken with Smartphones

MIT’s deep learning artificial intelligence algorithm demonstrates how similar new technologies and smartphones can be combined to give dermatologists and dermatopathologists valuable new ways to diagnose skin cancer from digital images

Scientists at the Massachusetts Institute of Technology (MIT) and other Boston-area research institutions have developed an artificial intelligence (AI) algorithm that detects melanoma in wide-field images of skin lesions taken on smartphones. And its use could affect how dermatologists and dermatopathologists diagnose cancer.

The study, published in Science Translational Medicine, titled, “Using Deep Learning for Dermatologist-Level Detection of Suspicious Pigmented Skin Lesions from Wide-Field Images,” demonstrates that even a common device like a smartphone can be a valuable resource in the detection of disease.

According to an MIT press release, “The paper describes the development of an SPL [Suspicious Pigmented Lesion] analysis system using DCNNs [Deep Convolutional Neural Networks] to more quickly and efficiently identify skin lesions that require more investigation, screenings that can be done during routine primary care visits, or even by the patients themselves. The system utilized DCNNs to optimize the identification and classification of SPLs in wide-field images.”

The MIT scientists believe their AI analysis system could aid dermatologists, dermatopathologists, and clinical laboratories detect melanoma, a deadly form of skin cancer, in its early stages using smartphones at the point-of-care.  

Luis Soenksen, PhD

“Our research suggests that systems leveraging computer vision and deep neural networks, quantifying such common signs, can achieve comparable accuracy to expert dermatologists,” said Luis Soenksen, PhD (above), Venture Builder in Artificial Intelligence and Healthcare at MIT and first author of the study in an MIT press release. “We hope our research revitalizes the desire to deliver more efficient dermatological screenings in primary care settings to drive adequate referrals.” The MIT study demonstrates that dermatologists, dermatopathologists, and clinical laboratories can benefit from using common technologies like smartphones in the diagnosis of disease. (Photo copyright: Wyss Institute Harvard University.)

Improving Melanoma Treatment and Patient Outcomes

Melanoma develops when pigment-producing cells called melanocytes start to grow out of control. The cancer has traditionally been diagnosed through visual inspection of SPLs by physicians in medical settings. Early-stage identification of SPLs can drastically improve the prognosis for patients and significantly reduce treatment costs. It is common to biopsy many lesions to ensure that every case of melanoma can be diagnosed as early as possible, thus contributing to better patient outcomes.

“Early detection of SPLs can save lives. However, the current capacity of medical systems to provide comprehensive skin screenings at scale are still lacking,” said Luis Soenksen, PhD, Venture Builder in Artificial Intelligence and Healthcare at MIT and first author of the study in the MIT press release.

The researchers trained their AI system by using 20,388 wide-field images from 133 patients at the Gregorio Marañón General University Hospital in Madrid, as well as publicly available images. The collected photographs were taken with a variety of ordinary smartphone cameras that are easily obtainable by consumers.

They taught the deep learning algorithm to examine various features of skin lesions such as size, circularity, and intensity. Dermatologists working with the researchers also visually classified the lesions for comparison.

Smartphone image of pigmented skin lesions

When the algorithm is “shown” a wide-field image like that above taken with a smartphone, it uses deep convolutional neural networks to analyze individual pigmented lesions and screen for early-stage melanoma. The algorithm then marks suspicious images as either yellow (meaning further inspection should be considered) or red (indicating that further inspection and/or referral to a dermatologist is required). Using this tool, dermatopathologists may be able to diagnose skin cancer and excise it in-office long before it becomes deadly. (Photo copyright: MIT.)

“Our system achieved more than 90.3% sensitivity (95% confidence interval, 90 to 90.6) and 89.9% specificity (89.6 to 90.2%) in distinguishing SPLs from nonsuspicious lesions, skin, and complex backgrounds, avoiding the need for cumbersome individual lesion imaging,” the MIT researchers noted in their Science Translational Medicine paper.

In addition, the algorithm agreed with the consensus of experienced dermatologists 88% of the time and concurred with the opinions of individual dermatologists 86% of the time, Medgadget reported.

Modern Imaging Technologies Will Advance Diagnosis of Disease

According to the American Cancer Society, about 106,110 new cases of melanoma will be diagnosed in the United States in 2021. Approximately 7,180 people are expected to die of the disease this year. Melanoma is less common than other types of skin cancer but more dangerous as it’s more likely to spread to other parts of the body if not detected and treated early.

More research is needed to substantiate the effectiveness and accuracy of this new tool before it could be used in clinical settings. However, the early research looks promising and smartphone camera technology is constantly improving. Higher resolutions would further advance development of this type of diagnostic tool.

In addition, MIT’s algorithm enables in situ examination and possible diagnosis of cancer. Therefore, a smartphone so equipped could enable a dermatologist to diagnose and excise cancerous tissue in a single visit, without the need for biopsies to be sent to a dermatopathologist.

Currently, dermatologists refer a lot of skin biopsies to dermapathologists and anatomic pathology laboratories. An accurate diagnostic tool that uses modern smartphones to characterize suspicious skin lesions could become quite popular with dermatologists and affect the flow of referrals to medical laboratories.

JP Schlingman

Related Information:

Software Spots Suspicious Skin Lesions on Smartphone Photos

An Artificial Intelligence Tool That Can Help Detect Melanoma

Using Deep Learning for Dermatologist-level Detection of Suspicious Pigmented Skin Lesions from Wide-field Images

‘There’s an App for That’ is Becoming the Norm in Healthcare as Smartphones Provide Access to Patient Medical Records and Clinical Laboratory Test Results

Amazon’s app-based employee healthcare service could be first step toward retailer becoming a disruptive force in healthcare; federal VA develops its own mHealth apps

More consumers are using smartphone applications (apps) to manage different aspects of their healthcare. That fact should put clinical laboratories and anatomic pathology groups on the alert, because a passive “wait and see” strategy for making relevant services and lab test information available via mobile apps could cause patients to choose other labs that do offer such services.

Patient use of apps to manage healthcare is an important trend. In January, Dark Daily covered online retail giant Amazon’s move to position itself as a leader in smartphone app-based healthcare with its launch of Amazon Care, a virtual medical clinic and homecare services program. At that time, the program was being piloted for Seattle-based employees and their families only. Since then, it has been expanded to include eligible Amazon employees throughout Washington State.

Mobile health (mHealth) apps are giving healthcare providers rapid access to patient information. And healthcare consumers are increasingly turning to their mobile devices for 24/7 access to medical records, clinical laboratory test results, management of chronic conditions, and quick appointment scheduling and prescription refills.

Thus, hearing ‘There’s an app for that’ has become part of patients’ expectations for access to quality, affordable healthcare.

For clinical laboratory managers, this steady shift toward mHealth-based care means accommodating patients who want to use mobile apps to access lab test results and on-demand lab data to monitor their health or gain advice from providers about symptoms and health issues.

Amazon, VA, and EMS Develop Their Own mHealth Apps

The Amazon Care app can be freely downloaded from Apple’s App Store and Google Play. With it, eligible employees and family members can:

  • Communicate with an advice nurse;
  • Launch an in-app video visit with a doctor or nurse practitioner for advice, diagnoses, treatment, or referrals;
  • Request a mobile care nurse for in-home or in-office visits;
  • Receive prescriptions through courier delivery.

The combination telehealth, in-person care program, mobile medical service includes dispatching nurses to homes or workplaces who can provide “physical assessments, vaccines or common [clinical laboratory] tests.”

Glen Tullman, Executive Chairman of Livongo
“Amazon is a company that is experimenting a lot with a variety of opportunities in healthcare,” Glen Tullman (above), Executive Chairman of Livongo, a healthcare company specializing in treating diabetes, and an Amazon partner company, told CNBC. “It’s one to watch.” (Photo copyright: CNBC.)

However, the US federal Department of Veterans Affairs (VA) also is becoming a major player in the mHealth space with the development of its own mobile app—VA Launchpad—which serves as a portal to a range of medical services.

Veterans can access five categories of apps that allow them to manage their health, communicate with their healthcare team, share health information, and use mental health and personal improvement tools.

Neil C. Evans, MD, Chief Officer in the VA Office of Connected Care
“The VA was an early adopter of digital health tools and remains a leader within US healthcare in leveraging technology to enhance patient engagement,” Neil C. Evans, MD (above), Chief Officer in the VA Office of Connected Care, told Healthcare IT News. “These digital tools are allowing veterans to more actively understand their health data, to better communicate with VA clinical teams, and to engage more productively as they navigate their individual health journeys,” Evans added. (Photo copyright: Department of Veterans’ Affairs.)

mHealthIntelligence reported that mobile health tools also are enabling first responders to improve emergency patient care. At King’s Daughters Medical Center in Brookhaven, Miss., emergency medical technicians (EMTs) are using a group of mHealth apps from DrFirst called Backline to gain real-time access to patients’ HIPAA-compliant medication histories, share clinical data, and gain critical information about patients prior to arriving on the scene.

Using Backline, EMTs can scan the barcode on a patient’s driver’s license to access six months’ worth of medication history.

“In the past, we could only get information from [patients] who are awake or are willing to give us that information,” Lee Robbins, Director of Emergency Medical Services at King’s Daughters Medical Center in Brookhaven, Miss., told mHealthIntelligence. “Knowing this information gives us a much better chance at a good outcome.”

Smartphone App Detects Opioid Overdose

The opioid crisis remains one of the US’ greatest health challenges. The federal Centers for Disease Control and Prevention (CDC) reported 47,600 opioid-related deaths in 2017, and the problem has only gotten worse since then.

To curtail these tragic deaths, University of Washington (UW) researchers developed a smartphone app called Second Chance, that they believe can save lives by quickly diagnosing when an opioid overdose has occurred.

The app uses sonar to monitor an opioid user’s breathing rate and, according to a UW press release, can detect overdose-related symptoms about 90% of the time from up to three feet away. The app then contacts the user’s healthcare provider or emergency services.

The UW researchers are applying for US Food and Drug Administration (FDA) clearance. They published their findings in the journal Science Translational Medicine.

While Demand for mHealth Apps Grows, Concern over Privacy and Security also Increases  

According to mobile data and analytics company App Annie, global downloads of medical apps grew to more than 400 million in 2018, up 15% from two years earlier.

“As with mobile banking, consumers are showing they trust mobile apps with their most sensitive information and are willing to leverage them to replace tasks traditionally fulfilled in-person, such as going into a bank branch or, in the case of medical apps, to a doctor’s office,” App Annie’s website states.

However, the proliferation of mHealth apps has raised privacy and safety concerns as well. While the FDA does regulate some mobile health software functions, it does not ensure an mHealth app’s accuracy or reliability.

In his article, “Dangers of Defective Mobile Health Apps and Devices,” published on the Verywell Health website, Kevin Hwang, MD, MPH, physician, researcher, and Medical Director of UT Physicians General Internal Medicine Center in the Texas Medical Center at the University of Texas Medical School at Houston, points out that “most mHealth apps have not been tested in a rigorous manner.”

Fierce Healthcarereported that federal lawmakers are worried veterans who use the VA’s 47 mHealth apps could find their sensitive healthcare information shared or sold by third-party companies. In fiscal year 2018, veterans participated in more than one million video telehealth visits, a VA press release reported.

US Rep. Susie Lee, D-Nevada, Chairperson of the House Veterans’ Affairs Subcommittee on Technology Modernization, told Fierce Healthcare, “As we assess the data landscape at the VA and the larger health IT space, we need to look at where protections exist or don’t exist and whether we need more guardrails.”

What does all this mean for clinical laboratories? Well, lab managers will want to keep an eye on the growing demand from consumers who want direct access to laboratory test data and appointment scheduling through mHealth apps. And, also be aware of HIPAA regulations concerning the sharing of that information.

—Andrea Downing Peck

Related Information:

How Amazon is Using IoT to Care for Its Employees

Amazon Launches Amazon Care, a Virtual Medical Clinic for Employees

VA Seeing Substantial Growth in Telehealth, Key Patient Engagement Tools

VA Releases Launchpad App to Streamline Healthcare Access for Veterans and Caregivers

Drug Overdose Deaths

Smartphone App Can Detect Opioid Overdoes Using Sonar

VA Exceeds More than One Million Video Telehealth Visits in FY2018

Medical Apps Transform How Patients Receive Medical Care

Dangers of Defective Mobile Health Apps and Devices

mHealth Tools Help Providers Access Data When They Most Need it

Here’s How Amazon Employees Get Health Care Through a New App—A Glimpse of the Future of Medicine

VA Launches New mHealth App to Consolidate Vets’ Access to Resources

The VA Recommends Apps for PTSD and Pain Management. It’s Led to New Veteran Privacy Concerns

;