In the dizzying swirl of health-related websites, social media and smartphone apps, finding a reliable source of health information can be a challenge. A group of researchers from the Johns Hopkins University schools of medicine and public health, as well as the university's Applied Physics Laboratory, have mapped out a course to navigate that complicated landscape.
Simon C. Mathews, a Johns Hopkins gastroenterologist and assistant director of the Armstrong Institute for Patient Safety and Quality at Johns Hopkins Bayview Medical Center, led a team of researchers who this week published "Digital Health; a Path to Validation" in the Nature Research journal npj Digital Medicine.
"There are more than 3 million mobile health apps out there, with 200 new ones springing up daily," says Mathews. "Patients, providers, payers, industry and regulators all have to navigate the confusion. It's a challenge to find solutions that provide real value."
Mathews tapped his Johns Hopkins colleagues at the Applied Physics Lab to help study the wide universe of health care technology.
"APL has experience in evaluating solutions in other industries," Mathews says. "We borrowed some of their engineering best practices in setting up our own approach to evaluation."
The npj Digital Medicine article illustrates a culture clash between technology and health care. Mathews and colleagues point to the tech startup mantra of "fail fast, fail often," which they say sits in direct opposition to the bioethical axiom "first, do no harm."
Mathews and his colleagues contend that many developers rush their technology to market, with too little focus on product design, safety testing and clinical trials. Pressure to move quickly encourages developers to perform only minimal verification and validation testing. Too often, says Mathews, end users' needs take a back seat to fast-track marketing.
In their paper, they cite a number of examples of how medical apps frequently do not provide meaningful clinical information and have an overall low quality of scientific support.
The researchers advocate a "digital health scorecard" for health-related technology. The scorecard would offer ratings by applying rigorous and objective validation of health technology solutions. Mathews likens such a scorecard to Underwriters Laboratory or Consumer Reports, nonprofit testing laboratories that evaluate products for safety and quality.
"Our proposed digital health scorecard takes a hybrid approach of these models," Mathews says. "It defines requirements and standards for digital health products, puts those products through rigorous and objective testing, then offers objective reports of the results."
The scorecard would report results of testing across four domains, including technical and clinical validation, as well as usability and cost assessments.
Mathews and his colleagues see two avenues for bringing their digital health scorecard to life.
"Governmental regulatory bodies, such as the Food and Drug Administration, the Federal Trade Commission and the Centers for Medicare and Medicaid Services could advocate and support this approach," says Mathews. "Alternatively, health care or hospital systems could establish a network of partners to help define requirements and test solutions according to transparent standards."
Several private and nonprofit organizations have taken on the task of evaluating health care technology for providers and consumers, the most encouraging of which, says Mathews, is a beta "online library" established in 2017 by the United Kingdom's National Health Service.
The NHS Apps Library offers government-sponsored validation, scoring health-related applications for clinical effectiveness, regulatory approval, safety, privacy, confidentiality and other measures.
"It's a step in the right direction toward a credible, trustworthy source for providers and consumers," says Mathews. "But there remains far more to be done."
In addition to Mathews, the article's authors are Alain B. Labrique of the Johns Hopkins Bloomberg School of Public Health and Michael J. McShea, Casey L. Hanley, Alan Ravitz and Adam B. Cohen of the Johns Hopkins University Applied Physics Lab. The authors report no conflicts of interest.