Our new paper explores the ethical use of mobile health technology in clinical care. While the focus is on psychiatry, the framework can easily be applied to any medical condition and is not mental health specific. In the paper we introduce an ethical framework that one can apply in order to make a more informed decision about whether using a certain app is the right choice or not. The full paper can be accessed at: https://www.ncbi.nlm.nih.gov/pubmed/28005647 and below is a schematic of the framework. The paper goes into detail and provides background information, expanded descriptions, and clinical cases. A pre publication draft is attached below and is similar but does not reflect the final version that is posted on pubmed.
A new study published this week in the journal Health Affairs by Dr. Singh and colleagues examines the quality of mHealth apps across a broad range of illnesses ranging from depression to diabetes. The paper " Many Mobile Health Apps Target High-Need, High-Cost Populations, But Gaps Remain," (http://content.healthaffairs.org/content/35/12/2310.abstract) examined 137 patient facing apps directly available for download on commercial marketplaces for the Android and Apple smartphones.
They evaluated each of the 137 apps across nine metrics:
1: Target Population
2: Functionality related to patient engagement
3: Average app store star rating across all version
4: Clinical Utility on 0-10 scale
5: Usability by System Usability Scale
6: App’s reactivity to information that could indicate health danger, eg response to suicide
8: Data sharing and if secure
9: Cost data on the app
The results were interesting and provide the best pcicture to date of the quality of patient facing mHealth apps.
One of the most important findings was that app store ratings, eg stars, do not correlate well to either the quality or usability of health apps. A five star rating on the app store may look impressive, but actually is not informative whether the app will offer high quality medical information / services or even be engaging or easy to use. Another concerning result was that only 64% of apps had privacy policies. This means one third of the sampled apps do not even outline what happens to users' sensitive and personal health data. Also of note, 60% of apps transmitted patient generated data in non-secrure forms. The authors also explored how appropriate the responses of these apps were to certain situations (eg how does a depression app react if a user indicates serious thoughts of self-harm). In the case of depression apps the result was around 25% appopriate responses which is low, and was low across many other conditions as well.
Overall the study brings new data to the state of quality of patient facing health apps. The results suggest several areas for improvement especially in regards to privacy and safety. The number of missing privacy policies and amount of apps sending personal health information in non-secure forms is concerning - but something that we have the ability to quickly address. The high variability in clinical utility and usability scores and lack of correlation to marketplace star ratings suggests the needs for better ways to educate for patients and clinicians regarding finding a good app. Perhaps most concerning is the lack of appropriate responses from most of the sampled apps, suggestinh they may be offering incorrect or even dangerous information.
However, the situation also has a silver lining. Health apps are still in the early stages and the field is maturing. There is currently more research on health apps than ever before, and those results will guide a new generation of more useful, safe, and valid apps. There is now an increased focus on apps safety, evidence base, and usability. Things are beginning to chance and going forward can only improve.
Paper => http://content.healthaffairs.org/content/35/12/2310.abstract)
News Story on This Study => http://www.cbsnews.com/news/health-apps-smartphone-miss-medical-emergencies/
A new study from Dr. Brennan Speigel's Cedars-Sinai Center for Outcomes and Health Research, investiagted willingness of patients to share fitness trackers data with the hospital. The study is notable for its simplicity, scale, and striking resuls. (http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165908)
Nearly 80,000 patients were sent patient portal messages inviting them to share and sync their fitness tracker data. Apple Healthkit, Fitbit, and Withings were among supported devices. The question was of these roughly 80,000 (79,953 to be exact) people, how many elected to share their fitness tracker data? This was not a hypothetical question but a real life choice that Dr. Speigel and colleagues observed and measured.
The answer: four hundred and ninety nine. Even accounting for inactive portal users and removing children etc from the sample, there were still 66,015 eligible patients. 499/66,015 or less than 1 percent total (0.75%) was the final total. Looking at who were those 0.75% that did upload their data, results indicted the strongest predictors were 1) being a healthcare worker and 2) having health insurance.
So what do these results mean? This is the largest study to date examining willingness of patients to share mobile health data with a healthcare system. While not everyone owns a wearable device, estimates that roughly 1/3 of the population has access to one mean that there were likely over 25,000 people in this sample that could have shared their data. So with only 499 sharing there data it means that people are more reticent to share than is often claimed. While this study did not explore why people did not want to share, this is an important topic for further research. Was it fears of data security breaches, encroachment on privacy, lack of perceived value in sharing? Are people simply not using their wearables so have no data to share?
The data from this study does however tell something about those who did share their fitness tracker data. The two strongest predictors were being a healthcare worker and having health insurance. The authors suggest that healthcare workers may have heard more about the study and thus suggest further education and outreach may be a good route to increase engagement going forward. It is also possible those working in the healthcare system and with insurance care greatly about their health and are already very engaged in their wellbeing. Again, it is impossible to know more but this study raises interesting questions. The table below from the paper also explores other variables such as age, race, gender, BMI, and more.
The bottom line is offering patients the opportunity to share data via a portal is not enough. Building the infrastructure to support data sharing is a necessary, but insufficient step. Studies like this are important in providing the large scale and real world data that we can all use to create more engaging and impactful medical technologies.
The paper is free to access at: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165908
...."According to Dr. John Torous, co-director of the digital psychiatry program at Beth Israel Deaconess Medical Center and Harvard Medical School, there are about a dozen online programs on the market using cognitive behavior therapy techniques — for a variety of conditions, including depression — which also have rigorous evidence behind them.“There are maybe ten thousand or so mental apps out there, and the number is increasing way faster than the evidence base,” Dr. Torous said, “so it’s good to see someone doing careful studies.”
Dr. Torous said the one caveat for all of them is adherence. “When you stop paying people to be in a study, when they stop getting reminder phone calls, they often stop doing it,” he said. “It’s like a gym membership that way; people may do it twice and then let it go.”