At its core, trust is fundamental to healthcare. We share private experiences and fears openly with clinicians knowing that such conversations are confidential and protected. In the U.S., federal laws like HIPAA and HITECH offer safeguards to ensure patients' privacy and confidentiality are respected and maintained. But what do we know about trust for newer digital health tools like smartphone apps aimed at health? A survey from Rock Health, conducted last year offers some interesting data about willingness to share health data:
The result that only 8% of respondents were willing to share their health data with technology companies suggests a strong lack of trust. This lack of trust may be well founded with a recent report from the U.S. Department of Health and Human Service's noting that many current mhealth platforms and apps do not offer adequate patient protection and confidentiality. The report outlines five areas of special concern:
Healthcare data is an increasingly valuable target for theft and hacking. A recent industry report from Arxan looked at 71 health apps and found that the majority of those sampled were vulnerable to hacking. While it may be easy to gloss over the security of apps given that most clinicians and patients lack the background and experiences to fully evaluate such - security flaws have been the downfall of some of the largest mhealth related efforts to date. After reports surfaced regarding the lack of security measures in apps on the U.K.'s National Health Service's app library in October 2015 - the entire app library was taken offline by the government. The rise and fall of the health app rating company Happtique is another example of unrecognized security vulnerabilities leading to the downfall of a digital health company. Ensuring that apps can protect healthcare data requires collaborations with engineers and security experts.
So how do we build trust for digital health platforms? A good start is a dual approach with a focus on 1) transparency and 2) data security. While each alone is important, neither alone is sufficient. An app can have the world's best security features but not handle patient data in an ethical manner. Or an app can have a patient-centered approach to data and give the user full control over their data - but suffer from security flaws that effectively make the data public. While there are many app developers and companies already following best practices and creating technologies that respect and protect confidentiality - there are still many that are not there yet. For mobile health to reach its full potential and become frontline tools in clinical care, trust will be critical. Building that through both transparency and security will be key. Here at BIDMC we are studying both the transparency and ethics associated with mobile health, partnering with local engineering teams to better understand security vulnerabilities, and educating both clinicians and patients about what to look for when picking an app.
The recent release and quick popularity of Pokemon Go has received much attention in the last week. In simplest terms, it is a smartphone app and game that requires users to travel around their physical environment and pick up / collect various characters. The app features augmented reality as shown in the screenshots below. To see and capture these various characters, the app superimposes these targets on video from the smartphone. Various social media sources and popular press articles have been quick to point out the potential mental health benefits of Pokemon Go. But what do we actually know about Pokemon Go and mental health?
The simple answer is very little. This is a new app that people are only just using. While there is already anecdotal evidence that the app is helping some people be more active and even feel better, we do not yet know more. However, the app is interesting as it demonstrates how willing and excited people are to use an augmented reality app. There are already many augmented reality apps available to download from the commercial apps stores, many seeming to feature spiders (see figure below). The benefits of augmented reality apps like this for exposure therapies are broad and research in this area is not new: a 2005 paper discusses using augmented reality to help with cockroach phobia. While virtual reality has to date received more attention and been the subject of numerous research endeavorers, especially for PTSD, it requires additional hardware beyond the phone: namely glasses or googles. Augmented reality requires nothing beyond the phone and thus is ready to scale to a population level - today - as Pokemon Go has demonstrated. But Pokemon Go has also highlighted another potential area of interest to mental health and health in general.
By virtue of having users travel to real locations to collect characters, Pokemon Go forces users to leave their homes, be active, and move. Physical activity remains a first line recommendation for many, although not all, mental illnesses with broad benefits for many patients. Therapies such as behavioral activation have been proven effective and in part seek to help people engage more with their environment (although there is more to behavioral activation than simply being active). Having people be more active and moving brings both physical and mental health benefits.
Thus observing how Pokemon Go is using augmented reality to make people more active is interesting and holds unique potential for health in general, not just mental health alone. But there are also risks to consider and much we simply do not know. Do apps like Pokemon Go cause people to over exert themselves or place themselves in dangerous situations? Does it lead to patterns of regular exercise in the long term or just a one time burst? Are there certain people it may be harmful for? Early privacy concerns have already been raised and are not yet fully settled. So do apps like Pokemon Go have any mental health benefits? It is too early to tell, but certainly the use of augmented reality and its uptake on a population level is noteworthy.