Subscribe to eTOC

Practice Matters
Those Smartphone Health Apps? What You and Your Patients Should Know about the ‘Hidden’ Risks



MORE THAN 300,000 HEALTH apps were available in 2017; of those, 78,000 new apps had been added to the major app stores in the past year.

Experts in health care technology and privacy discuss the challenges of ensuring that data shared in mobile health apps are safe and secure.

As smartphone apps become important tools for medical care — used for everything from tracking seizures and recording headaches to assessing the severity of Parkinson's disease and diagnosing dementia — so too are associated risks and potential pitfalls emerging.

“In the world we live in today, data are the new gold,” said Kirsten Ostherr, PhD, MPH, director of the Medical Futures Lab at Rice University. “The reason we get apps for free is because (the developers) get a lot of data about us. So the question is: What do they want to do with that data?”

Clinicians and patients who are using apps as part of a care plan may want to know the answer to that question, too — for good reason. More than 300,000 health apps were available in 2017; of those, 78,000 new apps had been added to the major app stores in the past year, according Research2Guidance, a market research firm that has tracked the mobile-health market for eight years.

There are 84,000 health app publishers, and 3.6 billion apps were downloaded last year — but in some ways, the field is still getting started. One indication: The share of app usage among people who have migraine is currently far less than 1 percent, the researchers reported.

Some apps support purely altruistic purposes, so their developers use the data only for research or clinical care. But many others are collecting data for the purpose of making money by selling data to other organizations, targeting advertisements and other functions.

In general, it is nearly impossible for an ordinary app user to tell what data is being collected by a health app and how it is being used, said Eric J. Stieglitz, JD, a technology and privacy lawyer in New York.


In theory, an app's privacy policy would provide that information for users. But, in an analysis of privacy policies of 29 apps used by people with migraine, Stieglitz and his coauthors found a wide range of policies regarding disclosures.

Some reveal they intend to share users' data, without meaningful details; others claim they will not; some make no privacy claims one way or the other, they reported in the July issue of the journal Headache.

Stieglitz's coauthors were Mia Minen, MD, chief of headache research in the division of headache medicine at NYU Langone Medical Center, and John Torous, MD, director of the digital psychiatry division in the department of psychiatry at Beth Israel Deaconess Medical Center.

Dr. Torous, who helped develop a neuropsychiatric research app, is enthusiastic about smartphone-enabled opportunities for clinical care and research. But, given the fact that many apps are wholly unregulated, he thinks clinicians and patients must become more aware of the risk/benefit ratio.

“People are being asked to give up a lot of information when they're using these apps,” he said. “They may not realize that the price of a free app is often themselves.”


Ipsit Vahia, MD, medical director for McLean Hospital's Institute for Technology in Psychiatry, shares that concern. He, Dr. Torous, and a colleague examined 72 apps being marketed to patients with dementia in an analysis published in the August 2017 issue of the American Journal of Geriatric Psychiatry.

“Inherently people with dementia have less ability to understand policies around privacy and data security, so we thought this might be an especially vulnerable population,” said Dr. Vahia, who is also the medical director of geriatric psychiatry outpatient services at McLean, an affiliate of Harvard Medical School.

Dr. Vahia and his colleagues found fewer than half — 46 percent — had a privacy policy; of those apps that do have policies, many are hard to understand.

Only 25 of the dementia apps examined described how they handle individual user data. Of those, 80 percent said they may share data with business partners or third parties; 60 percent said they may record the user's Internet protocol address or unique device identifier; and 52 percent said they may sell data in the event of a merger. (See “How Apps for Dementia Handle Individual User Data.”)

Calling dementia the biggest public health issue in America, as measured by overall cost of care, Dr. Vahia contends technology-based interventions must be a part of the solution. But if smartphone apps are to become widespread tools to improve care and quality of life for people with dementia, privacy protections must be in place, he told Neurology Today.


DR. KIRSTEN OSTHERR: “In the world we live in today, data are the new gold. The reason we get apps for free is because (the developers) get a lot of data about us. So the question is: What do they want to do with that data?”

Stieglitz' review of apps for people with migraine also found a wide range of privacy policies, some of which used “plain English” to explain their privacy promises and some of which were “incomprehensible regarding collection and use of data.”

Having a privacy policy — even one that promises to protect a user's data — is of limited comfort, he said. Although the app developer is legally bound to abide by the terms and conditions of a policy that a user has accepted, users have no way of knowing whether an app developer actually follows their own policies.

“They can promise you everything they want, but if nobody trained the (software) engineers on how to handle the data, it doesn't matter what they put in the policy,” he said. “If they are violating their own policy, they could be subject to legal consequences — if you can find the app developer — but they are still using your data.”

And for what purpose? That is the mystery that worries those who are looking at this issue closely. Dr. Torous pointed out that data from a health app can be pooled with data from other sources to create a user profile that many companies would like to have.

“The app may be collecting a GPS ping so it knows where you live and it knows what your symptoms are,” he said. “Could this one day be used by insurance companies to change your premiums?”


Dr. Minen, assistant professor of neurology and population health at NYU School of Medicine, sees patients' worry about the privacy of their personal health information firsthand.

When they enroll in a research study “and start reading the informed consent (document), that's when they may ask questions,” she said. “Some patients have asked whether the information collected could be provided to their health insurer and so forth.”

The answer is no, of course, and a member of the research team can explain how the personal health information is protected based on institutional review board policies. By contrast, commercial app developers have no such constraints — and, paradoxically, patients are typically unfazed.

“When we're talking with patients in the office about using commercial apps, they don't talk much about privacy and security and don't seem as concerned about it,” Dr. Minen said.

Dr. Ostherr's research underscores that observation. “We as health care consumers have really different and, in fact, contradictory attitudes about where and with whom we share our personal health information,” she said.

Through interviews with members of the general public, her research team found that individuals do not perceive the information they share on apps as particularly sensitive — even though it is — and in general are comfortable sharing it with corporations for vaguely understood purposes. They are less comfortable sharing information in research settings in which they feel individually identified and are asked to consciously agree to having their personal information collected.


DR. MIA MINEN: “When were talking with patients in the office about using commercial apps, they dont talk much about privacy and security and dont seem as concerned about it.”



“To me, this is a way of thinking about data privacy and security and trust that we're just starting to grapple with in society at large,” she said. “We really need to have a much more robust conversation about this, as more and more technology companies get into the health care space.”


None of the individuals interviewed for this article suggest that patients should avoid using health apps or that clinicians should advise against them. Rather, they encourage physicians to take a lead role in making apps safe for patients to use. The first step is becoming aware of the risks and benefits associated with health apps.


DR. IPSIT VAHIA contends technology-based interventions must be a part of the solution. But if smartphone apps are to become widespread tools to improve care and quality of life for people with dementia, privacy protections must be in place.

“If they were to recommend one, or even if they were to learn of a patient that was using an app for some kind of personal task, it is important for clinicians to raise the issue of privacy and security,” Dr. Vahia said. “In this day and age, for a physician to not discuss issues around privacy and security for an app would be akin to prescribing a medication without discussing its risks or side effects.”

Before recommending an app to their patients, physicians should use it, read the privacy policies, and become familiar with the content, Dr. Torous said.

“They can have two or three apps that they are comfortable with and keep up-to-date with and feel happy recommending to patients,” he said.

Beyond that, apps for some uses are quite easy to create, he said, and medical organizations might create their own apps so their patients don't have to risk misuse of personal data.

“In parts of the medical community, we can build these devices ourselves in a safe and secure manner, where we treat the data just like we treat patient data,” he said.

Dr. Ostherr suggests that medical societies could develop condition-specific apps in adherence with their guidelines — and in compliance with the Health Insurance Portability and Accountability Act so patient data could be linked to patients' electronic medical records.

The Consumer Technology Association and Xcertia, a joint mHealth collaborative effort — started by the American Medical Association, the American Heart Association, DHG Group, and the Healthcare Information and Management Systems Society — are in fact working together to develop guidelines that aim to assess the quality, safety and effectiveness of mHealth apps in the key areas of operability, privacy, security, and content.

Dr. Minen wants to see research that helps patients and clinicians understand how app developers are using data.

“It's important for patients and consumers to understand why these (commercial) apps are created,” she said. “Some of the companies may be interested in helping patients, but, in terms of transparency, it would be good to see who is funding them and what happens with the revenue.”


• Rosenfeld L, Torous J, Vahia I. Data security and privacy in apps for dementia: An analysis of existing privacy policies Am J Geriatr Psychiatry 2018; 25(8): 873–877.
• Ostherr K, Borodina S, Conrad R, et al. Trust and privacy in the context of user-generated health data Big Data & Society 2017: January-June 2017.
    • Minen MT, Stieglitz EJ, Sciortino, et al. Privacy issues in smartphone applications: An analysis of headache/migraine applications Headache 2018; 58(7):1014–1027.
    • Research2Guidance. mHealth app economics 2017: Current status and future trends in mobile health. How digital intruders are taking over the healthcare market November 2017.
      • Research2Guidance. mHealth Developer Economics: How mHealth app publishers are monetizing their apps March 2018.