In the previous two installments of this blog series we discussed the importance of protecting health information and how new technologies, on one hand, enable easy gathering and sharing of information but, on the other, create new challenges on how to protect that data.
In this post, we review what consumers should understand about their digital health data and what they can do to protect it and prevent misuse. Typically, the merits of collecting health and wellness information and sharing of such data as appropriate will outweigh these risks. But we do need to make sure we understand the implications and undertake efforts to minimize them.
Data Is Everywhere
The simple underlying questions are – who owns your health data, what is their commercial interest in it and who is responsible for protecting the data?
The answer to such questions used to be simple. Health data was owned by you (or your caregiver) – the “little book” your parents used to keep with your childhood diseases and records of broken bones and other mishaps – or by your care provider (doctor or hospital). In fact, the most comprehensive collection of health data was probably with your family physician, as he or she would be the focal point of all care decisions and the associated information.
Not so anymore. As medicine progressed and we collected more critical data, the need for centralized record keeping evolved. The first attempt to assemble health information independent of who provided the service was the universal immunization record. But we moved beyond that, and what used to be our “little book” of health is now entered into a personal health record (PHR) or patient portal that is linked to the hospital’s electronic health record (EHR) system. In addition, technical and scientific progress enabled us to collect data that we never considered the need to track before – for example, through our fitness trackers or smart watches.
The result is that we now have more information in more places that allows for uses we never imagined before – or we may not even be aware of. Technology, too, is advancing. Thus, insights and findings we are deriving from this type of information today may be vastly different (and perhaps more significant) in the future. About 20 years ago, we never thought that the human genome could be decoded. Now we see genetic information (and the resulting medical benefits) in a different light, and the price of a basic genetic test is so low that it even makes a nice birthday present.
Free Often Has a Price
Let’s face it – if a company provides “free” apps or relatively low-priced genetic testing services, it has to make money somewhere. And, unfortunately, this usually means the use of your health data and secondary uses of such data (including, but not limited to, commercial use).
Take, for example, a fitness tracking device or jogging app on your smartphone. Clearly, these technologies are a great way to monitor, analyze and improve my exercise habits. I can map my running or biking route and get information on distance, time, elevation and calories burned. But, if not properly secured, it also tracks my location, allowing a stalker to follow me or a burglar to know when I am not at home. There have also been reports of legitimate uses to such applications, such as the use by law enforcement.
Do we make the effort to fully understand where and how such data is stored, how it is protected and, most importantly, how it is being analyzed and shared? Does the device or app maker provide a clear and easy to understand privacy notification, and does it allow me to opt out of any tracking or sharing of my data for any other purpose than the intended? Does the device or app maker really want me to understand what I am agreeing to, and what they are doing with my data?
Similar concerns may exist with genetic testing. One reason that these tests are now so affordable (and heavily advertised) is that the secondary uses of such data are quite lucrative. But, you as the consumer may not be aware of it as it may be buried layers deep in some small print legal terms. Indeed, depending upon the jurisdiction you live in and applicability of specific laws and regulations, there may be a danger in waiving your privacy rights to genetic information derived from such testing. (As always, if you have legal questions, seek qualified legal counsel in the relevant jurisdiction.)
Data may, in some cases, be anonymized prior to sharing – meaning that identifying information like name, address and date of birth, have been removed and replaced by anonymous identifiers. But after that, information can be freely sold to commercial entities for marketing or research purposes. Pharmaceutical companies have been reported to sign million-dollar contracts for access to genetic databases. However, there remains a residual risk, as researchers have demonstrated that so-called “anonymized” data may be re-identified in certain instances (e.g., “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” or “Re-identification of DNA through an automated linkage process”).
Consumers should be concerned and make a specific effort to understand:
- Are the types of data collected and the times and conditions of collection clearly defined?
- Does the company provide a clear explanation of how your data is being used and how it is being shared or commercialized?
- Where is the company based, which country’s laws will apply and where is your data being held?
- What are your rights in regard to your data?
- What options do you have to restrict the use of your data, often referred to as “opt-out”? (Please note: this may require a higher fee/price or premium level of service.)
- What ability do you have to get your data deleted or corrected?
- If the data is shared and, if so, internationally?
- What is the policy in regard to legal and law enforcement requests?
- Does the company commit to notifying you in case of a data breach?
- In case of a change in ownership, bankruptcy or other legal entity or business changes, how does ownership, use or protection of your data change?
It’s All in the Genes
One specific area of concern is obviously genetic data that you may have sent to a genetic testing company to learn more about your ancestry – we have all seen the commercials. It is estimated that within a decade the global market for genetic tests is expected to reach $10 billion.
But even if data is anonymized, we as consumers should be cautious. And even with some legal protections that may be in place, their applicability could be limited. For example, the U.S. Genetic Information Nondiscrimination Act (GINA) only prohibits discrimination and prevents the use of genetic data in health insurance or employment.
We need to be aware that our genetic data is a representation of us, more so than any other data set, be it identification numbers like Social Security, insurance information and even your medical records. It defines your heritage, racial origin, genetic diseases or disease risk and basic physical traits like eye and hair color. It is a full representation of you.
It turns out, too, that re-identification is surprisingly easy, as researchers have demonstrated. Thus, your genetic-related or derived information could be used against you, as there may be the risk of discrimination. As consumers we should make an effort to understand what we agree to and how our DNA data may be used. This would include the “sharing” of your data with research partners, business partner and affiliates, yet these companies may fail to meet even basic transparency standards and may even share data internationally.
Where Do We Go From Here?
This article is intended to provide a basic discussion and is not written to be a complete and comprehensive review of a quite complex topic. As stated previously, we need to find a balance between using new technologies to our advantage while minimizing privacy risks.
We as consumers should:
- Use new technology wisely and in full understanding of the risks and benefits.
- Make sure we understand the business terms and privacy agreements we are about to accept.
- Understand – and use – options to restrict the use of your data.
- Use common best practices to secure your accounts – for example:
- Don’t share usernames and passwords across accounts
- Use strong passwords
- Use a separate email account to channel unsolicited marketing emails
- Do not link with your social or professional networks
- Be careful with using your home address or real name
- Do not shy away from asking questions, seeking advice or even challenge corporations and lawmakers to improve data protection
The focus of this blog series was to discuss the privacy risks of new and evolving health and wellness technologies. There are obvious and significant personal and public health benefits of these technologies and it was not our intent to discuss (or dispute) them. We did want to, however, draw focus and attention to the risks and complexities of data privacy to assure that we don’t operate under a false sense of security.
Handling profoundly personal data without appropriate protections is a significant risk for a consumer and it is important that we become empowered consumers, conscious custodians, and responsible stewards of our data. Handling profoundly personal data without appropriate protections is a significant risk for a consumer and it is important that we become empowered consumers, conscious custodians, and responsible stewards of our data.
- TechRepublic, –“The Dark Side of Wearables: How They’re Secretly Jeopardizing Your Security and Privacy”
- Business Insider – “Almost Every Fitness Tracker on the Market Leaves Their Users at Risk of ‘Long-Term Tracking of Their Location’”
- Forbes –“The Privacy Delusions of Genetic Testing”
- Federal Trade Commission – “Direct-to-Consumer Genetic Tests”
- Federal Trade Commission – “DNA Test Kits: Consider the Privacy Implications”
- Federal Trade Commission – “Consumer Generated and Controlled Health Data”
- Federal Trade Commission “Mobile Health Apps Interactive Tool”
- theDataMap – “Documenting All the Places Personal Data Goes”
- HIMSS – “Privacy and Security Awareness Initiative”
About the Authors
Axel Wirth, CPHIMS, CISSP, HCISPP, is a distinguished solutions architect for the U.S. health care industry at Symantec Corporation. He provides strategic vision and technical leadership within Symantec’s health care vertical, serving in a consultative role to health care providers, industry partners and health technology professionals. Drawing from over 30 years of international experience in the industry, Mr. Wirth is supporting Symantec’s health care customers to solve their critical security, privacy, compliance and IT management challenges.
Carrie McGlaughlin, CISM, has worked two decades in health care IT and is the director of information technology and HIPAA security officer at the Buckeye Ranch, a behavior and mental health organization for youth and families.
Bayardo Alvarez, CPHIMS, is the director of information technology for Boston PainCare Center, an interdisciplinary practice focusing on the treatment and research of chronic pain. His responsibilities include overseeing Boston PainCare’s cybersecurity program and compliance. Bayardo has served in healthcare industry for over a decade, and has over 30 years of experience in information technology. He is also a member and chair of the HIMSS Privacy and Security Committee.
Lee Kim, JD, CISSP, CIPP/US, FHIMSS, is the director of privacy and security at HIMSS. In her role, she focuses on education and advocacy related initiatives involving health care information security and privacy. Lee has worked both on the technology and the legal aspects of health IT for over ten years.
- #ChatSTC Twitter Chat: May the Cyber Force Be With You
- #ChatSTC Twitter Chat: Now Matters – How Are You Fighting Cyber Threats?
- #ChatSTC Twitter Chat: Protect Your Identity With a Digital Spring Cleaning
- Data Privacy Is Crucial for the LGBT Community
- Laugh and Learn: A More Private Tomorrow, Tomorrow
- #ChatSTC Twitter Chat: Promote a Better Internet This Safer Internet Day
- Sharing While Caring – Protecting Your Digital Self
- Three Things Businesses Can Do to Protect Data Privacy
- #ChatSTC Twitter Chat: Fostering a Culture of Privacy Awareness at Work
- #ChatSTC Twitter Chat: Privacy in a Growing Internet of Me