Brought to you by DataRobot logo
Better World/Better Business

The Value of Your Personal Health Data Depends on Who’s Asking

Tori Orr
Tori Orr
June 28, 2022

It used to be common knowledge that if you needed a little extra cash in the United States, you could sell your own blood plasma. There were limitations, of course. You needed to meet certain health standards, share your social security number, receive a blood test, and undergo a physical exam by a medical professional prior to the donation. If you passed those hurdles, you could stand to make generally between $30 and $60 per session once per month. The more you did it, the more likely it was that you could become anemic, but it was no more risky than donating blood to the American Red Cross. Most private centers would then deposit your payments onto a debit card that could be used just like your other payment cards.

It wasn’t considered risky behavior to do this, but most people realized that it also wasn’t a big money generating activity. In contrast, the stakes of selling personal health data seem actually riskier for both the individual and the company trying to extract value from that data. And those risks are different, but it’s something being considered all the same.

The Risks of Sharing Health Data

As highlighted in a recent More Intelligent Tomorrow podcast episode, there are many different types of health data. The podcast notes that

…lab data is different from wearable data, which is different from data from your doctor, which is different from data from your dentist, and so on. All your data needs to be extracted and keyed in such a way it can be cross referenced. But putting it all together isn’t enough. You must also think about the patient’s experience.”

No matter who is interested (or invested) in your personal data, behavioral science tells us we all tend to judge potentially harmful actions as worse than harmful inactions. For example, how do you feel about the risk of sharing the dates of a woman’s menstrual cycle out of context? This is the kind of data that could be sold or shared between companies when women track those details on their personal wellness apps or fitness devices. With legal decisions changing every decade, how are companies supposed to protect a woman’s right to privacy ethically, while still aggregating data meant to help women overall? And yet, this is the primary ethical question. Is doing nothing, or even preventing reuse, worse than taking action that could return a human benefit, monetary or otherwise?

There is no denying that companies are eager to monetize AI in a huge variety of applications.

Profitable machine learning systems are not primarily framed by principled ethics, but instead by economic logic. That is what business does. As consumers, we actually have no idea what personal risks we potentially incur when voluntarily contributing our information to a third party. And yet, perhaps the specific objective of ethics surrounding selling personal health data should not be for stifling personal behavior, but the precise opposite: to reveal blind spots, to promote autonomy, and foster personal responsibility.

The Changing Landscape of Data Brokering

Starting around 2019, thanks to a groundbreaking Vermont legal decision that exposed over 120 data brokers selling consumer data, there was growing public awareness of the data mining going on within all industries, including healthcare. There are countless articles educating the public on how they can opt-out or file lawsuits against data mining companies that are using their personal data without consent or without proper protections in place. After all, if it is your data, why should a business be profiting from it without compensating you or at the very least, ensuring your privacy and safety should some of it escape their control.

This idea—brokering your personal health information rather than you voluntarily contributing it—is gaining even greater interest within the medical and health insurance sectors that are using everyday consumer information to predict health risks. Despite all this brokering, health data promises great things for a healthier and happier life for everyone. The opportunity to improve health outcomes and reduce inequalities lies in the value of data that’s collected on all of us as we interact with health services and with the digital world as part of our daily lives. As noted by Rob O’Neill, DataRobot’s Field CTO for the Healthcare sector, herein lies the challenge,

Health data is often collected at an individual level, but value is usually only delivered following some sort of data aggregation, analysis, and action.”

That includes determining the accuracy and annotating the context in which it was gathered.

O’Neill stresses that the challenges faced by citizens, health delivery organizations, and health-data private companies come from the legal frameworks that underpin data protection and governance. It is exactly the complexity of regulation and policy that can cause conflict and misunderstanding between individuals and organizations seeking to unlock the true value of health data.

To unlock the value of health data we need a balanced and a pragmatic approach to regulated data sharing that empowers the individual to choose how and when their data is used, while offering business drivers to improve the quality of data being collected throughout the healthcare ecosystem. Machine learning can support accurate diagnoses, resulting in better treatments and greater cost effectiveness. In fact, AI tools can be harnessed to offer innovative solutions that positively impact the disadvantaged, hard to reach, and vulnerable segments of our society.

If we don’t keep the positives in mind, our default is to visualize all the undesirable side effects and ethical consequences, such as increased discrimination, mechanized death, and genetic, social, behavioral or technological selection. Everyone is aware that the medical industry still lacks long-term experience to reliably exclude the things that could go wrong. There are no ethicists considering the issue of selling or brokering health data, and scientists are alarmed at the increasing levels of cynicism towards scientific research and medical consensus. The public often remains unconvinced on many scientific findings, reflecting similar patterns within the research community. The belief is paying you for your personal data would only exacerbate the divide between open science norms and heavily gated communities.

Yet the dilemma remains: Is doing nothing inherently riskier than acknowledging and taking on these challenges?

One alternative outlined by The Medical Futurist in a compelling article about the dystopian future of paying for health insurance by crowdfunding, outlines how money outside the health industry complicates

…the desperate state of a healthcare system where victims of terrible illnesses have to ‘commodify’ themselves on online donation forums.”

They highlight how desperation with the rising costs of healthcare is creating a system of “organized begging” where people without access, like minorities, migrants and those who lack the resources the most, are exploited into using alternative means to pay for disasters outside of their control. Diseases like cancer, genetic disorders, are being shifted to charity funding rather than asking whether our health should ever be outsourced in that way. Yet when we claim this method of data collection is unethical and should be banned, we deny the opportunity to people who may need that money. Bringing us right back to the reasons many people sold blood plasma in the first place.

Can We Collect Health Data in an Ethical Way?

Health data is already being gathered in exchange for insurance offsets. The wearable device market is predicted to grow by 1.68 billion between now and 2024. There are no global policies, frameworks, or regulations in place for these apps, and although HIPPA does protect the information shared between you and your physician, health apps and the data you share through services like genome sequencing kits or at-home microbiome tests is not only subject to serious security risks, but marketed through brokerages to anyone and everyone with the money to exploit it.

This phase could be an opportunity to educate citizens and lawmakers about the various levels of risk and reward, enabling a more data literate society that understands how health data is collected and utilized. Together we could examine the ethical dimensions of how collection is undertaken, which could lead to a population that is better equipped to be able to make informed decisions about how, when, and for what purpose or benefit their health data is captured and utilized.

As technology empowers patients, it makes clear the need for equal-level partnerships. For example, the #wearenotwaiting movement is promoted by people with diabetes and is a good example of how this could work. A patient-first design can reshape policy and regulations around the specific needs of real people who as a community willingly pool their data in the interests of their own health without interference from the insurance industry.

Every effort should be extended to avoid the creation of medical eavesdroppers who wish to watch us with access to minute details of our lifestyle and medical decisions. Policy should be implemented with regulatory oversight so that third parties don’t get uncomfortably close to data about personal health decisions. In such a controlled landscape, the option to share data with a health insurer to motivate a healthy lifestyle should include having anonymized access to personalized care. The resulting collaboration could be one example about what it means to make good—or even slightly risky—healthcare decisions that truly promote autonomy, and foster personal responsibility.

Tori Orr
Tori Orr
AI Ethics Communications
White Paper
AI Ethics Consulting; Asking for More Than Advice
Read More
Tags: AI Ethics article Better World/Better Business data analytics data capture data science DataRobot ethics health Healthcare More Intelligent Healthcare More Intelligent Industries public health

Keep up with the latest news

You've successfully subscribed!

DataRobot is committed to protecting your privacy. You can find full details of how we use your information, and directions on opting out from our marketing emails, in our Privacy Policy.