• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

Facebook's Role in the Health Data Privacy Crisis

Article

In part one of a series, we examine how health data sharing can go awry on social media.

facebook data,facebook data privacy,health data privacy,phi

On July 12, the Federal Trade Commission reached a settlement with Facebook, which agreed to pay $5 billion for data privacy lapses. Data privacy has come under attack in recent years. In the wild, wild west of healthcare data, data platforms that came around after protections were written might bring about improvements. But other technologies, including Facebook, have already dealt blows to health data privacy.

Patients want their health data to be secure, and healthcare leaders want meaningful ways to protect patients. Those desires could be incompatible with Facebook, which has a long history of using private data with questionable security and an absence of ethics. And healthcare is starting to notice.

Sens. Lisa Murkowski, a Republican from Alaska, and Amy Klobuchar, a Minnesota Democrat who’s running for president, proposed the Protecting Personal Health Data Act this past June, with the goal of closing security gaps in health data treatment for platforms that didn’t exist when HIPAA was created. In order to understand this bill, it is important to look at some of the aspects of health data available.

This is the first part in a look at personal health data. Perhaps the attention surrounding health data began with Facebook. That’s why, in analyzing the future of health data protections and threats to patient privacy, I’m starting with the social network. (I reached out to Facebook for comment but didn’t receive a response.)

My Facebook Data

One of the most disturbing aspects of Facebook data privacy became apparent to me in November 2018, when I had a baby. I was so excited to get ads for Nested Bean’s baby sleep sack, which I came to love. (Note: I’m not affiliated with that company, but l wish I were.) Still, pregnancy and childbirth are important to a patient’s health status, and Facebook knew this information about me.

Pregnancy is protected under the Pregnancy Discrimination Act, and some women have lost work or a promotion because they were expecting. When social media platforms collect that type of health information and surface it, there is no regulation over what marketing companies do with that data. An employer could know about my baby and my personal life based on my online digital footprint. A lack of clarity about digital health data could put protected classes at risk.

But here’s the disturbing part: The baby in the ad looked like my baby. The pictures of couples looked like the baby’s father and I. Was this a coincidence? Doubtful.

What does this mean for health data privacy on social media platforms such as Facebook?

Have you ever felt like your Facebook ads knew your personal health information? I remember when I realized they know what my baby and his dad looked like.

It shocked me.

That was interesting.

picture of my 3rd for reference that this baby is my only kiddo with eyebrows. pic.twitter.com/OzVIIZkSZ6

— janae sharp (@CoherenceMed) June 20, 2019

My older children are blonde. As a result, when I saw Facebook ads with models who looked like them, I assumed that everyone saw that generic look. But my youngest baby is part Asian-American, and so were the models in this new ad. How did Facebook know?

The extent of Facebook’s data collection is unclear, just as we don’t know how the company tracks pregnancy and race. Even the efficacy of Facebook’s data efforts has come into question. Women who suffered a miscarriage or whose child was stillborn have said they wished the algorithm could correct itself and stop displaying ads for baby gear.

Facebook collects preferences for each user. Some of those determine which ads we will see — and some seem to determine what the people in our ads look like. I’ve periodically checked mine ever since The New York Times published an article about Facebook predicting political party affiliation. At that point, my interests were in list format, organized by topic. The ads page has gone through different iterations, typically displaying which user ad preferences it collects and enabling users to tailor those ad preferences more closely to their interests. Facebook also has a list of advertisers who have uploaded a given individual’s information. I hadn’t done business with or given my information to all of the companies that had uploaded my information.

When was the last time you looked up which companies had uploaded your information to Facebook for advertising?

I looked today.https://t.co/irKLAd5EHc#healthIT #healthdata #dataprivacy pic.twitter.com/XOzby0eI3h

— janae sharp (@CoherenceMed) July 10, 2019

Currently, the ad preferences page consists of rows of stores or topics. As of this writing, Facebook’s format uses broad topics, under which certain companies or vendors would fall, a departure from the simple list of 2016. As a result, the data on me are less transparent. The data about which companies have obtained or purchased my data for advertising, meanwhile, are very clear.

But what happens when our health data are uploaded for advertising purposes? Apparently, Facebook’s knowledge is not always current. It won’t always know or correct, like in the case of the women who lost children and still saw baby ads.

How health data are protected, altered and shared to create our profiles is inconsistent and unregulated on many digital platforms.

Realizing the Data Privacy Gap

Advances in technology have created data security issues that are evolving faster than regulations can keep up. When we share health data publicly, the responsibility of the technology platform is unclear. But the extent to which social platforms should protect health data has been the subject of proposed regulations.

News surrounding data privacy and the U.S. presidential election has exposed gaps in personal data protection and inspired legislators to introduce several bills to protect individual’s information. The big driver of this push was Cambridge Analytica, which collected information about Facebook users and friends and sold predictive profiling services that could determine what might motivate an individual to buy. While its effect on the presidential election is up for debate, the publicity highlighted how thoroughly individuals were tracked online. Most consumers did not realize how advertisers were selling and using their data. (In this case, users shared their friend list and companies profiled them, hoping to change their behavior.) Data protection was not a given, and individuals had shared information without understanding the implications. Looking at Cambridge Analytica allowed individuals and governments to see that no information was really safe on social media, including health information.

While the pathway to better understanding and data protection is emerging, several important questions with significant implications for healthcare organizations and patients remain. What does an ideal health data law look like? How should potential health data laws be enforced? And these questions are just the starting point.

The questions surrounding health data privacy are urgent and relevant. While most patients are familiar with the ubiquitous HIPAA laws, which protect their medical information at “covered entities,” more consumers are voluntarily distributing private health information through technology every day. For example, people supply private health data to personal health apps, such as Fitbit, other activity trackers and fertility trackers, without understanding that health data privacy laws do not apply to these nontraditional areas. These apps fall into a gray zone of health data collection because they were not contemplated under the original privacy laws as a traditional “covered entity,” like a hospital system or traditional healthcare provider.

“Establishing a more consistent privacy framework for all health data, regardless of source or medium, can assist in establishing common expectations for both consumers/users and the operators of the devices. If alignment can occur with HIPAA, then the sometimes-convoluted and difficult analyses of when a device or service becomes subject to HIPAA could give way to an understanding that all data deserve and require protection.” — Matthew Fisher, J.D., partner, Mirick O’Connell Attorneys at Law, and chair, Health Law Group.

Marketing Data and Your Health

I’ve often been a proponent of health systems accessing the same marketing data sets that are available to companies trying to sell us yoga pants and using it to encourage healthy behavior patterns. Ethically, what data access an entity like Facebook should have for health data remains in question.

The Murkowski-Klobuchar bill contains some important suggestions, including greater control over how our health data is used for marketing purposes. The proposal provides for, “a process to limit the transfer of personal health data to third parties and provide consumers with greater control over how their personal health data is used for marketing purposes.” I know some of the personal data that Facebook has on me. I know that even though my relationship status is hidden and never updated, marketing companies know that I had a child. These companies know what we look like.

I really like buying cute baby clothes. But I’m not totally comfortable with an online platform knowing so much about my pregnancy in order to sell me things.

Is it safe to collect information about menstruation for selling different products? What about information regarding weight gain or loss, all to sell clothing? Some states have privacy laws specific to reproductive health conditions. Period tracking apps that share data are likely not following the guidelines for those protections. Some of those apps have already shared data that people have tracked, including reproductive data. This type of data should be protected, but social media platforms and new consumer-facing apps do not have regulations because they didn’t exist when laws were made.

There Is No Health Data Protection

Most people assume the data that they share about their health anywhere are subject to the same data protection guidelines that govern the physician’s office. So, what do people need to know about social media?

The data collected on social platforms are not always what users anticipate, and that could apply to health data. Facebook harvests personal health data, including weight and alcohol consumption, from iPhone users and other apps, according to research from the Wall Street Journal. I know that my Facebook ads are personalized according to my demographics, and my interests were affected by the mere identity of my child’s father.

Even more concerning, Facebook has started to actively promote the sharing of healthcare questions. There is a special designation from Facebook health groups where users can ask anonymous questions. Christina Farr reported on this option in April, and I wondered this: What will happen with that health data on the back end?

Facebook has received (deserved) criticism about not having security within secret and closed groups. The company has had data breaches outside safe use, as well as through groups like Cambridge Analytica, which used personal data to advertise for the Trump campaign. The personality mapping that Facebook sold to corporations could be used to influence behavior. After downloading a “personality” survey, Facebook users were often unaware that their friends list and behavior were tracked.

I really miss those personality surveys and learning the identity of my spirit animal and even my best friend. Somehow, knowing that the company used data to predict behavior, makes my “Facebook soulmate” results mean more. Even outside of the health issues that your online behavior can predict, people are unaware of how the health information that they volunteer are tracked and could be shared or sold.

According to reporting on the stance of Facebook attorneys, users should have no expectation of privacy when using the social platform. “There is no invasion of privacy at all, because there is no privacy,” Snyder said. The official legal position of Facebook appears to be that there is no expectation of privacy, which means shared health data could be used for profit.

Attention Is the First Step

The lack of health data privacy created by technologies like Facebook has caused many legislators to propose privacy protection. National attention to lapses and ethical issues has inspired the beginnings of changes that could lead to better health data protection. Through its lapses and data collection, Facebook has been fined and publicly battered. These issues are starting to inspire guidelines to protect consumers. In part 2, we will have a closer look at the proposed Protecting Personal Health Data Act — and where it might fall short.

Get the best insights in digital health directly to your inbox.

Related

Speaking with the Woman Behind the Facebook Health Data Breach Complaint

WannaCry, NotPetya, and Cyberwarfare’s Threat to Healthcare

Quarterly Payout for Cybersecurity Vulnerabilities Increases 83%, Highest Ever

Related Videos
Image: Ron Southwick, Chief Healthcare Executive
© 2024 MJH Life Sciences

All rights reserved.