Imagine you have a highly sensitive medical condition that you want, or need, to keep secret. Maybe you’ve been diagnosed with HIV, or you’re trying to kick an opioid addiction.
Desperate to get some advice or talk to a kindred spirit, you bare your soul in a Facebook support group for people with your health problem.
But what if your membership in a Facebook group you assumed was confidential wasn’t private?
And what if marketers could easily learn about your diagnosis and your name, email address, location and other identifying information?
Andrea Downing, a tech project manager and breast cancer advocate, has spent the past two years trying to tell the world about this alarming prospect.
Downing is an administrator for a private Facebook group helping women who have a gene mutation that puts them at risk for breast and ovarian cancer.
In 2018, she began to worry that leaks of personal data such as the Cambridge Analytica scandal, which affected up to 87 million Facebook users, could happen in the health sphere.
“There is much more wrong here than is being reported,” she remembers thinking. “I kept expecting others to be on top of that and nobody was.”
Downing thought there could be a similar risk for the women in her BRCA Sisterhood group who shared deeply sensitive information, including pictures of their mastectomies. Because their group was classified on Facebook as closed, members’ personal information was supposed to only be visible to other members.
Downing called a cybersecurity researcher named Fred Trotter, who says he confirmed her suspicion. Trotter said he found a loophole in the privacy settings for closed Facebook groups that would allow developers, marketers and others to download the membership lists of Facebook groups for thousands of diseases and conditions, from Alcoholics Anonymous to survivors of sexual assault.
Trotter said that without more information, it’s difficult to prove whether a third party developer exploited the alleged vulnerability.
“In less than an hour, I had extremely personal information that could be used against these women,” Trotter told CNN. “The kinds of things that they don’t tell their husbands about in some cases.”
They filed a complaint about Facebook with the FTC
Trotter believes Downing’s discovery had the potential for a leak “probably several orders of magnitude larger than Cambridge Analytica.”
In an interview, he said that because the vulnerability would have been present for all Facebook groups labeled “closed,” it would have affected far more people than that scandal, in which the Cambridge Analytica political consulting firm obtained the the personal data of millions of Americans.
Further, Trotter argued that the alleged vulnerability might be worse due to the high value of healthcare data to companies, and the high potential for malicious actors to use sensitive information for illicit purposes.
To be clear, Trotter and Downing do not point to a specific smoking gun of a third party stealing and selling health data that users shared on Facebook at mass scale.
But they do allege that users’ identifiable information related to specific medical diagnoses could have been accessible for a period of years by those with Facebook developer accounts.
Trotter and Downing are still concerned about this, even though they say the alleged health data vulnerability was closed in 2018 when Facebook changed its settings. Facebook told The Verge in July 2018, “While we recently made a change to closed groups, there was not a privacy loophole.” A Facebook spokeswoman acknowledged to CNN that web developers did have access to membership lists for all closed groups before the fix.
Facebook says that simply being a member of a closed health group doesn’t constitute a health disclosure, and that it’s investing in ways to give its users clearer information about group privacy settings, particularly with regard to health groups.
Downing and Trotter have filed a complaint with the Federal Trade Commission, arguing that Facebook had an obligation to protect membership lists for health groups and that it failed to disclose this alleged vulnerability to its users.
If the FTC found that Facebook violated its health rules, the complaint could put Facebook on the hook for billions in potential fines.
It also raises troubling questions about the security of users’ personal health information on the social platform — and beyond.
It all started because Downing wanted to help women at risk for cancer
In a way, Downing’s journey to becoming a health privacy crusader began many years ago — when she was three years old and her mother was diagnosed with a hereditary cancer.
“Many of my earliest memories were not knowing whether my mom would live or die,” she said.
Her mother survived.
In 2004, after graduating from the University of Texas at Austin, Downing moved to San Francisco and took a job at Salesforce, a cloud computing service.
When she was 25 she learned she had a mutation on her BRCA1 gene showing she had up to an 87% chance for breast cancer and up to a 60% chance of getting ovarian cancer.
The news stunned her. Suddenly she was hurtling toward the same disease that killed her great-grandmother and grandmother and which nearly took her mother.
Downing turned to the internet to find others with the same gene mutation. In 2012 she became an administrator for a Facebook group called BRCA Sisterhood, moderating a forum that offered advice and encouragement to more than 10,000 women diagnosed with life-threatening BRCA mutations.
In those intimate communities, women shared their suffering and exchanged information that could save lives.
“These women were the only ones who were there when the healthcare system didn’t work for me,” Downing said. “We have a shared identity, and it’s an important part of me.”
But in early 2018, troubling revelations came to light that the data firm Cambridge Analytica had used stolen data from millions of Facebook users to create highly targeted political ads during the 2016 presidential campaign. After that scandal broke, Downing started questioning whether her group really was a safe space.
Downing discovered it was possible to use a browser extension to download the names, employers, locations and email addresses of all the women in her closed Facebook group. That browser extension has since been discontinued.
Then she called Trotter, whom she had met through biotech circles, and asked to tap into his expertise. Trotter had founded a health tech company, CareSet Systems, and had done cybersecurity work for the Air Force.
Trotter told her that he, too, could easily download the membership list of BRCA Sisterhood along with the same identifying information Downing found. That meant the privacy of members of every closed group on Facebook — not just health groups — was potentially vulnerable.
“I sell healthcare data for a living,” Trotter said. “If this were ethical, I’d be doing it.”
“Anybody without consent could scrape these lists,” said Downing, who feared marketers and insurance companies could exploit personal health information that users had posted in what they assumed was a safe space.
She and Trotter also feared the potential dangers could transcend medical groups. For example, a government could potentially identify members of a closed LGBT Facebook group in a country where homosexuality was criminalized.
They alerted Facebook to what they had found
On May 29, 2018, Downing and Trotter filed a 30-page vulnerability report to Facebook.
“We have found a vulnerability in Facebook’s Group system. This vulnerability could be exploited in a manner that results in the loss of life,” they wrote, spelling out numerous ways in which hostile actors might use the information to potentially blackmail those with stigmatized conditions or a hate group might target a persecuted religious minority.
Trotter said he was amazed cybersecurity experts hadn’t spotted the flaw sooner.
“There are many who have formal certifications in this (field) who missed it for over a decade,” he said.
After some initial skepticism, Trotter says he’s now spent countless weekends working with Downing to diagnose the problem and call for solutions.
But Facebook wrote the the pair back and argued that what Downing had found wasn’t a vulnerability.
“We do not consider it to be a privacy or security concern in the product,” the Facebook representative told them.
The rep also told them that “people who are concerned about their membership in groups being seen by others are able to simply not use the Groups product, or can limit their usage to secret groups.”
“Some people may have legitimate reasons to want to create groups which have different feature-sets than the functionality we provide today,” the representative added. “If people are confused about the privacy of their membership in a group that is an unfortunate situation and we are committed to finding ways to reduce such occurrences.”
Six months later, Trotter and a lawyer named David Harlow submitted a 43-page complaint to the Federal Trade Commission, arguing that Facebook hadn’t properly protected health data in its private groups.
Co-signing their complaint was Matt Might, a Senior Lecturer on Biomedical Informatics at Harvard who had worked as an adviser on President Obama’s Precision Medicine Initiative and who has a son with a rare genetic condition.
In the December 2018 complaint, the group asserted that Facebook had an obligation to disclose the alleged breach of personal health information in its closed health groups but had failed to do so. Such Facebook groups constitute a “Personal Health Record” under the FTC’s definition, they argued, and therefore should be regulated according to strict confidentiality requirements.
The complaint acknowledged that some of the alleged privacy flaws had been resolved by changes Facebook had recently made to its platform, but that those didn’t fully excuse or fix the problems.
“We’ve been doing the digital equivalent of performing CPR,” Trotter said. “There’s a certain moral obligation when you’re first on the scene. We’re continually waiting for the cavalry to arrive.”
The complaint also alleged Facebook allowed third parties to access the membership lists of private health communities, a move that in some cases “was done contrary to the specific privacy decisions made by Facebook users.”
It also claimed Facebook broke FTC rules by explaining privacy policies to group users in an “unfair, deceptive and misleading way.”
Facebook has not issued a formal response to Trotter’s FTC complaint, and a Facebook spokeswoman declined CNN’s request for comment on the complaint.
A spokeswoman for the FTC told CNN that the agency couldn’t comment on its investigations.
She said the agency was “satisfied” with the settlement reached with Facebook in July after its year-long investigation, but noted that “if new information comes to light regarding new violations, the order does not stop us from looking into that information and taking appropriate action.”
Facebook has encouraged its users to join groups
One of Facebook CEO Mark Zuckerberg’s long-stated strategies has been to entice hundreds of millions of users into what the platform calls “meaningful” groups, built around common experiences or interests. Those could be for new parents, dog lovers or people with health issues.
“If you’re diagnosed with a rare disease, you can join a group and connect with people with that condition all around the world so you’re not alone,” Zuckerberg said in a 2017 speech to the Facebook Community Summit.
It’s not hard to imagine how support from Facebook health groups might save lives. Let’s say you’re diagnosed with a rare form of cancer and you join a Facebook health group to get more information. Another member could point you to a cutting-edge clinical trial that even your doctor didn’t know about.
But are those Facebook conversations with fellow patients truly private?
Facebook has historically labeled the groups its users can join on the platform, named for three tiers of privacy: open, closed, and secret.
Anyone could see or join an “open” group. Facebook’s users could see that a “closed” group existed, but couldn’t see posts inside them or join without an invitation. And “secret” groups weren’t visible to users at all until they accepted an admin’s invitation to join. (In 2018, Facebook changed the way it names groups, with “closed” becoming “private visible,” and “secret” becoming “private hidden”).
Dr. Roni Zeiger, who leads Facebook’s health strategy, told CNN in a phone interview that the platform is focused on adding privacy features to its health groups.
He wouldn’t say exactly how many patients have joined closed or secret health groups on Facebook, but said every day “millions of people” were using the platform to connect with others across “tens of thousands” of health groups, with both public and private settings.
“We’ve been speaking to members and admins as well as healthcare providers to figure out how to better support these groups,” Zeiger said. “The consensus is they could be better supported with better privacy.”
He said much of Facebook Health’s work has been motivated by the “useful” feedback that Downing and others have provided. He also highlighted a tweak Facebook announced this spring which will allow users in health groups to contact a group admin so that she can ask a question anonymously to the group.
However, Zeiger declined to answer specific questions about the alleged vulnerability. He also did not answer CNN’s follow-up email questions about it.
Instead, a Facebook spokeswoman pointed CNN to Facebook’s terms of service governing automated data collection, or “scraping.” Facebook’s policies bar developers from scraping unless previously approved by the platform, and state that third-party entities aren’t allowed to sell any data they obtain that way.
The social network has taken steps to tighten users’ privacy
Downing says it’s unclear whether data from BRCA Sisterhood or other Facebook health groups was acquired and sold to outside parties. Facebook didn’t respond to a request for comment on whether the data was sold.
In April 2018 Facebook restricted access to the application programming interface, or API, for groups. This restricted outside developers’ access to building software for the group. As part of those changes, apps could no longer access the member list of a group, and needed permission from Facebook as well as an admin to access content for closed and secret groups.
This meant the hole Downing and Trotter believed they found was effectively plugged.
Facebook also changed its privacy settings for closed groups in 2018, including a step that allowed groups with more than 5,000 members to change their settings from “secret” to “closed.”
Facebook says the changes boost privacy for members of those groups.
But Facebook didn’t follow federal reporting procedures for a data breach, which require entities that collect people’s health information to notify the FTC, all affected Americans and in some cases, the media, within 60 days of the breach being discovered, according to the FTC website. If a company fails to do that, FTC rules provide for penalties of up to $43,280 per violation.
For its part, Facebook doesn’t acknowledge that group membership constitutes the type of health disclosure that would trigger a need to notify the FTC if a breach occurred.
Eric Perakslis, a former chief information officer for the FDA and a faculty member at Harvard, says he believes there is merit to Downing and Trotter’s concerns.
“When Andrea and others got on Facebook, there was a clear and obvious upfront benefit,” he said, adding that it wasn’t until later that most people realize they might be putting themselves at risk.
For those who are ill and desperately seeking answers and reassurance, an online social network can feel like a vital lifeline.
“If your child is sick, you actually don’t care if you get hacked,” Perakslis said.
But because those health groups could be vulnerable to exploitation, he said that the smartest course for Facebook might be simply to drop medical groups.
“That would be an instantaneous way to mitigate risk,” he said. “Any network could spin out something else.”
But Facebook has increasingly come under fire
Last April Zuckerberg announced Facebook was placing new emphasis on privacy principles, including encryption, secure data storage and private interactions.
“I know we don’t exactly have the strongest reputation on privacy right now, to put it lightly,” he said. “I am committed to doing this well.”
But by mid-2019, it seemed like everyone was turning up the heat on Facebook over its privacy issues.
In July, based on revelations that came to light during the Cambridge Analytica scandal, the Federal Trade Commission levied a $5 billion fine on Facebook. The fine, which the commission said “resolves all consumer-protection claims known by the FTC prior to June 12, 2019,” was by far the largest in the regulator’s history.
In October, New York Attorney General Letitia James announced that attorneys general for 47 states were supporting a multistate antitrust investigation into Facebook, alleging the company “may have put consumer data at risk.”
On October 15, after the Electronic Privacy Information Center sued to oppose the FTC’s settlement deal as “not adequate,” the Public Citizen Litigation Group, a public-interest law firm that advocates for consumers, filed an amicus brief citing Downing and Trotter’s allegations.
“Exploitation of health or similarly sensitive data can put consumers at risk of privacy invasions and other harms,” the brief said. “Marketers and scammers can take advantage of patients and caregivers, preying on their vulnerabilities for profit or harassment.”
In the brief, lawyers argued that the $5 billion fine against Facebook may be inadequate and doesn’t take into account the full range of violations that occurred.
In a responsive filing, the federal government argued that the $5 billion settlement did address health-related missteps Facebook may have committed, and requires the social media platform “to prepare Privacy Review Statements” for new or modified products or services that involve health information.
On November 5, Facebook disclosed it had found and fixed a problem that had allowed some of its app partners to continue to access group members’ information, like names and profile pictures, even after the changes made in April 2018. Downing and Trotter say that problem was similar but not the same as the alleged vulnerability they found.
“Although we’ve seen no evidence of abuse, we will ask them to delete any member data they may have retained and we will conduct audits to confirm that it has been deleted,” said Konstantinos Papamiltiadis, director of platform partnerships at Facebook.
That directive was given to “roughly 100 partners who may have accessed this information,” Facebook says, noting that the number of partners who actually acquired that data was smaller.
The next day, California Attorney General Xavier Becerra announced his office was investigating Facebook for privacy violations.
Trotter said these types of actions are necessary to rein in a company which has historically valued rapid growth over its users’ privacy.
“Facebook only restricts privacy after they are being criticized publicly,” he said. “Once there is a catastrophe, then they apologize.”
Downing is helping chart the future of online patient support
Today Downing and her husband, a scientist, live in the Bay Area with their 4-year-old son.
For the last year and a half she has opted out of full-time paid work in Silicon Valley, focusing all her energies and skills instead into forming the Light Collective, a nonprofit that helps peer support groups foster safe human connections on the internet.
Their ultimate goal is to move her health group, BRCA Sisterhood, off Facebook. But because the group, with more than 10,000 members, is so large and Facebook is such a social monopoly, that’s nearly an impossible feat. The group’s decade of shared history is stuck on the platform. They feel they’re trapped, she says.
The primary vulnerability that Trotter and Downing identified is patched, the two say. It is no longer possible for marketers or others to download the membership lists of private health groups on Facebook, and it’s unclear the degree to which third parties could have exploited the alleged vulnerability, if they did at all.
But the fact that the vulnerability ever existed — and that members’ data could have been breached — remains a significant concern, Trotter says.
“It’s over,” he said. “We’ve already lost.”
In a September 2019 blog post, Trotter wrote an update about remaining privacy risks for members of Facebook groups. One, he argued, is that although non-members cannot see groups’ membership lists, members of private groups can. And that could create an incentive for malicious actors to use fake user accounts to join private health groups and harvest data.
Facebook declined to comment on Trotter’s latest allegation. But in a report it released in November, Facebook said it works to crack down on fake users and accounts “that seek to cause harm” and has found “many of these fake accounts (which) are used in spam campaigns and are financially motivated.”
The report said Facebook is improving its ability to identify and block fake accounts.
Still, given their ongoing concerns, Downing and her group feel they have no choice but to migrate to another platform that’s more private.
In early November, the BRCA Sisterhood began a fundraising effort to help build their post-Facebook future.
Their goal is to “declare independence, protect privacy, and develop rights to self-govern the data and content that they generate.”
One way they plan to do that, Downing says, is to build a “data trust,” similar to how public land and pension funds have been managed for centuries. She hopes BRCA Sisterhood can use it to secede from Facebook and provide a model for other patient communities.
“When I started all of this, I really just wanted to know that all my choices were my own. We need to have the rights to our own data,” she told CNN. “It’s about knowing that what I share is used for my benefit and not against me.”
Almost two years after they discovered the alleged breach, Downing and Trotter are still fighting.