• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

Forensic Genealogy Is Neat. Is It Ethical, Though?

Article

After genealogy data was used to make an arrest in the Golden State Killer case, NIH bioethicist Benjamin Berkman had some questions.

genealogy forensics, golden state killer genealogy, forensics genealogy ethics, golden state killer GEDmatch

Like many others, Benjamin E. Berkman, JD, MPH, became aware of the Golden State Killer case—and the recent suspect arrest made in it—thanks to the true crime enthusiast in his life. He said his wife first informed him that the serial rapist and killer’s identity may have been discovered through an online genealogy database.

He remembers asking “Hold on, where did they get this data, did they have permission to use this data, how did they use it?” At first, Berkman told Healthcare Analytics News™, the coverage of the finding was “excited” and “breathless,” but skeptical voices quickly emerged.

>>>READ: AI's Ethical Concerns Go Beyond Data Security and Quality

The National Institutes of Health (NIH) bioethicist joined in that skepticism, teaming with colleagues Wynter K. Miller, JD, and Christine Grady, RN, PhD, to pen an Annals of Internal Medicine commentary that explores the ethics of using online genealogy tools to solve crimes.

The man suspected of terrorizing California communities for decades was pinpointed when comparable DNA—that of a relative—popped up in the database of a small genetics analysis firm called GEDmatch. After the fact, GEDmatch issued a privacy statement to those whose information resided in their system. Berkman thinks that companies which collect and store genealogy information should make people aware of the possibility before they submit their DNA, rather than after.

“It’s much worse to hide the ball and have people be surprised than it is to be clear and transparent, up front, in a way that lets people understand and internalize that their data might be used in this way,” he said. “I think it’s important enough that it should have its own discreet section so that people can really focus on it.”

The NIH trio raise a number of salient points and questions throughout their commentary. For instance, although many firms highlight the fact that people’s data might be used in research, there’s a clear distinction between that and forensics. Some people might not be comfortable with contributing to their cousin’s arrest.

There’s also matters of sheer legality in evidence collection. The commentary raises “the abandonment doctrine,” or the idea that one abdicates a “reasonable expectation of privacy” when something is discarded (like a cigarette butt or a glass of water, from which DNA can be extracted). Does genealogical information in a database constitute abandoned evidence?

Berkman says that, by most legal definitions, it seems to.

“Whether that’s good social policy is a completely different question, but legally people shouldn’t have that expectation of privacy, because the law seems to be clearly going in a way that we think wouldn’t protect these data from police use,” he said.

There’s other matters that elbow into this scenario: Employment discrimination based on genetics (which they say there isn’t yet evidence of) but also criminal discrimination or profiling as a result of genealogy (which there might be examples of). For the latter, they point to a case in Germany, in which a contaminated swab led authorities to incorrectly target Romani individuals as suspects for a decade.

Those justice issues, in Berkman’s mind, are more concerning than the informed consent question: Asking or forcing companies to make consumers aware of the potential investigative use of their data is one thing, but removing human error and biases is far more difficult. The authors suggest that those concerns might raise the need to limit the use of genealogy data down to only cold cases where all other investigative avenues have failed.

“I think this is an interesting technique, and I am not at all interested in shooting it down, I just want people to have a conversation about the tradeoffs,” Berkman said. “And maybe we as a society are okay with the potential ramifications. Like with any new technology, let’s be conscious and deliberate before rushing headlong into it.”

Related Coverage:

Holding Public Algorithms Accountable

A New Tool Uses DNA to Predict Eye, Hair, Skin Color

How Did 23andMe Stumble in Its Early Days?

Related Videos
© 2024 MJH Life Sciences

All rights reserved.