As 23andMe faces a uncertain future, who gets to access the DNA data they've collected?
A growing number of tech companies are collecting data that reveals intimate details about their customers. When those companies fail, what happens to those data?
Anyone who’s been keeping an eye on 23andMe will know that the genetic testing company is going through a bit of a rough patch.
In July the company’s CEO Anne Wojcicki submitted a proposal to take the company private. In protest, all seven of the company’s independent directors resigned a couple of weeks ago.
The developments — which come on the tail of decreasing sales and a damaging data breach — have brought into question the future of the company that pioneered personal genetic testing and analysis.
But if 23andMe goes private, or is sold, what happens to all the DNA data it holds?
As Kristen V. Brown wrote in The Atlantic a few days ago,
“potential buyers [of the company] may have very different ideas about how to use the company’s DNA data to raise the company’s bottom line. This should concern anyone who has used the service.”
23andMe’s privacy policy sets out to protect personal information and limit how it can be used — and customers are always free to have their records deleted.
But these data aren’t covered by medical records regulations — 23andMe isn’t a medical company — and while there are potential legal protections in the US around how collected data may and may not be used, it’s far from clear how challenges would play out in court.
Which means that, if the company is sold, there’s a reasonable chance that this treasure trove of personal genetic data could be used in ways it was never intended for — even in ways that potentially impact individuals who provided 23andMe with samples.
I suspect that some of the more speculative outcomes here — including individuals being profiled based on their DNA in ways that impact access to medical services and even employment — are not that likely, at least in the short term. That said, unless the data are definitively destroyed, there’s always a chance that someone tries to monetize them in ethically questionable ways in the future.
But there’s a bigger issue that this case raises, and that is what happens when we give away increasing quantities of very intimate data with no concrete guarantees that these might not be used to our disadvantage in the future — especially where the data outlive transient companies, guarantees, and even regulations.
This is a challenge that encompasses a growing number of services and devices that collect health-related data, from Fitbits and Smart Watches to health apps. But it also extends to medical devices that download or stream high-value and highly personal data to servers.
There are already a growing number of cases around how seemingly innocuous data can be used to triangulate and extract potentially harmful inferences — it was only a few years ago that data from a pacemaker were used to help indict someone charged with arson. And the possibilities for large dataset value-extraction have expanded exponentially since then!
The arson incident is something of an isolated case. But as device manufacturers are increasingly exploring how to extract value from massive amounts of data related to behavior and health states, what are the guarantees that, over time, those data aren’t used in ways they were never intended for?
This gets even more serious as tech companies like Neuralink get into the business of collecting large amounts of deeply intimate data from patients (in this case data on brain function and state) — often without a deep seated ethos of lifelong patient care that many dedicated medical companies have developed over time.
These are data that, to a certain extent, define who we are, how we function and behave, and even how we might respond to different circumstances and stimuli.
As such, they are exceptionally high value to any company that can work out how to use them in creative and innovative ways.
In this respect, the next few months with 23andMe could be an indication of what the future might hold for deeply intimate data that individuals no longer have control over.
Definitely an evolving story worth paying attention to.
I’m so glad I never did a test with them 😅
The "vultures" have always been lurking, waiting for these type of messes to happen(Ensuring that is another rabbit hole).
The mess up in itself is a business model 😫