The genetic testing firm 23andMe, which holds the genetic knowledge of 15 million folks, declared chapter on Sunday night time after years of monetary struggles. Which means that all the extraordinarily private person knowledge could possibly be up on the market—and that huge trove of genetic knowledge might draw curiosity from AI corporations seeking to practice their knowledge units, consultants say.
“Knowledge is the brand new oil—and that is very prime quality oil,” says Subodha Kumar, a professor on the Fox Faculty of Enterprise at Temple College. “With the event of increasingly difficult and rigorous algorithms, this can be a gold mine for a lot of corporations.”
However any AI-related firm making an attempt to amass 23andMe would run important reputational dangers. Many individuals are horrified by the thought that they surrendered their genetic knowledge to hint their ancestry, just for it to now be doubtlessly utilized in methods they by no means consented to.
“Anyone touching this knowledge is working a danger,” Kumar, who’s the director of Fox’s Middle for Enterprise Analytics and Disruptive Applied sciences, says. “However on the identical time, not touching it, they is perhaps dropping on one thing massive as properly.”
Learn Extra: 23andMe Filed for Chapter. What Does That Imply For Your Account?
Coaching LLMs
Firms like OpenAI and Google have poured time and sources into making an affect on the medical area, and 23andMe’s knowledge trove could entice curiosity from giant AI corporations with the monetary means to amass it. 23andMe was valued at round $48 million this week, down from a peak of $6 billion in 2021.
These corporations are striving to construct essentially the most highly effective common objective fashions potential, that are skilled on huge quantities of granular knowledge. However researchers have argued that high-quality knowledge sources are drying up, which makes new and sturdy info sources all of the extra coveted. A TechCrunch survey of enterprise capitalists earlier this yr discovered that greater than half of respondents cited the “high quality or rarity of their proprietary knowledge” as the sting that AI startups have over their competitors.
“I feel it could possibly be a extremely precious knowledge set for among the massive AI corporations as a result of it represents this floor reality knowledge of precise genetic knowledge,” Kazlauskas says of 23andMe. “Among the human errors that may exist in bio publications, you would keep away from.”
Kumar says that 23andMe’s knowledge could possibly be particularly precious to corporations of their push for agentic AI, or AIs that may carry out duties with out the involvement of people, whether or not in medical analysis or firm decisionmaking.
“The entire purpose of agentic AI fashions has been a modular method: you crack the smaller items of the issue and you then put them collectively,” he says.
Representatives for Google and OpenAI didn’t instantly reply to requests for remark.
Trade-Primarily based Worth
23andMe’s knowledge may be precious throughout totally different industries utilizing AI to type via huge quantities of information—firstly, medical analysis.
23andMe already had agreements in place with pharmaceutical corporations similar to GlaxoSmithKline, which tapped into the corporate’s knowledge units within the hopes of creating new therapies for illness. Kumar says that at Temple, he and colleagues are engaged on a venture to create customized therapy for ovarian most cancers sufferers—and have discovered that genetic knowledge may be “very, very highly effective in understanding buildings that we weren’t in a position to perceive,” he says.
Nonetheless, Alex Zhavoronkov, founder and CEO at Insilico Drugs, contends that 23andMe’s knowledge is probably not as precious as some assume, particularly in relation to drug discovery. “Most low hanging fruits have already been picked up and there may be important knowledge within the public area printed along with main educational papers,” he wrote in an electronic mail to TIME.
However corporations in lots of different industries will seemingly have an interest, too. That is an abnormally giant and nuanced knowledge set: This quantity of genetic knowledge, particularly that which comes with private well being and medical data, isn’t publicly accessible, says Anna Kazlauskas, CEO of Open Knowledge Labs and the creator of Vana, a community for user-owned knowledge. “All of that contextual knowledge makes it actually precious—and exhausting knowledge to get,” she says.
Probably industries embrace insurance coverage corporations, who might use the information to establish folks with larger well being dangers, as a way to up their premiums. Monetary establishments might monitor the connection between genetic markers and spending patterns within the means of assessing loans. And e-commerce corporations might use the information to tailor adverts to folks with particular medical circumstances.
Moral and Privateness Issues
However corporations additionally face important reputational dangers in getting concerned. 23andMe suffered a hack in 2023 which uncovered the private knowledge of tens of millions of customers, severely hurting the corporate’s popularity. Bidders who come from different industries could have even much less knowledge safety than 23andMe did, Kumar says. “My fear is that among the corporations are usually not used to having this sort of knowledge, they usually could not have sufficient governance in place,” he says.
That is particularly harmful as a result of genetic info is inherently delicate and can’t be altered as soon as compromised. The genetic info of relations of people that willingly gave their knowledge to the corporate are additionally in danger. And given AI’s well-known biases, the misuse of such knowledge might result in discrimination in areas like hiring, insurance coverage and loans. On Friday, California Legal professional Normal Rob Bonta launched an “pressing” alert to 23andMe prospects advising them to ask the corporate to delete their knowledge and destroy their genetic samples below a California privateness regulation.
Eva Galperin, director of cybersecurity on the Digital Frontier Basis, worries that 23andMe’s genetic knowledge would possibly exist in a state of everlasting flux in the marketplace. “After getting bought the information, there are not any limits to what number of occasions it might be resold,” she says. This might end in genetic knowledge falling into the palms of organizations that will not prioritize moral issues or have sturdy knowledge safety measures in place.
Insilico Drugs’s Zhavoronkov says all of those fears imply that potential AI-related bidders will probably be dissuaded from attempting to buy 23andMe and its knowledge. “Their dataset is definitely poisonous,” he says. “Whoever buys it and trains on it’ll get destructive publicity, and the acquirer will probably be presumably investigated or sued.”
No matter what finally occurs, Kazlauskas says she is not less than grateful that this conundrum has opened up bigger conversations about knowledge sovereignty. “We should always in all probability, sooner or later, need to keep away from this sort of scenario the place you determine you need to do a genetic take a look at, after which 5 years later, this firm is struggling financially, and that now places your genetic knowledge vulnerable to being bought to the very best bidder,” she says. “On this AI period, that knowledge is tremendous precious.”