23andMe Data Breach: Legal, Privacy, & Reputation Fallout with Alya Sulaiman

Dezenhall Resources / October 2, 2025
Read full Substack

This episode of Reputation Nation features data privacy attorney Alya Sulaiman in a deep dive on the 23andMe breach—how a credential stuffing attack spread through opt-in features like DNA Relatives, why HIPAA didn’t apply, how the FTC and state attorneys general responded, and the governance, consent, and trust lessons leaders need now. The conversation connects legal obligations, product design choices, and communications strategy for high-sensitivity data incidents.

23andMe Data Breach: Legal, Privacy, & Reputation Fallout with Alya Sulaiman Episode Transcript

AMM (00:00)

We’re excited to welcome by Alya Sulaiman, Chief Compliance and Privacy Officer and Senior Vice President of Regulatory Affairs at Datavant. Alya is a recognized expert in healthcare data privacy, AI governance, and regulatory compliance.

Together with Alya, we’ll explore the fall out from 23andMe’s data breach which exposed the personal information of nearly 7 million users.

Stacy:

All right, so just jumping off here, I think what is kind of captivating about the 23andMe story is that it concerns genetic data that consumers or customers of the site have submitted. And there’s been a lot written and there’s curiosity around, well, why isn’t this protected by HIPAA? And what are the data privacy laws that protect this genetic information? And I thought it would be helpful to sort of set the table on where we are with a for-profit company and individuals voluntarily submitting their genetic data? 

Alya:

Yeah, it’s a great question. And I think one of the most common misconceptions about HIPAA for anyone who doesn’t live and breathe health information privacy in kind of traditional context is that it applies to data.

It does not apply to data. It applies to entities that meet certain requirements that handle health information. And when you look at the privacy laws that have sprung up across, you know, I think it’s close to 20 states and counting, as well as some of the proposals that have, you know, percolated up through the federal legislature, but haven’t actually landed.

You continue to see that focus on entities as a threshold requirement for whether or not a law applies. So, HIPAA applies to covered entities, primarily healthcare providers, health systems, health insurers, clearing houses for claims, and their business associates that serve and partner with them. 23andMe is none of these.

They are a direct-to-consumer company, Stacy. And as you mentioned, people are making a choice affirmatively to share data with 23andMe and use their services. It happens to be that the data 23andMe is collecting and analyzing and making available in digital form is health information, but that doesn’t mean it is protected health information under HIPAA. I think that is really shocking.

AMM (02:13)

I do think I agree. I was just thinking, you know the types of

companies that this information seems sort of similar to is almost like financial information. It’s protected in a way, but not the way HIPAA is like credit bureaus, social media companies, when you use your credit card, that sort of thing. I’m curious if in your mind how 23andMe fits into that versus sort of the healthcare ecosystem, which I think is where most general consumers sort of believe turning over 23andMe data sits as opposed to being more like putting in your credit card to buy something.

Alya (02:45)

Yeah, the good news is that it’s not a totally lawless environment out there. There are some rules that apply to any really direct to consumer company that handles particular types of identifiable information about individuals but also make statements to those individuals and the public about the safeguards and controls that they want to put in place to protect that data. So, for example, the Federal Trade Commission has the FTC Act, Section 5 of the FTC Act is really a main backstop for financial companies, for direct-to-consumer companies, for lots of companies doing things with health data that again, isn’t protected health information under HIPAA. And the idea there is that if you say you were going to do something, and you don’t do it, you could be held accountable under the FTC Act for engaging in unfair and deceptive trait practices. The FTC has in the past cracked down on companies for this, even health apps. And there are a couple more recent enforcement actions that have specifically had genetic data companies in the crosshairs.

I would say that FTC enforcement, that’s a pretty lengthy and involved kind of process to go from investigation to resolution. And that’s where I think you see states really stepping up and those state consumer protection laws are that kind of second layer and that combination of rules that apply that I think is really relevant here. And I know Stacy, you have some very interesting thoughts on how states stepped up in this particular scenario.

Stacy: 

 I mean, I am like chomping at the bit to talk about this because I think you have state attorneys general like out there in a way we haven’t seen before. You know, they put out notices giving consumers the step-by-step ways to opt out and get their data deleted. I wanted to ask you about that. We haven’t really seen that before. And I wanted to get your thoughts about, you know, I’m trying to see what the political connection is, but just your thoughts on AG’s activities here. 

Alya: 

And, you know, I mean, they’re absolutely when I when I start to think about the AG’s that that got involved proactively in the ways that you mentioned, there’s definitely a lean right politically. But I want to just emphasize what you said. This is almost unheard of for regulators to tell victims right after a breach, No, don’t just change your passwords or freeze your credit, but go demand that a company purge your records. I think that that chorus of state attorney generals really effectively, like indirectly flexing the strength of their own state consumer protection laws in this way was really striking. And it started with California’s AG, Rob Bonta. No surprise there for anyone who’s familiar with our state AG. Then you had AGs from New York, Massachusetts, all following suit. And this kind of collective stance I thought was really interesting because it kind of takes a page out of the European data protection playbooks that really enshrines this right to be forgotten or this right to erasure and flex it in a way that was really, again, more of an ethical stance around deletion being the best safeguard when trust is broken.

And so regulators typically prefer companies to fix problems, right? Like a lot of their enforcement actions typically end up in a long list of recommended corrective actions that companies should take. Here, you’ve got a bunch of very powerful AGs hinting that the better solution might just be for consumers to pull their data out and people listened. I read one report that said almost 2 million people, you know, submitted deletion requests after these alerts from state consumer protection leads. that’s a pretty huge segment of the overall impacted user base here, exercising a deletion rate all at once. I think there were even reports of capacity and processing issues just because of the volume of deletion requests that were coming in.

AMM (07:11)

Anecdotally, I know folks that trafficked in that conversation for a handful of weeks. It was “have you deleted your data from 23andMe from your family?” And it was something that I think really stuck in people’s minds. We’ve become a little bit numb to data breaches. think generally you see them in the news, Equifax lost all of however many users data or social security. I mean it’s really rampant, but this was just a completely different level. And I’m glad you brought up GDPR and European regulations because that’s what I thought of immediately. This felt very much like a sort of European action.

Being on my side of the fence dealing with sort of the crisis management aspects of these issues the fact that the AGs were California New York and Massachusetts is I think no accident that’s where we see sort of all state regulations started, on both coasts and it’s those three states in particular. But I want to get to the trust piece of this because you raised you know there was a huge trust concern, and I think that you know the AGs were probably well within their right certainly but It was probably not bad advice to send folks to delete that data. But if 23andMe comes back as a new entity, how do you either get your data back in the system if you want it, maybe you don’t, or how does 23andMe build trust to sort of move forward in some manner with this type of data so it either doesn’t happen again or there’s new safeguards?

Alya (08:34)

It’s such a great question because in, I know Stacy, you’ve lived this too, in healthcare, trust is our lifeblood, especially for any of us working on data intensive projects. And it is sometimes used opportunistically, right? to justify practices that you know, curb or slow innovation. But at the end of the day, if people don’t feel like you’re going to keep their information safe, they’re not going to want to work with you. And they’re not going to want to use your service in the case of, you know, direct to consumer companies. So I do think that there were lots of lessons learned from what 23andMe experienced very publicly, right, about, you know, explaining what happened, providing transparency to end users, really being willing to partner potentially, invite independent kind of oversight that there are probably lots of things that looking back that team thinks, okay, we could have maybe done that a little differently. I work for a company where health data is literally all we do.

And, you know, at DataVant, we sit at the intersection of every corner of the healthcare industry. We work with life sciences and pharmaceutical companies, genomics companies, and specialty diagnostics labs. We work with health plans. We work with healthcare providers, 80,000 of them, in fact. And they all trust us to make the right data available at the right time for their most critical use cases.

We know that having the level of investment into our internal privacy and security infrastructure to actually execute on our commitments, but also being really transparent to our customers about what we are doing and why. And, you know, even, you know, not just when they send us a thousand-line questionnaire to complete as a part of an annual kind of vendor security review, but really on an ongoing basis, it’s just so critical to approaching privacy, security, and trust in a collaborative way, making sure that expectations are clear. I also think this idea of bringing in third parties to either certify, assess, you know validate your environments can really bolster credibility. In certain pockets of health care, you hear about different third-party certification frameworks, whether that’s high trust or getting a SOC 2 type 2 report or getting your ISO 27001. There’s so many acronyms and numbers out there that map to different levels of assurance. But it’s something where I really believe that it is doing so effectively and volunteering for that rigorous testing in achieving, you know, hopefully those top tier certifications. It can not only be sort of tangible proof that you take trust seriously, but it also just helps you internally, like continue to hold yourself to a really high standard and create continuity in how the company and your customer is going to talk about what privacy security trust mean to you. I would have loved to see more of that independent oversight angle. you know, in the context of this issue. I think, you know, it’s definitely an area for improvement for any company that’s looking to figure out how to reposition themselves on the trust front.

Stacy: 

I want to build off that, Alya, and it kind of ties back to a recent action by the attorneys general, which there’s 28 of them, not just the blue staters. I’ll note my home state of Missouri is actually one of the plaintiffs in bankruptcy filing. And those attorneys general note that sort of the terms of use, the click through, you know, privacy statements, etc. aren’t sufficient for transparency in what’s required for the sharing of and granting of use rights to individuals data.

So I wondered about, you know, what do you think If you were advising a company that had this kind of data, and it sounds like you work with customers like that all the time these days, I feel like the terms of service are sort of like baseline, but maybe companies need to be even more in the face of consumers explaining what they’re doing when they’re sharing data. 

Alya:

There’s such a long debate here about the balance between meaningful transparency and what constitutes express, informed, and affirmative consent. And many of the state privacy statutes that those 28 AGs rely on have that express, informed, affirmative consent concept. I do think that there are definitely some key state statutes that support those arguments.

You know, California’s privacy laws are a really good example where they have really intentionally evolved in a way to, remove an exemption that a bankruptcy sale, carved out right from the definition of a sale of data and that to ensure that folks retain some rights to opt out of particular data transfers. So that’s another issue, right? Like on top of consent, like what about changing your mind later on?

And does the state law that you’re under the jurisdiction of give you that right? I do think that the strong companies out there really are developing creative ways to help patients, consumers, individuals understand what’s going to happen.

to their data and why, and what they can do about it. I see all different variations of consent forms, consent forms that link to guides, kind of annotating and explaining them online, consent forms that aren’t just blocks of text in a PDF that you click to sign, but walk you through in an interactive way, section by section, to ensure that you really understand what you’re agreeing to.

But the reality is that no one whether it’s a federal perspective and even some at the state level is saying this is the right way to do it. And so of course you have different companies approaching, you know, transparency, express, you know, express affirmative consent, informed consent in different ways. I hate to bring this into it, but until we have a federal law that kind of normalizes what the expectations are on that front, I think you’re going to continue to see a lot of variation. And I don’t think it’s necessarily illegal, right? Because these are, in many cases, reasonable interpretations of the rules we have to go off of at the moment.

AMM (15:26)

I’m actually glad you brought up federal regulation because the patchwork of state regulations for many of the clients I have is almost impossible to navigate. And so then you sort of move to either the highest common denominator or the lowest common denominator. And there’s generally differences between them. And just from an operational perspective, that makes doing business more complicated. And not to say that you shouldn’t always go to the heaviest regulatory burden, but oftentimes they are burdensome to shift things and we all know technology is generally ahead of the regulatory landscape. That’s just the way it’s always worked in our country. I’m curious what you think of how to help businesses from a practical perspective navigate those things because they can be costly, they can be sort of major pieces of how you operate and sort of shifting that. Do you have any thoughts on that front of navigating sort of the difference between federal and state regulatory landscapes, particularly in this area?

Alya (16:21)

So my first reaction is don’t navigate it for the first time when you’re in the middle of a crisis. You have to have a plan and you have to be doing regular tabletop exercises with the ugly, scary facts and areas that you hope in your, you know, that only would have happened in your worst nightmares. Like you have to be doing that level of planning, preparation and testing in order to avoid a whole lot of swirl when, you know, it’s really when not if a major issue happens. And I do think that there’s a lot out there about the textbook sort of incident response in kind of protocols, like isolating the breach, enlisting experts, informing law enforcement. And then of course, begin notifying your stakeholders, whether that’s your customers or your patients. But responding to a breach isn’t just about sort of containment and tech fixes, as you mentioned, dictated by a patchwork of notification laws and regulations. And in the US, almost every state has a breach notification statute of some kind that when you hit a certain threshold, whether it’s volume of individuals impacted or the type of data that was impacted, you’ve got to promptly alert affected individuals and in some cases, regulators.

So there are tons of resources out there that are free and make, you know, matrix out those different state laws and their notification requirements. Making sense of a resource like that requires you to have a really deep understanding of the data you maintain and the jurisdictions that it’s regulated under. And one of the least understood parts about just your overall incident response playbook is that you actually need incredibly strong data governance on the front end to execute on that playbook. Like you need to know where your data lives, what it is used for, where you got it, right? Who you’re, what your accountability points are for any data within your systems in order to actually respond appropriately when there’s a threat. And so in addition to practice and prepare, I think it is definitely worth your time. It is worth the investment to think about what data governance means that your organization and whether you can answer some again basic provenance questions about the information you trade in, where it comes from and who would be upset if something happened to it.

AMM (18:59)

I love that you brought up tabletop exercises. We try and do that with our clients during times of peace whenever possible for all the worst possible outcomes that exist. We say, what’s the question that keeps you up at two in the morning? And usually there are six or seven and anybody that’s dealing with, think data, particularly sensitive data, that should be high on their list. But the piece that you raised about provenance and understanding on the front end, where things come from, all of that.

Do you think that maybe we’re far enough into the data life cycle of how we all use data sort of constantly in our lives where companies on the whole are doing a good job of that on the front end or are we behind? I hit consent on things all the time because I want the app or I want to just get through the paperwork and the thought of saying no to not have the thing is just not something I generally do. I just get it done. I’m curious your thoughts there.

Alya(19:55)

Yeah, I, and this is really my personal perspective here, but I don’t think it’s realistic to expect people, individual people to sort of manage and account for all of the different places where their data may exist. I mean, just in healthcare, for example, if you are operating under the umbrella of HIPAA, there are hundreds of different use cases and contexts where your most sensitive data may be processed, because that’s just what it takes to deliver and pay for and measure the quality of healthcare today. And it can be processed without your consent because that’s how we set up the law to facilitate the delivery and payment and measurement of healthcare. So I do think, this is relevant because, you know, who hasn’t at this point in your life received a breach notification in the mail or, hey, you may be eligible to participate in our class action lawsuit. It’s really just become a normal part of life in the 21st century.

I think because of just the, again, just the necessity for these data flows and these data uses to happen in complex ways with lots of interconnected systems and players really has to be, each kind of company’s responsibility in the ecosystem to take data governance seriously. And for any company that lived through the early implementation of the GDPR, I know that was my job when I was in-house at Epic, had the printed-out version of the entire GDPR on my desk and highlighted for almost every daily reference. There were some, yes, painful parts of that experience, but also some incredible resources and insights that came out of it, like doing a data map.

Where does personal data come in to the company? Who touches it? For what purposes, What third party systems or vendors do they use to touch and process it? And then what’s the eventual disposition of that data? Where does it live long-term? How long does it live there? When and how could we delete it? And that’s obviously relevant to the 23andMe story too. So that data mapping exercise was one of the most valuable things that I ever led in my career. You know, also gave me a leg up for the ensuing years because I kind of like knew, we do handle credit card information and here’s how, you know, yes, we do sometimes receive genomic and genetic data. And here’s the context and why. And it allows for anyone who’s in the legal or policy space, it’s really an incredible foundation to provide insight, advice, guidance going forward.

But it’s a huge lift to do it. So you have to have buy-in from your senior leaders to undertake a project like that. and you have to stay focused while you’re doing it because it’s super easy to, you know turn over rocks in the process and see all sorts of little things that you want to start chasing and running down.

You’ve got to stay focused on the end state and recognize that just getting that version one of the data map and then eventually that version 1.1 and 1.2 is going to provide you with more peace of mind and just context to prioritize things like how you want to invest in privacy and security than really anything else you could do. So I do think it is a responsibility for companies. And even if you’re not GDPR regulated, it is a really worthwhile investment to put a data map together.

AMM (23:37)

The data map is an interesting concept and one that as a crisis management consultant that comes in when things are bad, I would love to see because you could point me to exactly where something came in, went out, how the problem happened, and having that context can be really important when you’re putting together a message map of, well, what’s missing? Who do we need to talk to? How does this fit in? That’s a really great takeaway that anybody that deals with these types of issues should consider both having the tactical operational piece of the data privacy map, and then how do you marry that with what you would need to deal with if you did have a worst-case scenario.

Alya (24:11)

Totally agree and going back to the 23andMe example, when this breach was announced in a very interesting way, right? It was a file leaked on a dark web forum where someone was essentially claiming to have a whole bunch of data for millions of 23andMe customers.

And I’m sure in the first 24 to 48 hours, they were very focused on just verifying the breach. But once you get past the verification point, you then have to immediately turn to containment. To your point, Anne Marie, that data map gives you all the places you need to look for from a containment perspective. for example, they had to shut off data exports. That would be in your data map. That data could leave systems in this way, If the following things were true, that’s the type of detail that you would hope would be captured in a data map. So, you’re essentially doing a little bit of pre-work for yourself in that worst case scenario. So, you know exactly which switches you need to turn off and levers you need to pull down in order to effectively contain the incident. 

Stacy:

I’ll just jump in here and say, it all sounds great, but, and you alluded to this a little bit, It is a gargantuan task. It is super hard, both from a volume perspective, from a culture perspective, from a bandwidth perspective. And Anne Marie, I agree, it would be amazing to have just this is where all the data lives and this is who we share it with. But I think not everyone has an Alya Sulaiman on their team. So it’s sage advice, but just it’s not an easy lift is my take.

Alya:

 I totally agree. And at DataVant now we are growing at a breakneck pace, and we’ve announced a couple, you know, significant acquisitions in recent months and my mind is, OK, great. All these great integration tasks. Lots of excitement for the business, but I am thinking about how do we understand data flows? How do we get our arms around that sooner rather than later? And I think, Stacy, my only answer to the Gargantuan task point, which is a really good point, is that it doesn’t have to be perfect. It doesn’t have to be comprehensive to start. And maybe you just prioritize your highest risk area, you know, if you’ve got teams that are doing research with lots of identifiable data, maybe it’s just a deep dive for one quarter with that team to understand exactly where they’re getting data from, how they use it and its eventual disposition. And that’s your V1 and maybe V1 moves to another team. Getting it up to date is a whole separate scenario, you know, kind of issue, but there are also privacy tools for that, right? Like introducing the concept of a privacy impact assessment in an organization and really educating people that, when you do any of the following things, like this might trigger the need to do a privacy impact assessment. And we’ve made it as easy as possible for you with this web form you fill out. And I do think that there’s a lot of ways to automate now, especially.

Capturing the information and having it spit out into, again, at least an initial draft of a spreadsheet. And again, it can be a spreadsheet. It doesn’t need to be this, don’t need to buy a multimillion dollar, you know, kind of system or like module of software to do this. Like you can start with a spreadsheet. The key thing is just, again, not having to do that mapping exercise for the first time when you are trying to figure out the implications of a massive incident.

AMM (27:57)

This is something Stacy and I have dealt with together in the crisis that we’ve managed in the foxhole. But you can’t not start. And no matter what information you have, as imperfect as it is, you’ve got to start somewhere. And the most important thing, if you don’t have the map you’re mentioning, is having at least a crisis team of who are the people that you know can be responsible for the thing. You might not have all the answers, but you know, in my case, as the communications person, it’s who knows what, where, when, how, how do we get things out. And Stacy as the legal side, it’s okay, what are the potential implications of this in all the different places this may happen?

And I think what I’m hearing from you is if you have data in your organization, you better know that somebody on your crisis team knows who to call, who your insurance carriers are, at the very least the passwords, those like very basic things, at least to start that process. Because in my experience in a breach, even if you have the best IT systems and the best folks, you don’t know where it’s coming from right away. It takes a little bit of time. That time might be hours, it might be days and you know, people are still important. Automating it is great, love relying on tools, but if you’ve at least got a handful of folks in the foxhole with you that can get to the root, you’ve got a fighting chance to contain.

Alya (29:13)

Totally and on the people point, something I’m experimenting with at DataVant and I kind of have an unfair advantage because our privacy and security and compliance culture is just incredible. I’ve never worked at a place where people are excited to see me and my team.

Stacy: 

They’re in your book of highlights. 

Alya: 

Yeah, exactly. They’re inviting us to things proactively and want our input, but we’re trying to leverage that enthusiasm to identify what we’re calling sort of a data steward per department. So that’s a great idea. Yeah. On our people team, who’s going to be the person. And I do find that there are some folks who are just like personally interested in this stuff. So we’ve had some luck with folks just raising their hand. I work with a data scientist at DataVant who has really become kind of my data steward person for our data science team, partnering on all things from how do we use data in support of our data science initiatives to how do we use AI responsibly with that data? And there’s a clear personal interest and by educating that one steward on the accountability points that matter, right, from a legal or regulatory perspective, from a communication stakeholder perspective, they can then translate to their teams and their peers what exactly you care about and why, and why it’s important to maybe take a more formalized or structured approach to how certain decisions around data use and other things are made so that it can all filter up into that broader data map. But I do think that this concept of a data steward, maybe not for every department, but for your really kind of high risk and high impact departments is going to help you build that contact list, Anne Marie, that you mentioned is so critical when you’re actually in the foxhole dealing with a crisis.

AMM (31:14)

Love it.

So 23andMe has been reclaimed again by its original founder, Anne Wojcicki, and she did so through her nonprofit, which I believe is called TTAM Research Institute. I think the deal covers both the DNA business, the telehealth unit, and all that genetic data. Why would you do that? Why would you go from a for-profit entity to having this in a nonprofit entity, does that change the regulatory landscape for 23andMe or what will be whatever the new company is called?

Alya (31:46)

I think there are a whole lot of governance considerations there that I’m not going to speak to, but that definitely were undoubtedly a core part of the decision making here. From the privacy law perspective, it is well understood in the privacy legal space that privacy laws, especially at the state level, have been drafted in ways to define covered businesses or covered data controllers as for-profit entities. In some cases, there are even revenue thresholds that apply to determine whether or not an entity fits under the umbrella of a state privacy law. So it is definitely possible that TTAM as a tax-exempt nonprofit could, evade some of the regulatory pressures that a for-profit direct to consumer company may face. With that said, there’s really, I think, again, this kind of goes back to this question about like how we think about privacy law as a society, right? And how we, as a country, evolve towards a different standard because if we continue to think about these laws as applicable to entities versus data, I think you’re always going to have those gray areas, those gaps, or those open questions about, you know, can you just change the attributes of the business or the structure of a business or organism, you know, or entity and be in a totally different space from kind of a regulatory oversight perspective when it comes to use of what is otherwise still, you know, sensitive health-related, health-ish data. So it’ll be fascinating to see if, whether it’s this Congress or next Congress, that all the drafts that have been circulating for years of a federal privacy bill, take this issue head on about applicability to an entity versus applicability to data. But there’s definitely implications for TTAM being a tax-exempt nonprofit from a privacy law perspective.

Stacy: 

 I wonder though, it doesn’t address the trust issue. Like if that’s kind of a workaround to get a little more flexibility with privacy laws, I don’t know that that helps the cause. Or, Ann Marie would probably know this, is there greater trust in nonprofits than there are in for-profit companies?

AMM (34:11)

I think usually there is a sort of underlying belief that a nonprofit has a more altruistic view or goal or desire to do good in the world.

Stacy (34:21)

My jumping off question, which we didn’t start with, but I just wanted to get your take. you’ve been in house counsel as well and you know sort of advise your clients about the importance of managing security and data privacy and you said earlier, you know, people sort of assume, you know, it’s not if it’s when there’s going to be a breach, but I wonder, you know, and when we always talk about, you know, the, the boogeyman or the bad parade of horribles that can happen. I wonder if bankrupting the company was, was ever forecast as a possible outcome of a data breach. 

Alya: 

I do think that this entire 23andMe saga from recent years is going to be studied. It’s going to be analyzed and it’s going to be evaluated both in terms of the trust angle, the breach response, the communications that were made both to consumers and to regulators, but also really this open question of, you know, who controls certain types of data. And that was something that I think was really striking for me as this saga was playing out was you had, you know, senators tweeting with concern, right, about this, about this incident, focusing on who controls Americans DNA and whether there is this gap between where current law suits and how data uses and data collection and just certainly the data availability has evolved where the laws just haven’t caught up. I think the other thing that I would highlight here is that, I do think that in the court of public opinion from a policy perspective, when you are handling some data like genomic data or all sorts of health related information, people in our space, Stacy, have heard this a million times, but you can change your credit card number, but you can’t change your blood type. Well, I’d take that a step further with genomic data because it’s not just about you, it’s your identity encoded, it’s your lineage encoded, it’s about your family, it’s about your loved ones and I think that this was a really bad scenario, you know, as far as crises go for a company, based on the outcome, but I still wouldn’t put it at like a nine out of 10 or 10 out of 10, because I think the nine out of 10 or 10 out of 10 bad scenario would be the genetic data actually being used to blackmail individuals in mass, like the harm, right? That could come.

Stacy: 

 Well, you might still see that. We don’t know who got that data or what they’re going to do with it or what they are doing with it, right? We only know one of the outcomes, which I think that’s a really powerful message and I think has broader appeal than just to people that worry about crises. But, you know, it always struck me how casual folks were about, you know, 23andMe spit parties or whatever when everyone just wants to know, you know, who they’re related to. And you sent your sample off to a company and people you didn’t know. I just thought that the whole concept was a little uncomfortable to me. We’re not going to know what’s been done with that data. it could crop up anywhere, right? 

Alya: 

So, yeah, I think the long-term consequences absolutely remain to be seen. And I think the intangible damage, just the, you were, you were someone who was skeptical before, you know, I think that there are a whole host of folks out there that might’ve lost trust in an entire, you know, really innovative industry segment over this. And my kind of takeaway really is that the bankruptcy really isn’t just kind of, you know, a business failure. I do think this is a bigger kind of crisis and data stewardship. Yeah. And the, you know, it’ll be really just interesting to see, you know, one who shows up to fill this gap. I mean, the ability to drive really meaningful and actual insights from our DNA is just getting better and better and frankly more trustworthy and more impressive. And eventually, we’re going to have to bridge this gap that’s been created around trust in the companies that are stepping up to solve really pressing healthcare problems like with responsible use of this data

you know, just all the innovation and promise that there is out there. But yeah, it’s a, I think a really sensitive time right now. And it’s, it’s funny, you talked about the spit parties. What also always bothered me personally is that I can’t control whether my parents or my sibling takes this test and, you know, so, and, we share this DNA. So,I think that there’s this really interesting angle too about the kind of, you know, butterfly effect really across whole populations and whole communities of people that were folks who have never done 23andMe may still, you know, 

Stacy: 

just swept up in it. That’s a haunting thought actually.

And with that, I’d really like to thank you for being here today. I’ve really missed working with you personally So it was really cool to connect. And this is a topic that I think we could talk hours and hours about because there are just so many layers. So thank you for sharing your expertise with us.

Alya:

Thank you for inviting me, really wonderful and insightful conversation and appreciate the opportunity to reconnect as always.