Rae Woods (00:02): From Advisory Board, we are bringing you a Radio Advisory, your weekly download on how to untangle healthcare's most pressing challenges. My name is Rachel Woods, you can call me Rae. Last week we talked all about the use of consumer data in healthcare. This is still a relatively new trend, but it is requiring all kinds of stakeholders, plus consumers themselves to wrestle with concerns about data privacy. One of those stakeholders that is getting more and more active is the Federal Trade Commission, the FTC. So in today's episode, I want to talk about what the FTC is doing to protect sensitive health data. To do that, I've brought back Advisory Board's digital health expert, Ty Aderhold, and I've also brought researchers, Sara Zargham and Paul Trigonoplos, who've been looking into the recent FTC rulings and trying to understand what the industry should do next. Sara, Paul, and Ty, welcome to Radio Advisory. Sara Zargham (01:01): Thanks for having us. Paul Trigonoplos (01:02): Thanks, Rae. Ty Aderhold (01:03): Glad to be here. Rae Woods (01:04): This is part two of a three part series. And Ty, I'm not going to apologize actually, I'm going to say thank you for being with us for every single one of these episodes as we talk about data on Radio Advisory. Ty Aderhold (01:21): Yeah, I guess if we're talking about data, apparently I need to be here and I'm okay with that. Rae Woods (01:24): And we are so happy to have Paul and Sara, this is your first time on the podcast. How do you feel? Are you excited? Sara Zargham (01:29): I'm very excited to be here. Thank you for having me, Rae. Rae Woods (01:32): What a thrilling conversation that we are going to have for your first recording as we talk about the FTC. Let me tell you where my brain goes when I think about the FTC, I think about antitrust. And if I'm correct, some of the research that you all have done shows that from 2016 to 2020, half of the FTCs focus in healthcare was on antitrust. That's the majority. And very little, like 2% or something like that, was actually on data privacy, which is of course the conversation that we want to have today. We're starting maybe to see the FTC shift its focus. What are we seeing now? Sara Zargham (02:29): So like you mentioned, Rae, we are really starting to see the FTC crack down on companies that share consumers sensitive health data. So one reason that the FTC is really stepping in is that not all health data is actually covered by HIPAA. For example, when health data exists in an app as opposed to through your provider or insurer, it isn't technically covered by HIPAA, and because of that, it's not subject to the same level of protection and confidentiality that HIPAA really demands. So we've been thinking a lot about this as almost a HIPAA gap or a gap in protection of health data depending on really its source and context. So HIPAA's one example, and then another big thing that we're seeing with the FTC is they're actually starting to really enforce specifically using the health breach notification rule. This was actually originally passed in 2009, so 14 years later we're just starting to see this being leveraged and HBNR- Rae Woods (03:33): Wow. Sara Zargham (03:33): ... yeah, long time, basically requires health apps to inform users when there's been an infringement on their information, and this is actually what happened with GoodRx recently. Rae Woods (03:45): And Paul, I think you have some personal experience with GoodRx and the health breach notifications rule. Is that right? Paul Trigonoplos (03:54): Yeah. I use GoodRx in my personal life and I got an email a few months ago. Do you want me to read it? Rae Woods (04:00): Yeah, let's do it. Paul Trigonoplos (04:02): Okay. "The Federal Trade Commission alleges that between July, 2017 and April, 2020, you visited GoodRx or used the GoodRx app. During this time, we shared identifiable information related to you, including health information, without your permission. This information includes included details about drug and health conditions you searched and your prescription medications. We shared this information with third parties, including Facebook, in some cases GoodRx used the information to target you with health related ads. The FTC alleges we broke the law by sharing your information without your permission. To resolve the case, we have agreed to an FTC order requiring that we'll tell third parties like Facebook who received that information to delete it. We'll never share your information with other third parties. We'll never share your information for other purposes unless we get your permission." And there's a few more bullets on basically things they will not do, that they were alleged to do, but still deny. Rae Woods (05:02): So what is your real talk translation of this email that you got? Paul Trigonoplos (05:07): It sounds like because they settled, they don't have to admit guilt, so they can just say sorry and we're going to change the way we act in the future. Also, use of the word alleged is a little bit galling here, but that might be a side point. Rae Woods (05:20): Yeah. Paul Trigonoplos (05:21): Because they did have to pay up at the end of the day. Rae Woods (05:24): My interpretation is, sorry, we promise not to do it again. Ty Aderhold (05:30): And Rae, the other thing I would add here is it doesn't truly address all of the data that now exists with third parties. And GoodRx isn't the only organization that has done this. So another consideration here is how many non-healthcare organizations, data aggregators, places like Meta that now have consumer health data. Rae Woods (05:56): And again, that data is not protected by HIPAA. Ty Aderhold (05:58): Right. As soon as it's outside of the hands of a healthcare provider, organization or payer, it's not going to be protected. Sara Zargham (06:05): Absolutely, Ty. This is just the first of many to come. And actually following the GoodRx settlement, the FTC put other companies on notice for sharing consumer sensitive health information for advertising purposes. So this is probably just the first of a lot to come. Rae Woods (06:25): But it also took us 14 years to get here. So why is the FTC kind of finally feeling momentum to act now? Ty Aderhold (06:36): Frankly, Rae, I think we've just reached a breaking point, for regulators and for advocates, I think we've reached a point where there's been enough reports and sort of whistleblowing journalism that has been done. There's been enough outcry, particularly around providers sending data to Facebook, and that's health systems as well, there's been big reporting around that. And we haven't seen any other sort of agencies or regulators step up to sort of fill some of the gaps that have come about as we've continued to expand our data capabilities and the usefulness of consumer data, to reference back to our previous episode with Solomon. And so I think the FTC is seeing that gap and saying, "All right, if there's no one else that's going to step in here, we do have this law we passed back in 2009, let's start to use it to fill some of those gaps." Paul Trigonoplos (07:35): I also think the momentum here reflects just the posture of the FTC overall. Like Lina Khan has said that healthcare's kind of where they're going to be aggressive and separately you can see some signals in their new budget on how aggressive they're going to be on the consumer protection side this year and next year. And they also are going to try to move the law forward however they can to beef up their ability to prosecute what is either anti-competitive or harmful to consumers, this is on that list. Sara Zargham (08:08): Yeah, I agree, Paul. I think it's also a lot easier for the FTC to regulate, like you were saying, now that there is precedent that's been established with GoodRx's win, and then also with recent settlements across the past few years with both Flo and Better Health as well, who also shared sensitive health information with large technology companies. Rae Woods (08:30): Meaning those are apps, Flo and Better Health are apps. Sara Zargham (08:34): Yes. Rae Woods (08:35): That collect user data, sold that data, right? Sara Zargham (08:38): Yes. Rae Woods (08:38): Got it. Sara Zargham (08:39): So with these wins, I think it's a lot easier really for the FTC to kind of continue the momentum here as opposed to with vertical integration for example, which is an area they are exploring, but they haven't really made any concrete strides there. Rae Woods (08:55): That's right. And that's something that we've talked about before on this podcast. But if what I'm hearing you say is that the momentum is just going to keep gaining speed, I want to talk about who's going to be impacted by that. Look across healthcare, who are the winners and the losers of the FTCs actions? Paul Trigonoplos (09:15): I'll start with the loser, which is just anyone that makes money using or selling consumer data, this makes it a lot harder to get the free or cheap data that you'd want to make that revenue. This is mostly tech companies, apps, some big tech a little bit, Facebook, Meta, they might be impacted because they can't get the same advertising data, but I mean, I think they'll be okay. It's really kind of smaller health tech companies that I think are going to be on the losing side here, rightfully or wrongfully, depending on how you look at it. Ty Aderhold (09:48): And that's especially true because for a lot of earlier stage startups or tech companies trying to operate in the healthcare app space, data has been their short term revenue as they try to build a larger model. So they've relied on patient data as a revenue source short term as they try to build out market share in a long term actual business plan. Rae Woods (10:15): Wait, wait, wait, I'm guessing that alarm bells are going off for our listeners right now, that are going, "Wait a minute, I'm partnering with a tech company, or I'm being told I should partner with a tech company, or I need to bring my business into the future and be thinking about artificial intelligence, to be thinking about a digital front door." And these are things Advisory Board has said. What does it mean for organizations that partner with tech companies? Ty Aderhold (10:38): I think the biggest thing is that you have to be doing your due diligence right now. We know health systems and health plans have been investing heavily in consumer data and getting consumer insights from big tech companies like Meta. Most of those arrangements, I'm guessing, involve data moving in both directions, and that is where these organizations need to be looking. What are we sending out? What are we giving up in return for some of these consumer insights that we've started to use and are really valuable to our organization? And do we need to change our practices there? Rae Woods (11:15): And I'm guessing that muscle is actually going to be very new for most healthcare organizations, is that right? Ty Aderhold (11:21): Yes. The people who have made the decisions around marketing and consumer data aren't used to having to consider the regulators and FTC and HIPAA and all these different regulations. And so certainly I would say a new muscle to flex. Rae Woods (11:42): So patients have to be the winner then, because the goal is to protect patients, people who are using these data in their daily lives. Is it as simple as saying that? Sara Zargham (11:53): I think on the surface, yes, the idea is that this precedent should protect data privacy somewhat, at least when it comes to apps selling your data to Google or Meta or really any other big tech company. But this isn't a comprehensive solution at the end of the day, to data privacy concerns, unfortunately. And there's still a lot of use cases that need to be hatched out, and this is, I think, really just the beginning in terms of any sort of real regulation this space. I mean, we keep going back to the fact that the HBNR was passed in 2009 and then it took 14 years for really any action to be taken with it. So I think we're kind of a long time out from seeing real comprehensive protection. Rae Woods (13:33): Patients and consumers could ultimately benefit here. And what I mean by that is benefit from the FTCs crackdown on data sharing. And this is where I want to be particularly careful with what I'm about to say. If it sounds like patients could benefit, is it actually their responsibility to take a bigger part in their own data privacy? Is that even fair to ask that of the general population? Paul Trigonoplos (14:04): Who wants to go under the bus? Ty Aderhold (14:05): I have a pretty short answer for you, Rae. I would say no, it's not fair to place that on patients, particularly when this is oftentimes very sensitive data they might be giving in high stress scenarios where there's a million other things they need to be focused on other than the fine print in the app when they click through once initially downloading it. Rae Woods (14:35): And it's not only high stress scenarios. First of all, no one probably actually reads the terms and conditions. In last week's episode, Solomon very gently but bluntly made fun of me for giving away my data to Amazon when they were buying One Medical and we were all in a meeting and I said, "Let me get on and let me see what I can do, da, da, da, da." And then all of a sudden realized not only did I pay $40 for that visit, but I was giving away information to Amazon that is protected in a different way than if I was doing that through a provider telehealth program, right? Ty Aderhold (15:10): Right. But I would say the other thing, Rae, is that terms and conditions is the most obvious place that a patient would know. And again, it should not be on them to know that. But a lot of the news stories we've seen that have broken is on a scheduling website for a provider, a patient puts in their information to try to schedule online and that information ends up going to Meta. And in that case, there are no terms and conditions. The patient is just going to the provider's website expecting that, hey, I'm just filling out this form. And those are the scenarios where there is, I would say, it should be 0% on the patient. Rae Woods (15:54): Yeah. What happens if the FTC starts coming down very hard? We said they're building momentum, they're starting to use this 14 year old rule and actually start going after folks. What happens if they really turn up the heat? Sara Zargham (16:10): Yeah. So we've been talking a lot about the harm to patients or the concerns around data privacy and data sharing. But another thing I did want to point out in that same vein, Rae, is that a lot of customers or consumers do really love these apps. They frequently use them and they can really be a value add when it comes to convenience. But unfortunately, like we've been talking about, some of them might be a bit sketchy, for lack of a better word, in terms of how they use the data and how they share the data. (16:43): But kind of going back to your question, if the FTC does come down really hard, patients may lose kind of that choice to use these convenient apps or platforms for free, even if they don't really mind giving up or selling their data. Ty also touched on this, there's the potential to stunt innovation as well to a degree, maybe a new market entrant or a new venture will be less willing to develop these apps if they can't actually rely on selling data as their initial early revenue source or even their main revenue source to enable the app to be free for consumers and kind of the general public. Rae Woods (17:24): And I think it's important to remember that from the consumer's perspective, folks are using a lot of different stuff at the same time. So thinking about my personal life, I am a Garmin loyalist, so I've got my kind of wellness apps that I use to track all the fitness that I do. I have got apps that help track other aspects of my wellness. Savvy listeners will remember that I complained to Ty once because my pediatrician requires an app to check in to our appointments for our son. And I had to do that, I didn't want to get that app when going to well visits for my kid, but our pediatrician said, "No, no, this is our new process." So I'm thinking off the top of my head, I probably have six different platforms that I'm using, knowingly giving my information away, let alone the kind of scheduling piece that Ty is now making me kind of panic about. It can feel impossible to get ahead of the data out there. Ty Aderhold (18:21): And Rae, the other way to think about this is you may have those six different apps that you knowingly said, "Okay, Garmin, I am going to give you this data." Garmin's actually an interesting example here, because there are apps like Apple and Garmin that are very clear that they're not going to sell your data, that they actually keep your data private. Apple has put out a ton of commercials lately across major television platforms that are focused on the privacy piece of Apple Health. But back to my main point, when you consider all of those different six apps that you've given data to, you may be okay with that app having your data, but do you want the data from all of those six different places connected somewhere into this giant portfolio of who you are and connected to your consumer spending habits and your sort of relative wealth and all these other things? That's the piece where it's like, ooh, do I want that? That might feel different than like, oh, I'm fine with having this one app collect data on me. Paul Trigonoplos (19:23): And Ty, your commentary made me think of the fact that increasingly the ability to sort of anonymize someone's identity with anonymous data, it's getting really easy. I think a lot of large corporations, a lot of data heavy organizations that kind of data analysis and data science is their bread and butter, it only takes probably a dozen tops data points from an anonymous person to connect and form a de-anonymized version of their persona. So that puts some specifics behind your comment on do you really want all of your disconnected data to be stored in some single portfolio somewhere? Rae Woods (20:07): We've been talking a lot about third parties, we've been talking about apps, we've been talking about third party websites. Is that really all we should be focused on in this conversation? Paul Trigonoplos (20:16): I think every part of the healthcare sector has something to pay attention to here because they're all kind of still in the business of getting and sharing data at the end of the day. But one that I would call out is there was a recent Health Affairs article that we can include in the show notes, that said 98.6% of US hospitals inadvertently collect website data through website tracking services. Rae Woods (20:37): Wait, wait, wait, wait, wait. I'm sorry. Say that number again. Paul Trigonoplos (20:39): 98.6. Rae Woods (20:41): Oh my God. Paul Trigonoplos (20:42): Yeah. So functionally, 100% of US hospitals inadvertently are collecting data from how the user interfaces with their website online. Rae Woods (20:51): Do they know that? Paul Trigonoplos (20:52): No, most don't, or they plead ignorance, I'm not sure what the truth is there. But there was a big fallout kind of in the media after, everyone's like, "I had no idea. We did not know that this data was being collected and then sold." Rae Woods (21:05): So there are a lot of things in this conversation that are making me nervous, and I'm guessing that our listeners are in the same boat, but there's one that we haven't mentioned yet. We started off this conversation talking about GoodRx as kind of the first use case here. And Sara, you mentioned two others that came quickly. One of them is Flo, which is a period tracking, I believe it's an app. And there are some immediate questions, alarm bells, red flags that folks are thinking about in the context of period tracking, abortion and what it means to be able to legally get one as a patient, a pregnant person, and also what it means to have that knowledge as a clinician and a healthcare provider. We've seen the FTC speak about data privacy with regards to reproductive health after Dobbs. What is the FTC doing here? Sara Zargham (21:59): Yeah, so I think it's really important to point out that we're at the one-year anniversary of the overturn of Roe v. Wade, and there's still a lot that's unknown. Dobbs is a great example of an instance where really it's on patients or consumers to protect their own data. This is another area where we're kind of seeing the HIPAA gap and potentially some regulative authority could come from the FTC. Following the overturn of Roe v. Wade, the FTC made a statement about the misuse of mobile location and health information, including reproductive health data and how it exposes consumers to significant harm. But at the end of the day, this isn't really a full solution to the reproductive health data privacy component, and we're already starting to see or hear about prosecutors really taking advantage of this gap between what HIPAA protects and what is covered by certain health apps. And this is really concerning in a post Roe v. Wade world where, like I was saying, prosecutors can access and use this unprotected reproductive health data to really enforce anti-abortion laws. Ty Aderhold (23:19): And this isn't just about health data, it's also about consumer data and location data that Sara briefly mentioned as well. And it ties back to our previous episode with Solomon, of there is so much that is not considered health data, that is just consumer data that we are seeing used with some of these reproductive health conversation, the patient's location data, if they're a clinic, their searches and text messages and other things. And so I think reproductive health becomes a great example of the harm that can come to patients and some of the potentially negative repercussions of using consumer data and not having regulation around it. Rae Woods (24:07): And this is one where I think we can all agree that we want to and need to see health leaders get more involved. First of all, they are on the hook for some of these decisions that they make and the knowledge that they may or may not have, particularly when it comes to things like abortion. And a year later, we are still getting a ton of questions from clinicians and from the delivery system about what this means for them. So I might vote we pause and spend an episode specifically talking about the implications of consumer data in the wake of Dobbs. Ty, are you willing to come back again for that episode? Ty Aderhold (24:45): Sign me up for one more. Rae Woods (24:47): Sara, Paul and Ty, thank you so much for coming on Radio Advisory. Sara Zargham (24:51): Thanks for having us, Rae. Paul Trigonoplos (24:53): Thank you, Rae. Ty Aderhold (24:54): Nice to be back. Rae Woods (24:59): Look, we are not saying that no health leader should partner with any tech vendor. We're not saying that there is a world in which you cannot use any consumer data. It's 2023, I don't think that's a realistic solution for today or for the future. But here's what is incredibly obvious to me, it is incumbent upon health leaders to be extremely vigilant on who they're partnering with, how they're using technology today, and clearly understand the implications for their clinicians, for their patients, and for their partners. And remember, as always, we are here to help. If you like Radio Advisory, please share it with your networks, subscribe wherever you get your podcasts and leave a rating and a review. Radio Advisory is a production of Advisory Board. This episode was produced by me, Rae Woods, as well as Katy Anderson, Kristin Myers, and Atticus Raasch. The episode was edited by Dan Tayag, with technical support by Chris Phelps and Joe Shrum. Additional support was provided by Carson Sisk and Leanne Elston. Thanks for listening.