false
Catalog
Data Ownership & Data Privacy - The Next Frontier
Data Ownership & Data Privacy – The Next Frontier
Data Ownership & Data Privacy – The Next Frontier
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello and welcome to this session titled Data Ownership and Data Privacy, the Next Frontier. My name is Tony Fiola and I lead our cardiac diagnostics business at Boston Scientific. I think it's generally safe to say, as evidenced by this conference, that it's generally accepted that the future of healthcare will include a more digital ecosystem driven by data. I think everybody believes that and has an interest in that and that's why you're here. But for that to truly happen, I think a couple of the existing challenges today must be addressed. And one of the challenges that came up already today, for example, is reimbursement or compensation for using some of this information to better care for patients. I think it's also generally accepted that today, in the U.S. anyway, the healthcare system is very reactive in nature and most of the reimbursement is around the treatment of patients that are already sick. And obviously what we want to do is get to a more proactive healthcare system where we can screen and predict and prevent future disease, but we need the reimbursement to follow that. Another thing that we've got to address is data ownership and data privacy. So it's no secret, you can just look around this room, that there's been a significant increase in the volume of data that clinicians are being forced to deal with and produce by patients, especially when you consider the wearable market and all of the data coming from these different wearables. But core questions in data ownership remain. Things like who owns the data? That came up earlier. Can the data be co-owned? Are the cybersecurity risks mitigated enough to the point where we can use it efficiently? And I think Chris, earlier today, the CEO of VisAI, said it really well when he said data ownership is not very clear. So today, we want to continue that conversation about data ownership and data privacy by starting with where the current law or what the current law currently states. To help me with this, since I am not a lawyer, and the truth is, other than watching a few episodes of Law & Order, that's about as much as I know, are two actual attorneys. So thank you both for joining us. It's Christina Monticello and Payle Kramer. So thank you for being here. And I'll say just a little bit about the two of them. So Christina is a partner at AHSHSLASKY, did I say it right? You got it. Excellent. She's a nationally recognized boutique health law firm based in Princeton, New Jersey. She has extensive experience in the privacy and health information technology space, representing some of the most sophisticated health care clients in New Jersey and nationally. She counseled several leading health information exchange organizations, hospitals, and health care systems, and works extensively with health care clients in the areas of HIPAA and other privacy and security laws, health information technology, and electronic health information exchange, fraud and abuse, state licensing requirements, and other federal and state governing health care entities, and the exchange of health information. Payle is a counselor at Baker Hostetler in Atlanta, here locally. So thanks for making the drive. She specializes in clinical trial and data-based research space. She counsels research institutions, health care providers, and pharmaceutical and device manufacturers on each aspect of the conduct of human subjects and data-based research. Payle's experience includes working with the FDA, OHRP, and HIPAA compliance, guidance for research related to Medicare billing, federal grant compliance, contract negotiations, and guidance on structuring relationships between institutions, researchers, CROs, and sponsors. So welcome to you both, and thank you so much for being here with us today. So why don't we get into it with an introductory question, and Christine, I'll start with you. To a non-legal audience, which I assume most of our audience here today is, how would you describe the current state of patient data ownership, and are there any industry-regulated standards that people should be thinking about? Sure. So right now, across the United States, there really, unfortunately, is no consensus when it comes to patient and consumer health records when we talk about ownership. Ownership is something that we have talked about for years, probably decades and decades, but at the end of the day, that ownership piece never really makes it into the regulations or the statutes, and it kind of takes a little bit of a backseat in the final product of any regulations that come out there. When we talk about health records specifically, so a medical record that gets created or is generated by a hospital or a doctor in their private practice, there's really only one state right now in the entire United States that I'm aware of that actually gives ownership of that patient record to the patient, and that is the state of New Hampshire. When you start to look at the other states, there's about 19 or 20 that expressly give ownership of the medical record, again, to the provider. So very different from New Hampshire, they recognize that these are the property of the provider, in that case, the hospital or the doctor or the other clinician there. And then the rest of us, including my home state of New Jersey, we haven't addressed the topic at all specifically. It is silent. However, in a lot of those states, including in New Jersey, a lot of providers do still take that position that it's their records, right? The hospital owns the records. At minimum, it's custodian of the records. And if a doctor leaves to go to another practice, they become affiliated with a different hospital, it's not that physician's record, it's not the patient's record to dictate, but the physician may still be entitled to have access to and copies of records if they need to continue care of the patient. So that's looking at it across at the state level for medical records. When we start to talk in the broader context of consumer health records, so data that's created and maintained by a fitness application like Fitbit or wearables, medical devices, right? It becomes a lot less unsettled there, and we have far less consensus. There aren't regulations that say the medical device company owns this expressly or owns that data or the patient owns that data in that case there. So we're looking at that very silent approach that a lot of states take also with respect to those medical records and that health data there. So when you start to talk about that broader health data, obviously if you take that data and you port it into or a provider pulls it into their medical record in New Hampshire, the doctor's going to own what it becomes a part of their medical record, right? The wearable data is going to go into the medical record, New Hampshire provider owns that. In Georgia, the provider would own it because I believe Georgia is one of the states where the provider owns that record. But in the hands of the health application or the fitness device manufacturer or some other type of wearable, that's going to be usually treated for now as really sort of the data that belongs to that medical device manufacturer or fitness application or other digital health application there. When we start to move out of the ownership discussion and we start to look at it in the lens of patient control, patient rights, that's where you have a much more universal consensus across the United States and even looking globally outside of the United States there as well. And almost universally, when we do talk about, instead of ownership, but we talk about control that the subject who is the information, the individual who is the subject of that information should have to be able to direct what happens to their information in certain circumstances, that's where we all have a general consensus that, yes, the patient should have, the consumer should have some level of rights and some level of control and sort of dictate what happens to their health data then in that context, again, moving away from that actual ownership standpoint. And it's going to vary depending upon where you are. And right now, we don't really have that federal privacy comprehensive legislation outside of the context of HIPAA. And while there have been some recent legislation efforts introduced to try to combat that, which Pyle will talk about a little later, even federally, we just don't have, again, that conversation about ownership. We're really always just talking about control and rights that a patient may have with respect to the data. Thank you. Pyle, would you add anything from a research standpoint? Sure. So in the research space, there's really two types of patient data. There's the data that is collected specifically for the purpose of the research. And then there's the data that serves some other purpose. Maybe it already exists in a medical record. Maybe while it's collected pursuant to a protocol, it's added into the medical record. For that second type of data, it's going to be what Christina said. It's going to be regulated by those laws that deal with health information held by a provider. We're talking about data that's collected specifically for a study. That is typically owned by the funder of that research or the sponsor of that research. And that's typically laid out in the contracts for conducting that research between the sponsor and the researcher. It's typically made clear to the patient in the informed consent form. And it's truly the purpose behind conducting that research is that the funder wants that data. Where it is a little bit different is where the funder of that research is a federal agency. And in those instances, there's more of a desire to share that data across other researchers. Okay. Thank you. So, Christine, I'm going to come back to you. In your answer, you talked a little bit about it being state to state, which can be somewhat confusing. In some cases, you talked about the manufacturer owning some of the data. In some cases, you talked about the role of the patient with respect to their data. But if you could bottom line it for me or bottom line it for the audience the best you can, is an individual's health data their own commodity? Yeah. So, to some extent, it's a yes, but it's also a no, like the attorney's favorite answer there. So, unlike other types of commodities where the individual has more of an absolute right to sell their product or sell whatever good they have, right? We don't have that neat concept when we talk about medical records and we talk about health data. It just does not translate over as easily if you're talking about selling widgets, right? So, that's the classic law school question we always got in contracts law. We always got in business associates law when we talked about rights and property rights and transactions. It's always, I have 100 widgets to sell, and then you would apply your law school answer. So, it's not like selling 100 widgets where you're holding it out as the best and the most industry up-to-date widgets, right? It just doesn't translate over as well. When we look at data and we look at databases, right? We have data is transient. It's incredibly useful. It's absolutely necessary. I think Chris Mansi said, look, the practice of medicine is a team. If we're not exchanging that data in order to treat our shared patients, then being able to get that complete picture of what the patient's medical conditions are becomes very difficult and not being able to have the data follow along with there. So, kind of putting that property right construct on a patient's data has never really sat right in the United States or even at the state level here too. And when we also look at it for public policy reasons too, again, we recognize how critical the need is for that data and how if the individual did have those absolute rights to be able to just say with respect to anything, you know, I'm going to unilaterally dictate what can and can't be done with my data in all circumstances regardless of what else the need might be out there. You know, for public policy reasons, we have carved out all of these exceptions and we have taken that step back from those ownership concepts and talked again about, well, okay, what rights should the individual really have with respect to that information? What control should they have? In what circumstances is it okay to not get the patient's consent? How can we use that data to spur innovation for predictive analysis, right? For internal healthcare operations and other really important public policy goals in achieving that. It's not, you know, the provider shouldn't be paying the patient in order to use their data. I'm the patient and I'm paying you as clinicians to provide me with treatment, right? I'm not going to Google my health conditions. And you need to be able to use data for that purpose and obviously for other critical reasons there. So, it's part of that balancing act that all states have had to struggle with, right? In terms of, well, where do we kind of give on this? New Hampshire obviously took that and is the only one that took that direct ownership approach. But even in line with that ownership, they still reflect that same rights and controls, right? The individual still does not have an absolute right to trade in that commodity, which is their medical record, even in the state of New Hampshire there. And so, I think over the past five years, I would say really beginning with California, I'm not sure how many of you might be from California, but you know the CCPA was the first real state comprehensive piece of privacy legislation. It was more of a consumer rights aspect. But starting with California, states started to sort of re-weigh all these balancing, right? And taking a harder look at the privacy of health data, the privacy of other data that might not be health, but is more, again, consumer generated and being used by companies there. And so, with New Jersey being one of the most recent ones, you know, all the states have kind of started to fall in line and we have almost about half that have kind of re-undertaken this balancing and sort of started to shift things. And again, keeping the ability for providers and other individuals to be able to maximize the potential for that data and recognizing all of those important public policy means, but at the same time, taking a hard look at what rights the individual should have and how much control they should have with respect to their data there. Okay. Thank you. Payal, is the same general approach shared in the context of research? So in research, when we're talking about data that's held by a HIPAA covered entity, the patient has the right to give their authorization to allow their data to be used for research purposes, but that right isn't absolute. So HIPAA carves out several exceptions that are specific to research to which the patient doesn't have the ability to say yes or no. So purposes that are preparatory to research, you know, if you're developing a protocol, you want to see if you have the patient base needed to conduct a particular study and access patient information to be able to make that determination, you may be able to have access to a limited data set. So that includes protected health information, such as dates, to be able to conduct database research without the patient's authorization. You can actually get full access to PHI with a waiver of authorization from an institutional review board or an ethics board. So that's where the patient's control is limited. And that, you know, for FDA and OHRP regulated research, then patient consent is required, but again, waivers might be available in those cases as well. Okay. Thank you very much. So in this case, I represent a device manufacturer for the sake of this conversation. And we know that patients today encounter multiple data stakeholders, right? From the patient to the healthcare provider, to the device or pharmaceutical manufacturer like myself, researchers, mobile app developers, et cetera, number of people can come across the data. So maybe we can start looking at these other entities, if you will, that might be involved. And Christine, I'll start with you. Do the laws dictate what a company can and can't do with the patient's data, or does the patient dictate what can and can't be done? So it's a mix of both. So when you look at the commonalities, I mean, we have these disparate state laws. We have HIPAA, that's been the only federal rule of law right now for patient privacy of health records. We have some FDA aspects, we have the FTC getting into the privacy and security mix. But a lot of the common elements in there are that the laws will all dictate the type of stakeholder that is going to be responsible for complying with the law, right? So HIPAA does not, and there was a lot of misconception about this during COVID, it does not apply to all of your health information, no matter who has it and where it goes, right? We have covered entities, which are the sort of the entities or the individuals who are responsible for complying with that law. So they all address specifically the type of stakeholder and what that stakeholder needs to do or can't do, as the case may be, depending on the law. So in the HIPAA space, we have your traditional healthcare provider, we have health plans and you have your health clearinghouses, which is like change management that we all recall from the beginning of the year gave probably 80% of the country so much headache with that breach that happened with them. The second commonality between the two when it comes to data stakeholders is that all the state laws and HIPAA all also address the type of information. So again, they don't apply very broadly to all information. When it comes to what the stakeholder can and can't do, they're going to very carefully or sometimes not artfully so state the subsets of data that the stakeholder is responsible for taking steps to, as well as what the individual might be able to do with that data as well. The statutes will also sometimes exclude wholly categories of individuals and stakeholders from having to comply with those, right? So HIPAA is one that's very narrow, right? If you are a health application, but you're not a covered entity, you don't need to comply with HIPAA unless for some reason you are providing a service as a business associate on behalf of that covered entity. If you are a medical device manufacturer, you are also likely not responsible for complying with HIPAA either because you are not a healthcare provider that is engaged in one or more standard transactions, right? And that goes on and on. With New Jersey's privacy law, which just went into effect not too long ago, it's a little unique because it doesn't carve out, so what a lot of state laws will do, is they'll carve out covered entities and say, well, as long as you're complying with HIPAA, you don't have to comply with our state law. New Jersey is not one of those. We decided that even if you're a covered entity for purposes of HIPAA, you are still a stakeholder that must comply with the Privacy Act, but what it instead does is it carves out PHI. So it says you need to comply for all other data that you maintain, but you don't have to comply with our state law for PHI because we recognize that HIPAA is in that space there and you can comply with HIPAA with respect to PHI. So states will take a little bit of a different approach there when it comes to types of information as well as the category of individuals, the stakeholders who have to comply with that. Sometimes you have a lot of other state laws that kind of sit outside of that that even more granularly get into things like sensitive information, so sensitive data. So in New Jersey also we have HIV-specific confidentiality laws. We have genetic privacy laws also that sort of act as that additional layer. So if a stakeholder, regardless of who they are and what type of business they might be engaged in, they may be responsible for complying with that law simply because of the type of information that they process, that they maintain and that they disclose and use. And finally, I think another sort of key aspect of all of that with respect to stakeholders and what they can do and can't do and what the patient or the consumer can't do, there's always that transparency element too that's common in a lot of these regulations, right? We as the health application or as the hospital have to tell the patient, here's what we're allowed to do, right, here's where we can control your data if you're giving, if we're allowed to process it. And here's what we can't do. And here's the rights that you have, right? Here's the circumstances where you must give us consent. And here's the circumstances where your consent is not needed. Things like treatment, things like payment, healthcare operations, those types of things that we're accustomed to using PHI for purposes of HIPAA, you'll sometimes see elements of that also in those state privacy and consumer rights laws as well. Okay, thank you. Pyle, I'll throw it over to you. So Christina talked a lot about stakeholders. So which stakeholders predominantly control the data collected by a researcher or pharmaceutical or medical device company? So the FDA and the OHRP, the Office of Human Research Protection, which oversees federally funded research, they both consider any research that includes individually identifiable information or protected health information to be human subjects research, even if it's not an interventional study that's being done on a patient. So the FDA regulations and the OHRP regulations would apply to any of that research, which includes institutional review board oversight to make sure that the rights of the patient, including their privacy rights, are being protected, and including confidentiality information and risks to confidentiality being provided within the informed consent form. And then in addition to that, when the researcher is a covered entity under HIPAA and the research they're doing is a covered function, which gets a little more complicated, then they have to comply with the HIPAA requirements as well. So Pyle, maybe a follow-up question before we move on on this one. How does a mobile health app come into play? Is it the same? Is it different? So it depends. So if the mobile health app is considered a medical device by the FDA, then it has to comply with the FDA regulations. A mobile health app is considered a medical device to the extent that it either adds to an existing medical device or it transforms the device into a medical device by claiming to treat or diagnose, mitigate or cure a health condition or a disease. It's now a medical device itself and will be regulated by the FDA. In addition to that, the FDA has really been interested in digital health technologies, even if they aren't quite medical devices, in the space of clinical trials, because they're making clinical trials more accessible to communities throughout the country. And the FDA's guidance says that if you're using digital health technologies in your research, even if they aren't the products being researched, the researcher is responsible for making sure that that digital health technology is secure and making sure that patients or participants in the research are aware of the risks associated with using that digital health technology. Okay, got it. Thank you. Maybe one final question on this topic. We talked a lot about the different stakeholders, but who predominantly controls the data that's collected for the actual practice of medicine? So generally speaking, that is going to be the healthcare provider, obviously with certain limits that HIPAA and state laws might place on them, whether it's a state licensing law that prescribes how they may or may not use the information, which might be a little different from HIPAA. You also have things like the board of medical examiners, which may apply to a physician holding those records as well there, but generally they would be permitted also to use those records and generally kind of dictate what happens to them in the normal course of their practice. So they'd be able to use and generally make decisions in terms of how that data is used internally for things that fall into the treatment type of bucket. You can use it to train your staff. You can use it to get paid. You can use it to analyze and use for quality improvement purposes. All of that can be driven by the healthcare provider, especially in terms of internal use. When we get outside of the provider too, when we talk about things like disclosures, that's still driven by the healthcare provider. In that context, of course, within what HIPAA and state laws will permit there. So the information can still be disclosed as part of that collaborative part of medicine. The provider can determine that they need a consult with another provider. They need someone to take another look at the data. They can share that information. They can drive that without having to get the patient involved and to get the patient's permission there. There's no consent that is needed for a lot of those critical public policy purposes and kind of treatment based functions as well there. Okay, got it. Thank you. Okay, so we've talked a little bit about the current state of data ownership. We've talked a little bit about the role of different stakeholders when it comes to a patient's data, but let's talk a little bit about the use of unidentified or de-identified data. And maybe Pyle, I'll start with you. How does this concept of individual data ownership or control intersect? And I think Christine touched on it a little bit, but with the use of patient's health data for the public interest and public benefit, again, for things like research. So research is probably where there's the greatest tension between an individual's right to the privacy of their data and the use of that data for public benefit. And our federal agencies that regulate research have a strong sense that sharing that data will accelerate medical innovation. So the NIH, for example, considers itself to be the steward of the nation's research data and has implemented requirements that when applicants are seeking funds from the NIH for research, the applicant needs to include a data sharing plan in their application. So what is their intent and how will they share the data with other researchers to continue the progress? Now, the NIH does say that they need to consider legal and ethical issues that may arise from sharing that data. So they're certainly recognizing that there are other regulatory structures that are at play here. The FDA as well, in response to the Cures Act, created a framework for the use of real-world evidence in research to support new indications for existing products or to support post-market approval application requirements. And real-world data is data that is routinely collected in the course of providing medical care. So it could be information that's in the medical record, it could be information that's in billing or claims data, or even patient self-reported outcomes. And real-world evidence is then take the clinical evidence about a product and the risks and benefits of those products that comes from analysis of this real-world data. The FDA actually created the Sentinel System where it collects a lot of this information and is advocating for the use of this information, especially in places where a control arm might be unethical in research. So there are many studies now out using real-world evidence. Okay, and maybe just a follow-up to that. Are patients able to opt out of certain secondary or de-identified data uses? And if so, how is that generally accomplished? So, we talked a little bit about HIPAA authorizations and how patients can't control the use of that de-identified data in research through the structure of HIPAA. So it's, but what HIPAA did create, so generally, an authorization should be for a specific purpose. Well, HIPAA created a exception for research saying that there can be a broader authorization for undefined future research purposes. That authorization just needs to describe the research in a manner that the patient could reasonably believe that their information may be used for that purpose. So, future research is, and an authorization for future research is an option where patients can opt in to allow their information to be used for that. Additionally, the NIH, in talking about their data sharing plans, does tell researchers that in the informed consent forms, they do need to notify patients of how information will be shared pursuant to their data sharing plans. Okay, thank you. Christina, anything you'd add to that? Yeah, so looking at the state, there's notable absence of research-specific language in a lot of the privacy and consumer protection laws, right? So, they'll say, they'll usually exempt or greatly expand permitted uses when you talk about de-identified data, right? It's a pretty common carve-out that however the state defines as personal data or personally identifiable information, whatever they call it, personal data seems to be the more trendy terminology for consumer protection rights at the state level to be called. They'll say, if the data's been de-identified in a specific fashion or anonymized, it's no longer subject to the state, and as far as the state's concerned, they're kind of hands-off about that data then. There are only some exceptions to that, and that is where maybe the data's not fully de-identified, but it's been stripped of some data elements, et cetera. So, the consumer might still have some ability to opt out in those cases of things like targeted advertisement, marketing, research, even if it's not being used on their full identifiable data set, and they're only really looking at like a small subset of that. But when we talk about de-identified data, that's a big gap at the state level that just isn't addressed in most of the consumer and privacy protection laws there. Okay, got it. Thank you both. Maybe one more question on this one, and, Payal, I'll throw it to you. Where patient consent or providing an opt-in, opt-out approach is required, what are some of the challenges and maybe some of the solutions for managing this that people should know about? So, the biggest challenge that I hear from clients is that how do we operationalize this? So, you know, there's always the, well, if you want to create a research registry, then you get consent and authorization from every patient who walks in the door. Well, you know, you're already giving them a stack of paper, so how are you going to add one more consent document into that? And even if you do that, the other challenge is, well, what if you didn't capture every possible use that you had in mind? And then, you know, and if patients opt out, how do you make sure that every software system that you're using is able to code that opt-out so that you're not accidentally pulling the data of that particular patient because there isn't great interconnectivity between all of these systems? You know, one of the ways to address this if you want to move forward with the research is to look at the option of a waiver of HIPAA authorization and consent available through an institutional review board or a privacy board that allows access to that data as long as certain requirements are met. Great, thank you. Christina, how do you think about this in the context of the 50 states and how each state is attacking this? Yeah, so some states have, again, they don't address when it comes to de-identified data, they don't really look to those future uses and disclosures like the research context does to kind of address those downstream uses that may be a byproduct of data or research that's done. So some of them do, though, have a little more of a prescriptive consent process that has to be met, right? The opt-out needs to be, you know, where the patient has a right to opt-out, it needs to be, like, conspicuous, it needs to be presented to them in a manner that's not, you know, embedded in legalese. As a lawyer, you know, that is a problem for us. Like, we have these massive boilerplate contracts, but if the individual is not really given an opportunity to have a meaningful consent or you're telling them that they can opt-out but it's buried in 30 pages that they aren't even required to click on but they have to go find it somewhere else and you're not actually presenting it to them, right? Some states have gone to the effort of making sure that consent can't be captured in that way, right? Opt-out can't be captured in that way. But it is challenging kind of looking prospectively forward for those future disclosures. And one of the things the healthcare industry is struggling right now with is actually in the context of sensitive data. And there are some new technological solutions that are being developed to help sort of tag the data, right? Whether it's de-identified data or whether it's an identifiable data set that is subject to restrictions of some kind, tag the data in that electronic setting and have the consent follow the data wherever it goes. And this is really important with some new requirements for entities that deal with substance use information, for example. It's not something that's necessarily applicable to your practice, but you might be part of a health system that has like a substance use addiction specialist, right? And one of the requirements there is that when you get consent for the first time from a patient to disclose their substance use specific information, which has some additional requirements that HIPAA doesn't typically place on a medical record, right? The consent has to actually follow it downstream, right? And so that's one situation where we're really looking to technology to provide us with these solutions. It's things that Europe is looking at too, right? Can we have technology tag and better allow individuals consent, not just over identifiable data, but how byproducts of that identifiable data, right? All this aggregated de-identified data can be used in a way that's meaningful for them, but still preserves the usability of the data itself to downstream. Okay, great. Thank you. So we've talked a lot about laws that govern at the state level, but it seems like when talking policy, we should be talking kind of at the national level, but we have an absence basically, it sounds like, of a comprehensive United States privacy law in place. Is that fair? Correct, yep. You'd say that's correct, okay. So Pyle, maybe I'll start with you. How does the American Privacy Rights Act draft affect HIPAA and research? So the American Privacy Rights Act is the congressional effort to create a federal data privacy law. And it will apply to entities that collect, process, retain personal data and are subject to the FTC. So personal data is very broadly defined. It does include medical information, genetic information, biometric information, amongst pretty much all other types of information. And it does exclude some entities. So, you know, entities that are smaller in number, where it, sorry about that. May I have your attention, please? May I have your attention, please? No harm has been investigated, and it's safe to reenter the building. You may resume your normal activities. You still may see the flashing strobe. However, there is not an emergency in the building at this time. Thank you. That's good to know. That is good to know. I guess we missed something. All right. As you were saying. So the APRA proposes, say that those entities that are in compliance with HIPAA are essentially deemed to be in compliance with the Privacy Rights Act. It will, in the research space, there are several uses that are permissible. Includes use of personal data for any peer-reviewed research that's conducted for the public interest. It allows use of the data for any research that would be regulated by the FDA or the OHRP. And then it allows the use of de-identified data for purposes of product development that's outside of really peer-reviewed research. Okay, thank you. Christina, maybe over to you. Any non-research implications that you see? So, I mean, because the APRA is trying to set that federal floor, right? It's trying to raise the bar from what we have for HIPAA. It's trying to reach a broader array of data. Where those in the privacy community are hopeful that by deeming entities who are subject to HIPAA in compliance that it doesn't provide an addition, just one more law that they have to kind of juggle, right? So there's that hope that it'll somewhat align the regulatory and compliance efforts there. But at the same time, at least as currently drafted, the APRA is not gonna fully preempt all of the various state privacy and consumer acts, right? So it's not just by setting that federal floor. If this gets passed, and I would note, it was tried two years ago, different flavor, different version of legislation. This one's trying to address some of the concerns people had brought up with the prior legislation and why maybe that one failed to pass, right? So the APRA is not going to completely preempt all of those other state laws, right? California's CCPA is still gonna exist. New Jersey's still gonna have its Privacy Act. And what instead entities are gonna have to do is make sure they understand where the federal one, like we'd have to do now with HIPAA, understand where the federal APRA might preempt the state law and where it doesn't, and where they have to make sure that they're complying with both laws, right? Because the APRA is not going to automatically mean that you are, and deem you to be in compliance with all of those state laws, licensing, consumer privacy, whatever the case may be. Okay, thank you. Pyle, are there any other policy trends that you're seeing, whether from the FDA, which I know we've got some FDA here, or otherwise? Yeah, so the FDA is advocating for an increase in diversity in clinical trial participants to see how products are affecting the community at large. And as a part of that, the FDA is shifting towards a more decentralized clinical trial model and advocating for the use of digital health technologies in clinical trials. So they're shifting their focus a little bit more to cybersecurity that's related to these devices. In the spring of this year, the FDA released draft guidance on cybersecurity for any device that connects to the internet in any manner. And so as part of the pre-market approval application, the manufacturer must talk about the vulnerabilities of that device and how the design of that device is intended to address those vulnerabilities. And then the other aspect we're seeing is that the OHRP has adopted the common rule which was revised a couple of years ago after a long, you know, 10 plus years of reviewing potential options for amendments. And the FDA is now working to harmonize with the requirements of the common rule. Recently published updated guidance on informed consent processes which includes, you know, really kind of beefing up the confidentiality of requirements in consent forms. Okay, thank you. Christina, based on what you're seeing in the industry and all of these different laws and regulations, you know, how do they affect stakeholders in terms of their data strategies and thinking about data control? All right, so it depends very much on whether you are a stakeholder that is sort of sitting in one state and doing business with and holding data of residents within one state versus, you know, domestically across the entire country. Let's take, you know, a normal New Jersey health system, right? You have a couple of hospitals, you have your outpatient services, you have affiliated physician practices because New Jersey is one of those states in which the hospital can't own, straight out own, a physician practice. So we have artfully crafted these affiliated practice arrangements between the physician groups and the hospitals. You may have nursing homes, you may have home health, right? You have all of these different service lines, all that have their own state licensing laws, but you don't really have to worry about anything else outside of your state box. Now you take a digital health application that is rolling out a product to a, you know, residents in ten different states, starting with California and moving, you know, they basically want to be able to cover the entire U.S. and market their product, right? It's a very different compliance challenge for them because instead of just looking at California, instead of just focusing on New Jersey, they now have to be cognizant of and aware of all of these other state laws. So I think the first critical thing for, you know, whether you're in one state or whether you're operating in multiple different states, is really to implement privacy and security by design from the very beginnings of whatever the project or the initiative might be, right? If you're a hospital and you're bringing a new technology partner on from the very beginning, you know, have that dialogue with them about how their systems implement privacy by design. You know, do they take into account what are the data sources? What's the data do they need? How is that data going to be used, right? There might be a difference in terms of what data is being held used in one for one purpose and data over here that's going to be collected, but it won't be used for the same purposes as the other set of data, right? So from the very beginning, just looking at that privacy by design, which is not a new concept, but it is something that is increasingly being focused on and it is a huge concept over in the EU. And even regulators like the FTC and probably the FDA are starting to do that too. And even the FTC, which regulates things like unfair and deceptive practices, which you might not think about as a clinician as being applicable to your organization, but it is if you have any type of, you know, anything on your website that holds out what you're doing, your privacy notice, right? If you're collecting information for patients that come to your website, right, their IP address, all of that could trigger the FTC enforcement and regulatory review there. But even they are starting to release guidance saying you have to, if you're an entity subject to our enforcement, you must, you know, really start to look at that privacy by design from the very beginning stages. I can't tell you how many times, like, I've been brought into a project and it's been going on for a whole year and I start to ask about, okay, well you have mapped out the data that's, you know, here, you know, do you know what state laws are you looking at here? And it's like cricket, so like nobody's asked us about that, you know, we'll have to get you some answers and get back to you about that. So you can kind of head off a lot of that by making sure this is part of the conversation up front. So that I think is definitely the most critical thing there. Okay, great. I would just want to follow up on something you said specific to Europe. So we've talked a lot about at the national level, we talked about data ownership at the state level, which it seems like there's a little bit more structure there, but you brought up Europe and I know many of the people here either work at or work for companies that operate globally or a lot of the clinicians that are in the audience certainly have a global footprint as well. Maybe we'll just take one, the EU. So even when we think about the EU or GDPR, how are they thinking about data ownership? So in the data, when we talk about data ownership too, you know, the GDPR is the European Union's very robust and very stringent data principles of, you know, the individual's consent is needed for really any type of processing of personal data that's subject to the regulation. And even the GDPR, as stringent as that series of regulations is, it doesn't talk about ownership either, right? It's all about privacy and security and control and, you know, giving the individual who's the subject of that data the ability to exercise that control. But the GDPR and the European Union also do back away from that ownership question, right? And that's something that likely may be dealt with on a country-by-country basis within the EU. But regardless of that, when we do bring it back to that conversation about control and what rights this individual have, the GDPR is much more stringent, right? I mean, that's where the CCPA really got a lot of its concepts and built upon, right? The concept of a controller who is collecting data from an individual and that individual needs to give them consent to process all of that data with much fewer exceptions than we're used to in the HIPAA space in particular or even at the state level too, right? And so we've had the GDPR in place as it is written for a couple of years now and the EU is starting to build upon that even further, right? They want to take it to the next level to make sure that data is usable particularly for public health purposes and for research and to give individuals a lot more say in how that data and really direct how that data can be used. Even when we're talking about things like de-identified data, which in the US, I'm sure you all know, I mean, big data is just a major industry and a lot of people because it's been de-identified, we don't really have the ability to say, well, I really don't want my data being used even if you can identify me as being the subject of that. I'm not comfortable with my data being profited off of like that, right? And so the new initiative, the European Health Data Space Initiative, is trying to tackle some of that but it's also trying to tackle some other things that we in the United States have been dealing with kind of separate from the privacy conversation. So in addition to kind of dealing with this issue of making data more accessible in the EU and used for all these public policy purposes, they're also starting to, in this regulation series, regulations will address things like interoperability, right? If you're a hospital in England and you have a patient who is somewhere in France, you need to be able to communicate easily and get access to records and the patient needs to be able to get access to those records very easily both in France and in, you know, in England. And so they're trying to tackle that and where we usually have kind of kept that out of the privacy space, we have a whole series of regulations, the interoperability regulations and information blocking rules, they're trying to tackle all of that also now. So I think we're kind of at the forefront of that whereas when we talk about privacy, you know, certainly I think the GDPR was kind of this big push that we're following now but it's sort of the opposite when we talk about interoperability and making data accessible and easy to use and also in terms of like information blocking as well, right? That I, as a patient, should be able to get access to my medical record wherever it is, whoever has it. It's not like Elaine from Steinfeld who's trying to peek to see her, you know, paper chart that the physician has and then the physician blacklists her, right? It's, you know, we're talking about electronic data, right? The patient should be able to get that at their fingertips. All of their doctors, right? This is a team effort. All of their doctors should be able to just get that at their fingertips and it's trying to make that, you know, kind of happen across the European Union where, you know, they're completely, you know, they're separate countries all participating in that so a little different politically than what we have with our states. Okay, thank you. Well, hey, before I see if I'm smart enough to figure out if there's any audience questions on this iPad, I thought I'll throw out some real-life scenarios that I know I've either heard about specifically from customers or that we're dealing with. So, Christina, why don't I go ahead and start with you? So, let's imagine this scenario. Patient walks into a physician office and has physiological data that they got off of some consumer wearable they bought online that may or may not be accurate. Does the physician have responsibility to rely on that data in their treatment plan would be one. If they do rely on that data from that consumer wearable, two, and maybe they found out later it was inaccurate, are they responsible for that? And then three, what's the responsibility of that consumer wearable or fitness device manufacturer? So, I think starting with the first question, right? Does the physician have a responsibility even to look at that data that the patient is bringing to them, right? Do they have a responsibility to even do something with it? So, there's no, you know, there's no state, there's no federal law that's going to say you physician must utilize all data that a patient brings to you, whether it's wearable, whether it's a digital health app, etc., right? At the end of the day, that process of determining what information is of clinical value to your treatment of this particular patient is really going to be professional discretion, right? Just like any other information that a patient might bring to you, you don't have to use it. If there's value in that information, if there's value to that wearable data, which if it's just a Fitbit and they're saying, well, here I slept for eight hours and you're not treating them for a sleep disorder, you may have absolutely no obligation to take a look at that data. If, however, you have a health application and maybe you ask the patient, right? Now, we're not talking about just something that somebody got off the internet, right? You're telling the patient now, I'd really like you to use this remote heart rhythm monitoring application because you have a history of heart attack, because you have a history of stroke, and the application is picking up on irregularities, right? And the patient says, well, yeah, here I have 30 days of data, there's a lot of this, it's alerting me to everything. In that case, the physician may have a professional responsibility, right, to take a look at that data. In that case, the data is likely more valuable to the clinician and relevant to treating the patient. So that would be a scenario where the physician may have an ethical and a clinical more of a duty to use that data. So the second part of the question, I think, was, okay, they've used the data, now what happens next? So, all right, the physician has decided they're actually going to use this data. It's clinically relevant, it's useful to them in their practice and their treatment of the patient. If they are using that information to make a treatment or any other type of decision about the individual, or even, no, not for that specific patient, in the past, they may have used that type of information, this remote heart rhythm data, to make decisions about other patients, then from a HIPAA standpoint, for sure, that information is likely going to be part of the designated record set for the physician. So they then, that means all of the HIPAA obligations that attach to designated record sets, all the rights that a patient has with respect to that, are going to attach to that. And, depending on the organizational policies of the physician practice, or the hospital, or wherever the healthcare, you know, the clinician's practice is located, it may also become part of the medical record, right? And remember, under HIPAA, even if something is not part of the medical record, it's considered what's called a designated record set, if you use it to make a decision about an individual. Even if it's not every single individual, it's not, you know, you can't pick and choose and say, well, I didn't use it for this patient, so it's not part of their designated record set, but I did use it for this patient, so it's part of their designated record set. It doesn't matter, it looks, HIPAA looks at the type of the information, and to determine if it's part of your designated record set. So that means that not only does HIPAA attach, but information blocking laws could attach to that information. So if a third party asks you to give them a, you, a complete copy of everything you have about the patient, information blocking is going to attach, which means you can't not give them that wearable data or that digital health information data, simply because you didn't create it. If it's part of the designated record set, our information blocking laws kind of use that concept to say, well, now you must make that data available, unless for some reason there's a very specific exception that says you don't have to do that. On the medical record side, if you do decide to incorporate that into your medical record then, then it does become part of your actual medical record and not just the designated record set. So it has all the same implications where you release information from your medical record there. And to remind me, the third part of that piece of that. The third part is, what is the responsibility of the manufacturer of that consumer wearable or fitness device? Okay, so that's also, it depends. Always the attorney answer, it depends, fact-sensitive, all of that. But if we're talking again about just a generic wearable that the patient gets on the internet, right? It's a very consumer-driven product. They market to consumers. The product is used primarily by consumers and their personal or their sports life, right? HIPAA is obviously not going to attach, so we don't have these concepts of designated record sets. It's not part of any medical record, so we don't have any of these sort of state ownership of medical record implications. And there really is no law out there right now, as we've talked about, that says that the data is or isn't the device company's data. So they're likely going to treat that data as their data to do with as they wish, subject to their privacy policy, of course, which triggers the FTC and unfair and deceptive practices, right? So it's not entirely without limitations. We do have some things that could kind of affect what that app company can do with the data. If they're subject to the FDA, then the FDA is going to apply. And in some cases, we do have kind of mixed-use products where it's held out to a consumer, but at the same time, it's also kind of marketed to physicians. And so we might have that case where it's both. The consumer is kind of using it independently, but at the same time, the physician is saying, hey, I need you to use this other piece of this product and sort of using it in their practice, in which case, maybe HIPAA is going to apply and maybe the data is not the health apps and it might be the providers. But again, it's very, very fact-specific. Okay, all right, thank you. Pyle, final scenario that I'll give you before we wrap up here. So device manufacturers like Boston Scientific and I know many other in the audience are increasingly incorporating AI into our device functionality. And typically when we're doing this, we're trying to use research and data information to train the AI to get better, more accurate for the future. And typically when we do this, we do collaborate with many of the clinicians that are in the audience today or their institutions. So maybe in a scenario like that, where for research or for training, so your devices are better, your AI is better in the future, what responsibility do the device manufacturers have and what responsibility in that scenario where we're really working with, you know, with our customers, with clinicians, what responsibility do they have in that scenario? So in addition to the responsibilities we've talked about, like FDA obligations related to confidentiality, the manufacturer should really consider their contractual obligations with the clinician and conducting the research and getting that data. Often what I see is that research institutions or researchers will acknowledge that the manufacturer is not a HIPAA covered entity, but then still hold them to the security requirements and the breach notification requirements that are part of a covered entity's obligations under HIPAA. And then, of course, the manufacturer has to comply with the confidentiality obligations within the informed consent. Okay, great. Well, hey, we have two minutes left and I got a question that hits too close to home that came through the my iPad here. And I say this because I have a 19 year old and a 16 year old at home. And the question is, can you briefly touch on how the privacy laws you've been discussing do or do not address adolescent data and whether or not parents are granted access? And they go on to say this can be an issue for sensitive information that adolescents may not want their parents seeing. So just curious how the legal and regulatory considerations in that scenario are. So I can take that first if you want. So federally we do have some, and it's not a substantial part of my practice, but I think it's COPPA, and it does address, you know, in terms of online privacy protection. So we do have some things at the federal level that apply to individuals who are under the age of 13 that do require parental consent to some things. And I know there's a lot of talk about Facebook and Instagram and kind of taking a hard look at those and whether those still make sense. And I do believe that that is an important part of the draft APRA that we're trying to patch, that sort of federal floor there. That is an important part of it and they are looking at online data that's collected from adolescents in particular. And that is a major concern that that proposed legislation is trying to address. At the state level, it kind of depends. And again, it depends on the type of data that that state is targeting and its consumer protection or its privacy laws. A lot of the concern is about online data protection, not just from adolescents but from kind of from everybody, right? All the data that's being collected from individuals when they're browsing web pages and then it's being used to kind of target. And some flavor of those do get into adolescents and when parental consent is needed and when and whether or not that entity, that's sort of the data stakeholder there, using that is allowed to use that data for adolescents and what kind of hoops they have to jump through in that regard. But I got to say it's not as detailed as what the federal, at the federal level, it's trying to accomplish. It's not an afterthought either. I mean there's definitely some thought that's been put into a lot of states in dealing with adolescents. But from a consumer protection, they're more focused on general consumer protection rather than specifically looking at the problems that we have with, you know, adolescents and Instagram, Facebook and WhatsApp and TikTok. My son, my oldest kid is 11 and I can't still can't keep up with this and I'm hoping he doesn't find a lot of these apps for a long time. Got it. Well, hey, thank you both very much. We are kind of right on time. So we will wrap here. I assume you'll both be around for at least a little bit. If other people in the audience do want to come up and ask questions, please do. But again, Payal, Christina, I just want to thank you tremendously for your willingness to do this. It's obviously a complex topic that we need to continue to talk about if we want to continue to, I think, move digital health forward. So I think it's important that everybody at least tries to get an understanding of it. So thank you both for doing this. Truly appreciate it. I hope the audience enjoyed it and I hope they have a great rest of the show. Thank you. And I'm definitely around. I'm going to check out the FTA session after this, but I'm definitely around. And if you're geeky about privacy, like I am happy to talk with you as long as you want. Great. Thank you both. Thank you.
Video Summary
The session titled "Data Ownership and Data Privacy, the Next Frontier," led by Tony Fiola of Boston Scientific, explores the complexities of data privacy and ownership in modern healthcare. The panel includes health law experts Christina Monticello and Payle Kramer, who share insights into current laws and future legislation.<br /><br />Key issues discussed include the reactive nature of the U.S. healthcare system and the need for reimbursement models that support a proactive approach. The panel examines the explosion of data from wearable devices and the unclear ownership of such data. Monticello highlights the lack of consensus among states regarding data ownership, noting New Hampshire as the only state granting ownership of medical records to patients.<br /><br />Data privacy laws vary by state, adding complexity for organizations operating across multiple jurisdictions. Some states, like California with the CCPA, have more stringent privacy laws that include provisions for de-identified data and sensitive information. Monticello and Kramer also touch on the regulatory roles of HIPAA, FDA, and FTC, and their interplay with state laws.<br /><br />The discussion stresses the importance of privacy and security by design in handling data and the challenges of managing patient consent and opt-out mechanisms. The Federal American Privacy Rights Act aims to create a more uniform national standard, though it won't entirely preempt state laws.<br /><br />The session concludes with real-life scenarios, exploring responsibilities around data from consumer wearables and the use of AI in medical devices. The panelists agree on the crucial need for ongoing dialogue and regulatory clarity to advance digital health.
Keywords
Data Ownership
Data Privacy
Healthcare
Wearable Devices
Legislation
HIPAA
CCPA
Patient Consent
AI in Medical Devices
Federal American Privacy Rights Act
HRX is a Heart Rhythm Society (HRS) experience. Registered 501(c)(3). EIN: 04-2694458.
Vision:
To end death and suffering due to heart rhythm disorders.
Mission:
To Improve the care of patients by promoting research, education, and optimal health care policies and standards.
© Heart Rhythm Society
1325 G Street NW, Suite 500
Washington, DC 20005
×
Please select your language
1
English