false
Catalog
Ethical Considerations in Digital Mental Health
Presentation Q&A
Presentation Q&A
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
So, Dr. Nebaker, we've had a few questions come in. So you just mentioned that there was a network of about 900 individuals who were working on sort of ethical issues around technology. How can a site tap into that network if they're trying to build a sort of library of apps that they want to use with patients? So if they want to build a library of apps, there's a couple of resources that would, you know, depending on what the needs are, the community that we have on the Connected and Open Research Ethics platform can be accessed through the forum that we've created. And if you are able to go to our Recode Health website, the core platform is accessible as one of our tools. And so putting a question on the forum to ask about what libraries may exist already, how they've been vetted, would be a first step to that, to just leverage the community. Because we have people in the network that are from the UK, from Australia, obviously from the U.S., but there are some really amazing things happening globally that, you know, we really are trying to help people not reinvent the wheel. There's also some really good resources that our colleagues up at UC Irvine have, Dr. Torres as well. So those are folks that are also part of our network. How would you suggest we begin to talk to our psychiatry trainees about using technology with patients? So I think it's really important to ask patients what they would like to learn about the themselves, and to find out how comfortable they are with using technologies to begin with. Depending, what we've learned is the older adults that we've worked with are really unfamiliar with how to use apps. We also have done a study on apps that are targeting older adults, and many of those are really, you know, frankly, dangerous for people because they're not designed for older adults. They're kind of preying on people. So I would suggest that for trainees that there be some education on what these apps are, how they're used, the differences between apps that may be clinically tested and are used for healthcare deployment versus commercial grade apps that could be useful as a tool. So it really requires thinking through what does the app do? And again, going back to the framework, how are data managed? How are privacy expectations considered? What is the literacy level with respect to data and technology? Is it usable? Has it been tested for that person? Is it accessible to that person? Do they need to have internet? Might it cost them more money? So I think having a checklist and using that kind of a framework to introduce trainees to how they may need to think about these things moving forward is a good step in that direction. But also, what is their role in educating the patient and how do they want to have or do they want to have a relationship with that patient where the patient is capturing information about themselves using a digital tool and sharing that information with their provider? So that kind of patient-provider relationship I think could really benefit from having that kind of ability for a patient to self-monitor, but using a tool that's safe and that can be shared with their provider. We have several clients in our clinic who wear Fitbits and we report their Fitbit progress as part of the electronic health record. Do you see any problem with that? I'm not sure I understand whether or not Fitbit progress is somehow, I'm guessing it's just what they have received on their Fitbit device or is it part of a wellness program that the clinic is involved in, in which case they may have access to the actual Fitbit data. So I think it depends on whether they have access to Fitbit's database. In fact, if they do, then there are some obvious concerns about privacy and does the patient know that their data are accessible? If they've agreed to that, that's one thing. If they're reporting on their Fitbit progress, I think that's an interesting metric to keep. These are everyday things that we can capture and share with the clinical team that normally only happened when they would come in for a visit and self-report. So the accuracy of these tools can really, I think, advantage the clinical relationship. So I think that there are obviously good things that can come from it, but we do have to keep in mind data management and privacy. I'm going to butt in and take a moment for my own question, which is when you're doing a research project that involves a wearable, my experience has been that the privacy officer comes back and asks a lot of questions about where the data rests. I find it hard to understand or how to respond to those kinds of questions. Is that the kind of thing that you go to the company for? How do you know how the company uses the data or who has access to the data in order to responsibly respond to the privacy officer? That's great that they're asking that question because you really do need to know where the data rests. You need to know who has control of it, where if they're selling it to others, data equals money for these big companies. And they were not designed to be a research tool. So that is not their business model. So researchers have figured out these tools could be really great for research. So some companies like Fitbit, interestingly, did not create a research-friendly terms of service until a year ago. And as a result of that, the National Institutes of Health All of Us Research Program would not partner with them until their terms and conditions of use changed because it put people at a disadvantage because of how they managed consumer data. So there are some apps and some devices that have developed intentionally as a research tool. So Actigraph is one of those where they wanted to partner with researchers and they built their tools so that all of the privacy and data management standards were adhered to. And so there are some companies that you will know you can trust and are the go-to because they have in place what we would call research-friendly terms and conditions. That is not a well-organized list at this point in time. I think Fitbit has been one of the go-tos for researchers because they have, well, Aaron Coleman, who runs Fitbit, has become the middle person between Fitbit and the researchers. So he basically created the database that stores all the research data and is able to manage research data that does not then become part of Fitbit data. So there are ways of making it responsible, but it does take due diligence on the part of the researcher and does the privacy officer at your office know what a good answer is? So when we started doing this research a few years back, the privacy officers and the IRBs had never really known that they had to think about these things. They hadn't realized that, you know, reviewing a terms of service was an important part of their job. And in some cases, the terms of service shouldn't keep a person from participating in a study. But if the IRB and the researcher have reviewed the terms of service and said, this is, you know, this company wants, okay, so ecological momentary assessment is a good example. So a researcher wants to use that to monitor a person's emotional state in real time over the course of a six week study. And so they want to use ecological momentary assessment, which is an app that can be downloaded on that person's phone. But part of the terms and conditions of the company that is managing the ecological momentary assessment data says that they can access the individual's contacts, their photos, their phone list, their email. It may be costly to their data plan. All of that is in the terms and conditions, which is totally unacceptable for a person that is participating in a research study. So if they don't change their terms and conditions to be research friendly, then an option is to give that person who is thinking about participating in the study, their own phone, a research phone that has the app on it. It's burdensome because now they have two devices that they're having to manage, but that's how you can work around it and still give a person the opportunity to say yes or no. I think we need to be thoughtful about giving people the information they need to make an informed decision, even if it's not information that we like ourselves. But at this point in time, it's an uphill battle right now to get these companies to become research friendly, because again, that's not what they were built for. And so would you say that at times you've seen aspects of the terms of service integrated into the informed consent? Yes, and not cut and paste, but in this one scenario, we said, if you want to use your own phone, please be aware that this app will have access to this information on your phone. This is who they may share it with. If you choose to use the research phone, they will have none of your personal information, but you do have the added responsibility of carrying your device, your personal device and the research device. So you would disclose what you think they need to know in order to make an informed choice. That is something that normally, I mean, I don't know on this webinar, if we were to ask the group, how many of you read the terms and conditions before you download an app? I think for the most part, nobody does. So the fact that IRBs and researchers are now increasingly reading these things to make sure that they're not asking somebody to do something that could invite potential harm, but telling them the major points of what they need to be aware of and whether or not they care is up to them. So then they still have the choice and we're, you know, maintaining our respect for their autonomy to make a decision. Right. It is a lot more work than researchers for certain. Right. But I think as researchers, we have always felt like we need to make sure that our participants truly understand. It's our obligation, it's our commitment as researchers, but yes, understanding terms of agreements of some of these companies is challenging that, you know, to be kind, I would say. Oh, absolutely. And I think the same goes for patients. If you think that there's a tool that maybe you have used yourself for meditation or for fitness management and you really like it and you think it could benefit them, it's up to you to make sure that you've thought through what is potentially the downstream of them using that. And maybe they care more than you do about your own data and where your data are going. Wonderful. All right, well, that wraps up our questions for today.
Video Summary
In this video, Dr. Nebaker discusses how a site can tap into a network of individuals working on ethical issues around technology in order to build a library of apps for patients. The Connected and Open Research Ethics platform and the Recode Health website are suggested as resources to access this community. Dr. Nebaker also emphasizes the importance of considering patient preferences and comfort levels when introducing technology in psychiatry, particularly for older adults who may be unfamiliar with apps. Privacy concerns and data management are highlighted, with a recommendation to use a checklist and framework to evaluate apps. The use of wearables like Fitbits in clinics is discussed, with attention given to data privacy and management. The need for researchers and privacy officers to review terms and conditions of companies offering wearable devices is emphasized. Some companies have research-friendly terms, while others may require additional measures to protect participant privacy. Dr. Nebaker suggests incorporating aspects of terms of service into informed consent documents to ensure participants make informed decisions.
Keywords
ethical issues
technology
library of apps
privacy concerns
data management
Funding for SMI Adviser was made possible by Grant No. SM080818 from SAMHSA of the U.S. Department of Health and Human Services (HHS). The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement by, SAMHSA/HHS or the U.S. Government.
×
Please select your language
1
English