false
Catalog
Crisis Management Using Asynchronous Telehealth an ...
Presentation And Q&A
Presentation And Q&A
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, and welcome. I'm Dr. John Torres, the Director of Digital Psychiatry at Beth Israel Deaconess Medical Center and technology expert for SMI Advisor. I am pleased that you're joining us for today's SMI Advisor webinar, Crisis Management Using Asynchronous Telehealth and Apps for People with Serious Mental Illness. Next slide. SMI Advisor is also known as the Clinical Support System for Serious Mental Illness, is an APA and SAMHSA initiative devoted to helping clinicians implement evidence-based care for those living with serious mental illness. Working with experts from across the SMI clinician community, our interdisciplinary effort has been designed to help you get answers you need to care for your patients. Next slide, please. Today's webinar has been designated for one AMA PRA Category 1 credit for physicians, one continuing education credit for psychologists, and one continuing education credit for social workers. Credit for participating in today's webinar will be available until November 29th of this year, 2021. Next slide, please. Slides from today's presentations are available in the handout area found in the lower portion of your control panel. You can select the link to download the PDF now or at any time during the presentation. Next slide, please. Feel free to submit your questions at any time throughout the presentation by simply typing them into the question area found in the lower portion of our control panel. We'll reserve about 10 to 15 minutes at the end of the presentation for question and answer. And next slide. Now, it's my great pleasure to introduce you to today's faculty for the webinar, Emma Parrish. Emma Parrish is a third-year graduate student at San Diego State University and University of California, San Diego, joint doctoral program in clinical psychology. Emma's recent research interests lie in real-time assessments and interventions through technology for people with serious mental illness with a focus on suicide prevention, cognition, and functioning. Currently, Emma is completing a clinical practicum at the San Diego VA Center for Recovery Education Corps, where she works with veterans with serious mental illness. Emma, thank you so much for joining us and leading today's webinar. Of course, thank you so much for having me. I'm really honored to be here, and thank you all for attending the webinar. So I'll go ahead and get started. So I have no relationships or conflicts of interest or disclosures related to the subject matter. So the learning objectives for this webinar are to, first, learn about factors that are impacting suicidal ideation in people with serious mental illness, assess the risks of crisis policies in mental health mobile apps, and be able to judge if one offers adequate protections for use with patients that you work with, and also to formulate an effective safety plan for use of apps in patients with SMI. So first, I'll get started in talking about suicide among people with SMI with some pretty sobering statistics. So one in 20 people with a psychotic disorder die by suicide, and between 23% to 50% of people with a psychotic disorder have attempted suicide in their lifetime. However, despite this, suicidal ideation behavior among people with psychotic disorders is poorly understood. A number of authors have commented on this gap in the literature that exists for a number of reasons, partially because studies of suicide prevention a lot of times exclude people who are experiencing psychosis or other people with SMI, or that there just isn't a whole lot of research out there. However, we do know some information about factors that impact suicidal ideation in people with SMI. So first, hallucinations, particularly command hallucinations, are related to suicidal ideation and behavior. Additionally, and perhaps a little bit less well-known, is social cognitive deficits. So social cognitive deficits, particularly negative social cognitive bias and threat perception, so meaning perhaps perceiving a neutral face or a neutral situation as either threatening or angry, are related to suicidal ideation in this population. And also, these social cognitive deficits may also be associated with reduced social support, because people may be viewing social situations as more threatening. Additionally, interpersonal constructs related to suicide are linked to suicidal ideation. So by this, I mean the interpersonal constructs that are related to the interpersonal theory of suicide. So for those of you who aren't familiar with this theory, it's a well-known theory of suicide that, among other things, posits that thwarted belongingness, or feeling like you don't belong with other people, or perceived burdensomeness, feeling like you're a burden to those in your life, converge with hopelessness to kind of create suicidal ideation among people. And there's evidence that thwarted belongingness and perceived burdensomeness are also factors that contribute to suicidal ideation in people with psychotic disorders. However, these interpersonal constructs have some unique sort of correlates among this population. So for my master's thesis, I and my team found evidence that these constructs are related to psychotic symptoms. So people who are hearing voices or experiencing mistrust in others feel less belongingness, and they feel like they're more of a burden to those in their life. And also social contexts. So people who are with other people feel like they belong more with others, but also that they may still be a burden to others. So these kind of constructs that exist in the general suicide literature also apply to people with SMI. And finally, among people with SMI, suicidal ideation may be more likely to be associated with an attempt. So people who are experiencing suicidal ideation are potentially more likely to attempt suicide or engage in some other type of suicidal behavior as well. I wanted to also touch on diversity factors that impact suicidal ideation because I think this is really important. Unfortunately, there's not a lot of research at the intersection of diversity considerations, SMI, and suicide. But I wanted to bring up some general risk factors, things that are really important to consider. And as clinicians, always considering the intersection of different identities. So minority stress theory, which is the idea that sociocultural prejudice and discrimination produces stressors, can have an impact on mental health for members of minority populations. This has mostly been applied to people who are sexual or gender minorities, but it could also be extended more broadly to cover experiences of racism, classism, sexism, homophobia, transphobia that may be impacting clients. And additionally, always important to keep in mind that sexual and gender minority individuals may be at a higher risk of suicidal ideation behaviors as well. So these are some things to keep in mind when working with people with SMI. Now, I'd like to kind of jump into the bulk of the presentation, which will be about smartphone apps and using mobile phones along with other interventions for people with SMI. So first, I'd like to touch on mobile phone ownership and app use in SMI. So a recent review, this was in 2016, found that 66% of people with psychosis owned a mobile phone. And this was a review of a lot of the literature. And in the literature that they had found was in the last two years, so from 2014 to 2016, any paper that was published, they found that 81% of people with psychosis owned a mobile phone. Now that review was published about five years ago, so I would imagine that those numbers might be a little bit higher now. Additionally, one survey study found that 90% of respondents who self-identified as having schizophrenia spectrum disorder had more than one technology device that was available. So by this, I mean a computer, a landline, a tablet, a smartphone, a mobile phone without apps, pretty much any kind of technology device that they could access. Some people with psychosis may still be digitally excluded in the digital divide, so may not have the digital literacy in order to participate in using an app. But this can be overcome with interventions designed to improve digital literacy. There's one that's out there called DOORS, Digital Outreach for Obtaining Resources and Skills, which is cited in the bibliography as well. And with some patience and coaching, people can definitely learn to use mobile phones if that's something that they're interested in. And there's also evidence to show that people with SMI are interested in receiving mobile services and that digital health interventions are feasible. So definitely some still potentially obstacles to overcome in terms of digital literacy, but many people with SMI are motivated and interested in using mental health apps. Now I'd like to talk a little bit more about the mental health apps that are out there, provide some really exciting and encouraging kind of information about them, but also caution against some different things that are out there in order for you to make informed decisions with your clients. So first I'd like to highlight a paper that I worked on with John Turos and co-led with Tess Phillip, who's now a graduate student at the University of Iowa, along with our team at UCSD. And we were really interested in answering the question that's actually the title of the paper, which is, are mental health apps adequately equipped to handle users in crisis? What happens with a mental health app when someone discloses suicidal ideation, suicidal intent, anything that may suggest that they're in danger? So we approached this paper in a way that a consumer might. So what we did was we did an internet search that said top mental health apps. We just looked at general mental health apps that were out there because crises may occur at any time among people who are using mental health apps and who may be experiencing mental health difficulties. And what we found was that we found a bunch of list articles and we included 38 apps. These apps were not SMI specific. Again, these were just general mental health apps, also not crisis specific. So we assess the apps, crisis specific resources in the app interface, crisis language in the app, privacy policies as well. Importantly, not every app actually had a terms of service and not every app had a privacy policy. So there were quite a few apps that didn't include one of these documents. And it's pretty well documented in the literature that many mental health apps that are out there actually don't include a privacy policy. So some apps didn't have that information for us. And we also only included free apps. So what we found is that in the app interface, when you actually open the app, you're there, you're using the app, 35% of apps provided any sort of crisis specific resources in their app interface. Three of these apps provided suicide specific safety planning resources and 10 had a suicide hotline number. However, among those that directed users to crisis specific resources, there were some that were not necessarily appropriate for an audience in the United States. So, for example, there were some apps that were developed. There was one that was developed in Australia that had the Australian crisis number, which is great if you're in Australia. But if you're a clinician working in the United States, not exactly ideal. Additionally, the placement of this information in the apps was variable. For some apps, these resources were readily available on the front page, really easy to find kind of top of a right hand corner or something like that. But for some, you had to kind of click through a lot of different menus in order to actually find this information. Additionally, only 10% of apps contain any sort of crisis language in their policies. So by policies, we mean both the terms of service and the privacy policy. The reason why we were interested in this is because we wanted to understand what would actually happen based on their policies. If a user were to disclose somehow, either in a chat box or through a voice feature in whatever manner, if they were to be experiencing suicidal ideation. And also, if they were in need of help, what would that do? So some terms of service had it. So for example, one terms of service said, if you have suicidal thoughts or need medical help for any reasons, consult a local doctor or therapist or call emergency services. But most of the terms of services only said to contact emergency services if it were a medical emergency. And only four encourage users to contact the national emergency number for suicide specifically. Additionally, in the privacy policies, only one actually contained crisis language. This was for Seven Cups, which contains both a therapist function where a user can kind of chat with a therapist as well through the app where they can chat with just a trained listener. And it said that a user's information may be revealed by using the therapist function if they or someone else is in imminent danger. However, it doesn't really tell you what will happen if a user were to reveal this in another part of the app. So overall, really just highlighting that there's really inconsistent crisis language and policies and crisis resources that are in mental health apps. So takeaway, most mental health apps don't direct users to crisis resources, and these are inconsistent. Now, you might be asking, what about crisis-specific apps that are out there? That was just a review of general mental health apps. There is a review not done by our team, but there's a review that's out there that evaluated many depression and suicide prevention apps based on the evidence-based framework for suicide prevention. So the criteria that they looked for were tracking the mood and suicidal thoughts, development of a safety plan, recommendation of activities, any sort of information and education, access to support networks, and access to emergency counseling through either healthcare or a crisis hotline. So basically, they were trying to see, like, are these apps that are out there actually following evidence-based practices? They found 69 apps, 20 which were depression management, 46 for suicide preventions, and three combined. So of these apps, they actually found that of these 69 apps, six apps provided an incorrect crisis hotline number. So really important when evaluating apps to check the crisis hotline number and make sure it's the correct one, both for your country and also the most up-to-date one in general. Most apps included emergency contact info at least, as well as direct access to the crisis hotline. But however, only five apps offered all six evidence-based strategies. And even when asked for evidence-based, so for example, if the app was using a safety plan, many, so 11 out of 26 of those apps did not actually include all aspects of the safety plan. So also, too, with this five apps offering all six evidence-based strategies, I do find it important to point out that when you actually look at these apps, many of them are designed for different purposes. So some of the apps may be, for example, a button where you just contact people for support. So you input different contacts you would like to be supported by if you're in a crisis, and then the app is just kind of like a button that texts all of them that says, I need some help. That one wouldn't necessarily make sense to have a safety plan since it's outside of the purpose. But however, just really important to take away that a lot of the crisis apps that are out there are, you know, maybe not good and not doing all of the evidence-based things that we would like them to do. So with all that in mind, kind of all this information about policies and what's in the apps, I don't want to leave you with a grim picture of what's out there, because there are lots of tools to help you to really evaluate which apps are good. In order to use them in the most useful way. So there are some frameworks that are out there in order to evaluate apps. This here in this pyramid is the APA framework that's developed for app evaluation. So as you'll see, level one down there is accessibility. So is it accessible? Is it free? Does it cost money? Does it make medical claims? Is it credible? What's the business model? Is it stable? This can also be like, you know, is it available for both Android and iPhone? Can my client actually access it? So that's kind of level one. Level two is privacy and security. Also very important. So how is the data collected? Where is the data stored? Is it stored on the device or in the cloud? Can you delete it if you want? Do they have a privacy policy? Do they collect personal health information? And if so, where is the health information going? The next level is clinical foundation. So as a clinician, looking at it, kind of what are your first impressions? Would this be helpful? Is it clinically valid? Are there studies that support it? What are your impressions after using it? And then also to like, how are users feeling about it? Do they find it to be helpful? Next is engagement style, which is really grounded in autonomy and having a person taking an active role in their care. So in general with mental health apps, there is low adherence and high attrition. So in this, it's kind of thinking about what are some things that can keep people engaged. So the engagement style can include lots of different things. It really depends on the individual person and what they like. So some examples could be gamification, having chatbots, having discussion posts where they can communicate with other people. And kind of will this be usable in the short term or the long term? And then finally, therapeutic goal. So this is all about, you know, who owns the data? Can clinicians and clients access the data, export the data? Can it be used in a clinical manner? Even stuff like, can it be integrated with electronic medical records and can data be shared with providers? So all really important things to go through when you're evaluating apps, kind of starting at level one and going up to level five. There are also great resources that are out there to help you make these decisions and to evaluate these apps without kind of starting at square one and really reading through, you know, the terms of service and kind of looking at all these things. So this is MindApps.org. I would really highly recommend it. It's a tool developed by Dr. John Toros and his colleagues for app evaluation, and it can help you and your clients to make informed decisions surrounding app use. So this kind of area here, I'm not sure if you can see my cursor, but you can scroll over to the right, and it has lots of different dimensions that you can consider when making an informed decision about app use. So for example, is it available on Android or iOS? What's the price? Are there hidden costs inside of the app if it's free to download and then there are costs when you get in there? Who developed it? That's a really important thing to keep in mind. Was it developed by a government agency? Was it developed by a private company? Do they have a privacy policy? Do they share their data? What's the content? And is there any sort of evidence base? So again, a really useful tool in order to use this to evaluate apps before using them or recommending them to other people. So now what I'd like to do is to highlight three crisis-related mental health apps. One, which was included in the review of crisis mental health apps, and two, which were not included. And I picked these apps to show three different crisis apps that were out there with kind of a varying degree of evidence base and to really illustrate the importance of evaluating these apps. So I'll kind of go through each of them. This is by no means exhaustive, just three apps that could potentially be helpful in showing you how you might go about sort of evaluating these apps as well. So the first one that I'd like to highlight is the Stanley Brown Safety Plan app. So the in-person safety planning intervention, many of you may be familiar with this. It's a frontline treatment for suicide prevention, and it's a brief approach where a provider and a client collaboratively create a plan for the client's safety. So looking at warning signs that someone may be in a crisis, coping strategies, contacts for both distraction if someone just needs someone to talk to, or distress if someone needs someone to talk to about what's going on, professionals to contact, and restriction of means. So the safety plan intervention was actually developed by Stanley and Brown, and this app is created by the same developers. So you'll see here on the left, there's the safety plan and the different components of the safety plan. It has all the different components of the safety plan. You can also contact through the Contacts tab. You can contact the National Suicide Prevention Lifeline, 911. You can look for treatment through the SAMHSA Treatment Locator, and also the Suicide Prevention Lifeline in Spanish, which is great. And then also there are some psychoeducation about what is a safety plan and kind of how to fill it out for clinicians and for individuals as well. So for all of these apps, I actually downloaded them on my phone, sort of played around with them, imported some information to show you what it looks like. So for example, for the Social tab, the first question, who can you talk with who takes your mind off problems? So here I put two of my best friends, Juhi and Andrea, and the app will actually, you can tap on it and call, and it will directly connect to those people if you are experiencing a crisis. What healthy social settings can you go to that can take your mind off your problems? I'm a climber, I go climbing with friends a lot, so I put the rock climbing gym. I also put some internal coping strategies here too, such as going for a walk in my neighborhood or watching my favorite TV show, Parks and Recreation. So you can really personalize the app which can help with engagement as well. So in terms of empirical evidence, there is strong empirical support for safety planning that's out there. However, there's no empirical support for this particular safety plan app providing benefits over safety planning. There are some feasibility studies, not effectiveness studies coming from Australia that use other safety planning apps, not this one in particular, but nothing that uses this app particularly, but you can imagine that it would be useful in order for people to have their safety plan on their phone to access while they're on the go. Now let's get to the evaluation. So I put kind of the different five tiers, the five levels, and I'd love to kind of talk through them with you in terms of this app in particular. So accessibility, it's credible, it's totally free, which is great. However, it's only available on iOS. So if you have a participant or a patient rather who is an Android user, they wouldn't be able to use it. It also has offline access, which is great because people may not have internet all the time and it's pretty easy to use. If I didn't give it a full check mark, I put it a little question mark since it's not available on Android. Privacy and security. So this is where it gets a little tricky. So this app claims that it doesn't collect data on the app store. So there's that, but it's actually really hard to find a privacy policy. So if you click the link for the privacy policy, the link brings you to the general office of like mental health department, a mental health page for New York, but it's really hard to actually look into the privacy policies of the app. I gave that a question mark, maybe should have given an X because I really couldn't find a privacy policy, but they claim that they don't collect data. Clinical foundation is pretty sound. There are no effectiveness studies of this in particular, but it includes all six steps of the safety plan, warning signs, internal coping strategies, social contacts, et cetera, making the environment safe. Next, engagement style. This is really specific to the individual. So this one is kind of hard to evaluate if I'm not working with a specific person. It's easy to use, however, it's a little wordy. So it may be tricky for some people who are experiencing cognitive deficits or don't really like to read. It's not gamified, but that's not really the purpose of the app. There are no studies on engagement, but it seems easy to use. In terms of therapeutic goal, do we wanna think about shared decision-making, data sharing with providers? The app doesn't directly interface with the healthcare system, but it could be filled out collaboratively with the provider. So another question mark. The next app that I'd like to describe is called Suicide Safety Plan. Very similar, I included it here to really contrast it with the Stanley Brown Safety Plan. Importantly, you may notice that it doesn't match the original safety plan exactly. So it adds reasons to live, which could be useful, but it doesn't differentiate between internal and external coping strategies here. And it also doesn't include kind of distraction versus distress context. It does include a really useful guide that has some psychoeducation, a lot of resources like what is a warning sign? What is a coping strategy? Which could be really useful. And it does also connect to crisis resources. Here's again, a slide where I downloaded the app and personalized it a bit. And you can see here how it kind of fills out the list that's on the very first page with your safety plan. So all you have to do is open the app to see it as you go. So it seems helpful. It seems pretty user-friendly, but again, it's not the full safety plan. So similar to the other safety plan, there's strong empirical support for safety planning, but no evidence for this app in particular. And this app, it's not exactly the same as the original safety plan. So in terms of evaluation, accessibility, it's free to download, totally free, available for Android and iOS, has offline access, and you own the data. That's great. Privacy and security. It has a privacy policy. The data is stored on the device. You can delete the data and they don't share it with anyone. Also a green check mark. That seems pretty sound to me. The clinical foundation, it does as it claims. It's patient-facing. It has a safety plan, has some really good psychoeducation. However, it doesn't actually include all the steps of the safety plan. So it's kind of so-so on the clinical foundation since it's not exactly the safety plan. But again, the modifications that they make seem like they would be clinically useful. Engagement style. It seems easy to use. It might be a little simpler than other approaches to safety planning. There's no data on engagement, but it allows for personalization, which is good. And again, engagement is very individual to each person. Therapeutic goal. It could be used in conjunction with the provider. However, it doesn't allow for export of data, doesn't allow you to call for a provider, but it could be filled out collaboratively with the provider. So part of the reason why I included this one is to really highlight sort of the trade-offs that you as a clinician can kind of think about. So this one may have a little bit less of a clinical foundation in terms of not being exactly the safety plan, but it's really more solid in the accessibility and privacy and security. So really just providing this information to kind of arm you with all the different information for these apps to kind of see how you could go about thinking about these apps. Now, finally, I have another app that's not a safety plan that I wanted to talk about. So this one is Virtual Hope Box. So a hope box in a physical form is a physical representation of reasons for living. It's similar to a self-soothe box, if you're familiar with that in DVT. So people can put whatever they like in their box that really reminds them of their reasons for living. It could be good memories. It could be things that can distract them. It can be things that can help them to really get through a crisis. And this box is then kept in someone's room or in their house, and then they can use it when they're experiencing a crisis. However, one problem with the physical hope box is that you don't necessarily have it with you everywhere you go. So if you're out in public and you're really starting to experience crisis symptoms, that could be really difficult, and maybe you don't have your hope box with you. So the VA developed this Virtual Hope Box app in order for people to be able to take their hope box with them. So I think it's a really neat, well-designed app. So it's developed by the VA, which is the Department of Veterans Affairs, but I wanted to point out that anyone can use it. You don't have to be a veteran to use it. The virtual version of the hope box is more accessible on a daily basis, and it includes a lot of different components. So for example, this Distract Me, it includes puzzles like a word search, solitaire, stuff like that that people can play in order to distract them. There's also different relaxation exercises. This is an example of the controlled breathing, and when you go through this, the bar will go up and down to help with the visual representation of breathing. And then also something else that I really like about this app is that crisis supports and contacts are available on every single page of the app. So see this little phone? If anyone's in any part of the app, they can touch that, and that will connect them to 911, the Veterans Crisis Line in Spanish or English, and also an outreach center as well. So the app, something else that's really cool about it is that it's password protected. So even when you are logged into your phone, you can then touch the app and then put in your, you have to put in your iPhone password. So if someone, for some reason, has your phone, it's unlocked, they can't view it without going into it. And you can also add additional support contacts. So here I added my mom. Another feature of the app is called the Remind Me feature. So this allows you to import personal pictures of good memories that can remind you of your reasons to live and things, moments that you were happy in the past. So here I put a picture of me and my grandma. Had to give her a little bit of a shout out. And then here with Inspire Me, you can put inspiring, there are different inspiring quotes that you can scroll through. So for example, no feeling is final. In terms of the evidence, there's support for use of the physical hope box in suicide prevention. And this specific app is also actually supported by both a proof of concept study and a small randomized controlled trial. So the proof of concept study was 18 veterans receiving DET. And they found that the virtual hope box was used more than a physical hope box. And the veterans perceived it as to be beneficial and that they would recommend it to others. The randomized controlled trial was virtual hope box versus treatment as usual, about 60 participants in each cell and found that the virtual hope box improved coping self-efficacy, but not actual clinical outcomes. In terms of accessibility, it's available on Android and iOS. It's completely free. It's also available in Spanish and it's available offline. So I gave that one an extra check mark because it's available in Spanish as well. Privacy and security. Data is stored on a server, has password protection. However, you can't delete the data and de-identified data are anonymized and shared. So that one's kind of so-so. Clinical foundation, it's sound, has research supporting it. It does its claims. That's pretty good. Engagement is very specific to an individual. Again, this one is really easy to use. There are lots of colors, there are games. It has user-generated data, has audio, music, contact lists, personal photos. I think it's very solid in that respect. For therapeutic goal, again, it doesn't allow to kind of interface with exporting data. You could input your provider's number in there and it could be used in conjunction with the provider in a very collaborative manner. So a little question mark there. So now I want to kind of move on from talking about specific apps. Again, not an exhaustive list. There's a lot of stuff out there, but just wanting to go through some examples and just talk about some general considerations when using these apps, and then also move into applying all of this information to people with SMI. So importantly, while the current research is exciting, it doesn't focus on diverse populations. And so more work is definitely needed in this area. So for example, more apps developed in Spanish and other languages would be really important, and also culturally sensitive adaptations as well. I think that a really good question about app engagement is what about people who are older, maybe 60 plus, would they be engaged with apps? There are no studies on age engagement with these particular apps or within people with SMI. And there's really no intended age bracket for the apps that I mentioned. However, in general, older adults have been engaged in studies of mobile health apps. So this is a review, and they looked at eight mobile health apps studies and showed that older adults actually had the best engagement. So you'll see older adults are this orange line here. So the age group with 60 years and older used the apps for the longest duration, and then they were directly followed by people ages 50 to 59 years old. In our studies of mobile cognitive testing in my lab at UCSD, we've also found that generally older adults with SMI had better adherence to the surveys. So that's not necessarily an app in treatment, but older adults answered more of the surveys, answered more of the mobile cognitive tests. Of course, some older adults may require some assistance with technology, but it appears that older adults are pretty motivated to engage with these apps, which is great. Another consideration would be symptoms. So I think it's a very valid question. Would people with greater levels of symptoms like depression engage with apps less? Potentially. We definitely need more data on this. However, having a mobile-based version of an intervention may be helpful for engagement with interventions like a safety plan or a hope box where someone may want to access them in a crisis and sort of on the go wherever they may be, having it more accessible and with them. In our studies, we found that ecological momentary cognitive tests, so again, not an intervention, but sending people cognitive tests on their smartphone throughout the day, adherence to these kind of surveys was not related to positive symptoms, negative symptoms, depression, mania, or cognitive ability. And again, this is in a population of people with SMI. However, we did find that greater functional capacity was associated with greater adherence, which I think would make sense if people are doing a little bit better, they may be more able to answer the surveys. Disorganization may be another consideration, but as a clinician, if someone's motivated to use the app, interested in using the app, you can definitely work to problem solve using compensatory strategies to help people keep organized, keep their phone charged. And I think that you can treat app engagement kind of like homework completion and CBT, so, or any sort of other practice that you're doing. So having conversations about motivation, barriers to engagement, really while being understanding and keeping the client kind of at the center of that conversation to help them engage if they're interested. Additionally, here's a slide from the BIDMC psychiatry group, and kind of showing that smartphones can capture daily metrics around steps, sleep, self-reported symptoms, and the data over time, kind of when you understand people's patterns with this, may highlight when there are elevations and different things that could suggest risks that then you as a clinician can go and address. So it can help to address symptoms as well, similar to a paper and pencil diary. So now I really wanna focus on the rest of the presentation. We've really talked a lot about general mental health apps for crisis management, what's out there, and I would love to kind of bring it all back to working with people with SMI. So there actually are not a lot of apps and studies that are out there, but there is one that I would like to highlight that was led by my mentor, Colin Depp, and colleagues at UCSD. It's called Safety and Recovery Therapy, or START, and the protocol paper was recently published, which is in the bibliography. So basically the idea behind this trial is that many people who are experiencing suicidal ideation or suicidal crisis, intent to potentially attempt suicide after they present at urgent care, they actually don't go on to engage in follow-up care, and this is especially true for people with SMI. So START is a brief suicide-specific CBT intervention for people with SMI, and one of the conditions is augmented with a mobile phone. So there are two arms to this trial, and both arms receive START, which is four weekly sessions, which cover early warning signs and triggers, symptoms influencing suicidal thinking, social relationships, and they all kind of start with a coping skill, focus on a different topic, and then personalize and help to implement the topic. Clients are getting homework with this as well. And the four sessions is basically designed as kind of a bridge between presenting at urgent care and then getting connected with outpatient treatment. There's also a condition that includes a mobile augmentation. So you'll see this is just start versus start plus mobile augmentation. So the mobile augmentation includes automated CBT scripts that build on in-person content. I'll show you a figure from the paper that highlights this on the next slides. To really extend the content of in-person CBT to everyday life, these mobile scripts are delivered through an ecological momentary intervention app. It's called Illumavie. And participants also receive training on the phones in session one. So it was a six month pilot trial. We're still analyzing the data, but stay tuned, it should be coming out soon. And again, this is in a population of people with SMI. So here's a protocol for the mobile augmentation to kind of show you what it could look like. This is a figure from the paper. So for example, it starts off by asking, did you get a chance to try look point and name today? Which is one of the techniques that's used and taught in the start. They can say yes, and it was helpful. Yes, but it didn't help. No, it didn't. Depending on what people say, it'll then say, good to hear. Or maybe, oh, you said it didn't work. Have you tried this other thing? It can ask about early warning signs, suggest different strategies, ask if it was helpful or not. Asks if people have experienced any triggers, which is personalized to each participant. And then said, you said in session that rewarding yourself with a cup of coffee can help. Did you try the strategy? Yes, and it was helpful. Great job in trying to cope ahead. And then ends with an inspirational quote. So given enough time and distance, the heart will always heal. So the outcomes for this study, we're looking to see if the mobile augmentation would impact the scale of suicidal ideation, and then also engagement as well in services following this intervention. So again, stay tuned for the results, but this is an example of a trial that is just wrapping up for people with SMI using kind of a mobile augmented CBT. However, unfortunately, there are no current publicly available crisis-related apps specifically designed for people with psychotic spectrum disorders. Unfortunately, this isn't really terribly surprising considering that suicide is so understudied in this population, but already developed apps could be potentially used with this population. So I'll go over the different apps that I mentioned and how they may be applied to someone with SMI. So for both of the safety planning apps, I think before recommending an app, working with an app, kind of incorporating it into treatment, I think it's really important to discuss privacy of the app, make sure that the client is aware of the privacy and kind of where their data will be going, make sure that they consent to it, get some informed consent as well, and really translate it into a way that clients can understand. I think when working with the client, kind of helping the client download the app and making sure that they're aware of the parameters of the app. So for example, this will or will not help you call 911 or the suicide prevention lifeline. If I were working with the client with SMI and downloading this app, I might also help them program my number into the context list as well. I think it would be really important to consider mobile phone literacy at this stage, kind of see where people are at, how they're doing in terms of utilizing a mobile phone and any questions that they have, as well as any cognitive deficits and kind of working around and compensating for those as well. And also, like we talked about considering symptoms. So problem solving if someone is experiencing a little bit of disorganization, maybe making a plan for how they'll keep their phone charged should help them remember to take their phone with them. So then they do have this app and they have this available when they need it. I would also collaboratively fill out all steps of the safety plan with the client, really going through just like you would in a normal safety planning process, paying special attention to warning signs. So for example, is the client experiencing hallucinations that make them feel hopeless, hallucinations or paranoia that cause them to then isolate themselves. Linking back to the interpersonal theory of suicide, really stressing the importance of social connections and kind of understanding how clients may or may not be experiencing things like thwarted belongingness, perceived burdensomeness, really having an eye to this, to kind of the factors that impact suicidal ideation when filling out the safety plan. So again, it's very similar to filling out a safety plan as you would normally in your clinical practice, but just adding on the app element, adding on a little bit of like potentially compensatory strategies and making sure that the client is informed of all of the different privacy policies where their data will go. And so then they can make an informed decision about what to do about the app. So the other safety planning app, I would approach it very similarly. So I'll skip to the digital help box. So first, again, I would discuss the privacy of the app and the parameters of the app, sort of, okay, this will help you call 911 or the crisis hotline. It will help you to call me, considering mobile phone literacy. This one I think would kind of be a cool one to assign as homework or at-home practice if the client is motivated and able to use a phone. But again, I would probably really consider their level of mobile phone literacy and kind of how they feel about doing this on their own. And you can also collaboratively kind of go through this with a client, show them all of the different distraction, relaxation tools, help them import photos into Remind Me. But the homework could be kind of, creating a virtual help box, adding in photos, or that could be done in session. And then also to discussing relaxation strategies that may be helpful. I know that kind of without these apps, I go through many of these relaxation strategies with clients like controlled breathing, muscle relaxation, and kind of different mindfulness things. So it could be helpful to really link this back to different things that you may have already talked to clients about when you're going through the virtual help box. So in terms of kind of, I'm calling this section safety planning of apps because there are a number of risks that are involved with apps, a lot of which are kind of linked to data and privacy. So first off, I think being skeptical and discerning when choosing mental health apps, both in terms of privacy policy and also the clinical utility. There are a lot of apps out there that really don't have great interventions. They may have the incorrect crisis hotline. It's important to really evaluate that as well. Using readily available app evaluation tools, so such as the AP framework, before recommending an app to a client or using it in treatment would be really important. And again, I would really recommend using the mindapps.com in order to go through this. Be especially mindful of privacy policies and terms and conditions, especially if an app does not have a privacy policy. I think that may be a little bit of a red flag there. I also think it's really important to educate clients about apps before using them with a client. So shared decision-making is absolutely paramount. Making the decision together, not forcing apps on anyone, and really just having a conversation about the apps and whether or not the client thinks it would be helpful if they're motivated to do it, if they think it would be something that they would use. Evaluating the level of phone literacy is really important and taking potential cognitive impairment into account so that then you can adequately support someone in using a phone, connect them to outside resources that may help them use a phone. As I know, you're all very busy clinicians and may not have time to do that in everyday life, maybe finding a resource that could help them to figure out how to use an iPhone or an Android phone. Putting supports in place surrounding cognitive impairment. So for example, written instructions and reminders may be helpful. And really doing some problem solving and potentially implementing some compensatory strategies to help clients use the app. Also, asking clients what mental health apps that they're already using. They may already be using a mental health app or a crisis prevention app, especially if they're pretty savvy. It may be helpful to ask the client what they're doing already. And if you haven't heard of the app before, doing a little bit of research just into its capabilities and the policies of the app. Again, when doing this research, you don't have to do it all by yourself, using readily available online resources can be really helpful. Making sure that clients understands what apps can and cannot do during a crisis. So by this, I really mean, if there's an app that a client is using where there's a chatbot, knowing that if they potentially disclose suicidal ideation, what will and will not happen. Can they call 911 from the app? Can they call the crisis hotline from the app? Do they need a separate card or separate information that has it on it? Explaining the crisis policy of the app as well as crisis resources available is really important. And also to just making sure that clients know that not all apps are appropriate to use during a crisis given the really broad variability of the evidence base and also of the kind of crisis policies that are in there. So I have a little bit of a checklist. This is just something that I made and consolidated, but really when using apps with clients, reviewing the privacy policy, reviewing technical aspects of the apps, looking at the functionality of the apps, understanding what contacts are available in the apps, if you can input provider information and also what to do in the case of acute crisis. So planning with numbers to call in order. Something else to consider that I didn't include in the slides, but that is that these apps can change since they're on the app store, the developers can add things, can take things away. So even if you've used an app before with a client, seeing kind of like when was this last updated? Did they add anything? Is it still up to date? Is it still evidence-based? Stuff like that can also be really important to consider. And also too, with the what to do in the case of an acute crisis, making sure that clients have all of the tools that they need in order to get the help that they need. So in conclusion, people with SMI are at an increased risk of suicide. Crisis policies and language are inconsistent throughout mental health apps, as is the evidence base for crisis management apps. Local interventions can be really helpful. I think especially in this day and age when you may be seeing people during telehealth, it can be helpful to have more technology that's kind of bringing different interventions that you're doing in session out into the world and more accessible for clients to use. However, definitely proceed with caution and just keep shared decision-making kind of the different principles of app evaluation in mind when implementing. And also, double-checking that any of these interventions in your practice and also double-checking the phone numbers are correct as well. So here is my contact information. I think this is also in the handout with the slides. My email, I have a Twitter, and also my lab has a Twitter. If you'd like to ask me any questions that you don't get a chance to during the Q&A, or if you'd just like to be in touch, and I also wanted to give a big thank you to Dr. John Toros for asking me to give this webinar. It's a really great honor and for mentoring me throughout the process, as well as all the staff at APA, all the experts who helped to review the slides. Really appreciate all your valuable input. And then my wonderful mentors at UCSD, Drs. Colin Duff and Rae-Ann Moore. I'm gonna speed through this bibliography, but it's here if you would like it. And I think now I will give the floor to John. Thank you so much. Oh, and thank you so much, Emma, for such a wonderful presentation. Before we switch to the question and answer, and again, if you have any, you can submit them now, I wanna take a moment to let everyone know about the SMI Advisor app, which is actually accessible on a mobile device, of course. You can use the SMI Advisor app to access resources, education and upcoming events, complete mental health rating scales, and even submit questions directly to our team of SMI experts. You can download the app now at smiadvisor.org slash app, and I can assure you it has followed all of the principles that Emma was talking to us about. So I said, one of the questions that we have was also a compliment to Emma, many nice things coming in, saying that was very practical and hands-on, but also helping understand the bigger picture. In a broad sense, do you think that there need to be specific things, this question says, in kind of safety plans for SMI apps, or would kind of any, would an app that's kind of designed to help someone in a different condition be relevant around safety planning? That's a really great question, and also thank you for the compliments. I think that an app and a safety plan intervention that is relevant for, and developed for people with other conditions would be relevant for people with SMI. Thinking in terms of, in my clinical training at the VA, we use the safety plan that's developed, that's kind of for veterans in general, for people with SMI. So I think the same thing would be true for people using the app. And I'm not aware of any particular evidence that is looking at safety planning, specifically in SMI and modifications. However, I do think that there are a lot of unique things to really keep in mind. So with the warning signs in particular, psychotic symptoms is a big one, and how those may impact crisis crises. I also think too, on a paper that I'm working on with a postdoc in our lab, Dr. Samantha Chalker, we're finding that sometimes people may not have social contacts to list on the safety plan, just in general, not in the app, but in the VA. There actually wasn't any difference between people with SMI and without SMI in terms of the content of their safety plans, but keeping in mind if people may be socially isolated or experiencing symptoms of paranoia that may make it difficult for them to reach out to people. So I guess that was kind of a long answer, but I think that apps that are developed for general populations can be used with people with SMI, but keeping in mind psychotic symptoms and social isolations as well when filling out the safety plan. That makes sense. And this is the answer that may have a more straightforward or basically, how can people stay updated on the progress of your START trial? Oh, that's a good one. I think if you follow our lab on Twitter and me on Twitter, I can tweet about it when we kind of give information. We're currently working on the results of the main trial as well as I'm reading a paper looking specifically at like safety plan recall and if people actually remember their safety plans. So hopefully that should be coming out in the next year or so. So if you follow our lab on Twitter, we'll definitely tweet about it. And also if you have Google Scholar following Colin Depp, he'll be able to, he'll be like, it'll be on Google Scholar and all of that as well. That's Depp with two Ps. Could you say the lab Twitter again in case people didn't catch it and wanna follow it? Yes, absolutely. The lab Twitter is at UCSDCogCogDynamics. And then my Twitter is at Emma M Parish with no spaces and it's Parish with two Rs. Got it. And then a different question. Are you guys planning on making a safety planning app? Oh, we are not, but I think that would be a really amazing the kind of future project and future endeavor to do. Okay. And then one last question that's actually a question again, it says, thank you for the webinar, your useful considerations for the population we work with. So I think lots of compliments coming in. And with that, I'm actually gonna begin to wrap us up. If you actually have any follow-up questions about this topic or any related to evidence-based care for SMI, our clinical experts are now available for online consultations. Any mental health clinician can submit a question and receive a response from one of our SMI experts. Consults are always free and confidential and any about technology and apps are of course welcome. SMI advisors, just one of the many SAMHSA initiatives are designed to help clinicians implement evidence-based care. We'd encourage you to explore resources available on mental health, addiction, preventive TTCs, as well as the National Center of Excellence for Eating Disorders and Suicide Prevention Resource Center. These initiatives cover a broad range of topics from school-based mental health through the opiate epidemic. To claim credit for participating in today's webinar, you'll need to have met requisite attendance threshold for your profession. Verification of attendance can take up to five minutes. You'll then be able to select next to advance to complete the program evaluation and claim your credit. And last but not least, please join us next week on Thursday, November 4th, as Rob Kotis, Donna Roland, and Megan Ether present new clozapine REM, staying informed for the November 15th changes, which is certainly coming up on November 15th. Again, this free webinar will be held on November 4th from 3 to 4 p.m. Eastern time, so at noon for those of you in Pacific time. Thank you for joining us. Thank you, Emma, for presenting. Until next time, take care. Thank you so much.
Video Summary
Dr. John Torres, Director of Digital Psychiatry at Beth Israel Deaconess Medical Center, welcomes viewers to an SMI Advisor webinar on crisis management using asynchronous telehealth and apps for people with serious mental illness (SMI). The webinar is part of the Clinical Support System for Serious Mental Illness, an initiative aimed at helping clinicians implement evidence-based care for those with SMI. The webinar offers one continuing education credit for physicians, psychologists, and social workers.<br /><br />The webinar focuses on mobile mental health apps and their potential use in crisis management for people with SMI. The speaker, Emma Parrish, emphasizes the importance of evaluating the privacy policies and clinical validity of these apps before recommending them to clients. She recommends using app evaluation frameworks, such as the APA framework, to assess factors like accessibility, privacy and security, clinical foundation, engagement style, and therapeutic goal.<br /><br />Parrish highlights three crisis-related mental health apps: Stanley Brown Safety Plan App, Suicide Safety Plan, and Virtual Hope Box. She provides an overview of each app and discusses their potential applications for people with SMI. She also addresses considerations like mobile phone literacy, cognitive impairment, symptoms, and privacy concerns. Parrish advises clinicians to have open discussions with clients about app use, involve them in the decision-making process, and provide support and guidance when necessary.<br /><br />In conclusion, Parrish emphasizes the need for cautious and informed use of mental health apps in the SMI population. She recommends staying updated on the progress of studies like the Safety and Recovery Therapy (START) trial, which explores the efficacy of a mobile augmented Cognitive Behavioral Therapy (CBT) intervention for those with SMI. She also encourages clinicians to follow her on Twitter for updates and further resources. The webinar ends with a reminder of the SMI Advisor app, which provides resources, education, rating scales, and direct access to SMI experts.
Keywords
SMI Advisor webinar
asynchronous telehealth
serious mental illness
mobile mental health apps
privacy policies
clinical validity
app evaluation frameworks
cognitive impairment
open discussions
SMI population
Funding for SMI Adviser was made possible by Grant No. SM080818 from SAMHSA of the U.S. Department of Health and Human Services (HHS). The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement by, SAMHSA/HHS or the U.S. Government.
×
Please select your language
1
English