false
Catalog
Informed Decision Making for Picking SMI Smartphon ...
Lecture Presentation
Lecture Presentation
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Thank you, everyone, for joining us today. Today's talk on SMI Advisor is called Informed Decision Making for Picking SMI Smartphone Apps. For those of you who may not be as familiar with the initiative, the Clinical Support System for Serious Mental Illness, CSSMI, is a Substance Abuse and Mental Health Administration SAMHSA funded initiative implemented by the American Psychiatric Association, the APA. So my name is Dr. John Torres. I'm a psychiatrist at Beth Israel Deaconess Medical Center and Harvard Medical School. And the focus of my research and clinical work is digital psychiatry using technology and innovative mobile tools to improve recovery and care in serious mental illness. So I have no conflicts of interest related to the subject matter of this presentation. So what are we going to talk about today in this world of technology and smartphone apps in serious mental illness? We're going to cover three broad buckets, the first being the potential of apps in serious mental illness. So why are we talking about smartphones in this context? The second part, we're going to talk about what are the challenges in picking a good app? Imagine a patient comes to you and says, what do you think of this smartphone app? Or they tell you, I've been using this app for extra therapy. I've been using this app to track my medications. How do you make sense of it? How do you kind of help the patient and yourself have an informed decision about it and make a good decision about what app to pick? We certainly all have had training in how to recommend therapies, medications, lifestyle interventions, but what about digital intervention? So we'll talk about that. We'll also talk about the American Psychiatric Association app evaluation model, a free resource and tool that you can use to kind of structure a conversation about apps. And we'll go through in detail some of the good things and bad things we see when we try to evaluate apps. So while we can't recommend any one app for you and your SMI patients, we can certainly help you make better and more informed decisions. So with that, we'll get started and we'll talk about the potential of technology and mental health. So this is a picture from the Victorian era where they were imagining what technology of the future would be like. And you can see they got close with smartphones. That's certainly a phone that has a video. And in the background of this picture, you can't see, there's also a flying car. So we don't have flying cars yet. Technology hasn't advanced perhaps as far as Victorians thought, but certainly smartphones have become a lot more small and portable than they thought they would be in this picture. But I said, we don't want to talk about what is fantasy. We want to talk about what is real, what are patients using today, what can we be using in clinical care today? And that brings us to the question of access. And the reason we're talking about technology, considering what's real, what's not real, is we know that it can be very hard for patients to find access to mental health care. On the left, that map that you see that's red and white, that's a map of where there are psychologists in rural US counties per 10,000 population by the census. That was the paper in June, 2018, and you can see there's a lot of regions of the country that have no psychologists. There was a graph of this that was done similar for, it can be hard to find a psychiatrist. The graph on the right is where there is internet access and connection, and you can see there's very little white on that graph. There's a couple places in Nevada you may be able to see, in New Mexico, in Arizona that may have no internet. You can see generally almost everywhere in the contiguous US and Puerto Rico has some form of internet connection, whereas it's still, there's many regions of the US that don't have a person. So the potential technology to kind of bridge access and increase access really is one of the main reasons that we're talking about this today. It's also one of the main reasons that technology is not going to go away as a driving force in mental health. I said this is not going to be a passing fad, and again, the reason is you can just look at these two graphs and see how different the access is. But access alone does not guarantee quality service, good service, and it also doesn't mean that people can connect to the internet. That said, I would argue an increasing evidence says that increasingly people have smartphone devices. They do have ways to connect to that internet. We did a study in 2016 where we looked at access to, with people with psychosis in different research studies compared to the general population, and what you can see on that kind of dark blue line is with people with psychosis, smart technology ownership of phones or smartphones really went up from say 25% in 2008 to over 75% by 2016. That thin line up tops the general population, and you can see that that digital divide is beginning to close, that people with SMI, people with psychosis are having access to technology. More than half of them clearly by 2016 when we did this research had access to some form of a phone. It may not have been a smartphone, but increasingly it will be a smartphone as we'll talk about. I said, the case report I have on the other side of the cover story in American Journal of Psychiatry from 2017, this was actually a case report where the patient that we were seeing at the Massachusetts Mental Health Center, a community mental health center where we see many patients with psychosis, and we actually learned the patient volunteered to us that he had been using his smartphone to access a lot of therapy services, to access different CBT apps because he wanted more appointments, he wanted more access and more visits. So it's a case where sometimes if you ask patients, you may be surprised that they're actually already using it today. So to give you some more information on what are patients with serious mental illness, in this case, schizophrenia using in terms of technology, we did a research study with NAMI, the National Alliance of Mental Illness, and in 2016, we found that people had access to a personal computer, 89%, they may not own it, they may be accessing it through the library, but 54% had a smartphone, 52% access to a landline, but I said that smartphone I want you guys to focus on because that's not an insignificant amount. Certainly some patients may not have a phone today, but increasingly they do. And I think what's very interesting is now if patients are eligible through certain Medicare and Medicaid programs, they actually will qualify for a free smartphone. So I said, the screenshot below where it says free smartphone, I don't endorse or know the service. It was a screenshot I took, but if you type in free smartphone for kind of medical eligibility, a lot of patients with SMI may likely qualify for the service where you get a free phone and I said, you can't actually get a flip phone. The phone that someone will get today is a smartphone that will enable them to access and use a lot of different mental health apps. So I said, even if your patients don't have smartphones today, it's likely that the next phone that they get or they qualify for will be a smartphone that may even be free. So I said, certainly we're seeing upward trend in ownership and access to technology, especially smartphones by patients with serious mental illness. So assuming that our patients have these devices, they're getting more of these devices, what can we do with them? Why are we excited by this? And one reason is the potential of active data. And by active data, I mean something that someone has to actively enter and engage with and that would be like a survey. If you don't click a survey, if you don't actively engage, we don't collect any data. And for depression, what we had done and we're doing in different studies today is we offered people the PHQ-9 on a smartphone. You can imagine that you would slide that bar and you would kind of get to zero, one through three responses. And these are three questions from the PHQ-9, including question nine about suicide and self-harm. And the graph that you see that looks almost like a stock ticker graph on the right is for a single patient, this is what their daily PHQ-9 on the smartphone looks like compared to in clinic on paper. So in clinic on paper are those green boxes at day zero and at day 30. And there's two things I want you guys to notice from this graph that is complex at first. One is look at that blue line of kind of daily PHQ-9 scores or depression scores. Again, higher means more severe. In this case, there was one patient, every single day they reported more severe, higher depression scores to the smartphone than they did to the clinician on the paper and pencil scale. So people are reporting symptoms differently to the smartphone. And look at those diamonds. That's when someone told the smartphone app they had a two or three on a PHQ-9 thought of suicide. And that happened once, two, three, four, five, six, seven, eight times that the person reported serious thoughts of suicide in the course of 30 days to the smartphone. Yet when they came back into the clinical visit at day 30, there's no diamond there. There's no report of suicide. So the point of this is even by asking people survey questions on the smartphone, we can learn a lot about mental health and SMI. We can almost have a new view into depression. We have a new window of what someone's experiencing, and it may be different than what they report in the clinic. And it's hard to know, do people report symptoms differently on the phone? Is it easier to report symptoms on the phone? Is there perhaps fear of less judgment, so people report symptoms more candidly? Do people under-report in the clinic, do they over-report to the app? It's hard to know why these differences exist. But even by assessing symptoms on a phone, we can have some certainly new views and new windows into how people with SMI are experiencing illnesses like depression. Passive data is a different potential of these technologies. And in contrast to active data, passive data is something you don't have to engage with. It passively gets collected whether you press a button or not. And I said we'll talk about in future lectures the serious ethical implications of collecting this type of data, the type of stakeholder engagement protections you need. But to give you an example of passive data, imagine that your phone tells you how many steps you did per day. Your phone can say that you did 10,000 steps, 2,000 steps, you took no steps, and that's because your phone has an accelerometer in it. It's tracking the general, it's tracking your physical activity, and you don't have to tell your phone to do it, it does it automatically. I said your phone has GPS in it, and if you grant permission, some apps like a weather app certainly know where you are and can give you information based on your location. Likewise, if people give us information, we can learn about mobility patterns in SMI. So I said there's a lot of data that can be collected passively with no engagement. Again, we'll talk more in future lectures also about the ethical implications of this data. But certainly it can be collected, and we're learning how to use it. And I said, but we're not using it directly in clinical care. There are some use cases we'll talk about, but I said we're not using it in mainstream clinical care, partially because there's so much data that a smartphone can collect that we have to figure out what is a signal, what is noise, what is meaningful in SMI, and what will be a value. So I think this is coming very quickly. There's a lot of people working on it. There's a lot of companies working on it. And really the potential of passive data is functional outcomes. We can learn about people's sleep. You could say when the accelerometer, the phone isn't moving, and you're not getting calls at night, that may be a proxy for sleep. So can we kind of automatically record things that we care about in SMI? We want to make sure people are getting good sleep. We want to make sure people are getting the right amount of exercise and physical activity. So could this be a new way that we kind of learn about functional outcomes? The other potential of technology in SMI is intervention. So we talk about active data, passive data, ways you can learn about people's lived experience. Interventions are different because, again, these are ways we can, of course, intervene. One interesting app for substance abuse disorders that has FDA marketing approval is something called Reset that can be used for, I think they also have approval for opiate use disorders. I said, so that's one app that's actually the first app in the mental health sphere to get this type of approval. On the screenshot you see of the app over there that is called Focus is an app that is developed by George Benzies at University of Washington that's being used to offer on-demand services, psychoeducation, and actually interventions of people with schizophrenia. So it's an interesting app that they've done a lot of work on, a lot of research, and shown some very impressive preliminary efficacy results that says this app can help people in their symptoms. So I said, these are two examples of apps that are very far along in the research pipeline. I said, Reset, again, has the FDA clearance, and this Focus app for schizophrenia has had several impactful studies. I said, the first study, I said, one you can read there from 2014, they've done many studies since on Focus. So these apps for SMI are being built and used and studied today. And that also gets the question of what are patients using today? What are the patients that you and I are seeing, what are they engaging with? We've gone over the potential, some exciting apps that are out there, but what are people actually using? And certainly one point that we have to acknowledge is that patients having smartphones are necessary to connect to and use apps, but not sufficient. And by that, I want to kind of look at these two graphs below. The graph that's the state clinic is our Massachusetts Mental Health Center that treats patients with SMI, largely schizophrenia and bipolar disorder, as well as some treatment resistant to depression. And what I want you, sorry, I should bring the legend up there. What I want you to notice is black is smartphone ownership, and red is have you ever downloaded an app? But green is have you ever downloaded a mental health app and blue is are you currently using that mental health app? So we can see that in the state clinic that our younger patients, a large percent of them have a smartphone, a large percent have downloaded an app, less have ever downloaded a mental health app, but even less are currently still using that mental health app. So the smartphone, as we talked about, people increasingly have access to it, but they may not be using apps today, but still some are, those numbers aren't zero. We actually went through and looked at what apps are patients using today in a population of schizophrenia. We got permission to look at their phones with them and count apps. And perhaps it's not surprising that SMI is not a contraindication to liking games on your phone, social networking, music, navigation, those were the top four. And the patients that we see in Boston weren't using too many mental health apps outside of research studies. It was, you can see it's highly in black, it was not a high category of prevalence on their phone. And what's interesting is one app that was very popular among our cohort was actually an app for psychic services, not psychiatric services. And I bring it up because if you look at this screenshot for a psychic app, if you actually read it, it says it's about relationships, family, career. It says talk face-to-face to your favorite advisors, nonjudgmental advisors, personal. These are all elements of good care for SMI, right? Face-to-face care, nonjudgmental, talk about relationships, family, and career. So I think our patients are finding interesting resources online, but again, this is not an app that certainly we want to recommend people to use a psychic app. I said they want to use it for entertainment, that's perfectly fine, but again, it's interesting. The elements of good care are in something like this, but clearly this is not a mental health app. And I think that leads to the question before I get into what is a good app, what's a bad app are, it may be that what people are using towards improving their mental health is not always what's labeled as a mental health app. This was an interesting study in patients with depression, and it said basically, why are you using a mental health app and what are you using it for? And some people were using it for skill acquisition. They wanted to learn a certain CBT skills. They got an app, they learned that skill, and they stopped using it. Some people are using apps and technology for social connectedness. They want to go on a social network to have more friends and people to talk to. So that really wouldn't be a mental health app. They may be going on, again, a social networking site. Some people were just curious of downloading different apps to see what they were. And some people are using apps for safety netting, have people to kind of rely on and call on. But again, people may be using social network apps as ways to kind of have people to talk to, to be less lonely. So just because it's not called a mental health app doesn't mean they're not using it towards improving their mental health. And the other flip side is, is clearly a mental health app directed SMI doesn't always mean that people are using it. So I think you're getting a sense that there's a lot out there. There's a lot of things, but it's hard to know what people are using and aren't using. So this is one app that's directed towards schizophrenia that we did a study on. And we actually were, we got ethical approval and IRB to look at who is using this app for schizophrenia. We got about a year and a half worth of data of anonymously who had logged in and what had they done on this app for schizophrenia that lets you track your medication. You can track symptoms and a diary. It has a lot of different features. I said, I believe it's still available on Android and both Apple. Again, we don't recommend or endorse any certain app, but what I want to show you is kind of the data that we got that we've published in this paper. What's interesting is the medication feature was accessed by about 225 people. And you can see less people use the medication feature twice, and there's kind of a decay curve. Only about 50 people have used it a hundred times. And so despite being readily available and free, people were not logging in and using the medication feature consistently across the course of this free app being available. So it brings the point of just like something is labeled a mental health app, doesn't mean that people are running out to use it. If you build it, they also may not come. In some ways, it's a case of patients aren't waiting. People with SMI are finding ways to use technology in ways that they may want to use. In that same NAMI study, we found one of the most frequently uses of people's phones was to listen to music or audio files to help block or manage voices. So I don't think there's any research study saying smartphones to block voices can reduce hallucinations. But again, it's a case where I think patients have found interesting, useful cases for technology and smartphones that us as certainly the clinical community, as people that support them, that help them need to catch up and need to learn what are the innovations that those with schizophrenia are doing. In some ways, I think it's important that we credit for talking about technology and SMI, the interesting innovations that patients themselves are coming up with. And this case example is a patient who actually we wrote up a case report of his innovation. He needed to increase the dose of a medication, his antipsychotic medication, because he was having more hallucinations. And he asked about side effects. He said, I take this medication, how do I know it's going to be effective? How do I know if the risk benefit ratio is going to be worth it? And he said, I want to basically better track are the number of auditory hallucinations going down as I start this medication. So what he did on his own without, without, he wanted to figure this out. So he did his own self experiment where he first found a smartphone app. And every time he would have an auditory hallucination, he would unlock his phone, go to a counter app and press plus one. He said, that's kind of burdensome. So he just bought himself a tally counter and said, every time I hear auditory hallucination, I can press next on the tally counter, but it's not too convenient to carry that around. So he bought himself a digital tally counter. And what's interesting is using a digital tally counter, nothing too fancy, he was able to make this own graph of what his auditory hallucinations are over different doses of the medication. And you can see that the count, which is count of auditory hallucinations goes down as the dose of the medication increases. And again, this was this patient single one experience, but again, this is a case of someone with schizophrenia being interested, curious, and finding their own solution. And I thought this was a very impressive case. So we wrote it up in the literature because we have to make sure that our patients of SMI kind of are also seen as innovative. They're working on solutions as well. So that really brings us to the point that there's a lot of different technologies. There's a lot of different use cases. We're not sure what people are using, different people access different things, but I hope I've conveyed to you that certainly there's apps out there. Some people are using them. Some people are using them in different ways. And it's a little bit of a challenge if someone brings you something and says, what do you think of it? Should I be using this? Is this a good thing for my care? What would your answer be? And I think this has always been a challenge in healthcare. I said, certainly again, going back to the Victorian era, we've always seen certainly snake oil sales of different things. And I think I would kind of take a Carl Sagan quote of extraordinary claims require extraordinary evidence. If a smartphone app is claiming it's going to do extraordinary things, you want to say, where is that extraordinary evidence? And so to get into the nuts and bolts of operating, it can be really difficult because as I said, we once tried to estimate how many mental health apps are out there, and we thought if you kind of take the meditation, the mindfulness, the different symptoms specific, the medication there may be 10,000 mental health related apps. That's too many to keep up with. And we liken them almost to cargo ship containers in that they come and go. Think how many apps want to update and change on your phone. Some apps stop being supported. New apps come on the marketplace. So you never really, they're not static like cargo containers. They're always kind of in motion. So it's really hard to keep up with every single app. It's not like a medication where you know it has this active ingredient. It's not like a manualized therapy like CBT where we know these are the core components. They don't change. And we recently published a paper where we actually tried to say, well, is there a way, is there a magic formula that will pick out the right apps? And what we did is we took 120 apps and we coded them using this pyramid. We say, does each app use surveys? Yes or no. Does it record, say, your steps? Yes or no. Does it have a privacy policy? Yes or no. Does it give you bads and rewards? Does it connect you to social networks? We tried to say, of all these different things apps could do, is there some combination of features that makes apps popular and they look useful? And what we found was even using machine learning approaches, trying to let the computer find patterns in 120 apps based on all these features, it really, there was no magic formula that came out that said a great app is always going to have certain features in this kind of order. A great app is always going to do A, B, and C. And the only thing we found is a bad app is going to be one that is not updated very often. And that's probably something you could have realized from common sense. An app developer stops supporting an app and no longer updates it after six months. That probably means they don't care about it or they stopped updating it. And that's the reason for caution. So I said, but there's no magic formula for picking the right app. And that's important to keep in mind because I think you're going to see a lot of top 10 lists for apps, the top 10 apps for schizophrenia, the top 10 for depression. You're going to see a lot of websites kind of saying we curate apps. We have a list of our own scoring system. These are the best apps you should look at. And I think it's important to keep in mind that apps are constantly changing as we talked about. So even if you have a top 10 list or a scoring system or repository of apps, when were those repository ratings done? When were the top 10 lists done? Because those apps have probably updated and changed. As we'll talk about in a minute, the evidence for apps is rapidly changing. So we don't really have a gold standard for what is a good app. So any scoring system that people tell you is a great way to score an app is probably not very reliable or valid. So I think it can be useful to kind of say, well, here's a list of apps. Let's pick these ones for SMI. And I said, but you have to be very, very careful because, again, you would never say these are the best medicines for depression. It's medicine X, or you'd say the best treatment for anxiety. It's always problem solving therapy. And so we would always certainly kind of look at the treatment, look at the person, customize it. We don't want to kind of shortchange people on apps and recommend things, again, based on recommendations that are not reliable or valid. So we've taken a slightly different approach to the American Psychiatric Association, this evaluation model. And what it is, it's a template and framework for you to make an informed decision about smartphone apps. And what it really has, and I said, we'll focus on four layers today on risk, evidence, ease of use, and interoperability, which we'll go into each of those. And the idea is that we're going to give you some questions to ask of the app about risk, about evidence, ease of use, interoperability. And if you look at those questions about risk, and you say, I don't know the answer, or if something sounds not right, that's a good reason to stop evaluating the app. And because, again, if the risks don't make sense, it doesn't matter how easy it is to use their evidence. And the interesting thing about apps, as we'll go into each of these four layers, is some of the risks may be different than what you expect. Some of the evidence may be different than what you expect. Ease of use may be different than what you expect. A lot of the times that I said we know how to evaluate medication studies, therapy studies, there are some new features to kind of think about evidence for apps that we'll kind of go through here that will let you use this as a template to kind of go through and say, hey, this makes sense, or something seems worrisome. I should maybe step back from this one. So to talk about risk, I think the first layer of risk is a lot of apps that are put out there don't go through peer review. Apple and Google, who are the major app stores at this point, they're not evaluating apps for medical content. And certainly, there are some apps out there that just are dangerous. And the one that says schizophrenia, don't lose it. It's something that I found pretty quickly by just searching schizophrenia in the iTunes store. And it turned out to be a pong game. It wasn't even a medical app. And I think it was just a stigmatizing game of pong. The middle app is interesting. Clearly, someone was trying to make their own app about addiction. And I checked off the addiction of being right, which is probably not a diagnosis or addiction that we treat. But you can imagine perhaps a person was thinking of narcissism, which is not also addiction, but there's something they're trying to get to. You can see a collecting things as an addiction, which again, you can imagine what perhaps they're trying to get to. But it's not quite when adrenaline producing is probably useful for living and having normal physiology. Clearly, too much adrenaline can be dangerous or fatal. But you can see this is an app that someone made. They had interesting ideas, but really, it hasn't kind of gone through. It's clear that someone probably has not looked at it with a medical background. The app that says on the one that says close and bold, it says, help yourself by helping others. And it says, help save the healthcare system money by avoiding being readmitted. So again, some of these apps are just quirky. And I don't think app developers need to do harm. But I think sometimes people are just putting up content that really doesn't make sense, hasn't been looked at, isn't evaluated. I said the blue whale challenge is an interesting thing that spread through social media, where the 21st challenge was actually to kill yourself. So certainly something that actually was put up with malicious intent. But the idea of being just because it's an app doesn't mean it's certainly going to be safe. I think another risk that sometimes people don't think about, and it certainly isn't intuitive, but it's very important, is a lot of times when patients see an app, they say, well, it's an app for mental health. Therefore, it's going to offer me the same protection that I get when I come to the clinic. My mental health data is going to be kept secure. It's not going to be sold. It's going to be protected by things like HIPAA. But in the privacy policies of a lot of apps, there's these fine mouse print that says, actually, no, this is not a medical app. It may look like one, but it's a health and wellness app. And because it's a health and wellness app, it falls outside of medical jurisdiction. And thus, you don't have any kind of federal privacy protections for your health data. The company can do what we want with it, as outlined in the privacy policy. So as a proof of principle, we print out a bunch of privacy policies for dementia apps and read them all. And what we found was, in essence, 96% of these apps for dementia, when we read them, either said, of course, we're going to market and sell the personal data we collect from you, or we're not going to tell you if we sell it. Only 4% of these apps promise not to actually market and sell your data. So I think it's important that you kind of understand, is this a medical app? If it's not, what's happening to your patient's data or data they're entering about you? Where is it going to? And again, something we don't really come across when a medication is not going to do this. It's a pill. This is different. It's an app. It's dynamic. It's sending information. So the picture of the cloud on the right is just a nice cartoon. I always like to tell people, there's no such thing as the cloud. You're just someone else's computer that you're storing your data on. And where that gets interesting, I said, we haven't published this data yet, but we've been looking at, what does the privacy policy say? Where is data being sent? So the app may send your data to Facebook for an analysis, it may send to Google. And sometimes even the app privacy policies may not be accurate. The app actually, we found some of these apps may be sending data to different places than the privacy policy. And that's a really tricky problem because I said, certainly we can help people read a privacy policy. We can scan it with them. We can't do, we can't intercept the app transmission. So certainly something to be aware of is you want the privacy policy to offer the right protection for patients, but even the privacy policy alone sometimes isn't perfect. If what I just said sounds disconcerting and worrisome, I would say that the FDA agrees with you as well. There's a little bit out of control with kind of protecting people, making sure apps on the market are safe. They have evidence-based or not stealing patient's data. The FDA is working on a new program called pre-certification. We'll talk more about this in future lectures. And this is just a schematic of diagram from the FDA of what their new evaluation model for what they're calling software as a medical device with apps to fall under. So the FDA is currently piloting new programs to have new ways to kind of make sure that apps that are in the marketplace certainly are not kind of as confusing and difficult to figure out what is safe with. But for risk right now, I think the most important question is if you can just check if there's a privacy policy, if you understand what data is collected, is personal data de-identified, can you opt out of data? Can you delete data on your cookies, on your device? Who is data shared with? Does the data ever leave your device or does it go to the web? Is your data encrypted? Does the app claim to be HIPAA compliant? Again, these are things you should be able to find pretty quickly from the app. I said, it may be that you pick one or two apps to kind of ask these questions on and use those for patients. But I think this is something that you want to kind of be aware of because certainly I said, there are some risks that may not be intuitive. And we'll now move to level two of evidence. And I think something to be aware of just in certainly any clinical study is a lot of apps increasingly will say, well, we have a really great study and because it did really well in the study, it's ready for clinical use. And the slide here to show is not apps. These are kind of online computerized CBT programs. One is called Mood Gym, one is called Beating the Blues. They both have really good pilot studies. And, but there's this interesting study in 2015 where they took these CBT apps and put them in primary care in the UK. And the primary care physician just said, here's the app, use it if you want, if they thought you had an indication to kind of have depression. And what happened is without kind of the extra support of a clinical study, you can see usual care did no better than the intervention care. It really didn't separate out. So when you took away kind of support of a clinical study, it didn't do as well. I will say that they redid the study later in 2017 with phone call support and the online CBT did better. But the point being is sometimes a study in real world don't exactly line up. I think an interesting case in substance abuse was there was a very impressive paper that came out in Drama Psychiatry in 2015 about a substance abuse app called HS that really was helping people manage substance abuse for recovery. And there was an interesting paper that came out about a year later where they gave the HS platform that's after the system to different mental health centers focusing to different rehab centers. And what they showed is it was actually very hard for real world uptake for actual clinics to implement this app to get into their workflow, to sustain it from a cost point of view. So again, just because something worked well in the clinical evidence doesn't mean it was easy to implement in the real world. There's also an interesting case of this idea of a digital placebo effect. And you may kind of ask, what is a digital placebo effect? And the idea being, if you tell someone, here's an app, here's really high tech care, this is really advanced state of the art, which a lot of these apps are, how much of that is placebo? How much does expectations that people think that they'll be getting better? And I said, a lot of digital health studies don't have a placebo group. There was an interesting study this summer by Nguyen and Hogan, the citation is there, where they took a very popular mindfulness app called Headspace and made a placebo version where someone narrated the same voice, the same animation, in essence, what was not real mindfulness. And they showed in a group of students, again, a very small study, that the benefits of kind of getting placebo Headspace or real Headspace, there was no difference. The placebo group who got kind of sham mindfulness did just as well. So again, it's hard to know, what is a placebo? And how does the translation of evidence work? I had to slide a book versus movie, because if something is great in a book version, do you like the movie? If you like the movie, do you like the book? If CBT works well in face-to-face, does it work as well on an app? It's hard to know. So I think there's some interesting questions about how evidence translates and what is the role of placebo in all of this. The evidence for most smartphone apps for depression is evolving, but as the slide said, currently small and weak. What I want to draw your attention to is the meta-analysis we did on smartphone apps for depression in world psychiatry. You can find this paper for free online. If you look at the effect size for smartphone app versus waitlist, it looks not bad at 0.56. But if you compare it to smartphone versus active control, it really goes down to 0.22. The point of the slide being the type of study matters in how good the smartphone app is going to look for depression. If you compare the smartphone app to something like walking, to journaling, to talking to someone else, to extra appointments, that's probably a better assessment of the app than a waitlist control. So as we get higher quality studies with better control groups, what this slide is showing you is that the effect size of apps is still positive. They're not harming people. We can tell from this, but it's not as strong as perhaps the marketing that we're reading is telling us. And of course, I think we all have to be careful for hype. There's certainly different headlines coming out. This is the headline based on, we'll go back this slide where we said we need more evidence. The headline was swipe left on sadness. Researchers say smartphones can help cure depression. So I don't think that's quite what we said based on that research and also get to the point of, if you just look at the headlines, it can be a little bit misleading. So I think another thing to again, think about is when looking at the evidence for these things is, this was a very interesting app called Project Evo that kind of used different cognitive approaches towards depression. And this was a study where over time people used it and the PHQ-9 depression scores went down. When you go back and say, was there a dose effect? The people who use the app, none, never, versus optimally doing all the time, you would think people use the app more, perhaps should have a better, stronger effect. In this case, people who never opened the app did pretty well. In some cases at week six, they were kind of having the most benefit. So you say something's odd there, right? If people never opened the app, but they're benefiting the most, that kind of makes you again, worried about placebo effect where you kind of say, should there be a dose effect or not? So I think the point to show you this is the evidence is very new. It's evolving. We're getting new studies coming out every week now. So it's hard again, to judge these apps on evidence alone. I think it's important to at least say, does the app claim to do what it does versus what it actually does? Is there any peer reviewed evidence? Is there any feedback to support at use? Does the content appear of at least reasonable value? Remember we looked at that screenshot of the app that said collecting things is what this app was going to be tracking as an addiction. So simply the eyeball test is useful. A lot of apps won't have evidence. And I said, a lot of the evidence is early and you have to be pretty careful in evaluating it. So I said, that said a lot of things that we sometimes, not everything has to have a peer review paper certainly, but you just want to be aware of it or don't want to be certainly tricked by an app that says, well, we have this one feasibility study and that it's great. And you say, well, what does that really mean? So I think using your kind of clinical judgment will go very far in evidence. Engagement I think is an important topic too. Sometimes people think this is level three of our evaluation framework because it's an app, people are going to use it. It's going to be really sticky. People are going to keep using it. And I said, PTSD Coach is a great app. I actually recommend it to patients that I work with partially because the privacy settings are great in our evaluation framework. They really don't share your data with people. They work to protect your data. Partly because level two to evidence is really good. They've done research on PTSD Coach. And this now that we're talking about level three, engagement has been trickier. This is a screenshot. This PTSD Coach has been downloaded over a hundred thousand times in 74 countries around the world. And PTSD Coach has run into what almost every mental health app has done in a real setting outside of a clinical study is they polish on it. Here's downloads of PTSD Coach. We can see they had a lot less people coming back to use it the next day. And, but then people were using it later. But you can see that downloads is not always the best metric. And that just because you get someone to download an app in your office, doesn't mean that you're always going to use it. And sometimes people will say, well, we're dealing with SMI. We're talking about illnesses where there may be less motivation. Are people's not using apps and stick with them because this is a SMI specific thing. And I bring up this slide to say, no, this is a study from a large asthma study that had backing from Apple. And you can see they kind of had the same problems with keeping people engaged. So it's a general thing for mobile health and smartphone apps. One interesting thing is thinking about when people are using apps, what is the working alliance? We know that certainly therapeutic alliance is very important SMI treatment. Having a good alliance certainly is very predictive of good outcomes. What does the lines mean with technology? How is technology helping people reach their goals? Are the tasks appropriate to reach your goals? What is the bond? How are you supporting technology use in the clinic that kind of helps them through those tasks towards the goal? So I said the digital, if you actually Google, I said, actually, if you Google Henson and Torres, I said, this paper should be out in a week or two by the time this podcast is out. And you can read more about kind of what is a digital therapeutic alliance and mean and how to have implications for SMI treatment. So I said, we don't have a lot of data or hard recommendations for what's an engaging app or what's easy to use. And part again, patients are different. Some people may love technology. Some people may have a hard time. Some people may have vision impairment. Some people may always like to be on the screen. So I think it's just worth keeping in mind and having discussion saying, if you're using this app, does it look usable? Can you keep using it? Is it engaging? Is it going to keep you going? It's not enough to say, here's an app. It's going to be great. We don't need to talk about it again. And that brings me to the fourth point of interoperability. And by that, I mean, you kind of see these four silos here on the left. And we don't want people using different technologies to kind of silo their data and their mental health care. You don't want all information about medications going to one app, all about diet to a different app, exercise to a different app and CBT to a different app. And it's impossible to kind of put all the information together and understand how someone is doing. We live in an era of integrated care, the picture on the right of people having a conversation. So with any app, you always want to say, how is this being used to facilitate a conversation? How are we not going to silo data, not fragment data, but rather, how are we going to make sure it's easy to get data off? We're going to print data off. We're going to look at it. And how are we going to use it as part of the treatment plan treatment goal? Are we going to use an app to help track side effects of medication? And we're going to check in with that in a week and discuss that data. Are we going to track steps to kind of get someone more physically more active? And it's going to be easy to get that data off the device, to look at it and kind of get it as part of treatment and a constant conversation. We don't want to collect data for no purpose. We don't want to fragment data. So I said, so the steps, and again, we have some questions about interoperability of just thinking, make sure you own your data. Not many times, it's really rare that you can share your data with electronic health records today, but hopefully people can print out your app data or your fitness tracker data. They can download it or they can share it with you. So to review kind of the four steps to the pyramid we talked about, which risk, evidence, ease of use and interoperability, this meaningful data use and sharing. And again, for each step, we can't tell you what's good or bad, but we can kind of give you these questions to guide you and to kind of inform a discussion with patients to make sure it makes sense in care. So it's a lot of information, but the good thing is we have all of this that you can access for free up on the American Psychiatric Association website. If you go to, so it's, if you go to our website and if you click on psychiatrists, practice, mental health apps, you'll see this evaluation framework. We're actually going to be adding a lot more examples coming up in the next couple of months we're going to kind of be adding more use cases. I said kind of more work through app evaluation, more details. So this website will be updated, but it's a good place to kind of put it all together. Learn more. We'll also have more resources. We'll have more kind of learning opportunities through SMI advisor. And I said, we'll certainly have links and information through SMI advisor there. So thank you guys for listening and tuning in.
Video Summary
The video discusses the topic of informed decision making for picking smartphone apps for serious mental illness (SMI). The speaker, Dr. John Torres, is a psychiatrist who focuses on using technology to improve recovery and care for SMI. The video covers three main topics: the potential of apps in SMI, the challenges in picking a good app, and the American Psychiatric Association's app evaluation model.<br /><br />Dr. Torres explains that smartphones have the potential to bridge access to mental health care, especially in areas with limited resources. However, not all smartphone apps for mental health are safe or effective. He emphasizes the importance of considering risks, such as privacy and data security, when choosing an app. Additionally, the evidence for smartphone apps for SMI is still evolving, and it's important to critically evaluate claims of effectiveness.<br /><br />Engagement and ease of use are also factors to consider when selecting an app. Dr. Torres highlights the importance of an app's user interface and whether it can be easily integrated into a treatment plan. Interoperability is another key aspect, as it is important to ensure that data from different apps can be easily accessed and shared by both patients and healthcare providers.<br /><br />The speaker provides a link to the American Psychiatric Association website where a comprehensive evaluation framework for mental health apps can be found. This framework includes a set of questions and considerations for each of the four main aspects discussed: risk, evidence, engagement, and interoperability.
Keywords
informed decision making
smartphone apps
serious mental illness
Dr. John Torres
American Psychiatric Association
app evaluation model
privacy and data security
engagement and ease of use
Funding for SMI Adviser was made possible by Grant No. SM080818 from SAMHSA of the U.S. Department of Health and Human Services (HHS). The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement by, SAMHSA/HHS or the U.S. Government.
×
Please select your language
1
English