false
Catalog
Fostering Organizational Change to Promote Uptake ...
Lecture Presentation
Lecture Presentation
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello and welcome. I'm Amy Cohen, a member of the clinical expert team with SMI Advisor and an associate research professor in UCLA's Department of Psychiatry and Biobehavioral Sciences. I'm pleased that you're joining us for today's SMI Advisor webinar, Fostering Organizational Change to Promote Uptake of Evidence-Based Practices, Lessons, and Tools. SMI Advisor, also known as the Clinical Support System for Serious Mental Illness, is an APA and SAMHSA initiative devoted to helping clinicians implement evidence-based care for those living with serious mental illness. Working with experts from across the SMI clinician community, our interdisciplinary effort has been designed to help you get the answers you need to care for your patients. Now, I'd like to introduce you to the faculty for today's webinar, Dr. Allison Hamilton. Dr. Allison Hamilton is Chief Officer of Implementation and Policy at the VA Center for the Study of Healthcare Innovation, Implementation and Policy. And she is also a research anthropologist in the UCLA Department of Psychiatry and Biobehavioral Sciences. Her work is focused on women's health, mental health, implementation science, and organizational change. I will also have the opportunity to join in midway through the talk to add some details about a project that I worked on with Allison. Allison, thank you so much for leading today's webinar. Thank you so much, Amy. It is a pleasure to do this webinar, and I really appreciate the invitation and also the fantastic support from the APA in putting this together. I just want to start with a couple of disclosure and disclaimer statements that I don't have any conflicts of interest related to the subject matter and that the views that I will be expressing do not necessarily reflect the position or policy of the Department of Veterans Affairs or the United States government. I also just want to acknowledge several support sources that both Amy and I have through VA and also a number of individuals who have influenced and informed the work that we're going to be presenting today. So our focus today is on quality improvement and how to achieve quality improvement, what kinds of things you want to think about when you're embarking on a quality improvement effort. We're going to start with just a thought that all improvement will require change, but not all change will result in improvement. We're going to talk that through a little bit in terms of how are we going to know that improvement has actually been achieved. So what I'll walk you through are some quality improvement approaches, and they can be a bit bewildering, so we'll go through a few different types of quality improvement. We'll talk about some processes for identifying and honing in on QI priorities in order to develop feasible projects, and we'll review sources of evidence and approaches that will jumpstart your QI project options, particularly related to uptake of evidence-based practices. First, we just want to get started with a poll question so that we know a little bit about all of you who are in attendance today. We have 50% or more of our audience members who say they have done one or more QI projects, so that's very exciting. About 38% of the audience says they don't know very much about QI, and then the rest of them, about 6% each, are split between I'm trained in QI or I'm familiar with QI, but I haven't done a QI project. So I'll turn it back to you. Great. Thank you so much. That's very helpful. So before we start talking about quality improvement, it's important just to pause and think about what do we even mean by quality. I think one of the most reliable sources of information we have about quality is the Institute of Medicine, which defines quality as the degree to which health services for individuals and populations increase the likelihood of desired health outcomes and are consistent with current professional knowledge. The six dimensions that they call upon for quality health care are that this health care is safe, it's effective, it's patient-centered, timely, efficient, and equitable. So now let's move into some common quality improvement approaches, and you may have heard of some or all of these. There are lots of acronyms that might just get a little confusing. You might have heard of total quality management, continuous quality improvement, Lean Six Sigma. That's something that's used in the VA a lot. We're going to talk about evidence-based quality improvement, which is an approach that we've used across several projects. So I'm just going to talk us through a few of these approaches and really focus on their commonalities and some of the things that we want to pay attention to in quality improvement, regardless of which particular approach one is embracing. So quality improvement overall is a systematic approach to using specific methods to improve quality, and what we're looking to do with quality improvement is achieve successful and sustained improvement. A lot of the roots of quality improvement are really in industry, where there's been a pretty careful examination of both systems and processes designed to improve outcomes. So there was a lot of work many decades ago in industrial quality management science, particularly calling on the techniques of continuous quality improvement and total quality management. So, you know, very strong roots in business and management and really thinking about quality assurance and customer satisfaction. Just to walk through a couple of the commonly used quality improvement approaches, Lean Six Sigma is one that maybe you've heard of, or maybe some of you are even trained or certified in Lean Six Sigma. And this approach relies on teams to improve performance, and it's really focused on making things leaner. So as the name suggests, it's about eliminating waste and also reducing defects. So teams are assigned well-defined projects with an impact on the organizational bottom line, and there's this emphasis on statistical thinking at all levels, and people get trained in advanced statistics and project management and go through different types of belts, green belt, black belt, etc., depending on their level of experience with Lean Six Sigma. They use this approach, DMAIC, for problem solving, which stands for define, measure, analyze, improve, and control. And you'll see some kind of parallel ways of thinking about the cycles that we use to engage in quality improvement. Another key part of Lean Six Sigma, and as you'll see other quality improvement approaches, is that the management or leadership environment is very important in the support of quality improvement initiatives. And in the case of Lean Six Sigma, it's really thought of as a business strategy, so very oriented toward what is that organizational bottom line and how does the leadership structure approach the use of this quality improvement technique in order to achieve those goals of eliminating waste and reducing defects. There's a program called the Baldrige Performance Excellence Program, which is actually a certification program that's meant to improve the competitiveness and performance of U.S. organizations, and they have a national quality award, and their criteria for being one of these Baldrige organizations is that the components of the organization are managed as a whole. They pay a lot of attention to data management, so managing cybersecurity risks, and they also look a lot at risk management as a component of performance management. There are many, many helpful improvement tools available on the Baldrige website. They are not free, but they're very good, and if an organization is looking to become certified in this approach, then, of course, you would embark on the full package that they offer. Getting to Outcomes is an approach that one of our colleagues developed, Matt Chinman, which has been really popular in substance abuse treatment settings, reproductive health settings, where it's a very well-laid-out approach to figuring out how to provide support for implementation in an evidence-based manner. There are about 10 steps, quality improvement being one of those steps, that one would use in this model to be able to achieve successful implementation, and there are a number of resources available for the Getting to Outcomes approach, and many, many articles and other presentations and so forth that could help if this is an approach that you're interested in using. So what Amy and I and our colleagues have really focused on for, I would say, over 15 years is an approach to quality improvement called Evidence-Based Quality Improvement. And the reason why this approach was developed, not by us, but by some of our colleagues and some colleagues internationally, is that there really hasn't been a whole lot of success with continuous quality improvement in terms of actually demonstrating that using CQI results in the desired outcomes. So the steps that have been taken to kind of move CQI along into this evidence-based approach are a couple of different features. One, it really focuses on the research clinical partnership. It also uses top-down and bottom-up features to engage organizational senior leaders and quality improvement teams. So we think about it as a multilevel approach and one that is very, very team-oriented, and the evidence-based part really has to do with its emphasis on drawing on prior research evidence. So in other words, if there are topics that you want to address in your QI project, we would suggest within an eBQI approach that you look at what do we already know about this issue? What are the clinical guidelines? Are there some previously validated care models? Are there ways that we would even go about engaging in change that we know are more successful than other ways? So it's the evidence-based for the intervention or for the care model itself, but also for the way in which we would approach even supporting the change that the quality improvement teams might embrace. And it's very helpful that evidence-based quality improvement has a very strong track record of success. So now there's an evidence base for evidence-based quality improvement, suggesting that if sites and settings and organizations take on this approach to quality improvement, they will have a strong chance of achieving the outcomes that they're interested in achieving. So I just want to talk us through some steps around identifying priorities for quality improvement approaches and some of the things that we take into consideration when we're doing this type of work. So there are a lot of different ways of identifying gaps or figuring out where are we starting from? We need to know what's not happening, what's missing. Some quality improvement approaches use this 80-20 rule, so 20 percent of care processes that are being consumed by 80 percent of resources, and obviously we want to shift that balance. There are many, many different ways of figuring out what is the issue that's being addressed that we would recommend figuring out in close engagement and collaboration with stakeholders, including organizational leadership. But the idea really is that we would identify these priorities for improvement based on the available data and then make choices about how to approach solving that problem using the data that is available. It's very important in the multilevel stakeholder engagement context to gain consensus on priorities. So you don't want people at one level of the organization thinking, well, this is the priority for us, but people at another level thinking, well, our priority is different, because then it's going to be very hard to achieve the change that's desired because people have different notions of what that change should look like. So gaining consensus and data is a great way to do this. Everyone looking at the data and saying, okay, what can we agree on really needs to be fixed, and how are we going to do that? It's also really important to select a priority target that is important, but not only important, because it also has to be feasible. As we'll talk about in a couple minutes, feasibility is everything with your quality improvement projects. So what we've seen in many cases is that teams will say, well, this is the priority target. We want to end homelessness. And, of course, that is incredibly important, but it's not going to be feasible within the context of a small quality improvement effort. So taking off a little piece of that and saying, well, what can we do that would help us get to that bigger picture goal that is going to be feasible and that we could actually measure and potentially achieve the outcomes we're interested in? Okay. So the improvement process is not generally super fast, but there is evidence to suggest that organizations that make more rapid gains, relatively speaking, can answer three core questions. So they're clear on what they're trying to accomplish or the aim, how they'll know if a change is an improvement, and that gets into measurement issues, and what changes can be made that will result in an improvement. So in order to get to that point of achieving those outcomes, what can we actually do? What's that change concept? In the process of figuring out what your project will focus on, again, it's really important to think about how you're selecting that topic, having effective meetings. It seems simple and something that we would want to do across the board, but especially when you're getting people together in teams that may not have worked together before, may be coming from different levels of the organization, that's where those good meeting skills and optimized time management mean a lot, because you want people to keep coming back to the meetings, and if the meetings are not well run, if they're not efficient, if people don't feel like their time is being spent well, they might lose interest in being part of that quality improvement effort, and we have seen that. Conversely, we've seen when those meetings are run well, people are on board. They want to stay. They want to keep coming back. Another really key idea within speeding up the improvement process is focusing on testing the change that you want to make. It's a little attractive, I would say, to spend time analyzing the current process that's happening rather than thinking about how you're going to change it. So people can do a very deep dive into current processes, figure out all of the problems, and then find themselves six or eight months later saying, wait, we haven't changed anything. We've just figured out, you know, a really detailed analysis of the current processes. Of course, it's important to know the current process, but the idea is really to move as efficiently and appropriately as possible to testing the change that you want to make. Another thing that can happen that slows things down is that people might be tending to collect more data than they really need. So they collect way more information about the problem and the change than they're actually going to use, and again, that's another area where you can get kind of caught up in the process of just trying to be exhaustive with collecting data when actually the only part you need is some small fraction of that data. So working together as a team to say, what data do we actually need to solve this problem and engage in quality improvement? And finally, it's very important to think about how you're going to share information about the project. And one of the reasons why we really encourage this in QI projects is that lessons learned can spread. So if you figured out something, a way to solve a clinical problem, and someone hears about it, they might say, we want to do that, too. And that can happen very quickly, even surprisingly quickly. So it's important to think about where are opportunities where I can share information about what we're doing, because you never know who's going to pick up on it and say, we want to try that, too. So some of the essential ingredients that you need for quality improvement, and this is pretty much across the board of all those different approaches that I shared earlier. The essential ingredients, we would say, include an interdisciplinary team, and that's because you need people who have different perspectives, different backgrounds, different levels of responsibility in the organization to be coordinated to achieve change. So if you keep it kind of focused on just one segment of a clinical setting, it may not be able to touch as much as you're interested in because you're just not drawing on all the people who are actually in that setting on a daily basis. You need several different tools. You need tools for collaborative work. There are tools about brainstorming, conducting deep dives, and you need tools for describing your processes. IHI is an incredible resource for all of these materials. There's a quality improvement essentials toolkit that I highly recommend checking out because all of these items in green on this slide are available on that website. I'm pretty sure they're all free of charge, and they are excellent, and you can really adapt them for whatever you're interested in doing. There are other programs available on the IHI website that are also outstanding, but they make a lot of this material available to the general public. But the core concept that we want to keep an eye on with quality improvement is if you can't describe what you're doing, you don't know what you're doing. So that attention to really keeping an eye on what are our processes, what are we doing, where are we in that process are just really important for an efficient QI process. You've probably, considering that half of you have done QI, you've probably heard of PDSA cycles or Planned Due Study Act cycles. That's a lot like that DMAIC that we looked at a little bit earlier. Some have really conceptualized PDSA cycles as the building blocks of iterative health care improvement. And it's really pretty straightforward, consistent with what I've already shared with you. The idea is you want to plan this out with consistency and consensus building across your multilevel stakeholders. You want to actually do something to change what you've identified as a group, study it or evaluate it. And then you want to act and figure out what you need to do next. And it might mean that you're doing several PDSA cycles. You need those tools for data collection. And it's important to start with your questions. So what am I trying to understand? And then from there decide which tools and measures are going to be appropriate and parsimonious for what you're trying to achieve. And of course throughout is very important to consider the ethical implications of the methods that you're choosing and make sure that those are in line with your institutional parameters. There are lots of tools available for data analysis and interpretation, histograms, scatter diagrams, Pareto charts, many others. And a lot of those are available again on that IHI website and on other websites that focus on quality improvement. So it's not so much a matter of figuring out, well, which is the right one to use or which is the best one to use. But rather identifying tools that are really appropriately aligned with the question that you have and with the change that you're trying to achieve. Excuse me. This may seem like a lot, but it's important when you're engaged in quality improvement, especially kind of over the longer term, to keep track of your quality improvement activities. I mean, this is important for the sake of documentation. But it's also important because this might be a process that you decide to engage in or that you already are engaging in over time. And keeping track of what are the best ways to do this type of work, who is really staying at the table, looking at consistency of attendance, for example. Are you doing some type of rotating leadership of the meetings and how is that decision being made? What's happening with task allocation? You know, taking minutes, decisions about measures, informing leadership. These are all things to keep track of because they're going to inform your subsequent efforts. And they may be relevant to others in your organization who want to take this on and say, hey, well, we don't know how to get this going. How do we organize our meetings? What do we do about leadership of the QI project? What were the challenges that were faced? With the idea being that we want to spread quality improvement approaches because they are really effective at addressing gaps and problems in clinical settings, the more information we have about the process of doing that, the more effective those processes will be. Now, I touched a little bit on leadership earlier, and I think in doing evidence-based quality improvement over many years now, this is really kind of at the top of the list because so much is happening at the leadership level because they are the bearer of resources and often decision-making. So leadership support is really critical to quality improvement, and it's helpful to think about creative ways to get on different agendas and different mechanisms for reaching leadership to share with them what you're interested in doing, especially if they don't immediately come to the table. Not everyone on your QI team is gonna be the best person to go to leadership. There are gonna be some who, because of their position or status or relationship to leadership, they might be the better person than the next person because they just have that connection to leadership that would help get the message across. Now, each project is going to be a little different in terms of how often you want that reporting to happen, the mechanisms for that reporting, what types of things you wanna share. You know, in some cases, you wanna emphasize more of the successes. Maybe you've developed some innovations that would be really interesting to leadership. Maybe it's not the best idea to say, here are all the problems we had. On the other hand, in some projects, it's gonna be very important to share that. But it depends to a certain extent on the relationship that you have to leadership and who has those connections. It's also really important just to think about different levels of leadership and how you might reach different levels in different ways with different types of information. So being strategic about what you share with whom at the leadership level. So Amy and I wanna walk you through a really quick example of how we used evidence-based quality improvement in a study that was related to improving uptake of evidence-based practices for those with serious mental illness. This study, we call it EQIP, which is Enhancing Quality of Care in Psychosis. This was what we think of as a medium-scale multi-site implementation study. It took place over the course of 15 months and it was controlled at the clinic level. So for those who might not be familiar with VA, the VISNs that you see there, those are regions. And what we did was have a pair of sites in each VISN, one of which was an intervention site and one of which was a control site for a total of eight sites for intervention and four control sites. In embarking on this project, again, back to this idea of even figuring out what are our targets for quality improvement? Our team went to these regional leaders and said, here are some evidence-based pair targets that you can choose from for your patients with serious mental illness. They had five to choose from. They all chose the same two, supported employment and wellness. And we're gonna focus on the supported employment example. We're very interested in supported employment, particularly because it has such an incredibly strong evidence base. But the problem is that the evidence base is strong, but patients are still not getting offered this highly evidence-based service. So a substantial minority of those with serious mental illness still are not getting the services and sometimes they're not getting the jobs. But it always struck me that the only key to getting a patient into supported employment is the expression on the part of the patient that he or she wants to work. So it's really a simple thing, but it doesn't end up being simple in implementation. One of the things that we've learned through many, many trials of supported employment is that it is hard for these patients to keep their jobs. And what supported employment offers is ongoing support. And what we've learned, again, from a number of studies is that when patients are working, it improves many other aspects of their lives. So we use evidence-based quality improvement to support implementation at those intervention sites. And our eBQI strategies included engagement of leadership, having opinion leaders at the clinic. This was really critical. So people who bought into what we were interested in improving and also into the idea of improving quality of care for patients with serious mental illness. A big part of our strategies revolved around provider and patient education. We had a lot of feedback loops that were provided to both staff and managers, and we helped the sites establish local quality improvement. So our design, just really quickly, and then I'm going to shift over to Amy. Our design was informed by a framework that emphasizes how organizations change in very similar ways to how individuals change. We use mixed methods, so quantitative and qualitative methods to evaluate both processes of variations in implementation and effectiveness. And we collected data from both patients and staff, and we collected that data at baseline and follow-up. So in total, we had 801 veterans enrolled in this study and 201 providers and managers enrolled. And I'm going to shift over to Amy to share some of the work that we did. Thanks, Allison. So going back to your idea of an efficient quality improvement effort, you talked a lot about deciding the aims, the measures. And so the aim for the supported employment aspect of this study was what we would all, as clinicians, want for our clients. We wanted more people to use the service, which is supported employment, and ultimately we wanted more people to be employed at the end. And as you mentioned, really the only criteria for entry into supported employment is a client saying, yes, I'm interested in returning to work. So what we did to start this quality improvement effort and to be as efficient as possible is we just took a period of time, short period of time, to understand at each site what is the process from getting a patient who, finding out if they're interested, back to work. So we built what's called a process map. Now a process map is a type of flowchart, so it's not more complicated than that. And a high-level flowchart, meaning that's sort of more of a meta picture, is like the one that we developed here. It typically shows six to 12 steps, giving a view of the process. And flowcharts really show clearly the major blocks of activity or major components in a process. So these flowcharts are especially valuable in the early phase of the project, which is exactly when we used it in the QIP study. And so what we tried to do was, if you look on your left side of your screen, which I can show here, we started with, okay, a patient's desire to work, yes or no. And if yes, then they move to a provider who then would make the referral to supported employment. If they're not interested, then we ask them again at the next appointment. Provider referred to supported employment, do they refer them, yes or no? If they don't refer them, of course the patient has the opportunity to look themselves for employment on their own. If supported employment has the capacity, they take them on. Yes or no, if they have capacity. If yes, they deliver the service, hopefully get them to employment. If not, they go on wait list until services are available. As we mentioned before, one of the difficult parts about supported, competitive employment for this population is keeping them in the job and retaining them. So sometimes the job ends for a variety of reasons. We ask them again, do you want another supported employment job? If so, they start the whole process again, or they may go back to supported employment services. So if they don't want a supported employment job, we ask them again, if they do want another supported employment job, send them back to services. So, let's see here, can I click forward? So there are two parts. One is the part under the circle, which is actually the services delivered and the competitive employment. Our project was not part of that. That existed at the clinic and may exist at your clinic. And we found that the problem was that supported employment service people were saying, we don't have anybody, or we have too many people, or we have the wrong people. And so we really focused in EQUIP on this part under the bracket, which is identifying appropriate people, getting them to be referred to supported employment and thinking about the capacity of the service itself. So we started, and you'll see in EQUIP, we've worked on each of these parts under the bracket. So first, patient desire to work. What we found out was nobody was asking patients. Or they had asked somebody and said, I asked them a while ago, they said no, but hadn't asked them in months. And of course, as we all know, as just human beings, our minds change. And so we have to routinely ask the patients. So we built a patient-facing kiosk, which was in the waiting room, and it had a color printer next to it. And every time a patient came, they sat at the kiosk, and for this project, they asked them about supported employment as one of the questions. And they asked it at every single visit that the patient came. And so what printed out was a report like this, which the client got and took to their clinical encounter. And if they wanted to work, it would say something like, you have reported that you're not working, but you might like to. What can you do? Well, you should discuss with your doctor a referral to a new VA program that helps people find and keep jobs. So we pushed them to the identification that they had a desire, that we had a service that could meet that desire, and that they should advocate for that service in their clinical encounter. So we're sort of asking and pushing the patient, helping them with the words to be able to advocate for the service, since we know that this clientele may have trouble advocating for evidence-based practices. Then we moved on to the provider. And again, these are all things that we're pushing on at the same time in this project in order to reach our ultimate goal of people being employed. So provider services. What we did was we went to providers and we asked them about supportive employment services at their site. And here's some of the quotes. One person said, supportive employment barely exists. Another person said, well, there's also the supportive employment program, but that's for people that can actually already work in the community right off the bat. Another person said, I wish we could clone the supportive employment worker. So they had a very positive feeling about it. So what we were seeing here was where the problems were with this part of the process map. What's the problem? Either people don't understand what it takes to be in supportive employment. That is, they don't need to already be able to work. In fact, many of them haven't worked in decades. They just have to have an interest. That's an appropriate referral. Some people felt like they didn't really know that supportive employment was there or who the people were. It sort of barely existed. And other ones felt like, ooh, we need more people working over there because we've got a great person, but capacity is low. So there were a lot of different attitudes and knowledge about supportive employment, and we really wanted to attack those. So we built this fast facts for VA clinicians, which was about supportive employment. And you'll see that it has facts about, you know, supportive employment being an evidence-based approach. Who is eligible? There are no exclusions. We can, it's integrated with mental health care. We have a conversation or standards of practice for conversing with their mental health providers. Job search is individualized. The goal is competitive employment and supports are continuous. So we sort of attacked the provider box on the process map with education. Lastly, capacity. So as we saw in some of the quotes, some places felt like there was no capacity left. And some people were like, I don't even know if it exists. What is the capacity of sending people over there? So we began to use the data from those kiosks that patients were giving us in a way that we fed back to providers. And as Allison mentioned, this sort of feedback loop is very important. So we were, for example, here's an example of a quality improvement report, which we were giving to the clinic leaders and clinic line staff as often as every two weeks, sometimes once a month. So how many people were referred to supported employment at your site and how many were actually seen? So here you see a huge drop between how many people were referred and how many were seen. And in fact, in this report, you can also see how many people expressed interest. So you see a gap there too. Got 53 people with interest, only 43 were referred. What happened to those 10 people? And then 43 referred, only two were seen. So you can see, is my change happening? And as Allison talked about, we really wanna have these data to see if we know when we've been successful. So here's our baseline and we know where things are falling out. We also gave them benchmark reports because oftentimes clinics feel like, well, I'm in a different clinic. We have harder patients or we have a better supported employment or whatever the case may be. So we often told them, here are other sites that are doing this project and here's your site in terms of people being seen by supported employment who wanna work. So for example, this one shows that a particular site had only 4% of their people who wanted to work being seen by supported employment, whereas other sites had a higher percentage. What's going on there? Why are you different? So that's important to be able to give them feedback. So in terms of equip outcomes, in terms of patient preferences, 85% of the patients with schizophrenia in this study, included only individuals with schizophrenia, were unemployed. Okay, that's a very high percentage, but I don't think unusual. 53% of them wanted to return to work. This was a surprise to many of the clinicians and to leadership, mostly because they hadn't asked in a long time. And so patients who were getting better by their own hand, these clinicians were really working on recovery and these patients were really doing much better. New medications were coming out and they hadn't been asked again, what about employment? And patients were ready. And only 6% had received supported employment services in the year prior to equip. So we knew that there was, you know, really the sky was the limit. We had lots of room to improve, which was great. In terms of service utilization, remember that was one of our goals, get people to use supported employment. So in this study, which lasted a year, individuals at the intervention sites, compared to the control sites, were 2.2 times more likely to use supported employment services during this study as compared to individuals at control sites. And of course we controlled for baseline utilization and the desire to work. So the equip process was working. Patients were learning, being asked. Patients were given words to advocate. The providers were learning more about what to do with patients who were advocating and supported employment capacity was increasing. In terms of getting people actually back to work. So the intervention sites employment rates increased from 11% to 14%. And control sites had a slight decrease during that same period from 17% to 16%. So that's a non-significant difference, but it's certainly going in the right direction. And given how long it was taking for people to actually get supported employment services, we were very pleased with this outcome of getting at least some improvement in terms of competitive work. What about the organization? Well, as I said, those data really helped identify where some gaps were. And we saw at one site, they actually hired another full-time employee to be a supported employment specialist. Other sites didn't have the money to increase capacity in that way. So what they did was they reorganized care so that psychology interns at the site could provide supported employment services. So they trained them and use them in those roles. So that was really an excellent way to work around financial issues. And in total, we increased supported employees' capacity in another way by discharging people off of the supported employment capacity list who were not engaging anymore in the services or who were not appropriate. That is, they lost interest in returning to work. So there were a lot of different ways we shifted around in terms of organization to increase capacity. I'll turn it back to Allison now to give us some final conclusions. Thank you so much, Amy. Even though we've been thinking about the project for so long, it just always strikes me, especially how important, back to one of our earlier themes, how important it was for the leaders to be aware of what was happening in these specialty mental health clinics because these things around hiring and shifting staff and so forth, as we all know, those can't happen unless the people who control those resources know why those changes might need to happen. So it really reinforces that, excuse me, that idea of keeping the line of communication with leadership open and having the data to show, if we do this, we're gonna see these improvements and that has all kinds of positive outcomes for patients. I never get tired of that example, even though we've talked it through and thought about it for many, many years. So just to wrap us up, and we'd love to hear your questions and comments, so I'm gonna go through this fairly quickly, but there's so much work that's been done in quality improvement, and we really know a lot at this point about what's important to QI success. So just to reinforce some of the points that we've talked about today, that leadership is absolutely critical. This is from a systematic review of quality improvement efforts. Organizational culture oriented toward quality improvement effort and the value of such effort to be able to make changes is critical to success. Some type of data infrastructure or information systems. So to really be able to drive these efforts from data, you have to have that data available in one way or another. And also I would add someone, at least one person on the team who knows how to work with that data, how to access it, and potentially how to translate it for other members of the team. This systematic review also found that experience matters, and not everyone, of course, has to have years and years involved in quality improvement, but it helps if at least a couple people on the team do have that experience so that they can kind of guide others in the process. We know from this type of work in clinical settings that physician involvement is critical, that that sort of micro level motivation to change can really help things move. It can kind of make or break whether people feel like they want to even embark on this. Some degree of resources is necessary, though not necessarily extensive resources, and people who will take on the quality improvement leadership. There are many ways that leaders can support quality improvement, and we don't have to go through all of these examples here, but I think we've talked sufficiently about how critical it is for leaders to not only say, yes, I support quality improvement, but really put some meat behind that and say, do people have time to measure clinic performance? Are there opportunities to discuss successes and failures? Are we taking efforts to make sure that the clients or the patients are being heard in these processes and having some role in decisions about what priorities to target? And really thinking creatively about using existing resources to strengthen activities. For example, we learned in several projects that there are, in some cases, there are systems redesign folks who aren't even necessarily being tapped for quality improvement, and meanwhile, they've got all this expertise. So thinking a little bit outside the box in terms of who might be interesting and important to engage in these activities. So organizational change is not easy. It doesn't happen quickly. It's probably, I think we have enough evidence to suggest that when change is aligned with the internal context, efforts toward quality improvement are gonna go a little bit more smoothly, but it's never a completely smooth road. It does take time to make organizational changes. And it really is sort of this iterative and incremental process of building short-term successes. So when people start, and we've observed this over many studies, that when people start to see, oh, our team can actually change something, it might be a small something, but it might be something that, once that small change is made, it's so satisfying, and it's something we don't have to think about anymore. And that can kind of build on itself, where you say, let's try it again with this other problem that we have. So the successes really build on one another, especially if a group can kind of establish a rhythm and an approach to doing quality improvement. So what we would suggest absolutely is that we really do need to measure organizational change, because if you don't measure it, you really don't have a lot to say to your stakeholders. So you can say, yeah, we changed a lot, and then your facility director or whoever it is say, well, how do you know? And say, well, we just know, that's probably not gonna be sufficient. So you wanna be able to say, we changed it from this percent to that percent, or we changed this process, and here's some qualitative data to tell you how we did that. But this change, we don't really expect it to happen quickly, and it definitely requires a lot of coordinated activities, again, with people who come from different perspectives, different disciplinary backgrounds. So thank you so much for listening. It's our pleasure to share this work with you, and we really look forward to hearing what you're thinking about and what your questions might be.
Video Summary
In this video, Dr. Allison Hamilton discusses the importance of fostering organizational change to promote the uptake of evidence-based practices. The video is part of a webinar presented by SMI Advisor, a clinical support system for serious mental illness. Dr. Hamilton highlights the need for quality improvement in healthcare and introduces evidence-based quality improvement (eBQI) as an approach to achieve successful and sustained improvement. She discusses different quality improvement approaches, including Lean Six Sigma and the Baldrige Performance Excellence Program. Dr. Hamilton emphasizes the importance of leadership support and interdisciplinary teams in quality improvement efforts. She also explains the use of PDSA cycles (Plan, Do, Study, Act) and the value of data collection, analysis, and feedback in driving quality improvement. The video includes a case example of the implementation of supported employment services for individuals with serious mental illness, highlighting the use of process mapping, education, and capacity building to improve assessment and referral practices. The webinar concludes with a discussion of the challenges and benefits of organizational change and the importance of measuring change to demonstrate success.
Keywords
Dr. Allison Hamilton
organizational change
evidence-based practices
quality improvement
Lean Six Sigma
Baldrige Performance Excellence Program
PDSA cycles
supported employment services
Funding for SMI Adviser was made possible by Grant No. SM080818 from SAMHSA of the U.S. Department of Health and Human Services (HHS). The contents are those of the author(s) and do not necessarily represent the official views of, nor an endorsement by, SAMHSA/HHS or the U.S. Government.
×
Please select your language
1
English