false
Catalog
AOHC Encore 2023
105 Where‘s the Data? Building Data Sets for Conti ...
105 Where‘s the Data? Building Data Sets for Continuous Quality Improvement
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hello, everyone. I'm Charles Jalbaro, and welcome to the Name My Course, which is Where's the Data? Building Data Sets for Continuous Quality Improvement for Occupational Environmental Medicine Programs. So with that, we'll get started. Again, I'm Charles Jalbaro, and we have our panelist, and I'm pleased that David Cothcart is the co-chair for the Corporate Health Achievement Award. So he'll be introducing himself and the other panelists at the time. So I'm going to give you a little bit of background, just as a start. Then the main events could be the panel discussion. So I'm glad to see all our panelists are here, and we're ready to go here in just a minute. But ACOM's Excellence Award, which is this, began in 1996, and I can tell you I was there, so I know. And it's been the longest-running award programs actually in the country, and certainly within ACOM. So it was initiated through a joint collaboration between three ACOM committees to promote OEM as a quality-driven medical specialty, which we should be and are. It was a mixture of the committees, of the awards, the business and labor education, and public relations. And actually, I guess you might say the father of this whole award was the president at the time, a good friend and colleague, Dr. Kent Peterson. And he brought me into this whole award development at the time. So the program developed by this task force, which I was a part of, and it was chaired by an executive committee member. And funding was actually from seven charter sponsors as a pilot. So it started as a pilot program based on basically the Malcolm Baldrige award, if you're familiar with that. Then the co-sponsorship came in with very generous grants from Glasser-Smith-Klein starting in 1999. So that's when it really kind of began. So the purpose of the, what we call the ECHAA, which is Excellence in Corporate Health Achievement Award, remains unchanged though. Over all these years, it's over 25 years now, it's been a champion to improve worker health, safety, and environmental management to communicate the highest standards of excellence to the business and the professional community, to provide model organizations with visibility and validation of their efforts, to recognize continuing education, I mean efforts and education around improvement, and to emphasize performance measures, positive outcomes, and continuous improvement. So that has not changed. Here is just a collage of not all the winners, but many of them, just to kind of give you a scope and breadth of the type of winners over the years. So you can see here is everything from industry, from government, to healthcare, you name it. The methodology is defined. It's been defined, like I say, based on Malcolm Baldrige National Quality Award idea. So it is proscriptive, not prescriptive, meaning it does not tell you exactly what you have to do, like a specification. It gives you guidelines and what needs to be done instead of, you know, how to do it. So we'll get into that more with the panel discussion. But it emphasizes measurable outcomes. That's the key. It has been based on the ACOM scope of practice. As it's gone through variations and revisions, so have the Corporate Health Treatment Award. And they go back and forth and they match each other. It's a comprehensive award in three dimensions, environmental, economic and social. Now that was added about 15 years ago as part of the sustainability, if you're familiar with that, Global Sustainability Index. Those are the dimensions. So we added that in to actually give more of a business relationship to corporate suites or senior management. But the four areas have stayed the same, management and leadership, healthy people, healthy environment and healthy company. And you'll see that play through on the panel. And the points are awarded for programs, dissemination, metrics and outcomes and positive trends. And there's 1,000 points. And we'll get into that more in a moment. This is a slide. It's a very simplistic slide, but it gives you an idea. There's a number of elements, about two dozen items, we call them, in the award. And each one of those is scored separately. And all those items are actually captured within those categories I just mentioned. And on each one of those, we look for four different levels of maturity, that's what we call them. Programs, are the programs even there that should be there, in the opinion of the examiners. The program deployment, meaning is this just a program that's one location but it should be at others. Metrics and outcomes. Now we're starting to get into the data. This is where data starts coming in. And then, if you're using the data to do a plan-do-check-act cycle and improve and you see positive trends, or even trends that show you need to change and you have changed, that's a positive trend, or an evidence of continuous improvement. So you get up to 100% of points on each item, if you really do each one of those, which is pretty difficult to do. But again, we're looking for excellence and continuous improvement as opposed to the ideal. So, we were talking, Rocky and the committee, what would be a nice way to kind of give some examples quickly. So I picked out a few here, we talked about them, just to kind of give you a flavor of some of these. These are actually on our website, CHA.org. And again, everybody on the committees and examiners, it's all voluntary. So we're just simply going to show you some information that's been loaded up from presentations in the past, because the award winners would come to the AOHC and present. And these have been loaded. So these are, and others, are on the website. So here's, I just want to give you a flavor of them before we kick off the panel. So this one's from Johnson & Johnson, and they were looking at how are they going to improve and what do they need to improve on. That's the first thing, how do you decide what it's going to be? So they did an analysis of all their costs and injuries and safety statistics and so on. It came up with the top three causes that they need to work on to improve in their health and safety. So one was ergonomics, as you can imagine. The second was the slips, trips, and falls, which we all see plenty of those. And a third, and they talk about each one of these in their award, but the third one I thought was kind of interesting was the vehicle accidents. They picked up on that as being an important part of their, and you come to think of it, you see a lot of people in the pharmaceutical industry are actually out and about, you know, a lot of travel, in cars and so on, and fleets. So they did something called the Safe Fleet Program, and they described that in their award application. And they have a number of, as you can see here, a number of elements that they approached. And the data behind this, as I showed it, showed definite improvements in reduction in vehicle accidents. That's one example. Here's one from the Baptist Health of South Florida, a large medical center in the Miami region. And again, this is something that will probably relate to a lot of us here in terms of taking care of workers' comp cases and trying to figure out how to make those go, particularly in the health care area. There's so much workers' comp there because of injuries from lifting patients and so on. So this was their data over four years, showing the changes in indemnity cost and medical cost. And there's a positive trend there. The idea of this is having data for one year gets you into that third category of measuring. Having data over time is a trend that gets you into the area of positive trends of that fourth level of maturity. The next one here is a company that was called Jimbro. It's called Jimbro. It's up in the Northeast United States. They do construction work. They work on bridges and things like that. The CEO got very interested in safety and said they're going to make a change in their way they approach safety and health of their employees. And it's very much a family-owned kind of feel to the company, a small company. And unfortunately, one of his workers had fallen off a bridge and died. And he said at that point they had to make a change. So they went to the change and applied for the Corp Health Chief Award, which they did win. And I thought this was a nice slide in a sense of a number of ways. It shows trends and it's showing recordable incident rate and lost time incident rate. And the red is recordable and the blue is lost times. You can see there's quite a bit of improvement there. And what's kind of good about this slide was that it showed what they did and when they did it. A nice way to graphically show basically a plan, do, check, act cycle. They would do something, plan, check, act, and then they would do something else based on that data. And it got better and better. So I thought this was a very good example of continuous improvement. And then lastly, for my part, is the Caterpillar. And I remember these days I was Associate Medical Director there many years ago. And we really worked on health care costs. I remember the CEO at the time, this was back in the late 90s, mid 90s, saying that for the company, he said, if we continue our current trend, we'll be spending more on health care costs than we're making profit. So it was a dilemma that had to be solved. And so that was a healthy approach to the workers and their members. So here we have, this is a graph showing a trend of the national industry employers, the health care cost inflation per member per year. That was indexed. And that's the blue line at the top. And that's the projection that would come up after 2006. And it's still increasing, as we know. And so their strategy was to use quality principles to eliminate that gap and to try to bring it, not only did they bring it down to a goal, basically meeting the CPI, inflation rate, per member per year, but also to try to show that they actually did it very close to their goal. So that's another way of showing trend measurements. And what's important to the company, too, at that point, it was kind of a do or die situation. And they did quite well. And won other awards, too, by the way. So with that, it's my pleasure to introduce the panel leader here, David Cathcart. I've got to get my panel up here. Hey, good morning, everybody. I'm Rocky Cathcart. I'm really David, but I go by Rocky to stay incognito. You know, I was told that we're going to do this presentation at 8.30 on Sunday morning. And I thought, wow, there's going to be anybody here. I'm really surprised, and pleasantly so, that there's this much interest. So kudos to you guys for being here. I want to just have the panel discussion now. We're going to just kind of talk about some of the nuances of the award. Why are we doing it? What's the interest in it? What's the purpose? And then, hopefully, we can get some questions from you guys as well. Obviously, you must have some interest in data, some interest in how to improve your organization's occupational health programs, your employee health programs. So that's what the point of this is. So if you have a question, feel free to come up to the microphone, and we'll recognize you. I do have some kind of pre-prepared questions to get us started. So with that, I'm going to just introduce Dr. Lori Orlando. Could you just quickly introduce yourselves, and maybe tell us your background a little bit, and why you have some interest, and what's your affiliation with this award? Sure. Yeah, my name is Lori Orlando. I'm the Director of Vanderbilt Faculty and Staff Health and Wellness. I've been at Vanderbilt for about 15 years. And my interest in this award is that Vanderbilt won the award back in 2002. We were the first, I believe, institution of higher learning slash academic medical center to win the award. So it was a really prestigious honor to be able to win this. And even just the process was really beneficial internally for the organization. So happy to be able to be here to talk a little bit more about that. Awesome. Thank you. Dr. Bukta. All right. I'm Bill Bukta from Wisconsin. But I've been with, I was with the Mayo Health System for over 20 years, and spent 15 years in Rochester at the mothership. And while I was there, I was the Director of Employee Health, and doing a lot of clinical work as well. And after I got there in 2001, I knew about this award, and I thought, you know, we really ought to apply for this. I think it was two or three years later, we applied. And we can get into more detail about this later if you really want to know. We didn't complete it, because as we delved into it, and the various departments did their work, we found that we had some gaps, and we didn't qualify. So we never actually completed the application. But it was very valuable to learn that in the process. And as a result, we were able to beef up some of those departments. But I don't know if Melanie or Laura Bray are here, but maybe they should try it again. So we can talk about the why's if you want to do more detail on that. Dr. Yarbrough, do you have anything more you can add about yourself? Well, I've been with companies, many companies, and also were some at the VA. So I've been in a company that has applied. I've always been on the other side. I've been one of the examiners coming in. But again, this is, from an examiner's perspective, I think it's important that this is a learning experience. The whole point and purpose of the award is not to win, actually. It was never to win, per se. It was never top priority. From ACON's perspective, it was an education opportunity. And it was always to be education. When you do the report with an examiner, a group of examiners who are skilled to do this in occupational medicine, you get an invaluable report coming back, educational-wise. So it was always about education. And the award is nice, don't get me wrong. A nice big award, and the CEO likes it, and top management likes it, and your staff likes it. But again, it's about improvement, finding your gaps, like you just mentioned, and giving you a high-quality feedback report after. Well, some people get site visits. Even if you don't get a site visit because you don't reach a high enough score, there's still feedback, which I think is invaluable. So particularly because of the skill and background of the people doing the reports. And it's a group thing. A big part of this is consensus. Within the examiner point of view, it's amazing. It's very interesting for the examiners to talk among themselves, to say, oh, I didn't see that, or I didn't know, I disagree. And it's a consensus process. So our training is very much on consensus and coming up with what would be a score that would be reflective we think of our organization. Okay, and I'm Rocky Cathcart. I'm David, but again, go by Rocky. I started off in family medicine, oh gosh, long, too many years ago. Made the switch over to occupational medicine, and then was on our hospital staff, and was our chief medical officer at the time. And we felt it important to show that our hospital had good quality measures and good outcomes. And so we decided to seek the Malcolm Baldrige Award, and if you're familiar with that, it's an award that's given by the President of the United States, a very prestigious award, and it's very difficult to get. But our hospital system decided to start on that journey to apply for the Malcolm Baldrige Award. We didn't win, we didn't even get a site visit the first two or three times that we applied. And then eventually we got site visits, and eventually we won, after several years of trying. The reason I'm telling you about this is because for me at the time, it was really about the journey, or what I came to understand was it's not about the award itself, it's about the journey. It's about the journey of quality improvement, about looking at data and trying to understand what the data is telling you and then make changes that show that the data are driving you to do. And so that's what happened. So eventually I came to work for BP. I work for BP now for the past 10 years. In fact, my anniversary was just the first of this month. I love my job. I'm the U.S. health manager, so my role is basically supporting our clinical sites, policy and targets, quality performance indicators, that sort of thing. My interest in this award was my boss, Dr. David Turner, I'm sure many of you know him. When I came several years ago, he said, you know what, we need to apply for this. We've got a pretty good program here at BP, and we need to apply for this Corporate Health Achievement Award. It's much like the Baldrige Award. So this is really my first introduction to it. This is probably been six years ago or so. So we put together the data that we had, which was, you know, we're a global organization. We had sites all over the United States, refineries and chemical companies and others. So we started trying to pull the data just from simple programs, like hearing protection programs, like respiratory protection programs, benzene programs. And we found out that the data was so terrible, it was so sparse, and so didn't tell us anything. So while we thought we were doing well, we actually had no clue if we were doing well or not. So we decided to shelf that application. And then a couple of years later, David was asking me to continue to work on this. And so we did, we started trying to improve our processes, improve our data collection, and we tried again. This time, I put together an absolutely beautiful application and had all this on shiny paper. I sent it to the committee and got back, I got back my critique, which said that your data is horrible, still. You don't have good data, and you're not, you've not been able to show, as much as you say your program is good, you've not been able to show it through data. And they were right. There was no question about that. So we're back to the drawing board again. I decided to try and understand what this process is more, which is why I applied to be on the ACOM CHAA committee, and so that's my role in this, and that's why I'm here. So I've got a few questions that I would like to pose to the panel, and I'm not going to direct these to anybody, and if it's okay if I call you by your first names, I probably will do that. So the first one is, what's in it for me? Why should I apply, why should my organization, why should me as the medical officer for my organization bother to take the time to apply for this award? Anybody? Who wants to take that one? Well, I mean, I think we've sort of touched on a lot of this, even in just the introductory remarks and the overview that Charles gave. It's beneficial to do for multiple reasons. I mean, first and foremost is the sort of TQI process, right? Because what you're really wanting to do is be able to do a full assessment of your program, and this really gives that to you. It gives you a very structured, formal way to go about looking at your program comprehensively, and not just the things that maybe you in your department do, but it allows you to really recognize all of the different partnerships that you have with the different entities, because it encompasses workers' comp and the partners you might have with workers' comp or risk management or infection control or environmental health and safety, and it really gives you a broad picture of your overall program. It helps you to identify where you're doing well, where you have strong data, maybe where your data isn't as strong, or maybe where you have gaps. And then it also gives you that objective third-party look at the program, because if it's your program and you've been doing it, you want to do what are best practices, but until somebody else can take a look at it, you don't necessarily know where you have gaps. So I think that can really help. And that's beneficial. I mean, you can make the case that's beneficial not just to you as you do your programs, but it's helpful to the bottom line, because if you can identify where there are gaps, that's going to justify to your leadership, here's what we need to be doing, or here's where we need to invest more. So I think overall, it just really helps make your program the best it can be, and that's what it's about. That independent view, really. That independent view can really be helpful for you internally, for your leadership, and it helps you recognize what data you need. I think the other big thing, and you mentioned it, Rocky, is what is the data that you have, and how are you using it? Because you can have a ton of data, and it's great to have the data, but if you're not putting it in context, and if you don't have a way to say, okay, what is this data showing us, how are we using it, what story is it telling, and how are we using it to make our programs better, then that's, I think, where you kind of, to your point, might, I don't want to say run into trouble, but might recognize that there's another step that you can take. And so this process and this program is really a great deep dive to program evaluation and quality improvement. Yeah, please. You know, as an analogy, I think of the process as like a mirror, and it's a very shiny mirror, so don't blame the mirror for what you see when you look in it, because it's pretty neutral, and it's the same for everybody, and if you look in the mirror and you don't like what you see, that's your opportunity to spruce up and make it better. But talking about data, everybody's got data, but it's not all, as you say, good data, and the endpoints that you choose, your KPIs, they've got to be meaningful. And I'll give you an example, if you use like a mandatory influenza vaccine for healthcare workers, back in, that was really vogue 15 years ago, and that was a big safety badge that hospitals would wear, saying, well, we're 97% vaccinated, all our healthcare workers. Well, frankly, that's nice, but does it really matter? Does that move the needle? The needle is how many patient deaths were prevented from doing that, how many hospitalizations were prevented, how many healthcare workers were, how much did that decrease your absenteeism, your presenteeism, things like that. You want to get to the real root of the issue. Forget about these surrogate bogus indicators, they have to have meaningful impact. So like I said, the mirror is very shiny, and you will see, if you really use it properly, you're going to see some things you don't like, but it gives you that opportunity to tune them up. So that really raises a great point. I'm sorry, go ahead, Laurie. Go ahead. I was just going to add on to that, you were talking about the data, and it's almost like, in clinical practice, you wouldn't order a test just to order a test, right? You wouldn't order it just to have that piece of information, you order it in the context of what's going on with your patient, what's the story you're trying to tell, what's the problem you're trying to assess, and then you take that piece of data and you add it to all of the other information you have, and you come up with a plan and next steps. And so, I mean, you can kind of look at this in the same way. You don't collect the data just to collect it, you collect it to utilize it, put it in context and utilize it to answer questions and solve problems. Can you give us another example? I mean, this is such a great point, and we didn't have this down as a talking point, so I'm sorry to spring it on you guys, but, and I'm sure a lot of people probably came to this meeting saying, I wonder what data I need to collect. Okay, so you collect the percentage of hearing screens that are done in your organization that need to be, you know, and timeliness. Does that really matter? I mean, it probably does, but maybe only in the context if you had several hearing loss cases before, you got loud noise maps, and then you, you know, you do that, and then you can show that increasing the percentage actually decreased that, then there's a good data outcome, but just the fact that you've got, you know, 97 percent compliance rate with hearing screens in and of itself is not probably very telling. So there's not specific, if you came here looking for what's the specific data I need to collect, you're not going to get it from this meeting. What you hopefully will get is some insight is when you go back and say, I wonder where our pain points are, where our problem areas are, and what measures do we need to implement that to tell me something that I can change, and then see that those measures improve. Do you guys have any other examples? Well, I'll just make a point. I think the concept you're dealing with, and I think T would agree with this, is that just because you're compliant with a regulation doesn't mean it's a good program. Everybody's compliant, right? Does that mean it really made a difference? Quality improvement is what it's all about. So you need to go, sometimes you have to go beyond the standard to get real results, and that's what it's all about. But just stopping at the standard probably isn't good enough. There's a reason the standard was put in in the first place, right? It's for safety, it's for healthcare protection or employee protection. So to Bill's point, you have that information that the regulation says you need to collect, but then it's up to you to look at that data and say, okay, we've seen a spike in standard threshold shifts. Why is that? Because you don't just stop it. Okay, we've got X number of standard threshold shifts. The point is to look at that and say, what's the next step? Probably the best example that I remember from Mayo was the ergonomics program with – we had all the easy lifts and all these devices you push around, and it wasn't until we invested a ton of money into the ceiling lifts, and ones that actually weren't just chains like you have in an auto repair shop, they're actually designed for people, that we saw a significant drop in injuries in our healthcare workers. Now, that's a true KPI, that's a real measure, and it made a meaningful difference. And within two, three years, we had a 30% drop in lifting injuries amongst our healthcare workers. So that was, to me, one of our most significant impacts. I don't know if you've got the same experience. Something similar. Patient safety. Yeah. Just to put it in context, I'm an examiner looking at this, and heron conservation is a good example. We'll come in and say, do you have a program? Yeah, we've got one. Which is heron conservation, because we see that you need to, I mean, from a regulatory point of view. Or just because we know that industry to some degree, and know that could be an issue. And a second level of maturity is, are those heron conservation programs monitored, done everywhere they should be? So that gets you up to about, it gets you to 50% of the points. So it's not just that data is everything. You've got to have the two building blocks, those two, to build on each other. It's all building on each other. But then that's 50%. So it's not as though, that's not unimportant. It's just, that's not becoming excellent, which is the TQI. Great. Yeah, I think, when I was looking at that slope that you're talking about, you have the program, you implement the program, and that's where I think a lot of people stall, right there. There's a lot of people with 500 points, never get past that. They've got the data, but it's meaningless, or it doesn't have a meaningful KPI. So if you're going to apply for it, you've got to go into it with that. As I pointed out, we did not complete the application because we didn't have that data. And we, frankly, it was our industrial hygiene program that just didn't meet up to the standard. And if you've got a major gap like that, it's probably not worth going on. But again, you're looking in the mirror, you don't see what you like. Make the changes, and then you reapply, just like Rocky did. Yeah, again. So, yeah, data collection is hard. I mean, it's easy to get the numbers, but then it's really hard to make some meaningful, to glean some meaningful information out of that that you can action. So that's really, I think, the point of what we were trying to say here. Go ahead. And also, then it gives you that process that you can implement just going forward, right? Because then it's not about, after that, it's not about applying for that external validation. It's, okay, I've gone through this process, and now I know what I need to do just as part of my program going forward to continue that on an ongoing basis. Yeah, the program definitely improves as a result of that meticulous look at data. Go ahead, Charles. When you look at that question, why should I apply for this award, I think the simple answer from my perspective is because you can. All that information, there's tons of information on CHA.org that talks about all the program. It's not as though this is an idea that hasn't been documented. There's all kinds of resources, and applying is something that could be in the future. So just to start is the key, I think. But the resources aren't there, and they've been honed over time, and I think they're ready for you to use any time you like. That kind of leads me into the next question, which is what is the value of participation to the applicants? So we kind of talked about what's in it for me, how does this help my organization? But from the application process itself, anybody want to tackle that one? Yeah, I think you probably already know where your hickeys are before you even apply, right? And so this gives you an opportunity. You've got a manager that doesn't see it, you see it, and you think, well, let's get an external independent observer to give us that same feedback. And if they give you that feedback, now you've got something else to take to your manager and say, well, we applied for this award, we're not going to get it because we don't meet the standards here. This is another validation of the fact that we have some work to do here. So it gives you some clout in your own organization with that outside perspective. So you've got to have a little humility to do that. I mean, if you're going into it just to build up your ego, probably not the right place to go. Yeah, yeah. Because you will not come away feeling well. Well, Laurie touched on this a minute ago, and if anyone wants to tackle this a little bit more, unpack this more, for me, when we applied for the award, and I can't recite all the particular areas right now, Dr. Yarbrough might be able to, but there's multiple areas. What I found was is that many of these areas were outside of my direct purview. So I had to go to HR to find data. I had to go to our IH people who I don't directly deal with, more of an indirect. I had to pull in sources from our employee benefits program that I really didn't have. So I had to pull in all of these people from different areas, environmental, to look at, help me find data that shows that our program is good. And they sent me data that showed our program was good, but it didn't show any improvement, and it really didn't identify areas. So it wasn't meaningful data. So does anybody have any other thoughts around what's the – because this is, for me, I thought was one of the things that really helped improve the quality of our – of health in our organization, mainly in that it connected me with other parts of the organization that I really did not have insight into. I've heard that over and over and over again in terms of doing all the examinations and a lot of them. And the whole process was linking parts of the organization that really were kind of in the same – doing similar things that is healthy, you know, healthy company, healthy people, healthy environment. It was – but they were in different silos, pretty much silos. And so the collaboration that occurred with, you know, multidisciplinary way was really considered one of the high points. Well, we were told over and over again it was like this success, but that was a benefit to them. I think on a little bit of the negative side, I think what you pointed out, Rocky, is that all the different aspects of your organization that you'd be looking at are not under your control, and a lot of them aren't. And some of the people that are controlling those aspects may not be happy that you're revealing what's going on in their department. So you have to be very diplomatic about how you do this. And if you have some – like some really siloed departments and some that are in conflict, you may be treading into some really murky water. So be aware that you're not going to be just dealing with your own department. You're going to be looking over your entire organization. So if there's not a feeling of cooperation and a general acceptance of quality improvement, it may not be your best choice. But, you know, that's one of the things, though, that helps lift up the health team and the importance of health in the organization because you have to have the sign off of your CEO. The CEO of your organization has to send a letter that basically is supporting this. So it starts at the top. Once that happens, you start getting noticed, right? Because you're pulling people in and say, we need data, we need data. And so it takes a little bit of a commitment of the part of the organization, but it's a good thing because it really helps get some visibility. You were gonna say something, Lori? I mean, yeah, well, full disclosure. We did this application well before I got to Vanderbilt, but fortunately, we do have a good working relationship with all of the departments that we needed to reach out to, but to your point, I mean, I think it's absolutely true and it's absolutely valid, particularly if there are those silos, sort of building those relationships, you do need to be diplomatic, but this is a good opportunity in those situations to have that in, so to speak, or have that initial opportunity to connect because I know there are, in some instances, you know, like, well, I wish there was a way we could work together or I wish there was a way to be able to approach this issue or this topic. And this, again, gives you an opportunity from a very objective, you know, fact-based, data-driven way to say, you know, here's a challenge or here's an issue that we can collaborate on and that we can work together to make the institution better and to keep our employees safe and healthy. And so that does provide, I think, a good, a good introduction to being able to partner more. Sure. You know, and we've talked about some of the benefits and I'm curious to know, at least for this group, what were some of the detriments? What are some of the detriments in terms of applying for the award? What's, and I say detriment, I mean barriers. What barriers are there that you might face in your organization to applying? Well, I'll lay out sort of the first and obvious one is there's a cost involved. It is not necessarily inexpensive and I know right now I think everybody's in a situation where, you know, cost savings and being, you know, fiscally frugal is important. So that can be a potential barrier. So again, it's about making the argument about why this is beneficial in the long run, right? And I think as, you know, preventive medicine specialist, I think a lot of us are probably familiar with making the argument that, you know, this is an ROI that we're gonna see down the road for our employees and so I think being able to lay this out as this is important for our organization because to the point that was made earlier, we're doing all of these things, right? In some cases, it's because there's a regulation that requires it and that's fine, but we wanna make sure that we're getting the most out of it and that we are, you know, we're spending the money and the resources and, you know, and the investment in these programs. We wanna make sure that we're getting our return on investment and we are doing things the best way that we can and so getting this external validation is a really important piece of that and so I think if you're looking at getting that sort of C-suite or that, you know, executive leadership buy-in to make the investment, making the argument about why the investment is good for them as well is important. So since you mentioned the cost, I'll just, I'm gonna divulge it, okay? So for, if you've got an organization that's more than 500 employees, the cost is $4,000. If you have 250 to 500, I believe the cost is, I think it's 1,000, is that right? I think it's 1,000. So. Julie back here is telling us. Oh, is Julie back here? Oh, okay, all right, thank you, Julie. Oh my gosh, I'm so glad you're here to keep us straight. Julie, by the way, has been, she's part of AECOM and been extremely helpful in supporting this. Julie, raise your hand so everybody sees you. So, yeah, so what do you get for that cost? I mean, yeah, you get, you do some introspective looking and you find data problems but from a standpoint of, you know, you're getting an external consultation by a group of your peers, so you get an independent look and they will send a report back that says, here's what we found in your application that's lacking, here's the good points, here's the weak points, recommendations on how to improve your program. So it's really, it's a consultation fee, essentially. And it's not OSHA doing it. Yeah, it's not OSHA, good point. It's not the joint commission. I mean, I think some of your managers might think, we don't want this kind of visibility. Who's gonna find out about this if we don't get it? I think there's some confidentiality. I mean, it's not, if you don't get the award, that is not disseminated, right? No, yeah, and then I would say another drawback is finding the data, which actually is a good thing, okay, because now you're actually looking for the data. Okay, go ahead. Yeah, I would just add on another, I think I've heard it over and over again. It's a big barrier is the time it takes to actually do all the background work. So you talk about real, the actual dollar costs, but it's really the indirect costs. And I've heard a number of times where, this took a lot more time than we thought because they had to actually build relationships. You had to actually go out and try to pull together, report and go after the different organizations and bring it into your application. And get other people to spend their time. Exactly, so it's a significant time spent. It can be done over several years, so it's not like it has to be done immediately, but there is time. So, before we run out of time, and I wanna make sure we leave some time for questions, but you know, the Malcolm Baldrige Award was started many years ago, really to kind of improve the quality of American industry. And I think the Corporate Health Achievement Award is also very prestigious, particularly among companies in terms of improving quality for the health of their employees and demonstrating that. So it's a prestigious award if you win it. And it's very valuable just to simply go after it. But I'm curious to know what the panel thinks in terms of does this, how does this apply or promote occupational medicine generally, more globally? What's your thoughts on that? Is this, does this have some value in that area? Is it just another award? That's a really good question. Right now, ACOM is really, we have a whole council on external relations and communication. We're trying to get our brand out there. And one of the ways we're trying to do that is through the media. And I think it's Tuesday, or it's Wednesday morning, I think we'll be giving our Emmy Awards to the journalists that have written the best articles on OEM issues. We did this last year as well. And this year, I think we tripled our applications. So this is one way we can get, if we get some journalists that are looking to us for topics, this is one of the topics we can push out to them. Hey, we have this award winner who met all these very stringent criteria and proven to be a shining example of good occupational environmental medicine in their communities. And you can get some national exposure out of that once we continue to build this media relations program. How about continuous quality improvement? I mean, this would seem to me, they're other than OSHA, you know? And if you want that one to drive your quality, that's probably not the best thing, but. Well, no, but I think you bring up a good point because it is, the fact that it is a rigorous application, the fact that it is very structured and has definable goals and measures that are used, I think it does set itself up as a good example for here's what sort of a model occupational health program would consist of so that folks can look to it. And again, even if you don't necessarily apply or even if you apply but don't win, because again, it's not the award that, winning the award that is the ultimate most beneficial outcome to the organization, but it's saying, you know, here's what is important in an occupational program. And it gives you that standard opportunity to look to say these are the criteria we need. And not just, you know, you look at the categories and it's economic, it's environmental, and it's social. So there's that wellness component of a program too that I think sometimes isn't necessarily thought of in the same context as the regulatory piece or the safety piece and some of those other components of an occupational program. But again, we're a preventive medicine specialty and so having that wellness component and having something that can guide you as to how that might fit in and how that should fit into your program is important as well. Yeah, my final comment would be just put up on the slide just a second ago and it's really, it's humoring, the award promotes comprehensive excellence in organizational practice. So, and again, we're not looking for what is done well. I mean, we're looking for what is done well, not what needs to be fixed. So it's a positive reinforcement, it's not an audit. So that's a little bit different from, you know, what you're thinking of quality audits. So just, you know, it's a little different from that. I'll just end by saying I see some faces out here of people representing organizations that should apply for this program. Yeah, you want to point them out? Okay, we got time for some questions. I think we got maybe just a few minutes. Anybody have a question? Want to step up and yeah, just if you just step up to the microphone, if you could. We are being recorded. We don't hear you, is it working? It should be working. Yeah, so I'm curious about the process of the evolution of the criteria. You want to address that a little bit and just how up to date that is. So for example, a lot of us are very interested in climate related issues right now and every organization should be doing something about it. Is that part of the environmental criteria right now? So if you could address that process for the evolution of the criteria and then specifically, I'm curious if climate is part of the environmental criteria. Well, healthy environment has always been a part of the four categories, we call them. And so it evolves with the scope of practice and the guidelines. So what's in the guidelines is over the years, the guidelines, ACOM guidelines, statements, and the scope of practice, including scope of practice have been incorporated back and forth into the criteria. So I don't remember if climate changes in the criteria per se. I mean, it's one of the items, but would be pointed out by the examiners if they felt like it was an industry, it's gonna be different what type of industry you're looking at or organization, if they see if that's relevant. So that's one of the gap analysis that's done with the examiners. Yes, sir. Hi, thanks. Paul McGovern, occupational physician based in London working for Google. I'm gonna just pick the microphone. Yeah. Two questions. One, I'd be grateful if the panel had anything to say to the area of triangulated data, mixing qualitative and quantitative methods, looking at data from very different sources to let those different pieces of data kind of boost and improve or support the narrative, the story that that data is telling. And secondly, how, if you have any experience or examples of analyzing second and third order outcomes, say first order outcome of a program might be reduced injuries, a second order outcome might be reduced workers' comp, a third order outcome might be reduced insurance premiums. So as we develop these programs, the narrative changes to one which is less clinical and more something which really gets a business leader excited. And that's, I think those two things tie together. I'd be really grateful for some insights and thoughts on that. I think I'd like to hire you to help us with our next application so we can, sorry, I mean, that's very creative. Go ahead. It is actually very helpful. I second that. I think we do use external resources as part of the committee as we develop and continue to evolve the criteria for the award. And welcome to US, we're actually using the, I know the Society of Medicine put out this value proposition, for instance, about two years ago, I guess, two years ago now? A little more than that, yeah. So we put that in, in our thinking in terms of the evolution. So we're using external resources, International Labor Organization, so on. I have a lot of these programs going forth. So we're looking at the JBI model as well as contingent improvement. So there's a, and we'd love to have, we, you know, these views and people who want to contribute as we do evolve these in a global situation. I was asked to speak for the Southeast Asia conference, international conference last fall, which I did, and the topic they want to have was quality improvement in occupational medicine, which I spoke about virtually. But it was, but it was a, but it was very, so it was an international interest. So this is an award that not only is U.S. based, although it is U.S. in a sense, it's for U.S. companies, but international companies also have, like BP, have some substantial facilities here in the United States. But it also has international appeal. So I love the idea of the order, first order, second order. That may ring a bell with a lot of people. Yeah, and I can maybe give one example of sort of how we've looked at our data to do that first, second order evaluation. And it goes back to the wellness component of our programming. We have a wellness program, we have a health risk assessment, we have a health promotions program that takes that information and develops programs to help minimize risk, reduce risk, keep people healthy. And we have looked at our health risk assessment data over time and looked at how our employees track, you know, what are their scores in each of these different areas. And then we take that to say, are they high risk, low risk, medium risk? And then we took that one step further and sort of modeled a process that D. Eddington had done as well and said, okay, folks who are low risk have X healthcare costs. Folks who are at medium risk have these healthcare costs. Folks who are high risk have these healthcare costs. If we can move them from high to low or high to medium and medium to low, then we can see the cost, the healthcare cost savings trending in the same direction. But the interesting other thing that we found is that even if you have someone who is higher risk and you move them to a lower risk category, the costs don't ever go back down to the same point they would have been had they stayed at low risk to begin with. So, you know, that just sort of lets you know you wanna keep folks at low risk. So you wanna have those programs that can help keep them healthy as opposed to just being reactionary and identifying things as problems arise. Can I just, I mean, that was such a great question and it sort of answered your question and that is because you ask about criteria of keeping up. While there are some general areas in the ward that you have to hit in your application, the creativity around showing that is really up to you and that's why I loved your question and I'm almost serious about asking you to look at our data. I mean, that was great. That was, yeah, that was great but it was a great example of, you know, what you can do when you go to look at your programs and how you can improve. Great question, thank you. But who's next? I'm sorry, we're running low on time. I don't know if he's gonna kick us out or not. One more minute. One more minute, okay. All right. We can hang around the back if you have questions. Okay, so go ahead. I don't think I'm gonna have a question. Me neither. I don't think you're gonna hire me here. Fred Houser, Kennedy Space Center. So some of us deliver our occupational medicine as a product, as a contractor, you know, other organizations. You talked about stovepipes and my big issue here is trying to figure out how I do this with working with another company who manages their IH, who manages their HR, who manages this. I may see this as intrusive, be concerned about that. Have you had any luck with organizations that deliver occupational medicine as a product, as a contractor, being successful with this? Because I love the process. I'd love to have that data. I don't, the word sounds great, but I like the idea of doing it because one of my big concerns is I feel like I'm operating in a stovepipe and this sounds like a great way to get away from it. Yeah. Yeah, as we've outsourced, you know, Kennedy, Buck Ferguson was a big, with the NASA Space Flight Center back in Florida, was a big part of this. And they had a big quality issue program with NASA. It's one of the ones that bested early on. And we used that as an example. But again, we've gone to a lot of outsourcing. I think what you're thinking about, everything becomes outsourced to the various vendors and now it's a vendor management issue. So trying to find the correct people who are involved with vendor management and putting those, the vendors will do what, you know, you contracting them will do. And to get those contractors changed, to have this as a, feeding into an award program or an excellence program, really is what it is, is difficult. And I think it's trying to find the right, again, it starts with the first category. The first category is the first category because it's management and leadership. And then trying to identify who that is, and sometimes it's not very clear. Or sometimes the CEO has to assign somebody to that and have that happen. That's the only way I've seen it actually work. We're going to have to cut off, I think. So I'll tell you what, we'll be hanging here in the back of the room if you guys want to talk or wherever. We'd love to take your questions. Thank you guys for coming. I think this is an important topic. Thank you. Thank you. Thank you. Thank you. Thank you. Thank you.
Video Summary
The video is a panel discussion about the Corporate Health Achievement Award, which recognizes organizations that excel in occupational environmental medicine. The panelists discuss the benefits of applying for the award, including the opportunity for comprehensive evaluation of the program, identification of gaps and areas for improvement, and validation of efforts by an external organization. They also highlight the importance of data collection and analysis in the award application process. The panel addresses the barriers to applying, such as the cost and time required to gather the necessary data and collaborate with different departments. They also discuss the value of participation in promoting the field of occupational medicine, both internally within an organization and externally through media exposure. The panel emphasizes the focus on continuous quality improvement and the impact of programs on various outcomes, such as reduced injuries, lower healthcare costs, and improved business performance. The criteria for the award evolve over time, incorporating standards and guidelines from occupational medicine organizations. Triangulated data, combining qualitative and quantitative methods, is valued in demonstrating program effectiveness. The panel concludes that applying for the award can help organizations improve their occupational health programs, promote collaboration across departments, and demonstrate excellence in the field of occupational medicine.
Keywords
Corporate Health Achievement Award
occupational environmental medicine
comprehensive evaluation
data collection
data analysis
barriers to applying
program impact
lower healthcare costs
improved business performance
program effectiveness
×
Please select your language
1
English