false
Catalog
AOHC Encore 2022
107: Teaching with AR and VR
107: Teaching with AR and VR
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Everything is okay, great, you can hear me, excellent. Okay, well, thanks everybody for bearing with us there for a moment. A little bit of irony in the technical challenges as we talk about a highly technical topic here, teaching with VR and AR. And the point of this session is to really get you up to speed on what VR and AR and mixed reality and XR really are and how they're being used currently in medical education, how they will be used over the next five, 10, 15 years. And I think this is for anybody who's potentially interested in starting an education program that utilizes these technologies, or even if you aren't, you just think it's neat. I think that's an appropriate reason to attend as well. That's totally cool. And as we go along, feel free to ask questions in the chat. I do have it up here so I can monitor that. We will leave some time at the end for questions as well. But before we jump too much into talking about, I think, the future of medical education, let's talk a little bit about the past. So if you were trained to be a doctor or a nurse or an EMT or anything like that, really, back in the, I guess this looks like the 1940s perhaps, then the way that it works is that you have some didactic sessions, right, where you learn just the kind of specific knowledge you need to know. Then you might have a little bit of simulation on pretty limited plastic mannequins. But then you get most of your training on real patients. Now fast forward to today, it's a whole different world. Now of course there are still the didactic components, and it is true that we just do really a small amount of simulation on pretty limited plastic mannequins. And it is true that actually in the end you get most of your training on real patients, but everything is in color and the facilities are nicer. All right. So it's really remarkable actually when you think about the fact that we now have supercomputers in our pockets, we have cars that can drive themselves, we have robots doing backflips, and yet we are still training people to save lives in the same way that we've been training them for closing in on 100 years now, which is really crazy. So is VR going to solve all these problems? Yeah, sure, I mean, I'm a little biased, but I mean, I think it actually is going to have a pretty big impact on the way that we train people in the healthcare system. Is it also going to lead to a post-apocalyptic wasteland where everybody's disconnected from each other? Potentially, but that's beyond the scope of this presentation, that's not really, I think those are interesting philosophical concepts. Now a little bit about kind of myself and my background, and this is, I think, an important disclosure as well for me, which is that I am the founder and CEO of Cimex. We make VR and AR medical simulation software, and it's designed for nursing training, physician training, EMS training, military medics, and it's kind of like a holodeck-style simulation where you're kind of walking around and engaging with your virtual patients that way. However, I'm also an academic. I'm an emergency medicine physician and assistant professor out at Stanford, and despite the fact that I am an academic, I think this presentation comes less from the academic perspective and more as kind of like notes from the front lines, right? My formal training is mostly in clinical medicine and then like administrative quality improvement stuff, not so much in the science of education, but I've been doing this now for about a decade, so I have a pretty good sense, which is about as long as you can do it, actually, just the technology's not really that old, and so I think that what we're gonna share and talk about today is really just from the perspective of someone who's been working on the front lines, and of course, this is not a sales pitch in any way, so we're gonna keep it neutral. Now, so why VR? Now, I think for me at least, my background prior to getting involved in this space was in quality improvement, patient safety. I worked for CMS and AHRQ, building packages of metrics and things like that to try to make care safer, and these are some of the pretty horrible statistics that exist around how many preventable deaths there are in the United States due to medical errors, these are some of the more alarmist ways to present this data. I think all of us probably on this call are aware that this is actually a very hard thing to calculate. Regardless, though, it is too much, and one of the things that I realized working for CMS and building metrics and things like that was that in that type of regulatory world, like a huge success is a program that increases safety by 5% or 10%, and what we really need is something that increases safety by 70% or 80%, and those types of changes I think are not gonna come through little tweaks to regulatory programs and better metric selection. You know, it's just really not the nature of those types of interventions. I think if we want that type of sea change, we need to look towards simulation. I mean, that's how the airline industry went from being relatively a risky endeavor to get on the plane, to now I think everybody is aware that flying across the country is safer than driving down the street, and a lot of that came from heavy use of high-quality simulation, and both of those things are important, both that the volume of simulation is high and that the quality, the fidelity is high. That picture on the left there is not a real cockpit. That's the simulated cockpit, and I think that's just remarkable how similar their practice and training environment is to the real thing. Now, of course, we know we don't have that. Our simulated environments look like this, right? We have these kind of glorified CPR dummies that vibrate to tell you that they're seizing, or they have little blue lights in their lips to tell you their oxygen levels are low, but they can't have missing limbs. They can't really have rashes. They can't have neurologic symptoms, and perhaps worst of all, you have to actually build kind of a fake hospital room around them if you're going to do simulation and take ventilators and defibrillators out of circulation so that you can preserve them for sim, and all this is a very expensive endeavor. The mannequins themselves are $100,000 to $200,000 often, but the fact that you have to build a simulation center around them to use them means that it really takes a lot of resources to start up a traditional simulation program, and additionally, the logistics of it are such that it's really hard to use frequently, and even if you have a multimillion-dollar sim center, it's pretty rare to have more than three or maybe four sim rooms that you can use simultaneously, and so because of that, sim becomes an event, like something that you do every quarter. You know, programs that are very prolific in sim might do it once a week, but this is still like relatively low-frequency sim compared to what I think we would like to do. Now, that's where VR comes in, right, because your patient can be a baby or a grandmother. They can be vomiting. They can be missing limbs, and then you have a lot of environmental flexibility too. You don't have to just build one ED room and one OR room. You can do cases like this one at the top of Everest or scenarios where you're in a moving transport helicopter or pulling someone out of a burning car and resuscitating them on the ground, and especially for field providers, I think that environmental complexity is really valuable, and then the fact that you can integrate any tool. I think simulationists, if there are any here in this session, know that we're kind of known for being scavengers, right? It's like, oh, you opened that center line kit, and you're not going to use it. Why don't I take that down to the sim center, right? It's hard to get the equipment that you want, but in VR, you can build a roboa into a scenario, whether you have a roboa device in your hospital's possession or not, and you can train around it, or I actually think this scenario involves a Gamow bag, which is pretty rare equipment but critical in a high-altitude sickness scenario, so it gives you a lot of flexibility into what you can incorporate, but the real power, I think, is in this middle section, the fact that with these new wireless headsets, you can really set up a mobile sim center anywhere, so you can lecture in the front of the room and sim in the back. You can make it part of small group sessions. You can send people home with headsets, and they can join together over the Internet and work together around the same virtual patients, and that kind of flexibility is what's going to allow us to 10X the amount of sim we integrate into our training and start integrating it into practice and skills training over the course of your career, and then, of course, it's a lot cheaper, so it's going to make simulation, I think, more accessible to the developing world but also to smaller nursing or EMT programs that don't generally have the big sim budgets in order to build big sim centers, so I think there's just loads of potential for this to enhance realism in many important ways and also be easier to use and also be cheaper, and having that trifecta of better, easier, and cheaper is something you really only get when there's a big leap in technology, which I think is where we are right now. A couple other important and valuable points, I think, about VR is that it provides you a lot of opportunity to track performance in ways that is very challenging in traditional sim. Historically, you generally have to have an instructor who's basically eyeballing the scenario at all times, and then you do a debrief that is based on everybody's recollection of events, whereas in VR, you can track objectively what was done, what was not done, timestamps of everything that happened in the scenario, when you talked to the patient, when you didn't, when you picked up a stethoscope and listened, or if you didn't, and all of that is objectively tracked and can be reported back and utilized in a debrief, so I think that there's a lot of potential there, too, to improve our ability to utilize simulation scenarios to learn and incorporate into our clinical practice, and then there's also the information-sharing component of it, the fact that, and this is just a handful of institutions that our organization and a few others have worked with around VR, but the fact that you can share information very easily. You know, in the sim community, traditionally, you have sim specialists at every single individual program building, in many cases, really the same types of sims. Everybody needs to have some ACLS, some NRP. Probably in the COVID era, people have independently built hundreds of COVID-related scenarios that they are not really sharing with each other because there's not really a mechanism to do so in traditional sim, but in VR, a sim becomes like a level in the game, becomes something that you can share with others, that you can distribute across the country and across the world, and I think will move us towards a sim community where you can have, you can be learning trauma from, you know, frontline army physicians. You can be learning infectious disease from physicians who are operating in, you know, in like tropical environments. So lots of, I think, information sharing that will let us build on the knowledge base that exists in the sim community and advance our ability to be able to utilize and design scenarios in ways that will train people well. And then, you know, the other reason to consider VR is because the evidence is there, and it's growing substantially, but we're already at a point that people are doing meta-analyses, and I mean, overwhelmingly, VR turns out to be better than traditional sim, I think, for memory retention, and this makes sense, right? The more that you can model the training environment in such a way that it reflects the actual practice environment, the more accessible those things that you've learned in their training environment will be when you get to the real world. And there's, I think, just a lot of little visual cues and environmental aspects that you can incorporate into VR that is just very hard in traditional sim. And then finally, the other reason to consider VR is because there's things you can do in it that you just literally can't do in traditional sim. The fact that you could do a mass casualty, multi-trauma scenario in just a blank, empty room in VR, and that would be prohibitively complex to set up in traditional sim. People try to do mass casualties, but they're extremely limited. Or I think something as simple as neurologic presentations, very hard in traditional sim, even though it's something we see in actual medicine all the time, that's just because mannequins can't have neurologic symptoms, and standardized patients can't really have realistic neurologic symptoms, and so we just don't train around it. Or incorporating family members and other ancillary providers into scenarios is something that we do occasionally, but I think one of the more powerful use cases I've seen for VR is that we've made some scenarios for pediatric ICU docs where the family members are there and they're crying, and you have to explain to them what you're doing while you're resuscitating their child. And yeah, you could do that with traditional sim if you have the baby mannequins and the fake ICU set up, and you have trained actors come in, but that's something that is an event. It's something you can do once or twice a year, whereas with VR, you can pop somebody in a headset and do that 10 times a day every single day with variations on it, and do that very easily in any empty room. So texture rendering accuracy versus gameplay interaction, which is better, bang for your buck, with wireless headsets? Are you asking between wireless headsets or from the HTC headsets versus the Oculus? Are you asking, it almost doesn't matter, because my answer is the same, which is that anyone who is operating in the U.S. should probably get an Oculus Quest 2 if you're considering hardware. I have no affiliation with any of these companies aside from building software for all of them. The software we build is compatible with all of them. The Oculus Quest 2, I think, is clearly the best bang for your buck in terms of the processing power, and we'll get to this in a minute. I would not recommend getting a wired headset in the current environment. Under any circumstances. There's not a good argument for it. The wireless headsets can be wired to a computer if you want more processing power. And so really, you are paying more for more limitations if you get a wired headset. Pico Neo 3 is a great headset too. That's very popular in Europe because it does not have the login limitations that you do with Oculus because it's a Facebook-affiliated headset. So the Pico Neo 3 is very good. I'd say it's a strong second choice. It is more expensive than the Oculus Quest 2, and it is slightly harder to use, I would say. And then your other good option in that category is the HTC Focus 3. It's also a wireless headset with very similar specs. Also just slightly harder to use than either the Quest 2 or the Pico Neo, but not, I mean, I think anyone with any degree of technological capabilities can figure it out. So let's actually take a step back and talk about what is VR, what is AR, what is mixed reality? And I think this single graphic helps, I think, distinguish between these things. When we're talking about VR, we're talking about a situation where your entire world is encompassed with virtual content. There's no real-world content. This is like your Oculus generally, so it's all virtual content. That one's pretty easy. Now, AR and MR, this gets a little bit trickier for people to distinguish. To be fair, there's not really one 100% agreed-upon definition, but the definitions that make sense to me, that I like, are that AR is where the real world is the star of the show. So you can see in these examples here, you're holding your phone, looking at some food, and it's showing you the nutritional information about that food. Or Google Glass, if you saw those old commercials where you'd look down the street and you see the Yelp reviews coming out the restaurants. The real world is the star of the show, and the virtual content is there to augment. Mixed reality is where the real world and the virtual world are intertwined. And so things like Pokemon Go, though sometimes called an AR app, is really more of a mixed reality app, if you have used that, because Pikachu is there in the room with you and you can walk around him. So that is more of a mixed reality. And then XR is a term that just encompasses all of these things, any combination of virtual world and real world. One more term that's important when we're talking about VR specifically is 3DOF versus 6DOF. You might hear people say that. That's three degrees of freedom versus six degrees of freedom. So there are VR experiences still that are only three degrees of freedom. These are generally the things where you put a phone into like a cheap headset that you buy on Amazon, and it tracks your up, down, left, right movements. But if you try to walk around the room, it can't tell that you're walking around the room. Whereas 6DOF tracks you in six degrees of freedom. These generally come in two flavors, either the wired VR headsets where you have external trackers, or the wireless headsets, and these have little cameras on the front that they use to basically have continuous video of the room and they use that to determine how you're moving around the room. And in the 6DOF experiences, you can walk around inside of the VR, AR, MR experience, and it can track your positioning in 3D space. So how are these things being used in medical education currently? So the very first use case I ever saw for VR in medical education was here in the upper left. That was people using VR augmentation for surgical trainers. I think it's kind of, like many first implementations, it's kind of cool in a whiz-bang way because it's fun. It is actually not that practical in my view, and this is no insult to these particular companies, but when you're doing a real laparoscopic surgery, you don't have a VR headset on, you're looking at a screen. And so this is actually a worse approximation of your actual practice environment than if you did not have the headset on at all. So, but I keep this as an example because I think that it's very common when you have a new technology and you probably, there are many other examples one could think of, of attempts to use VR, shoehorn VR, into medical education context where it doesn't actually add a lot of value. Now here in the upper middle, this is also, I think, one of the early use cases, which was anatomy training. And I think this actually makes a lot of sense. There's a lot of commercial products out there for anatomy training in VR. It is not quite like doing cadaver anatomy, but it still does give you that sense of 3D space and the ability to kind of peel back muscles and see what's underneath and see how things are connected. And so I think this is actually a great use case for VR, pretty valuable for people who are visual and spatial learners. Then here in the upper right is another use case that was relatively early, which is 360 videos. And if you, this is one that you can try usually with a cheap headset and your phones. There's lots of 360 videos, even on YouTube. And that might seem like it would not be that useful, but actually 360 videos are very, when they're done well, they're very all-encompassing. They really make you feel like you were there. And so a couple areas where it's very useful is for surgical training, because as many of you know, if you are a trained surgeon just trying to learn a new procedure, very often they will fly out to a hospital that is doing that procedure and watch in the OR. But with 360 video, there are procedures that are being broadcast in live 360 video and surgeons across the world can log in and watch it in VR. And it does feel like you are there and you can have the added advantage of being able to switch from looking over the surgeon's shoulder to now you're looking from the anesthesiologist perspective to now you're at the end of the bed if they have multiple cameras in the room. And so in some ways it provides even advantages over being there in person. Another area where 360 videos are helpful, we're using these at the Children's Hospital at Stanford to help with patient education. So for the kids who are going into the OR or going into the ICU, they can put on a headset and feel like they're in the OR for a little bit or feel like they're in the ICU for a little bit. And that way, when they are rolled into the OR, it's not the first time that they've been there. And when they wake up in the ICU and they hear the beeping of the machines, it's not the first time that they've seen or experienced that, and it's way less scary. And it really does work well. So that's, I think, patient education is another great use case for 360 video. Here in the bottom left, this is representative of the concept of scenario simulation in VR, which is obviously the area that I am most familiar with. That's where my company operates. This is not my company's product in this screenshot. But the idea of having a virtual patient in front of you that you interact with and talk with and run through a scenario with, that I think also obviously is a big use case for VR. And then here in the bottom right, I think this is an interesting one. This is a new company that I have no affiliation with called Catechist, and I include them just because they have, I think, an interesting concept that doesn't really have an analog outside of VR, which is that you put on a headset as an instructor and you record yourself demonstrating a procedure. And then your students will then put on a headset later on, and they will see you in VR demonstrating this procedure, and they can walk around it, and they can look at it, and they can rewind it and slow motion it. And so it's almost like a video that you can walk around inside of. And so I think that's kind of an interesting concept and is kind of an innovative way people are using this technology. Now, moving on to AR. And remember, if we're talking about AR, we're talking about where the real world is the star of the show. And with that definition, there's really not that many use cases that are active in medicine. Here in the bottom left is probably the only one that's in real clinical practice, which is the idea of using your phone or something to augment a textbook, and you get 3D models popping up, and that, I think, is interesting and useful. Here in the upper right, this is a real use case that is active in manufacturing and machine repair, where you can put on an AR headset and go down to a broken machine, and it will tell you, all right, undo these screws first, and then, okay, now pull back this door, and then check this wire, et cetera. And I classify that as education because it is just in time, real world education. And people here in the bottom right, this is a mock-up, this is not real, obviously, but it's representative of that same type of concept in surgery. And people have talked about this for a long time. What if you could wear an AR headset, open somebody up, and you can see their CT scan images projected over the area, and it can tell you where to drill or where the tumor is, and you can cut around it? And unfortunately, we're just extremely far away from that. As you can imagine, with a machine, you can just upload the CAD files into the system. It's very easy for your glasses to be able to recognize where the screws are and where the door is because it's the same every time for every machine. The human body is so heterogeneous that really the barrier is being able to train a computer vision algorithm to be able to recognize an arbitrary spine under any circumstances that you might encounter in the OR. And even for a given person, even if you have their CTs and MR scans that you can upload for training purposes, it can just look so different depending on the angle and the lighting and what your approach and things that we're very, very far away. You would also need to have extreme precision. I mean, if this is something that would be intended to be used to say, here is where you should cut and here's where you should not cut, then obviously your precision needs to be 99.99 plus percent because you don't want people cutting incorrectly based on your advice. And so this is all just to say that, unfortunately, this use case is pretty far away. Here in the upper left is something that is maybe a little bit more doable, which is just the idea of being able to use AR glasses. That's kind of a hands-free way to access patient information while you're in an encounter. And this is an OR example, but people have proposed that for outpatient clinics and things. And I think that the technology is doable for that. I think the question for that is whether there's really added value above and beyond getting that information from the EMR, given that you are still probably going to be charting in the EMR regardless. Or if you're in the OR, then your replacement for this is just a non-sterile nurse who can navigate the EMR for you, and that works pretty well. So given that there are pretty easy alternatives to this, I don't know that this is something that's going to take off either. Now, mixed reality. So there are a lot of mixed reality headsets out there. HoloLens would be classified. Sorry, I'm not sick, but I got COVID a couple months ago and I'm still coughing, unfortunately. I should go to the, I saw there's some long COVID sessions at this conference. Maybe I should go to that. But the HoloLens is a mixed reality headset. The Magic Leap is a mixed reality headset. Windows has a bunch of mixed reality headsets that are made by a variety of companies. And remember, this is where the real world and the virtual are intermixed. And there are, I think, more use cases for this in medicine. These two pictures on the left are both CAE products. One where you can overlay onto a pregnant mannequin, a virtual fetus, and you can see it's station change as it goes through the stages of birth, which is kind of interesting. Or here in the bottom left, you can take basically an empty plastic mannequin and a non-functional ultrasound probe, and it can show you ultrasound images based on recognizing the way that you're holding that probe and where it is located on this plastic mannequin's body. And additionally, you can then see through that and see the organs of that mannequin, so you can see how, as you fan through these organs, it's changing your ultrasound image. I think that's a great use case for mixed reality. Here in the upper right, this is not a commercial product, but it's an academic project out of Europe, where as you are doing chest compressions, you are seeing the blood pressure building up in the circulatory system and then eventually perfusing the brain, which I think really helps, I think, hammer home the lesson that you need to be compressing for a while before the brain starts getting blood. And so I think, again, that's a great use case for MR. Or here in the bottom right, this is actually early experiments by me, where you can project a virtual patient onto an empty hospital bed and then use a real stethoscope to listen to your virtual patient's heart. Now, that might seem like, well, that's gotta be the holy grail of simulation, because then you are really practicing with the same tools and in the same environment where you're going to actually be operating clinically. And that's probably true. I think over the next five, 10 years, that's gonna become a very valuable use case. The thing is, there are some substantial limitations to mixed reality headsets now. And I mention this just because also, if you're contemplating starting a program, many people think, well, mixed reality is the best of both worlds, right? I should definitely go with a mixed reality headset. And we do, I encounter a lot of places that have invested heavily in hollow lenses, for example, and then only discovered the things I'm about to tell you about after they have made these purchases. So I think it's just good to go in with full knowledge of this technology, which is that the way that these are constructed is that there are small projectors that have very short throw, that project virtual content onto the glass in front of your face. And all of the mixed reality headsets are generally some variation of this. And so the result of that is that actually the field of view is very small. And this is not apparent when you are watching demo videos on the internet, but if you put on a headset, it can become very apparent. And this is approximately what it looks like actually. So it is not even that the virtual content disappears in your periphery. It's really a pretty prominent part of your field of view where virtual content cannot be projected. And so for example, in the use case that I was just showing where you have a virtual patient lying on the bed, if you're looking at their chest, their head and even their pelvis are not really visible because the field of view just isn't that big. And so it's almost like you're looking with a flashlight in dark room in terms of where the virtual content can show up. And then the other limitation is ghosting. Again, if you watch online videos, and I think this is very dishonest, frankly, on the part of the hardware manufacturers, they will show it to you as if the virtual content is opaque and looks like it is part of the real world. Now with the MR headsets though, that is not the case because so you can imagine there's glass right in front of your eyes. You're seeing the real world through that glass and then it's projecting virtual content onto that same glass. And so there's no way to prevent the light from the real world, from just going right through the same areas where the virtual content is projected. And so you end up with this like 50% opacity approximately is what it looks like in real life. There are some little tricks you can do to try to make it a little bit more prominent, but the trade-off, the best versions of this, like you see on the right, where the virtual content is pretty opaque, but then the real world becomes a little harder to see. And that is a problem for a lot of medical use cases where you do want to be able to see the details of the virtual content that you are putting in there. So big limitation, I think, to the mixed reality headsets that are out there. And again, I have no particular financial investment in any of these types of technologies. I would love to see mixed reality become a little bit more usable, but I think those are just important things to keep in mind if you're contemplating a big hardware purchase in that area. So what is coming next? And there's a few things, actually I already mentioned this one, which is that I think wireless headsets are clearly the future. In fact, most of the manufacturers are gonna stop making their wired headsets, I believe, in the near term. I would not buy a wired Oculus or an HTC Vive Pro at this stage in the game. In fact, sometimes people say, well, should I sell or get rid of the wired headsets? And I would say yes, actually, because the wireless headsets are only $400. And so some people say, well, we invested $30,000 in six wired headsets and gaming laptops, and that is an unfortunate sunk cost. But for just a few thousand dollars, you can get more wireless headsets than that now and save yourself and your SIM team all of the headaches of setting up and dealing with wired headsets. So I think it is worth transitioning if you haven't already. And if you're buying new, certainly don't get any wired headsets. Just to hammer home that home even more, edge computing is going to become a big thing. So there are already in the gaming world, products like Google Stadia and others, where you can run high-end games on something like a cell phone. And the way that they do that is that the processing, the 3D processing is done in the cloud. And what they're beaming to you is really just video. And what they're beaming back up to the server is just your inputs. And it works quite well. And especially with 5G, that is going to be something that will become accessible anywhere. And that is just all the more reason not to worry about, oh, well, technically the processing power of the wired headsets is better. It's not really going to matter with the combination of edge computing and the increasing power of mobile chipsets. What else is coming soon? Eye tracking and biometric tracking. So there are headsets, like somebody mentioned the Pico Neo 3, I believe the 3 or the upcoming Pico Neo, perhaps it is, that includes eye tracking inside of it. So it can not just track your head position, of course, but it can see where you're looking. And so for medical purposes, there's a lot of value there to be able to say, hey, you're only looking your patient in the eyes 40% of the time that you're talking to them. What's going on? And so being able to track with that level of precision is going to have a ton of value. And then biometric tracking. There are already kind of third-party biometric sensors that some people are incorporating into VR training, but there are headsets coming out relatively soon that will make this a standard part of their hardware. And this is, of course, tools that will allow you to measure levels of physiologic stress. And the value in this here is that you can imagine a scenario where you are trying to keep people in the sweet spot for physiologic stress. Because if you're too stressed, you're not learning because you're freaking out. And if you're not stressed enough, then obviously you're not being pushed. And so already I know Mayo Clinic, for example, Mayo in Arizona, is doing VR with biometric sensors where they are working to keep people in that sweet spot in physiologic stress. And I think that is really cool. That is now you're kind of training medical trainees like you're training Olympic athletes, right? Where they're running on the treadmill with all those machines coming off of them, keeping them in the sweet spot and being able to do that with VR and biometrics, I think is actually coming relatively soon. Hand tracking is also coming soon. There are already beta versions of hand tracking for both the HTC and the Oculus headsets that are pretty good, but they're not perfect. But I think within the next one to two years actually, hand tracking will become good enough that we can abandon the controllers and you'll just be able to use your hands to pick up virtual objects. And then another thing that's coming soon is alternative ways of doing mixed reality. So I mentioned the limitations of current mixed reality hardware. There are at least two reasonable options going forward. There are this picture in the upper left is a Magic Leap had proposed for a while, little mini projectors that instead of projecting onto glass project directly onto your retina and track your eye movements to adjust that as you move your eyes. Now it didn't work when Magic Leap tried to do it, but it's interesting technology that would be the Holy Grail certainly of mixed reality and would solve a lot of the problems associated with current mixed reality technology. But something that's a little bit more attainable is in the bottom right, which is basically just putting binocular cameras on the front of VR headsets. And then you're pulling in the view of the real world into your VR headset and then overlaying the virtual content and presenting that in the VR headset. That would allow you to have mixed reality experiences that would solve the opacity issues and solve the field of view issues. HTC is already putting binocular cameras onto their headsets. The Varjo headset already has binocular cameras. It's really just the software to process this that is still being worked out. But this is, I think, probably a very promising way to allow mixed reality to be viable in the near term. Volumetric capture is another thing that is coming in this space that will be very valuable. So you can imagine, I'm sure, or you may have already seen that people will make rooms with cameras all on the outside and they'll put people in them and capture 3D videos of people acting out scenes and things. And then you can put that into VR and walk around it. And that's cool, but it's very resource intensive. But now people have invented inside out volumetric capture, where you can just drop a little camera array into a room like this in the bottom. And it will capture not only the direct light going into it, but also little light that is bouncing off walls and such that is relatively imperceptible to humans and can use that to calculate what the sides and sometimes the backs of objects look like. And so you can just, in a flash, capture a 3D image of a space that you can walk around inside of. And there's often kind of like dark spots and blank spots in it, but it's pretty cool technology and is advancing quickly. And it's gonna allow us to create, I think, photorealistic environments that you can walk around in in VR. Also, Smell-O-Vision is coming. And this is, there's a couple of companies that make smell add-ons for VR. And this is, it might seem silly at first, but actually I think for a training context is really valuable. We all know there are certain smells associated with certain types of medical conditions. And being able to incorporate that into your virtual reality training would be really valuable. And so I think there's some cool stuff coming here. And then everybody asks about, when will you be able to feel the virtual content? I'm sure some of you were waiting to ask that question for the end of this presentation. Unfortunately, the answer is not for a very long time. It's one of those things that seems like it would be very easy. Then it is in the sense that you can make gloves that will oppose all of your joints and make it feel like if you're trying to feel a metal ball, make it feel like there's a hard ball there in virtual reality. The problem of course is that, when you're just moving your fingers, it will feel that way. But if you move your hands together, well, there's nothing stopping your hands from moving together. And so you'll just go right through it. And so, well, then you need to oppose yourself all the way up to the elbows or really all the way up to the shoulders. If you wanna be able to go like this and feel that there's a virtual object in there. But then, if your body wiggles, they can't tell that and still just go right through it. And so really you would need to have a pretty fancy getup that would oppose a lot of your joints if you were to actually have resistance that would make objects feel real in VR. And the level of resistance you would need to provide in a suit like that is also problematic. People have made suits that have basically hydraulics and engines that are associated with them. And it's like you're running a lawnmower next to you. And once you get to that level of resistance, there's also a safety question that comes into play. So I think it will be a while, unfortunately. Now that said, there's actually, I think this is less important than people make it out to be especially in medical training. You know, there's a very short period of time in your training when it's important for you to learn how it feels when a scalpel cuts or how it feels when an IV goes in. And there's a stage of your training where that's important but it's a small stage. There's a much larger stage where it's important to know when you try to put in an IV and it doesn't work, what do you do next? When you try to intubate and you can't, what do you do next? And that kind of procedural training around, you know, what is my algorithm for addressing the medical situation? Or what is my differential diagnosis and how do I go about solving it? That is a much bigger part of the practice of medicine as we all know. And I think is very amenable to VR training even without haptics. And so this is, I think we overemphasize this because it's in science fiction and video games. It's really been emphasized but for actual educational purposes, I think haptics are really not critical. So it'll be a while but I think there's still plenty to do in the meantime. What else is coming? Well, this is my cheesy ending slide. You tell me, I don't know. You will help figure this out. If you're attending a session like this, then it means you're at least interested in this space. And we are really just at the very beginnings of this. I think three, four years ago was when VR finally became accessible to an average program and when commercial products really started taking off. And so you can imagine if you were at a conference three, four years after mannequins were released and you're trying to imagine how that technology would be used decades later, it'd be very hard. And that's where we are now in VR. So I think this is not, it's a little cheesy but it's also true that I think people like you in this session may very well be the pioneers that figure out the new ways that we can use VR in the future. So thanks so much for your time. Hopefully that helped kind of get you up to speed on this technology. Happy to answer any other questions. I see one right now. Development engines, Unity versus other platforms. What has the lowest associated cost? This is very nitty gritty question and it really depends on what you're trying to do. So the two big gaming engines are Unity and Unreal. Unreal is free, but if you're developing a commercial product then they take a percentage of your revenue versus Unity you have to pay per seat for. So it really depends on how widespread your distribution is. Unity is also drastically easier to use, I would say, than Unreal. So my company in particular uses Unity. There are a couple I know of that use Unreal. I think the Unity has usability advantages. Unreal, from a graphics perspective, tends to be a little easier to make hyper-realistic graphics in Unreal. Then versus modifying Sims for individual needs. So the other, I do have a version of this where I talk in great detail about build versus buy. And I think, I don't know, this I probably do have a bit of a bias because I run a company that makes custom simulations for people. I will say it is drastically easier for us. We can make a custom simulation in a few weeks because we already have a simulation engine versus we do encounter a lot of people that have been in this quagmire of building their own custom VR simulation that's cost them $100,000 and it's taking them a year because they're just trying to reinvent things that people in the commercial markets have been doing for years already. So I think that is something to keep in mind is it is probably worth talking to vendors out there who already have engines, who can already build something for you rather than trying to hire developers and make your own thing. What do I think of Neuralinks? Like the Tesla product, like the Elon Musk Neuralink project. I think it's very cool. I don't think that we will be able to use it for VR for a long time. That's like very matrix style. And there's a lot we don't know about how to stimulate specific visualizations in the brain. I'm not a neurologist, but I'm sure they could. My layman's understanding of this is that we're not very close to being able to use a Neuralink to like stimulate a particular visual scene in the mind of a human being. So probably a long ways away. Any other questions or comments or thoughts? Can you hear us in the room? Now I can, yeah. Just a quick question, a really good presentation. If you're in a healthcare education system and you're wanting to invest in a medical education program that's got VRA, do you have any sense, is it sort of like personal computers where in the 80s where you're having a lot of mergers with programs? Or how do you know if the company might still be in existence in a couple of years in order to do your upgrades so it's not obsolete? Is it, do you have sort of a sense of the business aspect of VR right now in medical education? Yeah, I mean, I think that's a fair question. And I think it is not always easy to tell in the modern era because it doesn't cost very much to have a flashy and very professional looking computer. You can have a flashy and very professional looking website. And so you can look at a company and say, well, this looks fantastic, but it's just two people in their garage doing the side project. And they might move on to other things. I think that's a great point. I mean, I, and again, I should say, I have a little bit of a bias in this and that I run what is certainly one of the largest of VR medical training companies currently. But I mean, I do think even just from an educator's perspective, there are now enough established players that it probably makes sense for your first foray to go with an established player just for that reason. Because there are a lot of fly-by-nights out there too that are popping up and falling away. And you don't want to end up hitching your wagon to a horse like that. Most of the, there are plenty of established companies that have enough customers now as well that you can use that as a gauge. I mean, if they have a customer base that is quite large and that involves, that includes institutions that are, I don't know, serious educational institutions, then I think that is a good signal. And they should be able to provide you with some references that you can talk to as well. And especially if you're doing a big investment, I mean, it's worth making a couple calls just to make sure that things are working well for institutions that have implemented that technology. I have a couple of questions. I hate to play devil's advocate, but I know it was a great presentation. I think the technology is great, but are there some drawbacks? And I'll give my personal example. I recently went on a virtual reality ride and I was nauseous for two days. So for me, it's challenging. The other thing is a concern with data. Like right now, a lot of organizations capture our data. Are there any measures in place that prevent manufacturers from capturing our data and using that in different ways that we don't even know about? Yeah, no, I think those are great questions. And I don't even consider that playing devil's advocate because I think there's definitely, we hear it all the time. I hear it all the time. I got sick the first time I did VR. I personally get sick in most VR implementations. I cannot play most VR games actually, despite my years of experience wearing VR headsets. And now I think it is important though to recognize that getting VR sickness is not inherent to VR technology. It is 100% about how it is utilized. So three DOF experiences generally make you sick. So experiences where it doesn't track your movement in 3D space because there will be a disconnect between what you see and what your body feels, right? So unless you stay perfectly still and only move your head around, and especially those kinds that are very popular where you're on like a roller coaster, I mean, that is just extremely nauseating. It's a vomit machine for most people. And then similarly, when you're in six DOF experiences where you can walk around the room a little bit, it is relatively popular for manufacturers to have you do joystick navigation. So you'll move a joystick and your virtual character will move even though you have not moved. That will also make you sick. Warping will also make you sick. Poor frame rates will also make you sick. Now, that said, I think it is also very easy for modern VR manufacturers who care about it to make experiences that have almost no potential for making you sick. If you do a six DOF experience with no joystick navigation, with high frame rates, where your virtual character only moves when you move and only walks around the space when you walk around the space, then you will generally not get sick in those experiences. I know, at least for my company, that is the type of experience we have focused on and we have done specific tests on nausea and user acceptance, and it is 98% do not get nauseous in a way that interferes with their experience. So I think that is definitely worth considering, though, because there are, again, tons of fly-by-night companies or gaming companies that just don't care about that at all, and it will 100% make you nauseous. And then, sorry, the other question you had, aside from nausea. It has to do with the capturing of data. Oh, the capture of data. I mean, I think that is a fair concern as well, and companies are going to differ based on that, and I think that is a question you should ask everybody. There are, like, our company in particular does not upload any performance data, any educational data to the servers. That is all stored locally on the instructor's machines and they can then put that in their LMS. But definitely, I mean, that is a common business model these days, is to offer something for cheap and make your money off the data, the user data. So I think it is something that you would be wise to watch out for and ask about. Great, thank you. Just a quick question about the modular development of VR headsets, AR headsets. It is a little hard to hear, actually, for whatever reason. The modular development of VR headsets, or modular compared to a built-in solution like Facebook or Mac? I cannot quite make out what you are saying. I don't know if there is a microphone you can get a bit closer to. Can you hear me better? Can you hear me better now? Oh, yeah, that is way better. Yeah, so the question is about modular headsets. Some of the, one of the universities just demonstrated using ultrasonic sort of device to simulate a kiss or oral sensations. And it could be modularly built into VR headsets. Are we moving towards that direction more and more? Or Meta and others will continue to dominate the market? You know, I think that it is going to be hard for, if a company wanted to make a purely modular VR headset, I would applaud that. I think they are going to have a hard time competing with Meta and HTC and even Pico to some extent, because these companies already have a business model that is similar to like the Nintendo or Sega of the past, where they are able to sell the headsets at a huge loss and make money on the software and the games. And I think it would just be inherently, I think, challenging for somebody who does not, is not already connected to a large network of developers at this stage to launch a new headset and sell it on the merits of the hardware alone. So I don't know, but I would say, I would rate the modularity of the big name headsets as like medium, moderate right now. Like I showed you those Smell-O-Vision add-ons and they can be added on to the Oculus Quest or to the HTC Vive Pro or HTC Focus 3. I haven't heard about the ultrasonic device you're talking about, but I think the big names are relatively amenable to third-party add-ons at this stage. So I'm optimistic that people will still be able to innovate on hardware and be able to integrate it to these headsets. I see there's one other question. What's the typical cost for custom build for a 30-minute VR training program? That is extremely variable depending on the company that you are talking about and probably depending to some extent on the build. Like I know in my own experience from developing, there are some experiences we could make for probably $10,000 and there's some experiences that would cost us $50,000 to make for somebody. And I think if you go out into the market, then you will find prices even up to $100,000 for a single scenario that would be 30 minutes long. So it's gonna be very variable depending on the company you're talking to and depending on what it is that you want to accomplish. But yeah, I think as cheap as $10,000 if you find the right company and you have a vision that's amenable to it. All right, any other questions? Well, I know we are a bit over time, but thanks so much everybody for making time and for your excellent questions along the way. Feel free to reach out to me. I can rrobera at stanford.edu. I think probably my email address is included somewhere. I'll put it in the chat here as well. If you're interested in these topics or have any other questions or want to talk more, feel free to reach out. Thanks so much. Thanks so much. Thank you. Thank you. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much. Thank you very much.
Video Summary
In this video, the speaker discusses the use of virtual reality (VR) and augmented reality (AR) in medical education. The speaker emphasizes the potential benefits of VR and AR in improving medical training, particularly in terms of increasing safety and enhancing realism. They discuss various use cases for VR and AR in medical education, such as anatomy training, scenario simulations, and patient education. The speaker also highlights the importance of wireless headsets, eye tracking, biometric tracking, and alternative methods of mixed reality. They acknowledge potential drawbacks of VR, such as motion sickness, and address concerns about data privacy in VR. Overall, the video provides an overview of VR and AR technologies and their applications in medical education, while also addressing some of the challenges and considerations associated with using these technologies.
Keywords
virtual reality
augmented reality
medical education
improving training
anatomy training
scenario simulations
patient education
wireless headsets
data privacy
×
Please select your language
1
English