About the Author
For many healthcare technology management (HTM) professionals, the demands of the present make it a challenge to spend time on the future. But Master Sgt. Wesley Reid has seen a glimpse of the future, and he's excited by what's ahead.
Reid, CBET, CHTM, is the Army service lead for the Department of Defense (DoD) Biomedical Equipment Technician Training Program at Fort Sam Houston in San Antonio, TX. Reid is busy, perhaps more so than most, and he's been busy for as long as he can remember. This is true not only in his current job, where he oversees the training of hundreds of technicians annually as they launch their careers as military biomedical equipment technicians (BMETs), but also in his previous position as the healthcare technology manager at Tripler Army Medical Center in Honolulu, HI.
Reid, who entered HTM nearly 20 years ago when he joined the Army out of high school and underwent technician training himself, has worked on enough equipment over that time to know that the field is constantly changing and that it's anybody's guess what might come next. Still, he said, he has seen the future, and from what he can tell, it's augmented reality (AR).
“The possibilities with AR are practically endless,” Reid said, including helping clinicians to care for patients more effectively, training new or experienced BMETs, or even factoring into the day-to-day HTM work of medical device repair and maintenance. AR technology, where computer-generated images are superimposed on the user's view of the real-life world, is going to revolutionize medical device management, Reid predicted, and it's going to revolutionize healthcare itself.
“We're all going to be augmented in the medical field,” he said. “I really believe that's where we're headed. And the more I learn about it and see what can be done with it, the more I think it's going to lead to incredible advancements—not only for those of us working in healthcare, but also for the patients we're trying to help.”
Coming Soon to a Hospital Near You
It might seem easy to dismiss Reid's enthusiasm for AR as just another example of the hype surrounding a technology that's been on everyone's “emerging trends” list for the better part of three decades. To do that, however, you'd have to ignore his experience both at Tripler and at the DoD's BMET training program. AR, Reid noted, is just the next step in the ongoing evolution of all manner of computational modeling and simulation (CM&S) technologies, which he and his colleagues at both sites have relied on for years.
Tripler, for example, maintains a medical simulation center with a virtual reality (VR) trainer for their da Vinci surgical system and a robot manikin that can blink and breathe. Medical students use those and other simulation-based training tools to practice everything from lumbar punctures and laparoscopic surgery to ultrasound imaging and fetal monitoring.
The San Antonio facility where the BMET program is run—known as the Medical Education and Training Campus—uses computational modeling combined with three-dimensional (3D) printing to develop ready-for-practice, disposable versions of the expensive componentry its students must learn to master. “An instructor will identify the need for something, and then if we don't have the computer-assisted design (CAD) files for that part on hand, we'll use calipers and take measurements to model it in our system and then use one of our printers to build it,” Reid explained.
A typical project might involve a rebuild kit for a sterilizer. Instead of buying that kit for thousands of dollars, “we'll develop it here so they can work with it without having to worry about making mistakes. We want them to know what they should and shouldn't do before they're out in the field, maybe somewhere remote, and they wind up damaging something during routine maintenance.”
Reid's experience with AR technology is in “proof-of-concept” work he and others in the DoD training program have done. “We're not there yet; we still need to secure funding, and there's software development that needs to happen. But we are close—it's coming soon.”
He predicted that at some point in the very near future, clinicians will no longer have to be trained on expensive patient simulators. Instead, they'll don special “smartglasses” that will allow them to work with real people overlayed with holographic images and computer-generated signs and symptoms. “They'll look through their glasses and see a ‘real’ patient lying there, and then as they go through and do their interventions, the software will use artificial intelligence (AI) to show the patient reacting in different ways.”
Similarly, he said, in BMET training, an instructor in a lecture hall might use AR to help students visualize the inner workings of complex technologies and how they will respond to their handling and adjustments. “The way it is now, you have to interpret what the teacher is telling you and how that component or circuit is working. But with AR, you'll be able to see it in real life.”
For example, imagine a lecture about sterilizers where the teacher projects an augmented sterilizer onto the floor at the front of the room. “Wearing the glasses,” Reid said, “you'll be able to take parts away, to run it through the functions, and to watch as steam goes through it and fills the chamber. These are things you can't do right now—you can't even get a sterilizer into a lecture hall.”
Crunch Time: Computational Modeling
Medical devices, such as implants, can be made out of a variety of biocompatible materials, from metals to ceramics to a number of different plastics. The challenge, said Mark Semler, CEO of the Zucker Institute for Applied Neurosciences (ZIAN) at the Medical University of South Carolina, is that those materials must be able to perform certain tasks in a very limited physiological space.
“That's what engineering is all about,” he explained. “You have to engineer the geometry of your device so that it can operate within specific parameters.”
At ZIAN, Semler and his colleagues collaborate with industry manufacturers to develop medical devices used in the treatment of neurologic disorders. One of the keys to building devices that will work is a computational modeling technique called finite element analysis (FEA; also known as “finite element modeling”).
“You take whatever it is you're building” (e.g., a plate that will hold the bones together in a patient's neck), “and you break it down into as many tiny parts as your computer has the ability to crunch.” Next, Semler said, you feed the computer data on the conditions you expect those parts to face in a real-world environment. “You give it whatever forces you know or assume will enact on it, and that let's you model how it's likely to perform.”
In the hypothetical case of that spinal plate, Semler said, he'd be able to see those forces in “pretty colors” on his computer, “with red being bad—something you want to avoid.” He then could adjust the design, with the smoothness and the features that the clinician wanted, and rerun the FEA until he was happy with the model.
“Only then would you prototype it or print it with a 3D printer” Semler said. “It's a way to test it before you build it, which can save you a lot of time and avoid problems in the long run.”
Semler emphasized that FEA is in no way “state of the art” and that engineers have been using computers for modeling for decades. Still, he said, as the years have gone by and computing power has improved, “it's allowed us to build more and more complicated models and get closer to what we need in the finished product in our designs.”
According to Semler, some designers are using VR and AR tools to incorporate 3D images of patient anatomy into their computer models. “They'll put on the goggles and headset and see the patient's skull floating in the room, and they can walk around it, slice into it, and look at the vasculature and the nerves.” As an engineer, Semler said, it's an exciting time to be in this field. “It's hard to predict what might come next.”
AR will become indispensable for both in-the-field operator training and for working BMETs, Reid also predicted. “Soon a clinician will be able to walk up to a new device and their headset will point them to the buttons they need to hit and the knobs they need to turn,” he said.
Equipment technicians, wearing safety-style smartglasses, will be able to look at a sterilizer, for example, and interact with a wealth of information about it. An image might appear before the technician showing the device's make and model number. Then, a prompt might appear: “Would you like to know more?” After giving a verbal command (e.g., “Start PM procedure”), a checklist would appear within the tech's field of view.
“And then you'd just run through the list,” Reid said. “It would tell you to check the door gasket, and the image would briefly highlight the door gasket as you're checking it”—and then turn green when that step was complete. “Then, when you're done with everything on the checklist, it would automatically log the procedure in your CMMS. The days of having to go back and put those notes in will be gone.”
CM&S: The New Paradigm
Reid may be one of just a few in HTM who have managed to wrap their minds around the potential of AR, but he's certainly not alone in his conviction that CM&S can transform the field.
“This is a new paradigm for HTM,” said Purna Prasad, PhD, chief technology officer at Northwell Health in New Hyde Park, NY. “I don't see a lot of people talking about it right now, but I really think they should be. There are so many exciting developments in this area with real applications and implications for improving HTM.”
In the early 2000s, at Stanford University, Prasad participated in the design and development of its Center for Advanced Pediatric and Perinatal Education), a first-of-its-kind simulation-based healthcare training and research facility. At that time, he said he could not have envisioned what simulation and computational modeling would look like today.
“Not in my wildest imagination. 3D modeling, VR, AR, AI—everything about it has changed. And on top of that, with high-bandwidth Internet capabilities, now almost anyone can do it. It's becoming almost a mainstream thing,” said Prasad.
Indeed, as one recent report by the life sciences branch of the research firm Deloitte put it, healthcare may be finally catching up to the rest of the world when it comes to putting CM&S to use.1 The VR technology “that lets you pretend to be a star quarterback or space pirate can also help train young professionals or even provide pain and anxiety relief to patients,” the report stated. “The AR that puts Pikachu in your city park can also assist physicians with real-time information to use in diagnosis or even surgery.”
According to Deloitte, in the healthcare industry, the market for AR and VR software is expected to top $5 billion by 2025, while the market for virtual patient simulators alone should reach $1.5 billion in that same time frame.
“3D modeling, VR, AR, AI—everything about it has changed. And on top of that, with high-bandwidth Internet capabilities, now almost anyone can do it.”
—Purna Prasad, chief technology officer at Northwell Health in New Hyde Park, NY
Another report, by the research firm MarketsandMarkets, estimated that the global medical simulation market for software and devices of all kinds—not just those using AR/VR—will be worth $2.5 billion in 2022, up from $1.2 billion in 2017. “The benefits of simulation over traditional learning, increasing demand for minimally invasive treatments, and increasing focus on patient safety” are all driving the growth, according to the report.2
Healthcare providers are not the only ones turning to simulation and modeling. Medical device manufacturers are using the technologies as well, both to eliminate the “bad” ideas before they leave the whiteboard and to refine the good ones before their devices are used on real patients. CM&S, many device makers have recognized, can add tremendous value when defining the inputs and outputs of a medical system by allowing them to predict the performance of the system prior to design verification.
The technologies also can help facilitate the entire device development and implementation process, from human factors testing to the training of frontline clinicians and technicians. CM&S, noted the American Society of Mechanical Engineers (ASME) in a whitepaper, “has the potential to be applied at points throughout the product life cycle, from discovery and ideation to regulatory decision making, product launch, and post-market monitoring.”3
Nevertheless, added ASME, “it is still not used as widely as it could be” because of “scientific and technical challenges” that have yet to be worked out.
On a similar note, the Alliance of Advanced Biomedical Engineering reported that of the 1,500 new medical device applications that were submitted to the Food and Drug Administration (FDA) in 2017, only 220 (about 15%) included CM&S.4
A Matter of ‘Inputs’
One challenge related to CM&S, according to Linda Knudsen, principal mechanical engineer at Colorado-based Syncroness, has to do with computational modeling “inputs.”
In her work, Knudsen provides product development and automated test and assembly services not only to the medical device industry but also to aerospace and industrial customers. When modeling, she primarily uses two computer-aided engineering techniques—finite element modeling and computational fluid dynamics—to, for example, analyze blood flow in a device or calculate the stresses to which a device might be subjected when placed under an external load.
“Using a computer model to assess a medical device is inherently risky; it's important that you know whether your model is sufficiently credible.”
— Linda Knudsen, principal mechanical engineer at Colorado-based Syncroness
For modeling to be effective, Knudsen explained, “you have to make assumptions about the materials that device is going to be made out of and is going to interact with. And if you don't have data about those materials—or if the materials you're using aren't well characterized—you shouldn't even bother trying.”
An example of an input that wouldn't work for modeling is an uncharacterized human tissue material model, Knudsen said. “In a case like that, it would warrant traditional testing.”
Knudsen, who is on ASME's Committee on Verification and Validation of Computational Modeling, has been deeply involved in the creation of a new standard: ASME V&V 40-2018, Assessing Credibility of Computational Models through Verification and Validation: Application to Medical Devices. The standard, which is scheduled for publication later this year, was developed through collaboration among a wide variety of industry stakeholders, including device manufacturers, software providers, and regulatory agencies.
Once finalized, it will give analysts and users of modeling results “a framework for understanding how and when a computational model can or should be used,” Knudsen said. “Using a computer model to assess a medical device is inherently risky; it's important that you know whether your model is sufficiently credible.”
An FDA working group on CM&S within the agency's Center for Devices and Radiological Health is playing a key role in the development of the standard, Knudsen noted. A 2016 FDA guidance document offered device makers nonbinding guidance related to how their modeling studies can be provided to support submissions for regulatory approval, stating that CM&S “studies have been used by sponsors to support device design/development and have been reported in medical device submissions” for many years.5
The FDA also has stated that it “recognizes the public health benefits offered by modeling and simulation, including those in the area of in silico clinical trials,” and that it “advocates for their use as one of many research and product development tools” because of its “critical role” in “organizing diverse data sets, exploring alternate study design strategies, and predicting performance,” among other things.6
Knudsen said that in her work, she is “cautious about modeling; I use it only when it's appropriate and where it can add value.” For example, she said, she recently completed a project involving a medical device designed for pediatric patients that both serves as a site for the administration of intravenous fluids and provides a port for taking blood samples. For such a device to work safely, she explained, the clinician must be able to flush it with saline after removing blood to avoid sending clots into the patient when intravenous fluids are subsequently delivered. “That flushing has to be very efficient—you want it to be fast, and you need to minimize the saline that goes into the patient,” Knudsen explained.
To develop a device that could do the job, she relied on computational fluid dynamics. “We had great success with that and came up with a promising design, and then we bench tested it with pig blood and saw good correlation,” Knudsen said. (In its guidance document, the FDA lists a number of applications for computational modeling, from calculating stress locations on a hip implant to determining the distribution of absorbed energy in therapeutic ultrasound.5)
Computational modeling, when used appropriately, can offer several advantages over traditional device development methods, added Knudsen. “The main advantage is safety. I can make mistakes and not harm a soul. Another is to get results and insights that we can't get by testing on real patients,” as might be the case if a clinical trial would cause patients harm. “And it can be far better in terms of speed of development. We can change material properties and dimensions and do all kinds of variability and optimization before we even get to bench testing. So we're testing fewer products, but they're better to start with when we do bench test them.”
As an engineer, Knudsen said she appreciates how modeling can provide her with a fresh image of the device she otherwise never would have had. “It's that visualization of how my design impacts the performance of the device. For me, it helps me to be a much better engineer; it helps me come up with better ideas, and I think it translates to innovation,” she said.
In the Simulation Room
Better ideas also are top of mind at the Johns Hopkins Medicine Simulation Center in Baltimore, MD, where Julianne Perretta, MEd, RRT-NPS, works as lead simulation educator and is director of educational development and innovation.
Too often, said Louis Halamek, MD, FAAP, the simulation devices used to train clinicians don't even come close to replicating real life. Halamek is director of Stanford Medicine's Center for Advanced Pediatric and Perinatal Education—a simulation-based healthcare training and research center focused on fetal, neonatal, pediatric, and obstetric sciences.
“Look at what they've accomplished in aerospace, for example, and how they train people to interact with the technologies on the International Space Station. Or look at the commercial aviation industry, where the materials that go into a plane and the physics of flight are both well understood and can be accurately modeled.”
Sure, Halamek said, it's different in medicine, because human physiology is so complex, but that's not a good excuse for the fact that in healthcare “there is still nothing out there that can simulate a human being in high fidelity.” Healthcare simulation, he noted, “has a lot of catching up to do.”
Halamek, who is a neonatologist, said that for him to show a trainee how to insert an endotracheal tube into a newborn's malformed airway, he'd have to demonstrate the maneuver on a real patient because most manufacturers are only making models that reflect “normal” anatomic and physiologic conditions. “The standard patient simulators on the market right now can't model an anomalous airway.” VR simulators “hold huge potential” in this regard, he said, “but they remain relatively novel in healthcare, which is drastically different than what you see in other fields.”
The way Halamek sees it, if healthcare is going to bridge the simulation gap, better collaboration between engineers and clinicians will be needed. “There's a lack of communication between the two professions,” he said, which results in added time and cost to the development process and often to features that no one really needs.
Engineers working in silos are partly to blame, he said, but clinicians are holding the field back as well. “In many cases, clinicians are demanding a degree of sophistication or function that isn't necessary to achieve their goals,” Halamek said.
When it comes to the development of actual medical devices, Halamek noted, collaboration has been on the rise, with “engineers and clinicians in the trenches and groups from disparate backgrounds” often coming together to create innovative new products. “I'm hopeful we can do the same in simulation,” he said, “because I really think that's how we're going to advance the field.”
Her organization's simulation program, which serves both Johns Hopkins Hospital and the Johns Hopkins University School of Medicine, includes a 10-bed simulated hospital with a simulated operating room (OR), trauma bay, inpatient and critical care rooms, and a labor and delivery room. The center's primary use is as a research and educational hub for hospital employees and medical students. However, it also has become a resource for device manufacturers, who rely on it for medical device usability testing, and it has evolved into a kind of laboratory that the organization uses to make equipment purchasing decisions.
“The way that used to happen,” Perretta explained, “is very important decision makers at the hospital would sit at a conference table and argue back and forth at each other about why equipment would or would not work. Now, we take the bedside nurses, anesthesiologists, and surgeons, as well as anyone else who will be interacting with this equipment, and bring them in here and let them test it out for themselves in a simulated hospital environment. That way, we can see what will work and what won't.”
That process might lead to insight about how a certain device can be customized or modified to suit the needs of individual departments, Perretta said. “So if a nurse moves from oncology to pediatrics, it's not that they have to suddenly learn how to use a new piece of equipment. They just have to understand the different nuances of it.”
It also can result in determining that a device doesn't meet their needs at all—perhaps, for example, because its platform is incompatible with what they already have. “In cases like that,” said Perretta, “we'll give feedback to the manufacturer and tell them, ‘Look, here is what you need to change in order for this equipment to work for us.’”
Manufacturers that wish to conduct human factors testing at Johns Hopkins do so through a separate process, Perretta said. “They're pre–FDA approval devices, so it could be that they're making a significant change to an already existing device, or it could be that they're designing a brand-new device.”
An example might be a fluid warmer for the OR. “They designed it already, and it looks like it works well, but now they've got to test it, and the FDA wants to see that they tested it with engineers, as well as with nursing, anesthesia, and others,” Perretta explained.
The manufacturer would bring the device into the simulation facility, and it would make the rounds with manikins that replicate patients in real-life. “Their human factors people and design specialists will watch,” Perretta said, “and if they observe software issues, sometimes they'll change it right there. Or if they see people struggling, but it's something they can't fix right away, they'll come back later with a new design and try again.”
In addition to satisfying FDA requirements, the goal is to make sure that the iteration that the FDA eventually approves “is the device that's the safest, most effective, and easiest to use,” Perretta explained.
The lab's “bread and butter” remains the training of Johns Hopkins employees. “Before we put a new device out where it will be used with real patients, we use the simulation center to teach clinicians how it works.” The lab is “the place to make the mistakes, to ask questions, to try something you haven't done before,” said Perretta. “If you do something wrong, or something doesn't work, then you get a chance to adjust and try it again.”
Although the lab wasn't originally built with HTM professionals in mind, they also benefit from what takes place within it, Perretta added. “We actually discovered this accidentally, but often our equipment in the lab breaks more frequently than what's seen clinically because novice users are the ones who mostly use it,” she said.
Recently, for example, Johns Hopkins Hospital had an OR table arm with a crack in it—damage that had not been seen previously on identical equipment in the facility's units. BMETs can look at such issues to get a sense of a device's potential weak points and perhaps take steps to shore them up.
Similarly, Perretta said, the HTM team can come into the lab and “test what may or may not work at the bedside.” For example, if a manufacturer issues a warning regarding an error that needs to be fixed, “but in the meantime suggests putting a sticker on the device saying, ‘Don't push this button,’ HTM can come in here and try that sticker out on us first. That way, rather than putting it out there in the real hospital environment”—and finding out two weeks later that it's ineffective—“ they know beforehand what to do to make an impact.”
Finally, Perretta said, because HTM professionals can get into the lab and see the devices in action for themselves, they have an opportunity to speak up before purchasing decisions are made. “Sometimes, organizations will purchase equipment that may make clinicians happy, but then you find it's the exact opposite for biomedical engineering—that their needs aren't aligned.” When everybody's at the same table, she noted, “you can avoid those situations and make decisions with everyone's best interests in mind.”
The Future Is Virtual and Artificial
Perretta isn't the only one with experience in simulation who feels strongly about its potential for improving collaboration—and improving the work of clinicians and HTM alike. Sean Frenette and Michael Ballintyn, who are clinical engineers with the Hartford Healthcare in Connecticut, recently had the chance to take a training course with a vendor where VR technology was used to create a virtual learning environment.
“It was more for the evaluation component of the course,” Frenette explained, “but we did get to use it to play around with troubleshooting, and it led to an interesting discussion about how we might use it in the future.”
For instance, Ballintyn said, they talked about using VR in the prepurchase process to visualize how a device might be situated to optimize clinical workflow. “In radiology, when you're bringing in technologies like magnetic resonance and computed tomography, it's tough to get an idea how they'll look in a finished space.” CADs and drawings can be helpful, he noted, but VR would allow people to “basically walk through it, to move things around, to see how it would be to bring in a patient on a stretcher.”
“We're going to look back one day and we're not going to believe how we used to do things.”
—Wesley Reid, Army service lead for the Department of Defense Biomedical Equipment Technician Training Program in San Antonio, TX
As it stands now in many organizations, leadership may bring in new devices without considering their potential impact on workflow and without consulting HTM about potential design issues. “And then you end up with a situation where it's like, ‘Oh, I guess we didn't really think this through.’ Or there's a construction project, and the finished space is smaller than you need it to be and everyone is left trying to figure out what to do.”
For his part, the DoD's Wesley Reid said, he too can imagine using simulation technologies to make facility design and purchasing decisions. But he also envisioned a future where supply technicians, for example, will wear smartglasses that guide them to the products they need to find and then update the inventory automatically as they deposit the items in their cart.
Reid foresees the day when colleges will offer HTM training programs using VR and AR in their curricula to reduce what they have to spend on “real” medical equipment. “We're going to look back one day and we're not going to believe how we used to do things,” Reid predicted. The technologies are in place for this revolution right now—“it's just a matter of the industry putting them to work.”