Those overseeing clinical engineering (CE) functions at the top policy level—the Food and Drug Administration (FDA), the Joint Commission on Accreditation of Healthcare Organizations (JCAHO), and others in the profession—depend on sporadic assessments of nonstandard samplings of data when analyzing CE functions such as maintenance compliance, risk and failure analysis, and inventory management. This methodology results in the following: a fragmented view of operational functions, a limited view of the interrelationships of CE functions, and policies that reflect assumptions of equipment reliability.

The oversight of equipment maintenance, including the development of Joint Commission on Accreditation of Healthcare Organizations (JCAHO) standards, relies on data sampling, surveyor discretion, a range of policy reviews, and a presentation format that does not make apparent the interrelationships of clinical engineering (CE) functions or the status of internal policy reviews. In other words, the scope of CE services is monstrously large, interconnected, and involves many people, each with his or her own viewpoint and communication style. I propose that we address this situation by getting everyone and everything together, discussing and recording the results in a way that is clear and easy to share, and circulating documented results. This proposal is not easy to implement, but it addresses a wide range of shortcomings. Among them are the following:

1. Sporadic data sampling

This results in procedural reviews that are incomplete and are not directly representative of service functions. What an equipment management plan states and what is delivered can represent two distinctly different outcomes.

Nobody really knows how risk assessments are performed in hospitals or even how a critical piece of data, such as percentage maintenance compliance, is derived. It is currently possible for a hospital reporting high maintenance compliance to have a greater percentage of equipment requiring maintenance due for inspection than a hospital with a low percentage or noncompliant status has (e.g. simply discount all equipment not located at the time of calculation).

Nobody knows the condition of hospital inventory and service records—what percentage of inventory cannot be located due to insufficient staff or mismanagement, or because it is not on the premises. Among other things, these data can have a direct impact on risk assessment, maintenance procedures, analysis of the effectiveness of maintenance, compliance calculation, failure analysis, the use of hazard alerts and recalls, and the length of time equipment can be expected to last, which in turn affects new technology planning and treatment costs.

Automated data retrieval via the Internet would increase data collection from health care organizations and would simplify many tasks, including accreditation surveys.

2. CE functions

The separate functions that CE performs, such as risk assessment, maintenance, and equipment planning, rarely are analyzed in a way that makes inconsistencies in policy development obvious.

Reviewing CE functions and policies as separate issues and reporting the current state of development and future plans as individual phenomenon do not allow for a comprehensive view of the interrelationships of CE functions. A standardized presentation that includes all major functions and policies would make issues more visible to more people. Such a presentation would also force a continual and more orderly review of all issues, regardless of popularity.

3. Sporadic reporting

This restricts institutions in their ability to share significant data quickly and effectively with oversight groups and other institutions. Many types of issues, such as equipment failure and user errors, parts acquisition, and preventive maintenance corrective actions, are made available to other institutions on a voluntary basis only.

4. The monster in the closet

Balancing cost and patient safety. Under loose controls, cost consciousness can influence, subtly and insidiously, the way problem areas are interpreted, turning complex environments into “opportunities” for simplified administration of services (e.g. department mergers, department management downsizing, outsourcing) and cookie-cutter cost containment. There is real incentive and real ability to bypass effective issue assessment, instead of depending on surveys that rely on data sampling.

The problem is not the conversion of raw data into generalized assessments. Condensing and summarizing data for review purposes is a practical necessity in any oversight process involving large amounts of data and groups of people. At issue is the method used in collecting and displaying that data: determining the type and quantity of information necessary for meaningful oversight, and the form in which the information is presented. The FDA relies on voluntary failure reporting, “Medical Device Reporting” for manufacturers, importers, and user facilities and MEDWATCH for consumers and practitioners (Center for Devices and Radiological Health / CDRH, updated September 22, 2002). The JCAHO also relies on voluntary failure reporting, “Sentinel Events…involving death or serious physical or psychological injury, or the risk thereof.”

You cannot clearly see the solutions if you cannot clearly see the problems.

Adopt a standardized review procedure to help ensure that everyone in the CE profession has access to a more complete representation of CE functions, to observe the interrelationships of CE functions, and to review more realistically the effectiveness and value of any one function. There are three basic areas that need to be addressed:

  • Base policy and (to some degree) accreditation on data that are collected on a continuous basis, rather than depending on periodic research and the intuition of inspection surveyors.

  • Assess the interrelationships of CE functions. Remember that functions such as inventory management, risk assessment, and maintenance compliance are linked one way or another to each other and to patient safety and resource expenditures.

  • Create a consistent presentation format. Because a standardized review of policies and standards necessarily would be broad in scope and would involve many people with different views, a standardized presentation format would help ensure that everyone is using the same language, referring to the same specific points, and addressing all the relevant issues.

We have entered a new era where business and engineering advances have formed partnerships with hospitals. This is reflected in increased demand for and availability of new and costly technologies, strict reimbursement formulas, and increased reliability of technologies.

It is no secret that one challenge facing CE services has migrated from the adoption of new technologies to the adoption of improved design and manufacturing methodologies. We do not inspect each physiologic sensor prior to use. We do not periodically inspect all AC-operated patient care devices. Trust in manufacturing, backed by historiography, enables us to “violate” what just years ago were pillars in ethics.

Increased reliability on technology, coupled with data sampling oversight, has contributed to an environment in which CE service providers assume greater authority over their procedural methodology and participate in an accreditation survey that is more craps game than survey. That is to say, CE administrators can influence outcomes by design and selection of operational initiatives, and surveyors pick and choose the trail they “dig down.” This does not mean that CE receives no scrutiny or guidance, but the integrity and form of analysis and reporting are left mainly to those doing the maintenance and program design. A JCAHO mandate is to educate; however, the student has become the authority and the exam is left to chance.

This is not good management design, but it is understandable how the system got this way. Consider the complexity of this environment—the matrix of efficacy, cost, and safety. Also, visualize it from the perspective of JCAHO: new and changing technologies; a mix of inside and outside service providers; large and small full-service and specialty patient care facilities; and a range of professional organizations, manufacturers, and service staff capabilities. Now administer effective risk assessment, maintenance protocols, and data collection and analysis—all in an atmosphere of live-or-die cost containment.

The JCAHO and the FDA certainly recognize that they cannot micromanage medical equipment servicing. As a result, they focus their attention on the bigger picture, hoping for trickle-down effects. But there is a gap. Voluntary efforts aside, only samples of information travel from the operations level to those with the ultimate say-so. The result: A view at the top is sporadic and fractured. Trickle-up data influence policy development and nobody really knows how much of what trickles down (see the Figure).

Everyone—CE service providers and the JCAHO—copes with complex and changing environments, and policies and standards that shift to accommodate change. It is a human environment with human reactions.

Four fears CE service providers have of standardizing procedures:

  • It will cost more money.

  • It may bring lower functioning service providers up in terms of quality, but may bring (my) higher functioning service down.

  • It will stifle my ability to address critical issues in ways that relate to my specific environment.

  • It will cost me precious time, reduce what little control I have over my department, and force me to pay attention to bureaucratic form rather than to the staff screaming at me for service.

Fears such as these are not easily addressed. One could, however, look to wisdom in an axiom such as: “A more controlled environment is a good one only if it truly offers greater control.”

What controls are in place in the current environment of care? How the JCAHO oversees CE functions is determined by how the JCAHO controls its standards development—and vice versa. The information for this oversight process comes from a patchwork of thousands of hospitals, plus the input of JCAHO staff and invited experts in the field. From the viewpoint of this proposal, an effective review process can only be achieved if all involved parties can see and can understand all the issues. Without a holistic approach to oversight, “management by crisis” can dictate that the issue screaming the loudest will generate the greatest activity. That is as true for policy development as it is for service.

In order to interpret one CE function, all functions and relationships that make up the practice of clinical engineering must be clearly visible. You would not dismantle a sophisticated piece of technology without having the appropriate service manual. Why should our behavior be any different in overseeing the set of interdependent CE functions?

Implementation means creating definition; it does not mean “set in stone.” When a system evolves, there must be mechanisms that permit core changes, as in naming CE service functions. Nothing remains the same. Standardizing nomenclature and a presentation format involves defining a structure that makes visible proposed alterations and includes the format used to track changes.

  • Make the viewing of CE functions and their interrelationships (Table 1), as in the following example, part of a formal review process.

    Inventory management impacts maintenance compliance calculation, recall, hazard, failure reporting, and equipment life-span analysis—which, in turn, impact treatment efficacy (patient safety) and efficiency (cost containment).

    Repair data impacts risk assessment, scheduled maintenance, recall, hazard, failure reporting, maintenance effectiveness analysis, and equipment lifespan analysis—which, in turn, impact treatment efficacy and efficiency.

    Risk assessment impacts scheduled maintenance and maintenance compliance—which, in turn, impact treatment efficacy and efficiency.

  • Maintain a standard presentation format. The tenet underlying this proposal is that some functions, such as quality of inventory records and risk assessment, directly impact others, such as maintenance scheduling and compliance calculation. These functions affect treatment efficacy and cost. Also, there are more functions and issues than can be analyzed and discussed conveniently by groups of people with varying degrees of expertise and involvement—without a standard recording and tracking format. The example shown in Table 2 is used only to illustrate the idea.

  • Develop an online system of data collection, so monitored issues are reported on an ongoing basis. Device nomenclature, data collection, and safety are just some of the issues involved in this stage of the implementation process.

  • Place the implemented changes on the Web and print, circulate, and discuss the issues highlighted by all involved parties.

Table 1.

Clinical engineering (CE) functions and their interrelationships.

Clinical engineering (CE) functions and their interrelationships.
Clinical engineering (CE) functions and their interrelationships.
Table 2.

Recording and tracking format for clinical engineering (CE) functions.

Recording and tracking format for clinical engineering (CE) functions.
Recording and tracking format for clinical engineering (CE) functions.
Objectives in Adopting a Standard Clinical Engineering Review Procedure
  • To better view and understand the relationships of different clinical engineering (CE) functions and to emphasize the effects that changes to one have on others. Creating standards to manage the process of reviewing CE functions also will enable deficiencies in the reliability of these functions to be seen more easily.

  • To better resolve efficacy issues (e.g. risk assessment) that arise from purely economic issues (cost containment).

  • To better track the work by different groups (e.g. Joint Commission on Accreditation of Healthcare Organizations, Association for the Advancement of Medical Instrumentation, ECRI, and Food and Drug Administration) on multiple issues.

  • To better share information of various CE functions within the network of hospitals and oversight organizations, and to present this work in a form that encourages participation and promotes greater understanding.

Author notes

Alan Pakaln worked for 23 years overseeing medical equipment services in hospitals in New York City. He is currently a clinical engineering consultant specializing in inventory management, risk assessment, and analysis of other service-related issues. Pakaln is editor of Common Sense Directions (www.commonsensedirections.org), a nonprofit online journal addressing issues relating to medical equipment management. E-mail: [email protected]