Patients with renal disease undergo a progressive loss in renal function, which eventually requires treatment to sustain life. The most widely used modality of treatment is hemodialysis. Such treatment is generally undertaken three times weekly with each treatment session lasting between two and four hours, although currently a number of investigations are exploring the potential benefits of increasing the frequency of treatment. During dialysis, which requires access to the patient's blood circulation, blood is removed, passed through an artificial kidney (hemodialyser), and returned to the patient. Within the artificial kidney, blood flows on one side of a semipermeable membrane, while on the other side an electrolyte solution continuously produced by the hemodialysis machine flows. During treatment, non-protein-bound low and a limited proportion of high molecular weight compounds elevated as a consequence of renal insufficiency, as well as fluid gained in between treatments, are removed, and electrolyte abnormalities are normalized. This article focuses on the evolution of hemodialysis technology along with emerging concepts and the practicalities of managing such equipment.
Early treatments of renal disease were limited to easing the patient's suffering from acute or reversible renal failure.1 With the advent of a reliable method of gaining access to the patient's circulation in the early 1960s, treatment of patients with irreversible renal failure became possible. The dialysis systems used at this time were simple: the electrolytes were manually mixed with water in a refrigerated tank to minimize bacterial growth. There was minimal monitoring of treatment-related parameters and continual monitoring of the patient during treatment by either medical or nursing staff. These early approaches used sodium bicarbonate for buffering, but problems arising from the formation of calcium carbonate meant that this approach was abandoned in favor of acetate. This remained in use until the 1980s when, with the increased use of high-efficiency dialysis treatments and the availability of new technology to minimize calcium carbonate formation, bicarbonate re-emerged as the preferred buffer.
By the mid-1960s treatments could be undertaken in the patient's home and the first generation of proportioning systems capable of automatically mixing water with electrolytes began to be manufactured.2 These early systems incorporated audible and visible alarms to maintain essential parameters such as dialysis fluid temperature pressure and electrolyte concentration, as well as patient-related parameters (pressure in the extracorporeal circuit) within prescribed limits. In 1973, U.S. Federal legislation provided funds for the treatment of patients suffering from kidney failure (HR-1, Public Law 92-603), resulting in a rapid increase in the number of patients receiving treatment as well as the involvement of the corporate sector in dialysis research and development, resulting in rapid developments in technology.
Current technology retains the concepts of the earlier systems but uses an increasingly sophisticated approach to ensure patient safety. This is achieved by monitoring the blood and dialysis fluid circuits, with monitors linked to alarms to indicate abnormal conditions, stop treatment, and ensure that the patient is isolated and protected from the malfunction and its effects.
Systems in use today (see Figure 1) also incorporate ergonomic user interfaces, monitoring and control systems for ultrafiltration or fluid removal, computer interfaces, and continuous monitoring of patient-related parameters such as blood pressure and blood volume changes. These systems make extensive use of microprocessor technology, and in addition to providing the basic functions discussed above, they permit individualization of electrolyte concentrations in the dialysis fluid to meet differing clinical requirements. Safety checks are automatically performed before treatment. The computer interface permits clinicians to monitor treatments being performed in remote locations such as satellite dialysis units or the patient's home.
The majority of treatments use continuous blending of electrolytes with treated water to produce the dialysis fluid This process, as well as the monitoring of vital parameters, is performed by commercially produced equipment. Such equipment is available from a number of producers within the United States, including Fresenius Medical Care North America (Waltham, MA); Gambro Renal Products (Lakewood, CO); and B. Braun (Allentown, PA). The use of such equipment is by no means universal, particularly when there is a lack of suitable water. Under such circumstances dialysis fluid regeneration may be used. The original REcirculating DYalisis (REDY) system was commercially produced until the mid 1970s. It used approximately 6 liters of dialysis fluid, which was recirculated through a cartridge containing immobilized enzymes and chemicals. As this also removed electrolytes contained in the dialysis fluid, calcium, potassium, and magnesium were infused into the cleaned solution to maintain the correct electrolyte concentration. This system is no longer in production, but two equivalent systems are currently available in the United States: the Allient Sorbent dialysis technology system (Renal Solutions, Warrendale, PA) and the RDS 3001 system (Dialysis Parts and Supplies, Inglewood, CA).
An alternate approach that does not rely on sorbent technology was used in the Aksys PHD System (Aksys, Ltd., Lincolnshire, IL [no longer in operation]), a fully automated personal dialysis system intended for use in an alternative care setting, such as the patient's home. It combined water treatment, dialysis delivery, and reprocessing into a single machine and used a low volume of dialysis fluid (approximately 50 liters) and hot water (85°C ± 5°) for the disinfection of both the blood and dialysate pathways. The NxStage Systems One (NxStage Medical Inc., Lawrence, MA) uses disposable blood and dialysis fluid circuits and when used with the NxStage PureFlow SL system prepares dialysate for use during hemodialysis. In contrast to conventional systems, which can be as large as 63 inches high by 19 inches wide by 28 inches deep and weight as much as 300 pounds, the NxStage System One is only 15 by 15 by 18 inches and weighs about 75 pounds.
The dialysis fluid produced by the hemodialysis monitor passes to the hemodialyzer. Major changes in design have occurred over the past 30 years. The coil type dialyzers, widely used in the early 1970s, are no longer produced and parallel plate designs are used in only a small fraction of treatments. Today, the most commonly used device is the hollow fiber or capillary dialyzer based on a design first used in the 1960s (Figure 2). Hollow fiber dialyzers offer flexibility in that they are available in a range of sizes to meet different clinical requirements and can be used not only for conventional treatment but also for the newer treatment modalities such as high efficiency and high flux treatments. Recent dialyzer-related developments have focused upon the improvement of performance by the optimization of the flow distribution within the fiber bundle and on the improvement of the biocompatibility of the membranes used.
Recent and Emerging Developments
Real-Time Monitoring of Delivered Treatment
A number of studies have indicated a statistical association between the amount of urea (a breakdown product of ingested protein) removed or treatment adequacy and survival. Adequacy assessment is currently performed monthly and involves blood sampling pre- and post-treatment. It also provides retrospective information. Recent advances in online measurements of the dialysis fluid using conductivity clearance permit repeated measurements of adequacy during individual treatment sessions.3
Control and Monitoring of Vascular Stability During Therapy
Patients undergoing dialysis gain fluid from dietary intake between treatments, which is removed during treatment. Despite the use of microprocessor-linked volumetric fluid removal systems in the current generation of machines, tolerance of such fluid removal remains poor in many patients—faster and more efficient treatments push the limits imposed by human physiology. The fall in blood pressure resulting from such intolerance necessitates frequent nursing interventions and is debilitating to the patient. A number of different approaches have been developed to alleviate this problem, including automated blood pressure monitoring either alone or linked to a control system able to automatically modulate the fluid removal to minimize impact on the patient, monitoring the changes in blood volume during treatment by the continuous online measurement of red cell or hematocrit concentration, use of a surrogate parameter for changes in blood volume reflecting fluid removal,4,5 automated profiling of the rate of fluid removal during treatment,6,7 or use of isothermic dialysis.8
Dry Weight Monitoring
An important aspect of preventing hypotensive episodes during dialysis is accurate knowledge of the patient's dry weight. Dry weight can be defined as the post-dialysis weight of the patient, which is as close as possible to normal hydration status. In dialysis patients, this weight is the lowest post-dialysis weight that the patient is able to tolerate without the development of intra or interdialytic symptoms. Dry weight is generally established from clinical examination of the patient but is subject to inaccuracy as clinical symptoms may not be present even with modest fluid overload. To permit a more objective method of dry weight assessment, a number of groups have begun to use bioelectrical impedance techniques.9–11
Moving Beyond Adequate Removal of Urea
Conventional dialysis is principally a diffusive process that favors the removal of low molecular weight uremic compounds such as urea. The attribution of complications of dialysis over a long term to the inadequate removal of middle or large molecular weight compounds resulted in the development of removal methods using convective trans membrane transport such as hemofiltration, a technique utilizing convective solute removal, and hemodiafiltration, a technique combining convective and diffusive removal. Both treatments use high flux membranes and large volumes of ultra pure infusate to replace the ultrafiltrate or fluid removed. Early variants of the technique used pharmacologically produced infusion fluids. Due to the high cost, only 2 to 5 liters of fluid were used during treatment. The currently used approach to deliver hemodiafiltration therapy, known as “on line hemodiafiltration” utilizes higher volumes (>20 liters) of infusate, which is derived from specially modified dialysis machines. In these, a portion of the bicarbonate buffered dialysis fluid produced is diverted, passed through special filters, and infused into the extracorporeal circuit to replace the fluid removed. This additional treatment that the infusate undergoes is important since standard dialysis fluid contains bacteria and endotoxin fragments, the presence of which is known to contribute to micro inflammation in the patient. If such fluid was to be infused directly into the patient, potentially fatal consequences would ensue. To ensure sterility and patient safety, the integrity of the filters is checked prior to each use.12
Safety Standards and Risks
Hemodialysis requires the availability of large volumes of treated water. Every week, patients are exposed to approximately 400 liters of water in the dialysis fluid that, albeit with the interposition of a semi-permeable artificial membrane contained within the hemodialyser, comes into contact with the bloodstream. Thus, water used in the preparation of dialysis fluid must be treated to ensure that exposure to common water contaminants is minimal. Standards for water quality for use in the preparation of dialysis fluid such as ANSI/AAMI/RD62:2006, Water treatment equipment for hemodialysis applications have been developed and this not only defines maximum permitted contaminants, but links compliance to reimbursement.
Technical, nursing, and medical staff should be aware of the risks to patients arising from inappropriate water quality. Optimal water quality should be ensured by the use of a quality assurance system with documented maintenance and disinfection protocols, the evaluation and audit of chemical and biological contaminant levels, and the establishment of “action limits.”
The early hemodialysis machinery was prone to technical problems, leading to an FDA report published in 1980. With awareness of these issues, patient safety became an important focus in the development and application of internationally recognized safety standards that ensure the patient is not placed at risk in the event of a fault during treatment. Today all equipment used in the delivery and monitoring of dialysis therapy complies with such standards. The general safety standards are covered by ANSI/AAMI ES60601-1: 2005, Medical electrical equipment—Part 1: General requirements for basic safety and essential performance and those specific to dialysis machine requirements are covered by IEC 60601-2-16:1998, Medical electrical equipment, Part 2-16: Particular requirements for the safety of haemodialysis, haemodiafiltration and haemofiltration equipment. All equipment produced by major manufactures complies with these standards and compliance is denoted by appropriate certification. Standards are continuously evolving as they can not foresee risks associated with new technology or application.
More recent information relating to risks and hazards during clinical use of hemodialysis equipment can be found in the Manufacturer and User Facility Device Experience Database (MAUDE) of the Medical Device Reporting (MDR) database of FDA (www.fda.gov/cdrh/maude.html). Within the context of ensuring patient safety, the Patient and Quality Improvement Act was introduced in 2005 to provide legal and confidentiality protection for safety information that healthcare providers can share, thereby moving toward the creation of a “culture of safety” in hospitals, outpatient clinics, and dialysis facilities to reduce errors and adverse events.
Managing Hemodialysis Machines
Hemodialysis machines represent complex interface between the patient and technology. Although such machines are highly technical, most hazards in hemodialysis are not caused by machine malfunction; they are related to user errors such as incorrect setting of control values or alarm limits. Key points in ensuring that advere incidents do not occur include in-depth training of staff, including the pre-treatment, performance, and post-treatment steps needed to safely perform each therapy mode; detailed instructions for the user to properly clean, disinfect, and maintain the dialysis machine; and proper use of the special functions of the machine such as ultrafiltration and dialysate concentration profiles or feedback control based on physiological models. To further prevent error and ensure patient safety, all personnel should use standard operating procedures or checklists rather than relying on memory, report minor technical defects, use appropriate disposables for the machine, and avoid using the machine for “non standard” applications.
Dr. Nathan Levin is medical and research director of the Renal Research Institute, New York, and professor of clinical medicine at the Albert Einstein College of Medicine, New York.
Dr. Nicholas Hoenich is a clinical scientist and member of the School of Clinical Medical Sciences, Newcastle University, United Kingdom.