It is tough to give the last talk on the future with so many excellent presentations. Let me give you a little bit different sense of what may be defining future developments in the practice of pathology.

Obviously, Siemens (Malvern, Pa) and General Electric (Fairfield, Conn) were invited to speak today because collectively we have recently spent, between the General Electric purchase of Amersham and our purchases of Diagnostics Products Corporation, Bayer Diagnostics, and Dade Behring, $20 billion in this space (prepared August 1, 2007 with Siemens planned acquisition of Dade Behring announced). Why are we investing in in vitro diagnostics? There have been discussions about whether pathology is merging into radiology or radiology merging into pathology. I think the drivers are probably not that but, in part, some basic world views. First, it appears that there are reasonable returns on capital in diagnostics, compared with other worldwide investment opportunities. There is obviously the aging of the population. There are the advances in science, and then finally, there is a “hedging your bet” strategy, because at the end of the day, both imaging and laboratory diagnostic strategies compete in the same space. To the extent that you have more of the diagnostic waterfront covered, you have a better shot at having the diagnostic tools that actually make the diagnoses.

These are some top-level considerations that come into imaging companies investing in laboratory companies. I also think there are some interesting lessons from radiology for pathology and we will get to some of these similarities, especially in information technology.

This overall diagnostic vision is driven by what Elias A. Zerhouni, MD, the director of the National Institutes of Health has described as the future of medicine, and that is that medicine will be predictive, personalized, and preemptive. Let me focus on the personalization part. In medicine today, we treat many illnesses pretty much the same way for everybody. For example, hypertension is pretty much treated the same way; congestive heart failure is treated the same way. We have a lot of one size fits all medicine. In contrast, personalization is an integral aspect of the overall American economy and so it is not surprising that medicine might also become personalized.

Is there a medical equivalent of the commercial concept of mass customization? How about Burger King's “have it your way” or Dell Computer's dynamically creating a business by custom building computers? Look back to automotive pioneer Alfred Sloan, who founded General Motors and successfully competed with Henry Ford. Henry Ford said the customer could have any color car as long as it was black, but it was Alfred Sloan who offered customers Buicks, Chevrolets, Oldsmobiles, Pontiacs, and lots of variety, making General Motors the leading auto manufacturer. Mass customization will come to medicine, and we are interested in being part of that.

We discuss molecular-based medicine as being the future, but some of this is already here. There was an interesting article recently in JAMA looking at the incidence of elevated cholesterol.1 The JAMA article noted the drop in low-density lipoprotein cholesterol during the last 10 to 15 years, and when you tease out those numbers, what is clear is that the molecular-based medicine leading to insights into how 3-hydroxy-3-methylglutaryl coenzyme A reductase helps synthesize the cholesterol backbone has brought us to the point at which today we see lower lipid levels and a significant reduction in coronary artery disease. Obviously, there are a number of things in addition to statins involved in this decline, but it is worth noting that molecular medicine has started.

The physicist Neils Bohr noted “prediction is hard, especially of the future.” However, it is worth pondering that there was 1 prediction (made just before the original 1966 Star Trek science fiction TV show highlighting the distant future) that turned out to be blisteringly on target. Gordon Moore, an inventor of the integrated circuit and one of the founders of Intel, made a prediction in the April 1965 issue of Electronics magazine that, based on the number of transistors one could pack into an integrated circuit, computer performance was going to double roughly every 18 months all the way out to 1975.2 It turns out that this prediction of doubling has been mathematically accurate for the last 40 years, and folks at Intel believe they have at least another 10 years of this performance. All of the technologies you have seen at this conference come from the computing growth described by Moore's rough equation.

Modern imaging is entirely a Moore's Law play. If you look at genomics, that is essentially a Moore's Law play. Do not forget that Watson and Crick came up with DNA structure using wet chemistry and crystallography in 1953, but it has only been in recent years that we have had the computer power to whiz through DNA sequences.3 

Magnetic resonance imaging, originally called nuclear magnetic resonance, has been around even longer. Felix Block and Edward Purcell discovered nuclear magnetic resonance in the 1940s. It is interesting that when computerized nuclear magnetic resonance entered clinical practice around 1984, radiologists immediately took the word “nuclear” out of the name to avoid patient concern about things nuclear.

Using the insight of Moore's Law, we can look at modern imaging equipment as computer cameras on a network. Today's genomic and proteomics worlds comes from computerization of analytical tools. Dr Jose Costa gave a great presentation on systems biology and I will not venture into that space, but clearly as we move from today's early discoveries with DNA microarrays, we will develop increasingly refined models of what we are testing for using this computer power.

It is worth noting that DNA expression studies and microarray results are highly dependent on the exact cell and tissue type you prepare them from. With an eye on turf wars with other medical specialties, anatomic pathology will remain the ground truth for selecting where microarray data should come from. Tissue localization has to come from a pathologist.

As we get to very high-dimensional data sets such as microarrays, one can see that one does not need, for example, the entire sequencing of the human genome; one only needs small parts of the data set to make powerful predictions.

A central business implication of this rich albeit partial information is that our entire concept of health insurance may go away. To the extent health insurance plans pay for health care, insurance relies, as a basic principle, on the unpredictability of risk. However, at some point if your predictive powers become so great that you can predict much of future health, the concept of randomness goes away. Insurance will not work as a business if the only folks who buy it are the ones who know they will get sick. In the insurance industry, this principle is known as “adverse self selection.”

What influence the politics of highly predictable patient-specific health status will have on how we finance future health care is also unknown. However, the extent of information provided by our new diagnostics will definitely have economic implications even if we cannot predict them today.

Let us take the evolution of radiology and look for lessons for pathology. Radiology started in 1895 when Conrad William Roentgen took the first x-ray, that of his wife's hand. Interestingly, he purchased his x-ray tubes from Siemens.

If you look today, 100 years later, at an image of a hand on digital x-ray, there is not that much difference. I understand some fundamental technologies in pathology such as hematoxylin-eosin stains are even older. However, today's multisliced computerized tomography (CT) reconstruction of the arteries of the hand provides quite a different view of the hand, one never before seen noninvasively.

As with all microarray driven “-omics,” imaging now is an incredibly high-dimensional data space. How high-dimensional data spaces are handled is fascinating. When you have 3000 “slices” on a body CT scan, you simply cannot read them on a view box or a mechanical film alternator. This type of high-volume data demands imaging tools. Anatomic pathology handles similar volumes of data if you digitize specimen slides; so you can expect picture archiving, communication, and storage systems to penetrate pathology in time.

Another potential similarity between imaging and pathology is highlighted by the number of ways to diagnose diseases such as coronary artery disease. Do you want to study a patient's heart with an ultrasound stress test or with a nuclear stress test or with a 64 multislice CT or with our latest dual-source CT scanner, or do you use a treadmill?

Today we have software that integrates these separate diagnostic tools into a cardiology workstation, so the cardiologists in a practice, no matter what their background, no matter what their modality, whether invasive (cardiac cath, electrophysiology) or noninvasive (echo, nuclear), have a single shared work environment. I do not believe this exists in pathology today, but I think you are going to see more of these integrated work spaces in the future.

Images have to be heavily manipulated. The Figure shows a dual-source CT scanner image of 2 coronary artery stents. On the right side of the Figure is the data set viewed using volume rendering computer graphics techniques. On the left side of the Figure you have 2 multiplanar reconstructions of the same underlying data set. These multiplanar reconstruction views are colloquially referred to as “spaghetti views” of the coronary artery.

Dual-source (2 perpendicular computerized tomography [CT] beams) cardiac CT image of heart with 2 coronary artery bypass stents—multiplanar images on the left, volume rendered image on the right are shown. By using 2 CT beams within 1 gantry, image acquisition speed is increased to the point at which the heart rate does not need to be slowed by β-blocker medicines thereby decreasing total radiation dose, which increases image resolution

Dual-source (2 perpendicular computerized tomography [CT] beams) cardiac CT image of heart with 2 coronary artery bypass stents—multiplanar images on the left, volume rendered image on the right are shown. By using 2 CT beams within 1 gantry, image acquisition speed is increased to the point at which the heart rate does not need to be slowed by β-blocker medicines thereby decreasing total radiation dose, which increases image resolution

Close modal

Image postprocessing in these 2 images is a way to manage high-dimensional data. Similarly, as we look at DNA microarrays, we are not going to be looking at them as thousands of little bright lights in the night sky, we are going to read them with integrated image processing tools that help us interpret the data.

We will see more computer-aided diagnosis as a byproduct of the digitization of data. Another appeal of an impending merger pathology and radiology imaging is in vivo and in vitro joining together in scale. Imaging has always provided gross anatomic views, but as you get into magnetic resonance imaging and nuclear medicine, you have the ability to look at cellular scale function directly. This has paved the way for merging of technologies, such as positron emission tomography and CT, into single machines. Over time, there is hope that markers used today on an in vitro basis will be usable in vivo in combination with imaging.

Today many positron emission tomography CT scans are ordered looking for metastasis of cancer cells. The positron emission tomography component has the sensitivity to pick up very small numbers of malignant cells while the CT component allows precise localization of these cells. Findings of even a few distant cancer cells allow patients to avoid noncurative surgery. One upcoming example of fusion imaging is positron emission tomography magnetic resonance, which will offer new ways to precisely localize cells active in thought and other neurologic functions.

Imaging algorithms, such as the back projection algorithm for CT or Fourier transforms for magnetic resonance, are mathematically precise. However, most of our clinical world is wildly imprecise. The clinical world is a world of habit and daily activities. There are no algorithms for this. Even in the soft underbelly of clinical workflow, we see new ways to use computing.

Hospital bed turnaround is a canonical clinical workflow and although not a big issue for pathology, it can illustrate how something as simple as discharging a patient and getting somebody new into that bed has lots of dependencies and human handoffs. Our current hospital information technology systems are primitive when it comes to handling complex organizational behaviors. Historically most clinical hospital information technology computing has emphasized 2 things. First, current systems serve as stores for data that you put in and pull out following a paradigm of making the paper chart electronic. They also serve as simple order entry systems with what can be characterized as simple 1-way messaging.

Today, we can expand the limits of medical computing with software known as workflow engines in which you can start modeling clinical processes, put timers on each process, put in exception handling, and if something does not happen the way it should, you can drive backup escalations of those processes. You can visualize these systems as software that takes flowcharts you might draw with Microsoft Visio and that then actually automates the steps on those flowcharts. Today we are seeing early automation of workflow with these workflow engines.

We have discussed a lot about technology at this conference, but I think we should keep in mind that the economics of medicine are about to change radically. American medicine is probably going to be reinserted into a free market economy in which it has not been since the start of Medicare in 1964. I think we are going to be under new price and performance pressures. If you want to get a flavor of what such a world might look like, read Michael Porter's new book Redefining Healthcare.4 He is a well-known Harvard Business School professor with several best selling books on competition. He is now focusing on health care.5 

I would conclude by speculating that advances in computing will drive a merger of in vitro and in vivo pathology and imaging at the molecular level. Tools for high-dimensional data analysis are going to be integrated and their growth will continue to be driven by Moore's Law. Health care information technology will evolve to help us handle labor-intensive clinical workflow environments as well. Personalization of health care will be powered by this rich computer software and hardware.

Carroll
,
M. D.
,
D. A.
Lacher
, and
P. D.
Sorlie
.
et al
.
Trends in serum lipids and lipoproteins of adults, 1960–2002.
JAMA
2005
.
294
:
1773
1781
.
Moore
,
G. E.
Cramming more components onto integrated circuits.
Electronics. 1965;38(8)
.
Watson
,
J. D.
and
F. H.
Crick
.
Molecular structure of nucleic acids; a structure for deoxyribose nucleic acid.
Nature
1953
.
171
:
737
738
.
Porter
,
M. E.
and
E. O.
Teisberg
.
Redefining Health Care: Creating Value-Based Competition on Results.
Boston, Mass: Harvard Business School Press; 2006
.
Porter
,
M. E.
and
E. O.
Teisberg
.
How physicians can change the future of health care.
JAMA
2007
.
297
:
1103
1111
.

Dr Rucker is an employee of Siemens Medical Solutions USA.

Presented at the College of American Pathologists Futurescape of Pathology Conference, Rosemont, Ill, June 9 and 10, 2007.

Author notes

Reprints: Donald W. Rucker, MD, Siemens Medical Solutions USA, Inc, 51 Valley Stream Pkwy, Malvern, PA 19355-1406 (donald.rucker@siemens.com)