Skip to Main Content
When considering the fields of medical devices and medical electronics, it is relatively easy to look back over 100 years and identify those developments that had a major overall impact on individuals and society in general. It is more difficult to do the same for developments during the present time. We can reasonably assess current activities and achievements, but determining their overall impact is a bit more difficult. Assessing future developments and their impact is truly challenging. Further, if we look at the developments of the past and imagine ourselves with this challenge 100 years ago, we know that none of us could have predicted the advances in medical devices and medical electronics that we enjoy today. So why are we taking the challenge of looking to the future? I (M. R. Neuman) asked myself this question many times after I was invited to prepare this document. I finally decided it was well beyond my capabilities, and rather than accepting the blame for incorrect predictions of the future alone, I would invite some friends to join me in this speculation and share the crystal ball with them.
Let us start with the easy part. One hundred years ago, very little in the field of medical devices existed, and medical electronics did not exist at all. Three devices did exist, which not only impacted the practice of medicine at the time, but still represent the basis of important technology in the practice of medicine today. The first is the stethoscope, which was invented in its original form in 1816 by the French physician, René-Théophile-Hyacinthe Laennec . This device was a hollow wooden tube with a funnel-shaped opening at one end that was placed against the patient's skin, with the other end pressed against the physician's ear (Fig. 1). It replaced the physician having to place his (just about all physicians were male at that time) ear on the patient's chest to hear faint heart sounds, which was awkward and also stressful for modest female patients. Today, we accept the modern stethoscope as a symbol of medicine, as well as an established and widely used diagnostic device. Although it looks different from what was invented nearly 200 years ago, little has changed in the basic principle, fundamental structure, and application of this medical device over that time.
The second device that was already in existence 100 years ago, although not for long, was the original medical X-ray imaging device. X-rays were discovered by Wilhelm Conrad Röntgen in 1895 and shown to penetrate the human body, revealing skeletal structure. Within a year this development was applied in clinical medicine, and still is the basis of much of medical imaging today. The imaging machines look quite different, but the principle is essentially the same as it was in Röntgen's time. For his discovery and description of X-rays, Röntgen received the first Nobel Prize in physics in 1901 .
The third existing medical device 100 years ago was the electrocardiograph, a rather complex device for obtaining and displaying a signal referred to as the electrocardiogram (EKG). Although previous investigators had described the underlying biophysical principle, an electrical signal on the body surface that was related to activity of the heart, Willem Einthoven developed a practical device using a string galvanometer to record this signal and to study it on a group of human subjects . His early device was commercialized (Fig. 2) and used for clinical studies and diagnosis just about the time the first PROCEEDINGS OF THE IEEE issue was being published. As was the case with Röntgen, Einthoven was honored with a Nobel Prize for his discovery in 1924.
Keeping these existing advancements in mind, what other medical devices and electronic systems have had major impact on our society, ushering in significant improvements in medical care since the first pages of the PROCEEDINGS appeared? We will examine these from the perspectives of diagnostic, therapeutic, and rehabilitative devices. Although many different technologies and their underlying sciences were involved in the development of medical devices, this paper will focus on those that are related to electrical engineering and the present field of biomedical engineering, since one of the latter's roots was electrical engineering. The IEEE and its predecessor, Institute of Radio Engineers (IRE), played a significant role in the development of this field over the lifetime of the PROCEEDINGS.
Although the three technologies described above were established and used more than 100 years ago, medicine at this time was more of an art than a science. A few devices were beginning to be used in the practice of medicine to get quantitative data, but for the most part medicine was a qualitative art. While there were established hospitals, they were not seen as much as a place to treat and cure diseases as they are today. Times were changing, and new hospitals were being established and recognized as the appropriate institution in which to care for seriously ill patients, even though much of medicine was still practiced in the patient's home. Nevertheless, looking back over this century, many important devices, concepts, and institutions were established that led to the modern medical techniques and devices that we appreciate today.
One of the best examples for the practice of the art of medicine is the previously mentioned stethoscope. This early medical instrument provided the clinician with information in the form of sound waves that were moderately amplified, due to the design of the device. These sound waves, however, still had to be interpreted by the clinician, who had to recognize not only the major sounds but the intricate nuances in timing and low amplitude interstitial sounds between the sound bursts. Thus, the art of using a stethoscope involved being able to interpret the sounds in terms of timing, tone, and intensity. Sounds similar to what a musician does, doesn't it? Indeed, a physician learns to interpret the sounds in a stethoscope in much the same way that a musician interprets the nuances in music that separates a spectacular performance from an ordinary one. Again, like the musician, the clinician must learn to recognize significant sounds by listening to many examples and learning from many patients who have clinical conditions that produce the characteristic sounds.
This is quite different from the approach taken by an electrical engineer, who would try to quantify the art and create a carefully organized, systematic approach to interpreting what the clinician hears. This process would help to make the interpretation of the sounds more objective, but it might lose some of the unspoken aspects of the art of listening to heart sounds, and it would certainly have its limitations with music. The engineer's approach was to quantify and record the sounds and graphically display them on a chart recording or an oscilloscope screen. This was known as the phonocardiogram . Then the clinician could see the temporal relationships between the various components of the sounds and identify sounds known as “murmurs” that could be characteristic of diseases of the heart valves or other parts of the heart. By looking at a plot of the sounds, the engineer and the physician could measure amplitudes of various components of the phonocardiogram as well as their temporal relationships. This also allowed physicians to quantitate differences in heart sounds between patient encounters and to better understand disease progress. Further advances in this technology were made allowing a so-called “voiceprint” of the sounds to be plotted giving a 3-D plot of the sounds as a function of time, amplitude, and frequency components. This further documented and quantified what the physician heard .
Why do we concentrate on this technology in this paper when phonocardiography is not one of the major high-impact medical devices developed over the last 100 years? It is included here because it illustrates the approach that engineers have taken in assisting clinicians to make quantitative patient measurements. A primarily qualitative measurement such as the physician listening to the patient's heart using the stethoscope is considered, and engineers attempt to understand the measurement technique and the processes that the clinician uses to interpret the resulting information. Then, she/he can design an instrument to replicate what the clinician does and to provide more consistent data that can be quantitatively assessed to assist in making the diagnosis. One can take this a step further and even develop computer algorithms to interpret the sounds and suggest diagnoses. Model 3200 of the ubiquitous 3M Littmann electronic stethoscope even includes sound analysis software that is based on wavelets. Thus, in our opinion, this quantitative approach to diagnostic medical instrumentation is one of the major contributions of electrical, biomedical, and other engineers, as well as computer scientists, to developing medical electronic devices over the last 100 years.
A major advancement in physiology over the last 100 years was the understanding of electrical fields and currents associated with mammalian cells, organs, and organisms. Although this work was important in understanding the basic physiology of electrically active cells such as nerve and muscle cells, it also led to the development of electrophysiological instrumentation that is used in the diagnosis of disease. Fig. 3 illustrates some of the signals that can be obtained from the surface of the body that are related to the functioning cells within the body. The most widely used of these is the EKG that has already been discussed in previous paragraphs. Today's modern electrocardiograph is much simpler than Einthoven's original string galvanometer. It provides recordings of the electrical activity of the heart as seen from different directions in the body and allows cardiologists to better understand heart rate, heart rhythm (Heart rhythm is the temporal relationship of a series of heart beats, in terms of rate and morphology), injury currents secondary to damage to the heart muscle, the effects of drugs on the heart, and the size and position of the heart's chambers. This made it possible to quickly diagnose and monitor rhythm disorders of the heart, some of which are life-threatening, to diagnose injury to the heart muscle such as occurs during and following a heart attack, or to identify conditions in which the size of the heart changes such as heart failure, a condition that affects many older individuals.
Two general types of devices can be used for recording and displaying the EKG. The standard electrocardiograph used in physicians' offices, clinics, hospitals, and emergency transport vehicles generally records 1, 3, 5, or 12 leads (electrode combinations to look at the EKG from different directions), the latter being the most common for a short period of time. This device provides an opportunity to diagnose ongoing cardiac conditions. The second type of medical electronic device for observing the EKG is the cardiac monitor. This is used in critical care situations where the patient has the possibility of encountering a rhythm disorder that if not quickly addressed could lead to significant morbidity or death. In this case, the instrument displays the EKG in real time, and it generally has electronic features that identify life-threatening arrhythmias and sounds an alarm so that immediate therapeutic measures can be taken.
Another advancement in medical electronic devices in recent years has been the incorporation of computing hardware and software in the device to assist the clinician in the diagnosis associated with the variable(s) being measured. In the case of the electrocardiograph, this involved a preliminary, or more correctly, suggested diagnosis based on the recorded EKG. It was originally difficult to develop such a computer program, because clinicians often could not agree on the specific diagnosis from electrocardiographic recordings. Although it was relatively straightforward to align a computer with a single clinician, computer agreement with a clinician with a slightly different diagnosis was not as good. It was clear to the people working in this area that they had to first agree to standards for interpreting the EKG, so that programming a computer to carry out this process was a much simpler activity. Today, most clinical electrocardiographs have this capability.
Thus, we see that understanding and applying the EKG has yielded several medical electronic devices over the last 100 years and has played an important role in diagnosis and clinical monitoring of patients with heart disease. The use of the cardiac monitor in critical care medicine has made it possible to identify life-threatening arrhythmias as they occur and to quickly provide therapeutic measures to abort the rhythm disturbance, minimizing the risk of permanent damage to the heart or death.
Physiologists have known for a long time that cells produce electric potentials and fields, and externally applied electric fields and currents can influence the activity of these cells. Thus, another major advancement in biomedical electronic devices was the development of practical functional electrical stimulation of cells, nerves, and muscles. This advancement was best exemplified halfway through the last century by the development of the cardiac pacemaker. It is an electronic device that can be used to treat certain arrhythmias of the heart that prevent it from beating regularly. It provides a brief electric impulse to the heart muscle that causes the cells to contract, eventually producing a ventricular contraction that sends blood to the rest of the body. Early cardiac pacemakers were external devices that were connected to electrodes on the heart through the skin, and they had controls that could adjust the pulses to give amplitudes and durations appropriate for the individual patient. Through the work of the late Wilson Greatbatch pacemakers that could be totally implanted within the body were developed and later commercialized , , . Product improvements over the years have led to the pocket watch-size pacemakers that are used today. Fig. 4 illustrates an early pacemaker and a more current model. Through the use of cardiac pacemakers, millions of patients throughout the world have been able to live a high quality of life well beyond what would have been possible without this device.
Cardiac pacemakers, however, were not the only application of functional electrical stimulation to improve lives. Muscle stimulators were developed in 1960 to treat patients who had a condition known as drop foot following a stroke, whereby they had difficulty dorsiflexing (lifting the ball of the foot) during the swing phase of walking. The newly developed electrical stimulator was controlled by a switch in the patient's shoe that caused the device to stimulate the muscles of dorsiflexion when the patient's foot was lifted from the ground at the beginning of the swing phase . Early devices were externally worn, and did not receive a lot of patient acceptance until many years later when totally implantable devices were developed.
Functional electrical stimulation has been applied to other skeletal muscles for treating paralyzed patients. A major effort over many years at Case Western Reserve University (Cleveland, OH) resulted in devices that can be used to stimulate the nerves leading to paralyzed muscles in patients with spinal cord injuries, to help these patients regain some function of otherwise nonfunctional limbs. This work led to quadriplegic patients gaining some hand function to carry out normal activities of living such as feeding themselves, operating electronic devices, and taking care of some personal hygiene . Work with lower extremity functional electrical stimulation in Slovenia as well as the United States helped paraplegic patients be able to stand up on their own and transfer from a wheelchair to a chair or a bed and to even walk and climb stairs. Clearly this work led to an improvement in the quality of life for these patients. Commercial products for hand control have been marketed , and it is reasonable to expect similar products for upper and lower extremities, which are completely implantable, in the future.
Functional electrical stimulation has also helped patients regain aural and visual sensation through development of the cochlear prosthesis and the visual prosthesis, the latter being currently under development. The former device helps profoundly hearing-impaired individuals to regain some hearing through electrical stimulation of multiple sites along the cochlea in the inner ear. Devices are now commercially available and are routinely implanted. The visual prosthesis, on the other hand, is still in the research and development phase, although a new device is close to being marketed. These devices function by taking an image and using it as the basis for a pattern of electrical stimulation of a miniature array of electrodes implanted over the retina of the eye. This creates a rudimentary image on the retinal cells by stimulation based on an image sensed by a miniature digital camera. Although this technology is years from deployment, it has the potential to help visually impaired individuals to navigate their world and to live lives that are closer to what visually intact individuals can do.
More recently, electrical stimulation of the brain has been used to treat movement disorders such as Parkinson's disease or tremors that cannot be controlled by medication . The technique has also been used with patients with previously untreatable mental illnesses. This method of treatment offers many opportunities for patients to regain normal or nearly normal movement and mental function and be able to return to a higher quality of life.
Another area of medicine with major advances over the last 100 years is medical imaging. Even though, as mentioned above, the application of X-rays in medical diagnosis existed shortly before the PROCEEDINGS began publication, many improvements of this new technology and its application to medicine were made during the PROCEEDINGS' lifetime. All of the advancements would be too numerous to describe in this paper, but two are most notable: the development of the computed tomographic (CT) scanner and magnetic resonance imaging (MRI). Both of these developments led to Nobel prizes for their developers, and both are mainstays of clinical practice today. The introduction of computers made various image reconstruction algorithms possible, allowing tomographic images to be produced from X-ray absorption measurements through the body in many different directions. New, faster, spiral scanning techniques have been introduced, making X-ray imaging the technique of first choice in many clinical situations. The resolution and clarity of these images far surpassed what could be done with planar X-rays. Even though images were obtained from the transverse axis of the body, modern computer technology made it possible to reconstruct images from other planes of the body and even 3-D renderings of internal structures were possible. Physicians were now able to look within the body noninvasively for diagnostic, and in some cases even therapeutic, purposes, and this greatly advanced the clinicians' tools for diagnosis and treatment. The uses of CT and MRI scans were also complementary in that their images emphasized different anatomical structures. CT is used for tissues with great differences in X-ray absorption such as bone and various soft tissues, while MRI emphasizes the soft tissue and shows contrast between tissues of differing water content. Today, MRI has in many countries become the primary technique used throughout the body in the routine diagnosis of many disease processes, replacing and sometimes surpassing CT. MRI has particular advantages in that it is noninvasive, uses nonionizing radiation, and has a high soft-tissue resolution and discrimination. It may also provide both morphological and functional information. The resultant MRI image is based on multiple tissue parameters, any of which can modify tissue contrast. MRI technology not only allows us to view the internal structure of the body, but also the relatively new technique of functional MRI allows us to examine metabolic function as well as tissue structure. Thus, for the first time, we are able to see what parts of the brain are active when a subject is performing a specific task such as moving a limb or engrossed in specific thoughts such as performing a mathematical calculation. Clearly these technologies have made a major contribution to our ability to look inside the body to better understand physiological function, diagnose disease, and assess the effectiveness of treatment.
Other imaging technologies have also contributed to our ability to look inside the body and understand its structure and function. One that is notable for its contributions to dynamic observations of soft tissues in the body is ultrasonic imaging. Pulse-echo high-frequency sound was introduced in the 1960s, first to detect midline shift caused by intracranial bleeding in head injury. Obstetricians adopted the technique, using fetal head measurement as a way of dating pregnancies. (Much of the early work was done by Ian Donald and colleagues working in Glasgow, U.K., where they made use of the ultrasonic flaw detectors used by welders in the local shipyards.) Position sensors were later added to the transducers enabling 2-D cross-sectional images to be produced.
Various techniques, such as rotating or rocking the transducers, were introduced to produce real-time moving images of the fetus. Recent developments using multisensor ultrasonic probes allow real-time dynamic images to be produced, and with the aid of computer image processing, 3-D images of internal structures including the unborn fetus are possible. This technology has also made it possible to dynamically observe the performance of all four heart valves in real time. Transducers were also pulsed in the Doppler mode to obtain information on blood flow. Combining the two enabled both structure and function of the heart to be determined. Echocardiography is now the technique of choice when examining the mechanical characteristics of the heart, replacing a range of earlier instrumentation techniques such as phonocardiography, ballistocardiography, and carotid pulse analysis.
In nuclear medicine, imaging radionuclides are combined with other elements to form chemical compounds, or combined with existing pharmaceutical compounds to form radiopharmaceuticals. These radiopharmaceuticals, once administered to the patient, can localize to specific organs or cellular receptors. This property of radiopharmaceuticals gives nuclear medicine the ability to image the extent of a disease process in the body, based on the cellular function and physiology. This is in contrast to the physical changes in the tissue used by other imaging modalities.
In the future, nuclear medicine may provide added impetus to the field known as molecular medicine. As our understanding of biological processes in the cells of a living organism expands, specific probes can be developed to allow visualization, characterization, and quantification of biologic processes at the cellular and subcellular levels. Nuclear imaging is an ideal specialty to adapt to the new discipline of molecular medicine, because of its emphasis on function and its utilization of imaging agents that are specific for a particular disease process.
The development of miniature camera technology has enabled a wide range of endoscopic devices to be produced, allowing physician to visualize hitherto inaccessible areas of the gut, abdominal cavity, reproductive organs, and urinary tract. Earlier endoscopic devices used fiber-optic cables plugged into external cameras to generate their images. Now miniature video cameras can be mounted at the tip of the device allowing more flexible and versatile instruments to be manufactured. As a large number of endoscopic procedures are carried out every day, device cleaning and sterilization for reuse are issues. New techniques for decontamination are required to reduce the turnaround time for the device. A newer technique, sometimes referred to as capsule endoscopy, consists of a capsule containing a video camera and a radio transmitter that passes freely through the gut, transmitting color images to an external recording device as it does so .
We have seen that the past 100 years have greatly enhanced our ways to look inside the body and dynamically observe its function. Röntgen's original X-ray techniques have evolved significantly over 100 years; yet concerns related to these technologies continue. It is understood that ionizing radiation can lead to DNA damage that can cause cancer. It is still controversial as to whether the exposure from ionizing radiation from some imaging devices is a minor or serious problem, yet efforts are underway to minimize the radiation required for imaging studies. Several years ago, a similar controversy regarding ultrasound used to image the fetus existed, especially when this technology was used early in pregnancy. Although no studies have definitively demonstrated fetal injury secondary to ultrasound exposure, most clinicians still take a conservative view and minimize this ultrasonic radiation exposure to the fetus.
The advent of digital image processing has generated a large amount of data from various modern imaging technologies. Clinicians need to have rapid access to this information and be able to compare images of the same subject using different imaging modalities. A recent development, and one that has great scope for the future, is the integration of images from the various modalities mentioned above. A picture archiving and communication system (PACS) is a medical imaging technology that provides economical storage of, and convenient access to, images from multiple modalities. Electronic images and reports are transmitted digitally via PACS, eliminating the need to manually file, retrieve, or transport film. The universal format for PACS image storage and transfer is Digital Imaging and Communications in Medicine (DICOM). Nonimage data, such as scanned documents, may also be incorporated. A PACS consists of four major components: the imaging modalities such as X-ray CT and MRI, a secured network for the transmission of patient information, workstations for interpreting and reviewing images, and archives for the storage and retrieval of images and reports. A full PACS should provide a single point of access for images and their associated data. That is, it should support all digital modalities in all departments throughout the enterprise.
Although bioelectricity and imaging techniques as described above have contributed to healthcare at levels appropriate for Nobel prizes, many other medical devices, technologies, and techniques have made important contributions over the last 100 years.
Advances in surgery began with the introduction of anesthesia in the 19th century. Instrumentation for the administration of anesthetic agents and monitoring of surgical patients under anesthesia, and many surgical techniques have changed during the 20th century to the benefit of surgeons and patients alike. During the first half of the last century the anesthesiologist had fairly minimal instrumentation to help him/her oversee their tasks. Today, in the operating room during a procedure, an anesthesiologist looks more like a computer operator than a physician. Modern surgical instrumentation is centered around many different sensors and instruments that only a computer acquisition and interpretation system could assemble for the anesthesiologist to oversee the patient's condition.
Modern operating room technologies started with the use of electrocautery, the application of radio-frequency currents to seal bleeding vessels and make bloodless cuts in tissue. The early devices generated the radio-frequency currents using a spark gap, not so different from the techniques used in early radio communication. It was found that this technology did improve the surgical outcomes for many different kinds of operations and helped to speed up the time required for various surgical procedures. From the perspective of the patients, however, they probably cared most about the use of endoscopy tools as developed in the second half of the 20th century, since they allowed minimally invasive surgery to be performed. These tools gave surgeons the option to perform surgery by creating only small incisions, which in turn reduced patient discomfort and minimized recovery time following the surgical procedure. Later, catheters joined the surgical tool chest of interventional cardiologists, allowing them to use the patient's blood vessels as access ports to internal organs. As a result of these tools, many cardiovascular procedures, such as coronary artery bypass surgery or device implantations, including recently developed artificial heart valve implants , are conducted today using minimally invasive techniques rather than open chest procedures. This has resulted in a significant improvement in survival following critical procedures such as cardiac surgery. New technologies have required surgical device manufacturers to develop new tools that can be applied through the endoscopic devices and develop new skills to use these tools through the endoscopes, often with limited feel and vision, two of the important senses required in surgery. One surgical colleague once commented that endoscopic surgery is “operating through a keyhole,” yet the procedures have been widely accepted and adopted by both surgeons and patients.
Surgery evolved from having the surgeon directly operate through the endoscope to doing the procedures using robotic tools controlled by the surgeon. Visual imaging devices were also incorporated into the endoscope so that the surgeon could observe what she/he was controlling the robots to do using video technology. Now surgery became more like a video game, which required further advances in surgical skills. Today, devices on the market such as the da Vinci surgical robot put the surgeon in a corner of the operating room, looking at 3-D images of the surgical field and remotely operating robotic tools (Fig. 5) . It is not much of a stretch of the imagination to have the surgeon located even farther from the patient and connected to the surgical robot through a secure, reliable Internet connection. Presumably an expert surgeon in one part of the world could be able to operate on a patient in another part of the world with the same ease as if she/he was in the same room.
An important part of any medical instrumentation system is the sensor. This device is responsible for interfacing between the biologic organism and the electronic system that processes the data. Typical sensors must collect the physiological data by interacting with the biological structure without changing its properties or behavior. This is not a simple task, but the use of microfabrication technology has helped to move closer to this goal. Sensors are miniaturized so that their physical effect on the biologic system is also miniaturized, thereby minimizing disruption of that system. In recent years, the application of microelectromechanical system (MEMS) technology for biomedical sensors and instrumentation has helped to further reduce the size while increasing the complexity of biomedical sensing devices. This is particularly true for the diagnostic laboratory, or more correctly, the movement of the diagnostic laboratory to the patient's bedside. So-called “lab on a chip” devices have allowed chemical analysis of very small biomedical samples and even individual cells.
Even with these improvements in the biomedical sensors themselves, one still has to worry about connections between the sensors and signal processing and readout devices. This problem has been addressed through the use of wireless body sensor networks, which include special features such as fault tolerance and redundancy to provide reliable systems. Today, sensing devices can be incorporated directly into clothing or other wearable structures so that the individual being monitored does not have to take any special precautions in terms of applying the sensing devices other than putting on an apparently normal piece of clothing that contains the wireless sensor nodes .
Several years ago, the American Institute for Medical and Biological Engineering (AIMBE) had its members vote on what they thought were the most important advances in medical and biological engineering. This resulted in the technologies with the highest number of votes being admitted to the AIMBE Hall of Fame (http://www.aimbe.org/aimbe-programs/aimbe-hall-of-fame/). Although it is not possible to discuss all of these in this paper, they have been summarized in Table 1. It is important to note that many of these have already been mentioned in this paper and others go beyond medical devices and medical electronics. Nevertheless, these techniques and technologies would not have been possible without medical devices and the advances that they made possible.
Perhaps one of the most important advances in medical devices and medical electronics over the last 100 years is not a device or an electronic system. Instead, it is the recognition of biomedical engineering as an engineering discipline and the development of educational programs that lead to academic degrees in biomedical engineering. In addition, professional societies devoted to the new field of medical electronics (it was not referred to as biomedical engineering at that time) were established. One of the earliest was the IRE Professional Group on Medical Electronics that was organized in 1952 and was the predecessor of the present IEEE Engineering in Medicine and Biology Society. This was followed in 1959 by the establishment of the International Federation for Medical Electronics and Biological Engineering that several years later became the International Federation for Medical and Biological Engineering (IFMBE). As the field continued to advance, additional societies were formed all over the world for the exchange of ideas and to promote this new area of what was then primarily a subdiscipline of electrical engineering. It soon became clear that the application of engineering to problems in medicine and biology was more than just medical electronics, and the range of activities covered by these organizations was broadened to what we know today as biomedical engineering and bioengineering.
Another major step forward for the field was taken in late 2000 when the National Institute of Biomedical Imaging and Bioengineering was established as one of the National Institutes of Health (NIH) in the United States. This Institute, as with others at NIH, has internal research programs and supported biomedical engineering research in the United States and other parts of the world. This new home for biomedical engineering further helped to stimulate the growth of the discipline.
All the authors agreed on the futility of predicting what will happen in medical devices and medical electronics in the next 100 years. Clearly, this is a task that is well beyond any of our abilities. Imagine if Professors Röntgen or Einthoven were asked to do the same 100 years ago. Would they have been able to predict anything described in this paper? After all, they were Nobel laureates; they should be in the best position to predict what we now know. It is likely that they were wise enough not to try this exercise, yet some of us have decided to stick our necks out and predict what we see in the future for medical devices and electronics. Fortunately, most of us will not be around to see how wrong we were. Two of us proposed that to research this part of the paper we should attend a science fiction authors' symposium, or better still have them write this part of the paper rather than rely on our own imaginations. With this in mind, let us think a little about the future.
A good place to start is to look at what others think are important problems to solve in the field of biomedical engineering. In 2009, the U.S. National Academy of Engineering held a meeting with input from people all over the world to identify the major technologic challenges of the 21st century and to encourage engineers to come up with solutions. Their final list contained 14 challenges, and three of these were related to biomedical engineering. These grand challenges, as they became known, were: 1) to reverse engineer the brain; 2) to advance health informatics; and 3) to engineer better medicines. We know that biomedical engineers as well as other professionals are already addressing these grand challenges and that we are likely to see important solutions to these problems in the not too distant future.
Understanding the brain, not only in terms of its structure but also its function, can lead to many important advances in neuroscience, as well as practical applications in engineering. Just as artificial neural networks have adopted some of the ways that cultured and in vivo real neurons behave, understanding the functional activity of the brain can lead to improved approaches toward the parallel processing of data and possibly new approaches to computing. These technological advances will be important, but let us not forget that by understanding how the brain works we also have the opportunity to understand when it dysfunctions and to, perhaps, find ways to correct this dysfunction. Thus, medical and scientific benefits, as well as technologic benefits, may result from such work.
The second grand challenge to advance health informatics systems is already underway in many parts of the world. Initiatives to establish electronic health records hold the promise of providing important health information regarding patients wherever it might be needed. A major challenge is how to provide this information in a timely fashion anywhere in the world. Health information systems that currently exist have difficulty interacting with one another in computer language let alone spoken language. If such a system is to be useful worldwide, these challenges must be overcome.
The third grand challenge to engineer better medicines will be very important as we enter the era of personalized medicine. It is well known that individuals respond differently to pathogenic agents and medications, yet we currently prescribe medications based upon the average behavior of a group of patients. By understanding an individual patient and their need for and response to the medication, it will be possible to provide a medication that is most appropriate for that patient's condition. In the future, drugs and other treatment approaches will be engineered to meet a specific individual's needs while minimizing side effects. In other words, drugs will be engineered to the client's own specifications. It is clear that economic methods to specifically characterize the individual will be needed to do this.
Since work is already progressing on these and the other 11 grand challenges, it is expected that these will be met in the relatively short term.
Developers of future medical devices will need to be acutely aware of costs associated with their technology. Today, medical costs are continuing to increase to the point where the expense of medical care will soon be no longer sustainable. This is especially true in the United States where healthcare costs are expected to exceed $2.7 trillion this year according to Medicare's Office of the Actuary. This comes to about $8650 per capita which is nearly double the per capita spending of the next highest country, Norway . If our goal for medical devices is to reduce the cost of healthcare, one can start by looking at the points where the cost is highest. Cost categories for medical service providers are similar to those of any other service provider, which are the labor, capital, and supplies. The bureaucracy associated with payment and reimbursement practices in many parts of the world is excessive, and more efficient systems will hopefully evolve in the future thereby saving significant costs.
Expected changes for healthcare workers may be dictated by technological advancements in the areas of information technology as well as automation and robotics. Information technology is already enabling massive changes in medicine by introducing the electronic medical record and drug interaction databases. An example of computer automation came from IBM, which has recently announced that it wants to apply its artificial intelligence computer system, Watson, which is capable of answering questions posed in natural language, to help clinicians. This computer, which recently played “Jeopardy” and won, can quickly survey all written knowledge on a subject to answer a question, so it would help the physician by providing instant access to all documented medical knowledge. Although one might be skeptical about having the computer make a definitive diagnosis, it could be a big help in making clinical decisions. However, a major change in the future is likely to come from the use of robotics technology. For example, prototype robots are already being developed as caregivers for children and the elderly . The use of robots for surgery is also being explored as mentioned above, and is likely to expand further in the future. These labor trends may mimic the ones seen in other industries, such as the software development industry, where the talented labor could be anywhere in the world where Internet access exists. Today, many medical institutions do outsource tasks such as the study of pathology slides and reading of radiographs to experts working at overseas locations to take advantage of cost savings they offer and the time zone difference. Pap smears are classified for cervical cancer using an automated imaging system that relies on fuzzy models . In the future, one may see a doctor via Skype, or undergo surgery by a robot which is driven by a surgeon who is living and working in a different part of the world.
Further, use of information management and analysis will be important in future. A groups in the Oxford, U.K., is looking at data fusion for early warning of patient deterioration in the emergency department . In August 2007, the National Institute for Clinical Health and Excellence (NICE) in the U.K. published a set of guidelines which recommend that physiological “track-and-trigger” systems (based on scores assigned to each of the vital signs regularly observed by the nursing staff) should be used to monitor all adults in acute hospital settings, including patients in the Emergency Department.
The Oxford group has now completed a 500-patient trial of the use of the track-and-trigger system in the John Radcliffe Hospital Emergency Department (Oxford, U.K.). They have reviewed the paper track-and-trigger charts completed for these patients by the nursing staff and analyzed the continuous vital sign data generated by the bedside monitors to which the patients were connected. This has caused a reevaluation on how patient deterioration may be optimally identified in this setting. The aim is to optimize patient monitoring in the Emergency Department, using a combination of an electronic system for the regular entry of nurse-validated observations (VitalPAC) and a real-time data fusion model (Visensia) to provide early warning if a patient deteriorates before the next set of observations is due. This is currently one of the most active research groups in the United Kingdom .
The capital equipment used in clinics is also likely to see much advancement during the 21st century. Imaging modalities offering increased resolution and automated diagnostic features are already on the way. One can also expect to see some smaller therapeutic devices deployed at home, as the elderly population grows larger and the public recognizes the economic and quality-of-life advantages of home-based care. For example, in Japan, the government recently announced the plans for the formation of 2000 centers around the country to support home healthcare services (http://e.nikkei.com/e/app/fr/gateway/rss_news.aspx?URL=/e/ac/tnks/Nni20120129D2901F05.htm).
Most impressive changes in the next 100 years are likely to be in the area of consumables. For many centuries, medicine was more palliative than curative. However, in the 20th century, more curative therapies began to appear. Today, technology can provide patients with intraocular lenses, insulin pumps, cardiac pacemakers, replacement joints, and much more. In the future, more of these devices will likely treat more medical conditions. For example, a total heart replacement may be available within the next 40 years. Although today the left ventricular assist devices are used only temporarily in the United States, they have been used on patients for many years in Japan where there are not as many heart transplants. Given the fact that there are five million patients with heart failure in the United States, one can see the medical need and the market opportunity for such devices rather easily. Another area that is likely to see rapid development is the technology for the treatment of kidney disease using novel materials and artificial organs. As societal concerns over public health and healthcare budgets grow, additional focus will be placed on the preventable conditions such as obesity and diabetes. A good example is the reduction in heart disease during the last two decades as a result of improved treatment methods and promotion of a healthier lifestyle. One can expect similar improvements for obesity and type 2 diabetes, which seem to go hand in hand for many individuals. Multiple technologies are being developed for the treatment of obesity, ranging from devices that limit the food intake or the absorption of high calorie items to stimulators that send signals to the brain to indicate that the patient has eaten enough. All of these are likely to come from the efforts of the engineers and scientists who are studying the pathology of disease and working on the miniaturization of devices, as well as from the development and application of novel materials.
Another approach to reducing healthcare costs will be to look at medical devices and the expenses associated with them. Today, new models of devices such as ventilators perform the same basic functions as previous models, but have new features that, although they are convenient for the therapist or consist of modes of operation that are only infrequently used, are really not essential for ventilating the patient. One way to reduce costs of this type of medical device is to be satisfied with current models that carry out the basic functions and avoid having to replace devices every time a new model with additional features appears on the market. Affordable medical devices that are more effective and less expensive can be possible in the future by returning those devices to their basic functions. The production of lower specification devices for all, rather than higher specification devices for few, may need to be refocused. Engineering students at Michigan Technological University (Houghton, MI) are developing an inexpensive ventilator that only carries out basic ventilation functions and avoids most of the features of hospital ventilators in use today. By doing this, the cost of the ventilator can be greatly reduced and hospitals can afford to store backup ventilators for use in times of pandemic or man-made or natural disasters, when not enough full-function ventilators are available to meet patient needs.
Improved therapeutic methods will also be seen in coming years. New methods of drug delivery that are more natural will be developed compared to today's bolus approach. Drug delivery mechanisms that can regulate drug dosage as a function of need will be commonplace, whether they are a complete implanted system such as an artificial pancreas or miniature MEMS devices that can be taken orally, sense the need for a particular medication, and appropriately dispense it. Improved methods of tissue engineering and regenerative medicine may help to eliminate the need for certain drugs or hormonal replacement therapy. Replacement organs may be grown using a patient's own stem cells that can be programmed to produce a natural organ or to create a construct that will carry out the function of the organ it was replacing, but not necessarily be anatomically similar.
Rehabilitation from injury, worn-out organs, or other damaged internal structures will be greatly improved over currently used methods. Methods to regrow central as well as peripheral nerves will be understood, which will lead to regeneration and reconnection of injured or damaged spinal cord and brain cells. Diseases such as Parkinson's disease, dementia, essential tremor, and stroke will be able to be reversed through tissue engineering of nerve cells, and new techniques will allow high-resolution interfacing with the nervous system. Such interfaces will be used to operate orthotic and prosthetic appliances, but also will allow individuals to communicate directly with information processing systems. Although this sounds more like science fiction today, beginning research in these areas is already underway and producing preliminary results .
Up to this point, we have looked at the future in terms of devices and techniques to address significant medical problems. But significant social problems associated with medicine also need to be addressed in the future. We have already touched on one of these related to healthcare costs, but other healthcare issues that medical devices and technology may be a part of can lead to structures and systems that create solutions. For example, when one looks at the world in general and even specific parts of it such as the United States, one sees disparities in healthcare. Too many people are denied access to the healthcare system due to poverty, transportation, rural locations, education, and religious beliefs. Our society needs to find ways to remove these disparities and to make healthcare available to all who need and desire treatment. We need to worry about providing healthcare expertise to places where it is currently unavailable, be it a remote village in rural United States or one in the developing world. We have already demonstrated that we have communications technologies that can reach just about everywhere on Earth. Mobile telephone and satellite communications networks can be used in the future to assist people who do not ordinarily have access to healthcare. They will obtain it either through telemedicine or procedures where an expert can make use of robotic technologies to assist in the examination and diagnosis of patients who are a great distance away. Our future is likely to include more extensive travel beyond the Earth, and this technology will be important for future space explorers and settlers.
These views of the future are based on the present and are really not much more than an extension of what is already being done. It is likely that these problems will be addressed or abandoned in the not too distant future, but what will the 200th anniversary issue of the PROCEEDINGS OF THE IEEE have on its pages (or more likely chips) that is entirely different from what we see today? Will mankind be part robot and part human as some science fiction writers suggest? Will all disease be eliminated and the life expectancy doubled as it was over the last 100 years? Will aggression and war be antiquated terms relegated to history books? Who can say? The best we can do is watch the progress that is being made as we move into the next 100 years and extrapolate from that. Nevertheless, the farther out we extrapolate the greater the error becomes, and extrapolation never leads to highly innovative new approaches. This can only be done by extremely creative individuals who dare to think beyond the norms of science and society. Fortunately, there have been many of these individuals in the past, and there is no reason to think that they will not be present in the future as well.
The PROCEEDINGS OF THE IEEE has guided us through the last 100 years and chronicled many of the advancements of mankind in engineering and science. We look forward for this great tradition to continue and to having a similar group of authors looking back over the field 100 years from now and saying as we say, “it has been a great ride, but the road continues on; and there is an even greater ride ahead.”
M. R. Neuman is with the Department of Biomedical Engineering, Michigan Technological University, Houghton, MI 49931-1295 USA (e-mail: firstname.lastname@example.org).
G. D. Baura is with the Keck Graduate Institute of Applied Life Sciences, Claremont, CA 91711 USA (e-mail: email@example.com).
S. Meldrum was with Norfolk and Norwich University Hospital. He is now a Consultant Clinical Engineer, Norwich, NR4 6AS, U.K. (e-mail: firstname.lastname@example.org).
O. Soykan is with Medtronic, Inc., Minneapolis, MN 5543-5604 USA (e-mail: Orhan.Soykan@Medtronic.Com).
M. E. Valentinuzzi is with the Institute of Biomedical Engineering, University of Buenos Aires, Buenos Aires C1053ABJ, Argentina (e-mail: email@example.com).
R. S. Leder is with the Universidad Nacional Autonoma de Mexico, Mexico City 04510, Mexico (e-mail: firstname.lastname@example.org).
S. Micera is with the Scuola Superiore Sant'Anna, Pisa 56127, Italy (e-mail: email@example.com).
Y.-T. Zhang is with the Chinese University of Hong Kong, Shatin, NT, Hong Kong (e-mail: firstname.lastname@example.org).
Back to Top