Fluoroscopy


Fluoroscopy is an imaging technique that uses X-rays to obtain real-time moving images of the interior of an object. In its primary application of medical imaging, a fluoroscope allows a physician to see the internal structure and function of a patient, so that the pumping action of the heart or the motion of swallowing, for example, can be watched. This is useful for both diagnosis and therapy and occurs in general radiology, interventional radiology, and image-guided surgery. In its simplest form, a fluoroscope consists of an X-ray source and a fluorescent screen, between which a patient is placed. However, since the 1950s most fluoroscopes have included X-ray image intensifiers and cameras as well, to improve the image's visibility and make it available on a remote display screen. For many decades fluoroscopy tended to produce live pictures that were not recorded, but since the 1960s, as technology improved, recording and playback became the norm.
Fluoroscopy is similar to radiography and X-ray computed tomography in that it generates images using X-rays. The original difference was that radiography fixed still images on film whereas fluoroscopy provided live moving pictures that were not stored. However, today radiography, CT, and fluoroscopy are all digital imaging modes with image analysis software and data storage and retrieval.
The use of X-rays, a form of ionizing radiation, requires the potential risks from a procedure to be carefully balanced with the benefits of the procedure to the patient. Because the patient must be exposed to a continuous source of X-rays instead of a momentary pulse, a fluoroscopy procedure generally subjects a patient to a higher absorbed dose of radiation than an ordinary radiograph. Only important applications such as health care, bodily safety, food safety, nondestructive testing, and scientific research meet the risk-benefit threshold for use. In the first half of the 20th century, shoe-fitting fluoroscopes were used in shoe stores, but their use was discontinued because it is no longer considered acceptable to use radiation exposure, however small the dose, for nonessential purposes. Much research has been directed toward reducing radiation exposure, and recent advances in fluoroscopy technology such as digital image processing and flat panel detectors, have resulted in much lower radiation doses than former procedures.
Fluoroscopy is also used in airport security scanners to check for hidden weapons or bombs. These machines use lower doses of radiation than medical fluoroscopy. The reason for higher doses in medical applications is that they are more demanding about tissue contrast, and for the same reason they sometimes require contrast media.

Mechanism of action

can be seen by the naked eye, but it does not penetrate most objects. In contrast, X-rays can penetrate a wider variety of objects, but they are invisible to the naked eye. To take advantage of the penetration for image-forming purposes, one must somehow convert the X-rays' intensity variations into a form that is visible. Classic film-based radiography achieves this by the variable chemical changes that the X-rays induce in the film, and classic fluoroscopy achieves it by fluorescence, in which certain materials convert X-ray energy into visible light. This use of fluorescent materials to make a is how fluoroscopy got its name.
As the X-rays pass through the patient, they are attenuated by varying amounts as they pass through or reflect off the different tissues of the body, casting an X-ray shadow of the radiopaque tissues on the fluorescent screen. Images on the screen are produced as the unattenuated or mildly attenuated X-rays from radiolucent tissues interact with atoms in the screen through the photoelectric effect, giving their energy to the electrons. While much of the energy given to the electrons is dissipated as heat, a fraction of it is given off as visible light.
Early radiologists would adapt their eyes to view the dim fluoroscopic images by sitting in darkened rooms, or by wearing red adaptation goggles. After the development of X-ray image intensifiers, the images were bright enough to see without goggles under normal ambient light.
Nowadays, in all forms of digital X-ray imaging the conversion of X-ray energy into visible light can be achieved by the same types of electronic sensors, such as flat panel detectors, which convert the X-ray energy into electrical signals, small bursts of current that convey information that a computer can analyze, store, and output as images. As fluorescence is a special case of luminescence, digital X-ray imaging is conceptually similar to digital gamma ray imaging in that in both of these imaging mode families, the information conveyed by the variable attenuation of invisible electromagnetic radiation as it passes through tissues with various radiodensities is converted by an electronic sensor into an electric signal that is processed by a computer and made output as a visible-light image.

History

Early era

Fluoroscopy's origins and radiography's origins can both be traced back to 8 November 1895, when Wilhelm Röntgen, or in English script Roentgen, noticed a barium platinocyanide screen fluorescing as a result of being exposed to what he would later call X-rays. Within months of this discovery, the first crude fluoroscopes were created. These experimental fluoroscopes were simply thin cardboard screens that had been coated on the inside with a layer of fluorescent metal salt, attached to a funnel-shaped cardboard eyeshade which excluded room light with a viewing eyepiece which the user held up to his eye. The fluoroscopic image obtained in this way was quite faint. Even when finally improved and commercially introduced for diagnostic imaging, the limited light produced from the fluorescent screens of the earliest commercial scopes necessitated that a radiologist sit for a period in the darkened room where the imaging procedure was to be performed, to first accustom his eyes to increase their sensitivity to perceive the faint image. The placement of the radiologist behind the screen also resulted in significant dosing of the radiologist.
In the late 1890s, Thomas Edison began investigating materials for ability to fluoresce when X-rayed, and by the turn of the century he had invented a fluoroscope with sufficient image intensity to be commercialized. Edison had quickly discovered that calcium tungstate screens produced brighter images. Edison, however, abandoned his researches in 1903 because of the health hazards that accompanied use of these early devices. Clarence Dally, a glass blower of lab equipment and tubes at Edison’s laboratory was repeatedly exposed, suffering radiation poisoning, later succumbing to an aggressive cancer. Edison himself damaged an eye in testing these early fluoroscopes.
During this infant commercial development, many incorrectly predicted that the moving images of fluoroscopy would completely replace roentgenographs, but the then superior diagnostic quality of the roentgenograph and their already alluded-to safety enhancement of lower radiation dose via shorter exposure prevented this from occurring. Another factor was that plain films inherently offered recording of the image in a simple and inexpensive way, whereas recording and playback of fluoroscopy remained a more complex and expensive proposition for decades to come.
Red adaptation goggles were developed by Wilhelm Trendelenburg in 1916 to address the problem of dark adaptation of the eyes, previously studied by Antoine Beclere. The resulting red light from the goggles' filtration correctly sensitized the physician's eyes prior to the procedure, while still allowing him to receive enough light to function normally.
More trivial uses of the technology also appeared in the 1920s–1950s, including a shoe-fitting fluoroscope used at shoe stores. Concerns regarding the impact of frequent or poorly-controlled use were expressed in the 1950s, leading to new guidelines, regulations and ultimately the practice's end by the early 1960s.
They are no longer used because the radiation exposure risk outweighs the trivial benefit. Only important applications such as health care, bodily safety, food safety, nondestructive testing, and scientific research meet the risk-benefit threshold for use.

Analog electronic era

revolutionized fluoroscopy. The development of the X-ray image intensifier by Westinghouse in the late 1940s in combination with closed circuit TV cameras of the 1950s allowed for brighter pictures and better radiation protection. The red adaptation goggles became obsolete as image intensifiers allowed the light produced by the fluorescent screen to be amplified and made visible in a lighted room. The addition of the camera enabled viewing of the image on a monitor, allowing a radiologist to view the images in a separate room away from the risk of radiation exposure. The commercialization of video tape recorders beginning in 1956 allowed the TV images to be recorded and played back at will.

Digital electronic era

were applied to fluoroscopy beginning in the early 1960s, when Frederick G. Weighart and James F. McNulty at Automation Industries, Inc., then, in El Segundo, California produced on a fluoroscope the world’s first image to be digitally generated in real-time, while developing a later commercialized portable apparatus for the onboard nondestructive testing of naval aircraft. Square wave signals were detected on a fluorescent screen to create the image.
From the late 1980s onward, digital imaging technology was reintroduced to fluoroscopy after development of improved detector systems. Modern improvements in screen phosphors, digital image processing, image analysis, and flat panel detectors have allowed for increased image quality while minimizing the radiation dose to the patient. Modern fluoroscopes use caesium iodide screens and produce noise-limited images, ensuring that the minimal radiation dose results while still obtaining images of acceptable quality.

Etymology

Many names exist in the medical literature for moving pictures taken with X-rays. They include fluoroscopy, fluorography, cinefluorography, photofluorography, fluororadiography, kymography, cineradiography, videofluorography, and videofluoroscopy. Today the word fluoroscopy is widely understood to be a hypernym of all the aforementioned terms, which explains why it is the most commonly used and why the others are declining in usage. The profusion of names is an idiomatic artifact of technological change, as follows:
As soon as X-rays were discovered in the 1890s, both looking and recording were pursued. Both live moving images and recorded still images were available from the very beginning with simple equipment; thus, both "looking with a fluorescent screen" and "recording/engraving with radiation" were immediately named with New Latin words—both words are attested since 1896.
But the quest for recorded moving images was a more complex challenge. In the 1890s, moving pictures of any kind were emerging technologies. Because the word photography was long since established as connoting a still-image medium, the word cinematography was coined for the new medium of visible-light moving pictures. Soon several new words were coined for achieving moving radiographic pictures. This was often done either by filming a simple fluoroscopic screen with a movie camera or by taking serial radiographs rapidly to serve as the frames in a movie. Either way, the resulting film reel could be displayed by a movie projector. Another group of techniques were various kinds of kymography, whose common theme was capturing recordings in a series of moments, with a concept similar to movie film although not necessarily with movie-type playback; rather, the sequential images would be compared frame by frame. Thus electrokymography and roentgenkymography were among the early ways to record images from a simple fluoroscopic screen.
Television also was under early development during these decades, but even after commercial TV began widespread adoption after World War II, it remained a live-only medium for a time. In the mid-1950s, a commercialized ability to capture the moving pictures of television onto magnetic tape was developed. This soon led to the addition of the prefix to the words fluorography and fluoroscopy, with the words videofluorography and videofluoroscopy attested since 1960. In the 1970s, video tape moved from TV studios and medical imaging into the consumer market with home video via VHS and Betamax, and those formats were also incorporated into medical video equipment.
Thus, over time the cameras and recording media for fluoroscopic imaging have progressed as follows. The original kind of fluoroscopy, and the common kind for its first half century of existence, simply used none, because for most diagnosis and treatment, they weren't essential. For those investigations that needed to be transmitted or recorded, movie cameras using film were the medium. In the 1950s, analog electronic video cameras appeared. Since the 1990s, there have been digital video cameras, flat panel detectors, and storage of data to local servers or secure cloud servers. Late-model fluoroscopes all use digital image processing and image analysis software, which not only helps to produce optimal image clarity and contrast but also allows that result with a minimal radiation dose.
Whereas the word cine in general usage refers to cinema or to certain film formats for recording such a movie, in medical usage it refers to cineradiography or, in recent decades, to any digital imaging mode that produces cine-like moving images. Cineradiography records 30-frame-per-second fluoroscopic images of internal organs such as the heart taken during injection of contrast dye to better visualize regions of stenosis, or to record motility in the body's gastrointestinal tract. The predigital technology is being replaced with digital imaging systems. Some of these decrease the frame rate but also decrease the absorbed dose of radiation to the patient. As they improve, frame rates will likely increase.
Today, owing to technological convergence, the word fluoroscopy is widely understood to be a hypernym of all the earlier names for moving pictures taken with X-rays, both live and recorded. Also owing to technological convergence, radiography, CT, and fluoroscopy are now all digital imaging modes using X-rays with image analysis software and easy data storage and retrieval. Just as movies, TV, and web videos are to a substantive extent no longer separate technologies but only variations on common underlying digital themes, so too are the X-ray imaging modes. And indeed, the term X-ray imaging is the ultimate hypernym that unites all of them, even subsuming both fluoroscopy and four-dimensional CT . However, it may be many decades before the earlier hyponyms fall into disuse, not least because the day when 4D CT displaces all earlier forms of moving X-ray imaging may yet be distant.

Risks

Because fluoroscopy involves the use of X-rays, a form of ionizing radiation, fluoroscopic procedures pose a potential for increasing the patient's risk of radiation-induced cancer. Radiation doses to the patient depend greatly on the size of the patient as well as length of the procedure, with typical skin dose rates quoted as 20–50 mGy/min. Exposure times vary depending on the procedure being performed, but procedure times up to 75 minutes have been documented. Because of the long length of procedures, in addition to the cancer risk and other stochastic radiation effects, deterministic radiation effects have also been observed ranging from mild erythema, equivalent of a sun burn, to more serious burns.
A study of radiation induced skin injuries was performed in 1994 by the Food and Drug Administration followed by an advisory to minimize further fluoroscopy-induced injuries. The problem of radiation injuries due to fluoroscopy has been further addressed in review articles in 2000 and 2010.
While deterministic radiation effects are a possibility, radiation burns are not typical of standard fluoroscopic procedures. Most procedures sufficiently long in duration to produce radiation burns are part of necessary life-saving operations.
X-ray image intensifiers generally have radiation-reducing systems such as pulsed rather than constant radiation, and last image hold, which "freezes" the screen and makes it available for examination without exposing the patient to unnecessary radiation.
Image intensifiers have been introduced that increase the brightness of the screen, so that the patient needs to be exposed to a lower dose of X-rays. Whilst this reduces the risk of ionisation occurring, it does not remove it entirely.

Equipment

X-ray image intensifiers

The invention of X-ray image intensifiers in the 1950s allowed the image on the screen to be visible under normal lighting conditions, as well as providing the option of recording the images with a conventional camera. Subsequent improvements included the coupling of, at first, video cameras and, later, digital cameras using image sensors such as charge-coupled devices or active pixel sensors to permit recording of moving images and electronic storage of still images.
Modern image intensifiers no longer use a separate fluorescent screen. Instead, a caesium iodide phosphor is deposited directly on the photocathode of the intensifier tube. On a typical general purpose system, the output image is approximately 105 times brighter than the input image. This brightness gain comprises a flux gain and minification gain each of approximately 100. This level of gain is sufficient that quantum noise, due to the limited number of X-ray photons, is a significant factor limiting image quality.
Image intensifiers are available with input diameters of up to 45 cm, and a resolution of approximately 2-3 line pairs mm−1.

Flat-panel detectors

The introduction of flat-panel detectors allows for the replacement of the image intensifier in fluoroscope design. Flat panel detectors offer increased sensitivity to X-rays, and therefore have the potential to reduce patient radiation dose. Temporal resolution is also improved over image intensifiers, reducing motion blurring. Contrast ratio is also improved over image intensifiers: flat-panel detectors are linear over a very wide latitude, whereas image intensifiers have a maximum contrast ratio of about 35:1. Spatial resolution is approximately equal, although an image intensifier operating in magnification mode may be slightly better than a flat panel.
Flat panel detectors are considerably more expensive to purchase and repair than image intensifiers, so their uptake is primarily in specialties that require high-speed imaging, e.g., vascular imaging and cardiac catheterization.

Contrast agents

A number of substances have been used as radiocontrast agents, including silver, bismuth, caesium, thorium, tin, zirconium, tantalum, tungsten and lanthanide compounds. The use of thoria as an agent was rapidly stopped as thorium causes liver cancer.
Most modern injected radiographic positive contrast media are iodine-based. Iodinated contrast comes in two forms: ionic and non-ionic compounds. Non-ionic contrast is significantly more expensive than ionic, however, non-ionic contrast tends to be safer for the patient, causing fewer allergic reactions and uncomfortable side effects such as hot sensations or flushing. Most imaging centers now use non-ionic contrast exclusively, finding that the benefits to patients outweigh the expense.
Negative radiographic contrast agents are air and carbon dioxide. The latter is easily absorbed by the body and causes less spasm. It can also be injected into the blood, where air absolutely cannot due to the risk of an air embolism.

Imaging concerns

In addition to spatial blurring factors that plague all X-ray imaging devices, caused by such things as Lubberts effect, K-fluorescence reabsorption and electron range, fluoroscopic systems also experience temporal blurring due to system latency. This temporal blurring has the effect of averaging frames together. While this helps reduce noise in images with stationary objects, it creates motion blurring for moving objects. Temporal blurring also complicates measurements of system performance for fluoroscopic systems.

Common procedures using fluoroscopy

Another common procedure is the modified barium swallow study during which barium-impregnated liquids and solids are ingested by the patient. A radiologist records and, with a speech pathologist, interprets the resulting images to diagnose oral and pharyngeal swallowing dysfunction. Modified barium swallow studies are also used in studying normal swallow function.

Gastrointestinal fluoroscopy

Fluoroscopy can be used to examine the digestive system using a substance which is opaque to X-rays, which is introduced into the digestive system either by swallowing or as an enema. This is normally as part of a double contrast technique, using positive and negative contrast. Barium sulfate coats the walls of the digestive tract, which allows the shape of the digestive tract to be outlined as white or clear on an X-ray. Air may then be introduced, which looks black on the film. The barium meal is an example of a contrast agent swallowed to examine the upper digestive tract. Note that while soluble barium compounds are very toxic, the insoluble barium sulfate is non-toxic because its low solubility prevents the body from absorbing it.