Название | A Comprehensive Guide to Radiographic Sciences and Technology |
---|---|
Автор произведения | Euclid Seeram |
Жанр | Медицина |
Серия | |
Издательство | Медицина |
Год выпуска | 0 |
isbn | 9781119581857 |
Present‐day CT scanners are MSCT scanners. One characteristic feature of MSCT is the two‐dimensional detector array, compared to a one‐dimensional detector array of SSCT. This means that there will be additional specific technical factors that affect the dose in CT. One such notable factor is the pitch (P), which is defined by the International Electrotechnical Commission (IEC) as the distance the table travels per rotation (D) divided by the total collimation (W). This can be expressed algebraically as:
The increasing use of CT has led to widespread concerns about high patient radiation doses from CT examinations relative to other radiography examinations. The distribution of the dose to the patient in CT is significantly different than the distribution of the dose in radiography. These differences require additional CT‐specific dose metrics. There are essential four CT‐specific dose metrics: the computed tomography dose index (CTDI), the dose length product (DLP), the size‐specific dose estimate (SSDE), and the ED. These and other elements of CT physical principles will be described further in Chapter 7.
Quality control
QC is an essential activity of all medical imaging departments and it is part of a QA program. QA deals with people and includes the administrative aspects of patient care and quality outcomes. QC addresses the technical aspects of equipment performance used to image patients. QA and QC programs have evolved into what is now referred to as Continuous Quality Improvement (CQI) which includes Total Quality Management (TQM). CQI was introduced by the Joint Committee on Accreditation of Healthcare organizations (JCAHO) to stress the importance that all employees play an active role in ensuring a quality product. The purpose of the procedures and techniques of CQI, QA, and QC is threefold: to ensure optimum image quality for the purpose of enhancing diagnosis, to optimize the radiation dose to patients and reduce the dose to personnel, and to reduce costs to the healthcare facility.
An effective QC program consists of at least three major steps, namely acceptance testing, routine performance, and error correction. While acceptance testing is the first major step in a QC program and it ensures that the equipment meets the specifications set by the manufacturers, routine performance involves performing the actual QC test on the equipment with varying degrees of frequencies (annually, semi‐annually, monthly, weekly, or daily). Error correction means that equipment not meeting the performance criteria or tolerance limit established for specific QC tests must be replaced or repaired to meet tolerance limits. These limits include both qualitative and quantitative criteria used to assess image quality. For example, the tolerance for collimation of the x‐ray beam should be ±2% of the SID.
QC for DR has evolved from simple to more complex tests and test tools to assure that the DR equipment is working properly to meet optimum image quality standards and fall within the ALARA philosophy in radiation protection. The American Association of Physicists in Medicine (AAPM) has recommended that several testing procedures for CR QC, using specific tools developed for CR QC. A few examples of these test procedures include physical inspection of IP, dark noise and uniformity, exposure indicator (EI) calibration, laser beam function, spatial accuracy, erasure thoroughness, aliasing/grid response, positioning and collimation errors, to mention a few.
QC is now an essential requirement of CT imaging and requires that users have a clear understanding of the various tests that play a significant role in dose‐image quality optimization.
Imaging informatics at a glance
Imaging Informatics is the current term used by the Society of Imaging Informatics in Medicine (SIIM) to replace the old term medical imaging informatics. SIIM notes that imaging informatics “is the study and application of processes of information and communications technology for the acquisition, manipulation, analysis, and distribution of image data.”
Imaging informatics is based on core topics that range from information and communication technologies, PACS, radiology information systems (RIS), the electronic health record, to cloud computing, Big Data, AI, ML, and DL. In this section, these core elements that characterize these topics will only be highlighted as a basis for setting the stage for the more detailed coverage later in the book.
Information technology (IT) refers to the use of computers and computer communications technologies to not only to process, store, secure electronic data but to communicate these data using computer networking infrastructure. PACS is an excellent example of an informatics‐rich medical device. PACS include the imaging image acquisition modalities such as digital radiographic and CT modalities, a computer network database server, storage and archival systems, and display workstations. PACS may be connected to information systems such as the RIS and the hospital information system (HIS). Furthermore, these systems must be fully integrated and secured for efficient and effective data interchange. One such approach within the PACS infrastructure is cloud computing, which simply provides a means of using the Internet for storage and retrieval (for example) of data using specific software packages. Additionally, emerging topics which will have an impact on the practice of medical imaging include Big Data, AI, ML, and DL.
Big Data is characterized by four Vs: Volume, Variety, Velocity, and Veracity. While Volume refers to the very large amount of data, Variety deals with a wide array of data from multiple sources. Furthermore, Velocity addresses the very high speeds at which the data are generated. Finally, Veracity describes the uncertainty of the data such as the authenticity and credibility. AI uses computers in an “effort to automate intellectual tasks normally performed by humans.” A subset of AI is ML which includes “a set of methods that automatically detect patterns in data, and then utilize the uncovered patterns to predict future data or enable decision making under uncertain conditions.” A subset of ML is DL which uses algorithms that are “characterized by the use of neural networks with many layers.” These emerging technologies will evolve and more importantly become useful tools in medical imaging. Therefore, students and technologists alike should make every effort to grasp their meaning and applications so that they can communicate effectively with radiologists and medical physics in an effort to participate actively in the management of patient care.
RADIATION PROTECTION AND DOSE OPTIMIZATION
Current radiation protection standards and recommendations for the safe use of ionizing radiation to image humans are based on the fact that exposure to radiation can cause biological effects. These bioeffects fall in the subject matter domain of radiobiology.
Radiobiology
Radiobiology is the study of effects of radiation on biologic systems which occur at the molecular, cellular levels and subsequently leading to whole‐body biological effects generally categorized as early and late effects.
The study of radiobiology is vital for technologists and radiologists working in radiology departments, and it involves an understanding of related physics and chemistry, types of biological effects, radiosensitivity, target theory, and direct and indirect effects. Furthermore, radiobiology involves discussing of deterministic effects (early effects) and stochastic effects (late effects). These topics and their associated subtopics (for example, subtopics for stochastic effects are radiation‐induced