Anti-scatter grids are used in biomedical x-ray imaging to improve image quality by reducing scatter radiation reaching the image receptor. However, this comes at the cost of increasing radiation exposure. Grid performance can be improved by optimizing strip-thickness, which reduces radiation exposure, leading to greater benefits achieved by the grid. Evidence has shown that strip height may also affect grid performance. This work investigates optimization of grid performance by varying both the strip thickness and height for a constant grid-ratio of 15:1 (r15). A series of grid designs, using lead strips and carbon-fiber-interspace materials for grids for high-energy use was evaluated. The performance of these designs was determined by adopting a Monte Carlo simulation. For each grid design, the signal-to-noise ratio improvement factor (KSNR) was determined. A maximum value of KSNR (1.895) was found among these designs at a strip height of 6.8 mm and thickness of 66.8 µm. The best performance of the r15-series grids is 6% greater than that of a grid design with a grid-ratio 15:1 and strip frequency 44 cm-1 (found in the literature); consequently, the transmission of scatter radiation is reduced by 40%. The results show that grid designs can be optimized by both the strip height and thickness. If the optimization of the strip height and thickness cannot be done simultaneously, the recommendation is to optimize the strip height for better radiation protection without compromising the grid performance. The findings provide useful guidance for designing high-performance anti-scatter grids to reduce radiation exposure of patients.
|Number of pages||10|
|Journal||International Journal of Imaging Systems and Technology|
|Publication status||Published - Dec 2020|