As NDE operations—particularly radiographic testing—transition from analog to digital technologies such as computed radiography (CR), users are learning that there is more to digital image quality than meets the eye. In fact, there are multiple factors that determine the final perceived image quality of a computed radiograph. Many of these factors are misunderstood, and some are touted as the “key parameter” or “magic bullet” in producing optimum image quality. In reality, such claims are oversimplified and are more marketing hype than reality. The truth? Perceived image quality results from the cascaded effects of many factors—such as sharpness, system noise, spot size and pixel size, subject contrast, bit depth, radiographic technique, and so on. Many of these factors are within the control of radiographers or designers of equipment and media. This paper will explain some of these key factors, dispel some of the myths surrounding them, and show that qualities such as bigger, smaller, more, or less are not always better when it comes to CR image quality.
Usage | Shares |
---|---|
Total Views 110 Page Views |
Total Shares 0 Tweets |
110 0 PDF Downloads |
0 0 Facebook Shares |
Total Usage | |
110 |