Optical density calculation
Hello,
My goal is to quantify two regions (abnormal vs normal) of white matter and compare them in terms of relative optical density. Based on my understanding, optical density is calculated as OD = -log(%T) where %T would be percent of light transmitted. In the case of a digital image the %T = (mean intensity value/65535) with background taken into account.
The calculated value based on this formula gives a different result from the optical density value given in Image Pro Premier even though the exact same regions are used.
My question is if my line of thinking is correct as well as how the optical density is calculated in the program? Would it matter if I use weighted RGB values instead?
My goal is to quantify two regions (abnormal vs normal) of white matter and compare them in terms of relative optical density. Based on my understanding, optical density is calculated as OD = -log(%T) where %T would be percent of light transmitted. In the case of a digital image the %T = (mean intensity value/65535) with background taken into account.
The calculated value based on this formula gives a different result from the optical density value given in Image Pro Premier even though the exact same regions are used.
My question is if my line of thinking is correct as well as how the optical density is calculated in the program? Would it matter if I use weighted RGB values instead?
0
Answers
Intensity and Density measurements of Image-Pro Premier use Intensity calibration. If the image doesn't have intensity calibration (default), then pixel gray levels are measured.
If you want to measure Optical Density on your samples you have to apply a proper Intensity Calibration to your image first (Capture tab, Calibration group, Create button drop-down).
Create a new calibration, change type to "Optical Density", set Black and Incident levels, apply it to the image.
Then draw objects outlines (using Measure tools) or use Count/Size. The Intensity and IOD measurements will be reported in the data table.
Yuri
I have indeed set the intensity calibration type to optical density with Black set to 0 and incident set to 65535. What I am curious about is how the calculation is done by the program to provide the actual output given. This is because the optical density output by the program and the optical density from my own calculation based on the output intensity are slightly different (about 10%) and I wish to figure out why.
OD=-log10((Val-BlackLevel)/(IncidentLevel-BlackLevel))
where Val-BlackLevel is replaced by (RangeMax-RangeMin)/(Samples-1) when it's below or equal 0.
The difference between your calculation and Premier is that you calculate Mean of pixel Intensities first and then use it to calculate OD, while Premier calculates OD of every pixel and then Mean of these OD values. Since the calibration is not linear these two approaches will produce different MeanOD results.
Yuri