Calibrating Optical Images and Gamma Camera Images for
Calibrating Optical Images and Gamma Camera Images for Motion Detection 1, 2 Gennert , Michael A. 1 Philippe P. Bruyant , 1 Manoj V. Narayanan , 1 Michael A. King 1 University of Massachusetts Medical School, Worcester, MA • 2 Worcester Polytechnic Institute, Worcester, MA Abstract Objectives: One approach to motion detection in SPECT is to observe the patient using optical cameras. Patient motion is estimated from changes in the images and is used to modify the reconstruction algorithm. An important subproblem is calibrating the optical cameras and the gamma camera. That is, it is necessary to determine the transformation from the gamma camera coordinate system to the optical camera coordinate system such that given a gamma camera point, one may compute the corresponding optical camera point. Conversely, given an optical camera point, one may compute the corresponding patient ray. Methods: We have devised a calibration phantom that can be imaged using both optical and gamma cameras. The phantom comprises a set of Lucite disks; each disk supports 2 low-intensity light bulbs and a 0. 8 mm diameter hole centered between the bulbs to hold a 99 m. Tc point source. The radioactive source location for each disk in image coordinates is taken to be the midpoint of the bulbs. The radioactive source location in gamma camera coordinates is found by segmenting the reconstructed source distribution and computing the centroid of the activity of each source. At least 6 such point pairs are needed, although 7 are used in practice to provide increased accuracy. Using procedure PROJ_MAT_CALIB of Trucco & Verri Introductory Techniques for Computer Vision, we compute the 11 parameters of the coordinate transformation and the residual error. Because we do not know in advance which optical camera points match which gamma camera points, an exhaustive search is used to find lowest-error matches. Results: We have been able to match optical and gamma camera points and determine the transformation. Tomographic reconstruction and segmentation take up most of the processing time; point matching and parameter calculation take less than 14 seconds of processor time on a Digital Alpha 433 au workstation. Conclusions: A calibration phantom can be imaged simultaneously to calibrate optical and gamma cameras and the transformation computed with no other input required. 1. Introduction One approach to motion detection in SPECT is to observe the patient using optical cameras. Patient motion is estimated from changes in the images and is used to modify the reconstruction algorithm. In order to relate changes in patient position as observed by the optical cameras, to SPECT data as observed by the gamma camera, it is necessary to determine the camera parameters. This is the calibration problem— determining the transformation from the gamma camera coordinate system to an optical camera coordinate system and vice versa. 2. Procedure We designed a calibration phantom, comprising a set of Lucite disks, that can be imaged by an optical camera and a gamma camera. The disk arrangement is non-coplanar and asymmetric, guaranteeing a unique solution for the calibration parameter equations. Acquire Optical Image Detect Blobs Generate Matches Image l. Read image l. Threshold l. Segment l. Select regions X Centroid 235. 57628 248. 13846 463. 48648 476. 22018 341. 06384 351. 514 335. 50632 347. 50485 173. 41379 184. 41176 294. 75 303. 3356 153. 42073 164. 48685 l. Optical: manually l. Gamma: largest regions l. Compute centroids Gamma Blobs Y Centroid X Centroid 57. 86440 88. 3193 60. 53846 66. 9417 108. 54054 42. 0171 113. 28440 53. 7263 237. 17021 70. 5912 240. 28038 53. 8215 329. 51898 82. 4415 336. 35922 329. 89655 334. 80392 385. 03906 394. 97260 431. 84756 439. 45395 Y Centroid 39. 7305 87. 3680 40. 2948 86. 8387 86. 4228 87. 5115 87. 1382 Z Centroid 62. 2723 79. 6488 71. 6796 29. 7913 6. 88985 48. 3729 31. 0288 Figure 4. Blob Detection. Processing is similar for optical and gamma blob detection. For optical blobs, final centroids are computed by taking midpoints of pairs of closest blobs. Match Generation l Generate all possible image world point matches l If N points, generate N! permutations l Compute camera parameters for each possible match l For each parameter set, calculate residual error defined as l Select parameter set with lowest residual error Figure 5. Match Generation. Best match is found by exhaustive search. Calculate Parameters l Trucco & Verri, “Intro Techniques for 3 -D Comp Vision”, PROJ_MAT_CALIB l Write world image transformation equations f: camera focal vector l 2 equations in 12 unknowns/point pair along optical axis l Use enough point pairs to determine c: camera center offset l Solve system of equations using SVD l Extract parameters from SVD solution l Coordinate systems l. Gamma camera / World coordinates xw l. Optical camera coordinates xc is rotation & translation from world xw l. Image coordinates xi is projection from camera xc l Equations l. World point xw=(xw, yw, zw) projects to xi=(xi, yi) where Patient l. Each world / image point pair gives possible Calculate camera matches Parameters parameters Select Best world point list Acquire Gamma Image Optical Blobs l. Rewrite to be linear in mij as image point list Calibration Phantom Algorithm l. N point pairs yield 2 N equations, so need 6 point pairs to compute mij l. Solve Am=0 where Detect Blobs Figure 1. Calibration Processing Flow. Figures 2– 7 show module details. l. Find m in A’s nullspace using Singular Value Decomposition Light bulbs Figure 6. Parameter Calculation. Parameters R, T, f, and c are found from mij. Sample Output 99 m. Tc well (approx 0. 1 m. Ci) Figure 2. Calibration Phantom comprising 7 Lucite disks (left), each holding 2 light bulbs and a 99 m. Tc source. Complete phantom is shown at right. IPL = Image. Point. List[ Image. Point(177. 5, 82. 0), Image. Point(222. 5, 211. 0), Image. Point(193. 5, 349. 0), Image. Point(293. 5, 289. 5), Image. Point(359. 0, 83. 5), Image. Point(312. 0, 347. 0), Image. Point(269. 5, 437. 0)] WPL = World. Point. List[ World. Point[88. 3, 39. 7, 62. 2], World. Point[66. 9, 87. 3, 79. 6], World. Point[82. 4, 87. 1, 31. 0], World. Point[53. 8, 87. 5, 48. 3], World. Point[42. 0, 40. 2, 71. 6], World. Point[53. 7, 86. 8, 29. 7], World. Point[70. 5, 86. 4, 6. 8]] Res = 9. 05 CPs = Camera Parameters[ T: [79. 8, -19. 6, 104. 2], R: [[-0. 923, -0. 262, -0. 280], [-0. 010, 0. 747, -0. 664], [-0. 383, 0. 610, 0. 692]], IC: [[317. 2], [238. 6]], fx: 650. 7, fy: 672. 4] Figure 7. Sample Output showing image points and world points in correct correspondence, with residual error=9. 05. Camera parameters are computed from matrix entries mij. For a 640 x 480 image the expected camera center offset is [319. 5, 239. 5]. Note the close agreement with image center IC=[317. 2, 238. 6]. 3. Conclusions Figure 3. Optical (left) and gamma (right) images of the phantom. The 7 pairs of light bulbs are clearly visible in the optical image. The reconstructed gamma image shows 64 out of 128 slices, with 5 out of 7 gamma source blobs. We have successfully calibrated optical and gamma cameras. The best match residual error is <10 pixel 2. The next lowest residual error is >1000 pixel 2, giving confidence that the best match of optical and gamma points has been found.
- Slides: 1