Fish position determination in 3 D space by
Fish position determination in 3 D space by stereo vision Miroslav Hlaváč Martin Kozák 27. 07. 2011
Project goals • Design low budget system to determinate 3 D position of fish in water environment in real time • Explore the capabilities of two cameras system • Explore the capabilities of the Kinect dept sensor • Testing of both systems in different conditions • Compare results from cameras and Kinect • Designed system will be used to track differences in fish motion
Used equipment and software • Aquarium (60 x 30 cm) – similar one is planned to be used in real application of this project • • • Two Microsoft Life. Cam Studio webcams Calibration object (chessboard) Kinect for Xbox 360 Rubber testing object Matlab
Two cameras system • The system of two cameras is emulating human eyes • We need to do calibration of cameras to determine the system parameters • These parameters are then used to compute 3 D coordinates from two different views of scene (epipolar geometry)
Epipolar geometry • We can determine position of point from one image, but to determine depth we need the information from the second camera • Selection of one point in left image and finding corresponding point on epipolar line in right image • Computing 3 D coordinates from those two points
Cameras calibration • Two sets of parameters for cameras – Extrinsic (rotation and translation between cameras) – Intrinsic (focal length, skew and pixel distortion for each camera)
Kinect • Gaming device for Xbox 360 • Projecting IR light pattern on the scene through special grid • Computing depth information from the projected grid distortion
Cameras results 1 • Manual corresponding points selection • Selecting the white point on rubber testing object manually and computing 3 D trajectory • 3 D coordinates accuracy is ± 0. 5 mm
Camera results 2 • We developed online tracking system – 7 fps • Automatic corresponding point selection • Image thresholding • Binary image opening to eliminate small distortions • By computing mean position of white pixels we will get corresponding points in both images
Kinect accuracy 14 measured distance [cm] • Real and Kinect distance dependence on water depth • Depth independent 12 10 8 6 4 2 0 -2 5 10 15 20 real distance [cm] 140 120 100 object size [pixel] • Kinect accuracy in x-axis in water • x-axis accuracy is ± 3. 5 pixels 0 80 60 40 20 0 -15 -10 -5 0 shift from the center of view [cm] 5 10 15
Kinect results • • • We developed online tracking system – 30 fps Maximum measurable depth in clear water is 40 cm Maximum measurable depth in dirty water is 20 cm Depth of fish is obtained by depth thresholding Minimal measurable distance 80 cm
Kinect vs. cameras Kinect • No need for calibration • Depth map is direct output • No color and outer light dependence • Maximal water depth limitation • IR reflecting material cause errors in depth map • Lower accuracy in water • Minimal distance 80 cm Cameras (+) • Precision (+) • Environment independence (+) • Image segmentation (-) • Localization of corresponding points (-) • Calibration for each new (-) system position (-) • Requires more processing (-) power
Conclusion • Both systems are usable for online 3 D fish position determination in water • We would recommend using Kinect in environment where accuracy is not the main concern the water is shallow and clean and where we need more mobility • Cameras offer higher accuracy and environment independence but they require more processing power (corresponding points detection) and initial calibration
Acknowledgement We would like to thank Ing. Petr Císař, Ph. D. for leading us through this project and for his advices.
- Slides: 14