Moveable Interactive Projected Displays Using Projector Based Tracking
Moveable Interactive Projected Displays Using Projector Based Tracking Johnny C. Lee 1, 2 Scott E. Hudson 1 Jay W. Summet 3 Paul H. Dietz 2 1 Carnegie Mellon University 2 Mitsubishi Electric Research Labs 3 Georgia Tech University Seattle, WA UIST 2005
UIST 2004 – Automatic Projector Calibration 1. Embedded light sensor in surface. 2. Project patterns to find sensor locations. 3. Pre-warp source image to fit surface. 4. (video clip 1) Correspondence between location data between and projected image is free (e. g. no need for calibration with external tracking system) Transforms passive surfaces into active displays in a practical manner. Variety of useful applications
Touch Calibration Projector-based AR Shader Lamps, MERL/UNC Diamond Touch, MERL Interactive Whiteboard Everywhere Displays, IBM
Focus on Moveable Projected Displays Goals of this work: 1. Achieve interactive tracking rates for hand-held surfaces. 2. Reduce the perceptibility of the location discovery patterns. 3. Explore interaction techniques supported by this approach.
Display Surface Constructed from foam core and paper Touch-sensitivity is provided by a resistive film Lighter than a legal pad
Tablet PC-like Interaction Video clip 2
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Difference is visible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF Difference is visible to the human eye LF Difference is NOT visible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF Difference is visible to the human eye LF HF Difference is NOT visible to the human eye
Gray Code Patterns Black and White Frequency Shift Keyed (FSK) HF LF Difference is visible to the human eye HF LF HF Difference is NOT visible to the human eye
FSK Transmission of Patterns l FSK transmission of the Gray Code patterns makes the stripped region boundaries invisible to the human eye. l Patterns appear to be solid grey squares to observers. l Light sensor is able to demodulate the HF and LF regions into 0’s and 1’s l This is accomplished using a modified DLP projector
Inside a DLP projector DLP = Digital Light Processing • Many consumer projectors currently use DLP technology • “DLP” is Texas Instruments marketing term for DMD = Digital Micro-mirror Device • Each mirror corresponds to a pixel • Brightness corresponds to duty cycle of mirror Pictures from Texas Instruments literature
Inside a DLP projector Light source Color wheel DMD Projector Lens
Inside our modified DLP projector Light source Projector Lens DMD
FSK Transmission of Location Patterns • • Removing the color wheel flattens the color space of the projector into a monochrome scale Multiple points in the former color space now have the same apparent intensity to a human observer, but are manifested by differing signals. The patterns formerly known as “red” and “grey” are rendered as 180 Hz and 360 Hz signals respectively. Monochrome projector is not ideal, but is a proof of concept device until we can build a custom DMD projector.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Projector Specifications Infocus X 1 (~$800 new) – 800 x 600 SVGA resolution – 1 DMD chip – 60 Hz refresh rate Full-Screen Location Discovery Time: – 20 images ( log 2(# pixels) ) – 333 ms at 60 Hz 3 Hz maximum update rate
Incremental Tracking l Project small tracking patterns over the last known locations of each sensor for incremental offsets l Black masks reduce visibility tracking patterns l Tracking loss strategies are needed (later) l Smaller area = fewer patterns = faster updates – 32 x 32 unit grid requiring 10 images – 6 Hz update rate of
Tracking Demo Video clip 3
Latency and Interleaving l Incremental tracking is a tight feedback loop: project update … l 6 Hz update rate assumes 100% utilization of the 60 frames/sec the projector can display l System latencies negatively impact channel utilization l Achieving 100% utilization of the projection channel requires taking advantage of the axis-independence of Gray Code patterns.
System Latency – Full X-Y Tracking Only 73% utilization X patterns Projection: Graphics & Video Y patterns X patterns Hardware & OS scheduling Software: draw X-Y patterns update & draw X-Y patterns Time Y patterns
System Latency - Interleaved Tracking 100% utilization of the projection channel and 12 Hz interleaved update Projection: X patterns Y patterns Software: draw X patterns draw Y patterns update & draw X patterns Time update & draw Y patterns update & draw X patterns
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Tracking Pattern Size Tracking Area Tracking Rate 32 x 32 grid 12 Hz interleaved 16 x 16 grid 15 Hz interleaved -75% are a + Smaller tracking area increases risk of losing sensor (e. g. maximum supported velocity) log 2 relationship makes it hard to gain speed though the use of smaller patterns e t a r 25%
Tracking Pattern Size Pixel Density Decreases
Tracking Pattern Size large, coarse tracking pattern small, fine tracking pattern Preserves physical size of tracking pattern (cm) Preserves maximum supported velocity (m/s) Distance is approximated from screen size Scaling factor is adjustable (precision vs. max velocity): ~2. 5 mm; 25 cm/s
Motion Modeling Predicting the motion can be used to increase the range of supported movement (e. g. max acceleration vs. max velocity) Much of the work in motion modeling is applicable. But, no model is perfect and mis-predictions can lead to tracking loss potentially yielding poorer overall performance. Models are likely to be application and implementation specific. v a
Tracking Pattern Shape We used square tracking patterns due to the axis aligned nature of Gray code patterns. Patterns with high-radial symmetry are best for general movement in two -dimensions. Pattern geometry can be optimized for specific applications.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Detecting Occlusions or Tracking Loss Causes of tracking loss: 1. occlusions 2. exiting the projection area 3. exceeding the range of motion supported by the tracking patterns With FSK transmission, tracking loss corresponds to a disappearance of the carrier signal. This allows error detection on a per-bit basis. Implemented on a low-cost PIC processor as: 1. sudden drop in signal amplitude 2. insufficient amplitude 3. invalid edge count
Lost Tracking Behavior Single/independent sensors: 1. Discard and hope the sensor has not moved 2. Perform a full screen discovery process (+333 ms) 3. Grow the tracking pattern around last location until reacquired Multiple sensors of known geometric relationship: 1. Try the above three techniques. 2. Compute predicted lost sensor locations using the locations of the remaining available sensors.
Tracking Loss With Multiple Sensors video clip 4
Estimating Lost Sensors Available Sensors Action 4/4 No estimation needed, compute the 4 point warping homography directly 3/4 Measure 6 offsets, compute affine transform to estimate lost sensor from last known location 2/4 Measure 4 offsets, compute simplified transform to estimate lost sensor locations 1/4 Measure 2 offsets, compute 2 D translation 0/4 Try full screen discovery Note: Transformations for each point cannot be implemented as a simple matrix stack because LIFO ordering of sensor loss and re-acquisition is not guaranteed.
Talk Outline • Reducing Perceptibility • Achieving Interactive Rates • Pattern Size and Shape • Tracking Loss • Interaction Techniques/Demos
Supported Interaction Techniques Video clip 5
Supported Interaction Techniques Simulated Tablet PC Magic Lens Location Aware Displays Focus + Context Input Pucks
Conclusion l Unifying the tracking and projection technology greatly simplifies the implementation and execution of applications that combine motion tracking with projected imagery. – Coherence between the location data and projected image is free. – Does not require an external tracking system or calibration – Simple: Demos were created in about a week l This approach has the potential to change the economics of interactive displays – The marginal cost of each display can be as low as $10 USD – Museum: wireless displays could be handed out to visitors. – Medical Clinic: physical organization of patient charts/folders
Future Work l l l Removing the color wheel was a proof-of-concept work around. Construct high-speed projector using a DLP development kit Explore using infrared to project invisible patterns Explore other applications where low-speed positioning is sufficient. Achieve +18 Hz (+36 Hz interleaved) tracking with visible patterns and an unmodified DLP projector using RGB sections. l Using multiple projectors (or steerable projectors) to increase freedom of movement.
Acknowledgements Funded in part by the National Science Foundation under grants IIS-0121560 and IIS-0325351 Funded in part by Mitsubishi Electric Research Labs Johnny Chung Lee johnny@cs. cmu. edu
Technical Details l l l l Infocus X 1 ($800) 800 x 600, 60 Hz PIC 16 F 819 at 20 Mhz, 10 bit ADC Sensor package < $10 in volume 4 -wire resistive touch sensitive film IF-D 92 fiber optic phototransistors 45 Bytes/sec for location data 25 m. W during active tracking Latency (77 ms – 185 ms)
- Slides: 46