Computational method improves the resolution of time-of-flight depth sensors 1,000-fold

For the past 10 years, the Camera Culture group at MIT’s Media Lab has been developing innovative imaging systems—from a camera that can see around corners to one that can read text in closed books—by using “time of flight,” an approach that gauges distance by measuring the time it takes light projected into a scene to bounce back to a sensor.