Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My friend Steve Silverman worked on a system which they commercialized back in 2006 [1] that unlike the MIT one captures full-frames so you can get 3D at video frame rates.

It uses a 5ns long pulsed laser "photon torpedos" at 30hz to illuminate the scene and then captures the at a much higher sampling rate where the MIT system just scans a line at a time. So, unlike the MIT one, it's a small, hand-hold-able, system that captures full motion. [2]

You get full scene 3D without the drawbacks of scanning.

They flew one of their cameras on the last Discovery Mission.

[1] http://asc3D.com

[2] http://www.youtube.com/watch?v=3L91F9o600E

http://video.google.com/videoplay?docid=-3656494784112768834

I had the chance to play with it a couple years ago - quite amazing.



Parallel vs. Serial.

The Google Tech Talk goes into a little bit of detail about how they capture a stack of frames (slices in Z) into a buffer right behind the sensor and then dump that out for each snapshot.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: