I am currently a graduate student studying at the Institute of Applied Art at National Yang-Ming-Chiao-Tung University. Recently, I have seen many Interactive Projection works and want to start learning about it. However, my undergraduate education did not cover much related education, and I am not very familiar with the software and technology in this area. I am attaching a case reference from the Sony Future Experience Program for reference. Below are some related questions of mine:
In the video, the UI of the projection can interact with the subsequent physical additions. What sensing technologies and components are used to achieve the interaction?
I think that most projectors do not have sensing technology. Are there any projectors that can handle sensing? (e.g. laser sensing…)
I have become more familiar with software such as Unity and Arduino, and I feel that there is a chance to achieve similar interactions, but I am still at a stage where I am not clear about the actual know-how. I would like to talk to a senior member to find out more about this part.
I would appreciate it if the senior members of the club could provide some key words related to the technology in the video as a reference, or tell me what tools I can use to achieve the desired interactive effects. If you are interested, I would also be happy to chat privately to learn more about the technology and applications. Thank you!
I don’t know what components or sensing technologies are used but from a quick look at the first you tube reference it is almost certainly a computer vision camera tracking the fingers and an overhead projector shining down on the table surface as you can see the images on the hands in the projected area. There might also be some touch sensing done with an IR camera below on a rear projection diffuse screen but it is hard to tell for sure.
You could start by researching OpenCV and Computer Vision I think. I don’t think this is likely using a Kinect as that is better for full skeleton tracking. Look into “Blob Track” on forum and in wiki and the “BlobTrack” TOP in Operator Snippets (Help/Operator Snippets/TOP/Blob Track) to see how TD can track objects . Maybe check out Optical Flow too which might be useful here too.
On projectors with sensors, I worked for a projector manufacturer for many years and we used GigE cameras to do automatic alignment, warping, stacking and blending using computer vision techniques. But we did not build them directly into the projector yet as often it was most beneficial to be able to place the sensing cameras wherever you wanted and usually covering more than a single projected image. There might be some built in by now - but most projectors don’t have built in laser or optical cameras yet as far as I know. It would not be very hard to do so however. You could start your experiments with as little as a cheap webcam and micro projector from Amazon or similar.
I did some beginner Unity programming but think Touch Designer is a better way to do interactive stuff like this. You could probably have something like above up and running within a few weeks once you borrow or buy a webcam and a cheapo projector. So get on YouTube and look up optical tracking demos! Good luck and please report back what you learn!