This year I'm submitting a talk to Siggraph titled "Real-VFX in your Living Room". It explores what happens when you take traditional visual effects and apply them to a scene containing live depth and color buffer data from a camera (in this case the Kinect). Deferred lighting, soft particles, snow accumulation, depth buffer noise filtering and more will be covered. This video was my supplementary submission material. I hope it gets accepted!