I was reading the BBC test of the Nikon D800 and it brought up some good points about how it’s actual HD resolution is less than true HD because of the interpolation algorithms nikon used to compress the much higher resolution sensor data. The thing I was wondering is why is the interpolation even taking place in the camera. That can’t be a very good place for it to happen. I think the next big advance will come when a nice full frame camera can capture 24 raw frames a second. Once that happens everything can change. Suddenly you would be able to use Lightroom to adjust every frame in a film. All the downsizing interpolation be done on a PC. The only real drawback would be syncing the sound but that could be done in camera if it was set up to do it. Of course an actual DSLR probably would be good to use because of the noise but an interchangeable lens camera with a full frame would be perfect.