720p displays at 60 frames per seconds and 1080i displays at 30 frames per second, shown in alternating fields (which might get de-interlaced inside the set).
The questions I find that never gets answered is: How many individual frames are being "captured" and "transmitted" per second?
If, it turns out that its 30 frames in both cases, then 1080i converted to 1080p inside the set should be superior, right? I assume that the 60 fields would be de-interlaced into 30 proper frames and doubled to be displayed at 60 hz (or possibly quadrupled for 120 hz sets).
On the other hand, if sports programming is being "captured" and "transmitted" at a 60 hz (60 individual frames per sec), then 720p would have the edge, because 720p would be able to display 60 unique frames per second, whereas 1080i converted to 1080p inside the set would only be able to display 30 unique frames per second.
The spoiler is anything that's "captured" using a 1080i camera (where the individual fields are captured separately) and "transmitted" at 1080i. I believe some Discovery programming used to be captured this way. No way a progressive display, whether 720p or 1080p can ever deal with this as the alternating fields aren't supposed to match up. I'm guessing this type of "capture" method is on the decrease.
Edited by jediphish, 13 March 2010 - 08:58 AM.