JonW said:
...Something else to keep in mind is that 1080i uses half the refresh rate of 720p, hence why 720p is preferred for fast action shows like sports. So when converting from 720p to 1080i there's more going on than just scaling, 1/2 the frames are dropped...
Not exactly. The faster refresh is one reason, but not the main reason, which is that there will not be any interlace motion error on moving images (or at least a whole hell of a lot less than with 1080i).
Half the frames are not dropped. Half of
each frame, the alternating lines in each frame, are dropped. Each frame of 720p takes 1/60th of a second. Each frame serially delivered as 720 progressive is rescaled into 1080 lines. The odd lines are kept from the first frame and the even lines are dropped. The odd lines then become the 540 lines that comprise the first field of the interlaced frame. The even lines are kept from the second progressive frame and the odd lines are dropped. Those remaining even lines then become the 540 lines that comprise the second field of the same interlaced frame. Two fields of 1/60th second each then make one interlaced frame of 1/30th second each (actually 1/59.94 and 1/29.97). If they simply dropped half the frames outright, that would reduce the resolution to 540.
JonW said:
...Likewise when converting from 1080i to 720p 1/2 the resolution is dropped...
Again, not really. There are over 2 million pixels in a 1080i frame and less than a million in a 720p frame. That sounds a lot like over half the resolution is dropped, but resolution is dependent upon much more than sheer pixels per frame. Since two frames of 720p are delivered in the same time frame as 1 frame of 1080i, the delivered pixel rate is pretty similar, with 720p delivering about 88% of the pixels per second that 1080i does.
Not only that, but there is a huge difference between potential resolution and perceived resolution. 1080i has a potential resolution somewhat higher than 720p for still images, but in the real world 1080i still images do not appear to be significantly sharper (and they actually are not), only mildly sharper, due to many factors. For moving images, 720p images actually are perceived as sharper (and in some ways they actually are sharper).
Bottom line, there is very little perceived difference in resolution between 720p and 1080i, even on 1080p-native displays, and of course the only resolution that matters is the resolution that is perceived. If one system were actually superior to the other, then would not have all networks adopted the superior format? Of course they would. But neither is actually a superior format, which explains why folks can rarely if ever even tell the difference.