Can someone explain why 1080p needs more bandwidth than 1080i. Both have 1080 lines of 1920 pixels. Both send 30 complete pictures per second so the same amount of data is sent in 1/30 of a second.
I is for interlace so only the even number of 1920x540 lines are sent in one 1/60th of a second and then the odd number of 1920x540 lines are sent in the next 1/60th of second. Not all 1920x1080 are sent at one time. So half the bandwidth is needed for a 1080i vs 1080p image.
This explains it better.