DBSTalk Forum banner
1 - 20 of 36 Posts

· Mentor
Joined
·
40 Posts
Discussion Starter · #1 ·
My plasma TV has 720p native screen resolution. I believe that D* sends us a MPEG-4 signal in 720p. Would it be best to leave my HR20 with a 720p output on the HDMI cable or go with 1080i? I really can't see a difference when I switch between the two. Thanks.
 

· The Shadow Knows!
Joined
·
36,634 Posts
Bottom line is, do what looks best to you. Most people with 720p TVs like to use 720p, but personally, don't ask why, everything on my TV looks better at 1080i.
 

· Premium Member
Joined
·
41,526 Posts
keyoctave said:
My plasma TV has 720p native screen resolution. I believe that D* sends us a MPEG-4 signal in 720p. Would it be best to leave my HR20 with a 720p output on the HDMI cable or go with 1080i? I really can't see a difference when I switch between the two. Thanks.
If your TV is only 720p, then you'll never "see" 1080i, as a 1080i signal will be converted to 720p by your TV.
DirecTV "simply" passes on whatever the broadcaster resolution is.
Some are 720p and others are 1080i, but "only your TV" would know and if 720p is all your TV will display, then..... :)
 

· Hall Of Fame
Joined
·
1,212 Posts
keyoctave said:
My plasma TV has 720p native screen resolution. I believe that D* sends us a MPEG-4 signal in 720p. Would it be best to leave my HR20 with a 720p output on the HDMI cable or go with 1080i? I really can't see a difference when I switch between the two. Thanks.
you plasma TV is not 720p. It is 768p. there are only a very few sets ever made that were 720p, one of them being the prevoous 37 inch panasonic plasma. thats the only one i know of that was ever a 720p plasma. is that yours?
 

· Broadcast Engineer
Joined
·
4,146 Posts
keyoctave said:
My plasma TV has 720p native screen resolution. I believe that D* sends us a MPEG-4 signal in 720p. Would it be best to leave my HR20 with a 720p output on the HDMI cable or go with 1080i? I really can't see a difference when I switch between the two. Thanks.
DTV sends channels in their original format (unless you count the soon-to-be-history MPEG-2 1080i channels, which are sometimes derezzed to 1280x1080 or 1440x1080). So if you get your local CBS/NBC affils in MPEG-4, they are at 1080i, while your local FOX/ABC affils will be at 720p. Most MPEG-4 sat channels other than these are 1080 (the ESPNs, currently in MPEG-2, are 720p).

If your set is truly a 720p-native set, setting the DVR to 720p is not much different than leaving it at 1080i, although you would avoid some double conversions by setting the DVR to 720.

If you set the DVR to 1080, for instance, the DVR rescales ABC from 720 to 1080, and then your HDTV rescales it from 1080 back down to 720, essentially reversing the process, both of which could be avoided altogether by setting the DVR to 720. Rescaling unnecessarily can be done pretty transparently, but it can only be less than as good as leaving it in 720 mode all the way.

For 1080 content, setting the DVR to 1080 means it gets rescaled to 720 in the HDTV, and setting it to 720 means it gets rescaled in the DVR. Assuming the scalers are fairly equivalent (which is a safe bet) there would be little difference between them.

Bottom line, setting the DVR to 720 probably makes the best sense.
 

· Mentor
Joined
·
40 Posts
Discussion Starter · #6 ·
dtrell said:
you plasma TV is not 720p. It is 768p. there are only a very few sets ever made that were 720p, one of them being the prevoous 37 inch panasonic plasma. thats the only one i know of that was ever a 720p plasma. is that yours?
Your right. Mine is 768. I just had 720 on my mind!:D
 

· Mentor
Joined
·
40 Posts
Discussion Starter · #7 ·
TomCat said:
DTV sends channels in their original format (unless you count the soon-to-be-history MPEG-2 1080i channels, which are sometimes derezzed to 1280x1080 or 1440x1080). So if you get your local CBS/NBC affils in MPEG-4, they are at 1080i, while your local FOX/ABC affils will be at 720p. Most MPEG-4 sat channels other than these are 1080 (the ESPNs, currently in MPEG-2, are 720p).

If your set is truly a 720p-native set, setting the DVR to 720p is not much different than leaving it at 1080i, although you would avoid some double conversions by setting the DVR to 720.

If you set the DVR to 1080, for instance, the DVR rescales ABC from 720 to 1080, and then your HDTV rescales it from 1080 back down to 720, essentially reversing the process, both of which could be avoided altogether by setting the DVR to 720. Rescaling unnecessarily can be done pretty transparently, but it can only be less than as good as leaving it in 720 mode all the way.

For 1080 content, setting the DVR to 1080 means it gets rescaled to 720 in the HDTV, and setting it to 720 means it gets rescaled in the DVR. Assuming the scalers are fairly equivalent (which is a safe bet) there would be little difference between them.

Bottom line, setting the DVR to 720 probably makes the best sense.
Thank you TomCat for the detailed explanation! It all makes more sense now.
 

· Icon
Joined
·
526 Posts
I use native mode. My TV is 1365x768 and has to rescale everything anyway, so it's better that it only be done once.

Something else to keep in mind is that 1080i uses half the refresh rate of 720p, hence why 720p is preferred for fast action shows like sports. So when converting from 720p to 1080i there's more going on than just scaling, 1/2 the frames are dropped. Likewise when converting from 1080i to 720p 1/2 the resolution is dropped.

On the other hand, since DirecTv cannot transmit in 1080p24 format, so a HD movie is converted to 1080i60. If you have a well-designed TV it can detect the film content within the 1080i60 signal and convert it back to or at least treat it as 1080p24 - but this isn't going to happen if you let the HR2x output in 720p mode.

In summary, if you have a high-quality TV use Native mode; unfortunately Native mode probably still slows down channel switching.
 

· New Member
Joined
·
2 Posts
Hmm...I dont know about all that. I have a Panasonic 50' Plasma and when I'm watching channels that are in 720p, I can totally tell when I switch to another channel that is in 1080i. Futhermore my TV tells me wheather the content is 720p or 1080i....so how does it scale it back down from 1080i to 720p, when it says its in 1080i??? I have the HR21 set to Native so it flickers when switching resolutions...
 

· Icon
Joined
·
526 Posts
Your TV is telling you what the signal it's getting is, not what it's capable of displaying. If you're not sure what your TV can actually display list the model # and we can look it up. A 50" Panasonic is typically 768P or 1080P, though.

Plasma, LCD, DLP, and LCoS unlike CRTs are fixed pixel displays and are only physically capable of displaying a single resolution although the vast majority of them will accept and convert various resolutions fed in.

That flickering you get with native mode as the TV adjusts to the changing signal is also one of the drawbacks and part of the slowness, but another advantage is that if your TV remembers settings by input resolution (like mine does) it will automatically show 480i/480p content in "wide mode" and 1080i/720p content in "full mode" or whatever you prefer.
 

· The Shadow Knows!
Joined
·
36,634 Posts
Stuart Sweet said:
Bottom line is, do what looks best to you. Most people with 720p TVs like to use 720p, but personally, don't ask why, everything on my TV looks better at 1080i.
veryoldschool said:
If your TV is only 720p, then you'll never "see" 1080i, as a 1080i signal will be converted to 720p by your TV.
DirecTV "simply" passes on whatever the broadcaster resolution is.
Some are 720p and others are 1080i, but "only your TV" would know and if 720p is all your TV will display, then..... :)
VOS, you're absolutely right, which is why I cannot account for my 720p tv looking better when fed 1080i content. Not a lot better, just a little sharper. My theory is that the 1080i signal is downsampled to 1366x768 progressive, and I do have the image enhancements on, while the 720p signal is upsampled to 1366x768, and perhaps the image enhancing software can't do a good job.

Or perhaps my TV is just possessed.
 

· Super Moderator
Joined
·
13,787 Posts
My plasma has to either upconvert 720p to 768 or downconvert 1080i to 768.

Speaking only for me, I found that I get a better picture when the source is 1080i.

My plasma seems to do a better job downconverting then upconverting.

I think the only way anyone can choose is test it and see what they come up with.

My 2¢. :)

Mike
 

· Premium Member
Joined
·
41,526 Posts
Stuart Sweet said:
VOS, you're absolutely right, which is why I cannot account for my 720p tv looking better when fed 1080i content. Not a lot better, just a little sharper. My theory is that the 1080i signal is downsampled to 1366x768 progressive, and I do have the image enhancements on, while the 720p signal is upsampled to 1366x768, and perhaps the image enhancing software can't do a good job.
Or perhaps my TV is just possessed.
Sounds like it's better at "throwing out" parts of the image, than "making them up". :)
 

· Mentor
Joined
·
40 Posts
Discussion Starter · #14 ·
OK, another couple of questions......

I notice that on my DVR's HDTV settings, I can select any of the resolutions from 480i to 1080i. Would selecting 720p and 1080i allow the unit to switch between either resolution? Would it have to be in native mode to do that?
 

· The Shadow Knows!
Joined
·
36,634 Posts
Technically if you are in native mode and have all resolutions checked, the DVR will put out 480i, 720p, or 1080i. However, your display will still show 768 lines progressively.
 

· Mentor
Joined
·
40 Posts
Discussion Starter · #16 ·
OK, just so i understand this.......

If i check all the resolutions and have the box set in native mode, the box (DVR) will output whatever that resolution is and passes that through the HDMI cable letting the TV do the up/down conversion/interlacing to it's native display resolution.

If I only check one box (say 720p) and de-select the native mode, then the box does the up/down conversion/interlacing and sends that 720p output through my HDMI cable to my TV which will then have to up convert it to it's native screen resolution.

Is this correct?
 

· New Member
Joined
·
8 Posts
it seems that many of you are concerned about image scaling, and rightly so. Why dont you do what I did, and get yourself a professional external scaler? I personally use the Anchor Bay DVDO VP50. However, Anchor Bay is coming out with a new processor called the DVDO Edge, which costs a lot less, and has most of the features you'll need most.

I put my HR20 into "Native" mode (turning on all the resolutions), my Panasonic plasma into 1:1 pixel ratio, and the VP50 is what resizes the image for my display. It makes most Standard Definition material look like High Def. I've put it side-by-side with having both my plasma scale the image, and having the HR20 scale the image, and its easily better to have the external processor do it instead of either.

As a nice plus, for BBC America, and other SD programs that are run as letterbox, I have the VP50 scale the image properly so it fills the screen and maintains the original aspect ratio. Looks much better than using the Zoom feature of the Panasonic plasma.

As a side note, I'm actually a television network engineer, so I can tell even the most minute discrepancies in picture quality, but even the non-engineer friends can see the differences I'm talking about.
 

· Super Moderator
Joined
·
13,787 Posts
jello2594 said:
it seems that many of you are concerned about image scaling, and rightly so. Why dont you do what I did, and get yourself a professional external scaler? I personally use the Anchor Bay DVDO VP50. However, Anchor Bay is coming out with a new processor called the DVDO Edge, which costs a lot less, and has most of the features you'll need most.

I put my HR20 into "Native" mode (turning on all the resolutions), my Panasonic plasma into 1:1 pixel ratio, and the VP50 is what resizes the image for my display. It makes most Standard Definition material look like High Def. I've put it side-by-side with having both my plasma scale the image, and having the HR20 scale the image, and its easily better to have the external processor do it instead of either.

As a nice plus, for BBC America, and other SD programs that are run as letterbox, I have the VP50 scale the image properly so it fills the screen and maintains the original aspect ratio. Looks much better than using the Zoom feature of the Panasonic plasma.

As a side note, I'm actually a television network engineer, so I can tell even the most minute discrepancies in picture quality, but even the non-engineer friends can see the differences I'm talking about.
Don't need to. ;)

I have it set to 480p & 1080i and it does a great job with scaling. :D

Mike
 

· Broadcast Engineer
Joined
·
4,146 Posts
keyoctave said:
Thank you TomCat for the detailed explanation! It all makes more sense now.
Uhh...yeah. Except it doesn't exactly apply if you are 768-native and not 720.

That means that you will give up some small amount of rez some of the time on some 1080 content if you choose 720 as the DVR output. "Native" mode might make more sense for you, except it can lengthen channel-change times and HDMI handshake times, up to 7-8 seconds, which many find intolerable.

If your set was made before 2005 (when most sets started to do 1080 deinterlace properly) 720 might actually still look better, since the DVR DOES do deinterlace properly, and setting your DVR to 1080 means that happens in the DVR rather than in the HDTV.

Best advice is to try them all and pick what works best for you. There will actually be little difference anyway.
 

· Broadcast Engineer
Joined
·
4,146 Posts
JonW said:
...Something else to keep in mind is that 1080i uses half the refresh rate of 720p, hence why 720p is preferred for fast action shows like sports. So when converting from 720p to 1080i there's more going on than just scaling, 1/2 the frames are dropped...
Not exactly. The faster refresh is one reason, but not the main reason, which is that there will not be any interlace motion error on moving images (or at least a whole hell of a lot less than with 1080i).

Half the frames are not dropped. Half of each frame, the alternating lines in each frame, are dropped. Each frame of 720p takes 1/60th of a second. Each frame serially delivered as 720 progressive is rescaled into 1080 lines. The odd lines are kept from the first frame and the even lines are dropped. The odd lines then become the 540 lines that comprise the first field of the interlaced frame. The even lines are kept from the second progressive frame and the odd lines are dropped. Those remaining even lines then become the 540 lines that comprise the second field of the same interlaced frame. Two fields of 1/60th second each then make one interlaced frame of 1/30th second each (actually 1/59.94 and 1/29.97). If they simply dropped half the frames outright, that would reduce the resolution to 540.

JonW said:
...Likewise when converting from 1080i to 720p 1/2 the resolution is dropped...
Again, not really. There are over 2 million pixels in a 1080i frame and less than a million in a 720p frame. That sounds a lot like over half the resolution is dropped, but resolution is dependent upon much more than sheer pixels per frame. Since two frames of 720p are delivered in the same time frame as 1 frame of 1080i, the delivered pixel rate is pretty similar, with 720p delivering about 88% of the pixels per second that 1080i does.

Not only that, but there is a huge difference between potential resolution and perceived resolution. 1080i has a potential resolution somewhat higher than 720p for still images, but in the real world 1080i still images do not appear to be significantly sharper (and they actually are not), only mildly sharper, due to many factors. For moving images, 720p images actually are perceived as sharper (and in some ways they actually are sharper).

Bottom line, there is very little perceived difference in resolution between 720p and 1080i, even on 1080p-native displays, and of course the only resolution that matters is the resolution that is perceived. If one system were actually superior to the other, then would not have all networks adopted the superior format? Of course they would. But neither is actually a superior format, which explains why folks can rarely if ever even tell the difference.
 
1 - 20 of 36 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top