1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

No difference between Native On/Off and 480, 720, 1080...

Discussion in 'DIRECTV HD DVR/Receiver Discussion' started by CHaynes112, Dec 7, 2009.

  1. Dec 7, 2009 #1 of 27
    CHaynes112

    CHaynes112 AllStar

    54
    0
    Dec 2, 2009
    Greetings,

    I'm new to the HDTV scene (but fairly tech savy). I just bought a new 60" Plasma (LG 60PS11) and have been trying to "tweak" everything to get the best possible experience. I have did a few basic things like disable energy star mode, change from vivid to standard, and enable 24p mode. I have also made a few audio changes like enable auto volume level and enable TruSurround XT (as I don't have a real 5.1/7.1 surround system yet). I am getting ready to calibrate my TV as well (hopefully this will improve my SD quality). I am currently using a HR20-100 (0x368) and have it connected to my TV via a HDMI cable.

    I have been trying to experiment with the Native feature, and also different resolutions. Here's the thing... I can't seem to notice ANY difference with Native ON versus Native OFF? Also, I don't notice ANY difference when swapping resolutions (like going from 480i to 480p to 720p to 1080i)? Can someone explain this to me... Is my TV 'converting' everything to 1080p?

    EDIT: I have all the modes enabled 480, 720, 1080...

    Just an FYI, I do see the differences when toggling between Original, Crop, Pillar, etc....

    -Thanks
     
  2. Dec 7, 2009 #2 of 27
    harsh

    harsh Beware the Attack Basset

    21,192
    183
    Jun 14, 2003
    Salem, OR
    Did you fully define all of the display modes the TV can do on the HR20? If you set it for native but only have 1080i as an option, it will convert everything to 1080i.
     
  3. Dec 7, 2009 #3 of 27
    CHaynes112

    CHaynes112 AllStar

    54
    0
    Dec 2, 2009
    Yes, I have enabled all the modes. The screen will flash then it will say to press INFO if you can see the message.

    EDIT: Just an FYI, I do see the differences when toggling between Original, Crop, Pillar, etc....
     
  4. Dec 7, 2009 #4 of 27
    rudeney

    rudeney Hall Of Fame

    4,266
    1
    May 28, 2007
    Make sure you are on an HD channel to test. The best would be a 1080i channel. If you are watching an SD channel, then you likely won't see much difference at all.
     
  5. Dec 7, 2009 #5 of 27
    DrummerBoy523

    DrummerBoy523 Godfather

    514
    1
    Jan 9, 2007
    Franklin, TN
    If you have native set to on, the DVR will output the video at what ever resolution it received the program and will allow your TV to convert it.

    If you have Native set to off, the DVR will convert the signal to what ever resolution(s) you have selected on your DVR. This avoids the resolution changes at the TV.
     
  6. Dec 7, 2009 #6 of 27
    RobertE

    RobertE New Member

    8,024
    0
    Jun 9, 2006
    Make sure your using a HD connection between the box & tv, either HDMI or componet (red-blue-green), not composit (yellow).
     
  7. Dec 7, 2009 #7 of 27
    gully_foyle

    gully_foyle Hall Of Fame

    1,301
    5
    Jan 18, 2007
    Los Angeles
    Your TV will convert everything to its fixed display format, whatever that is. However, if you force the HR2x to transmit something with less resolution (say, 480p), you should be able to see a difference, since (720p or 1080i) -> 480p -> (TV display format) will lose info at the 480p stage.

    To force 480p, you'd have to 1) turn off native mode and 2) uncheck 720p and 1080i. You would not want to run permanently like that, however.

    Few of us can eyeball the difference between 1080i and 720p. But there can be artifacting, especially with sports, if there are too many conversions. Native mode takes the box's converter out of the picture.
     
  8. Dec 7, 2009 #8 of 27
    TomCat

    TomCat Broadcast Engineer

    4,153
    100
    Aug 31, 2002
    Really? That does not sound right. Native is supposed to pass everything, regardless of format, in that format, meaning that the rescaling/deinterlacing in the DVR is essentially out of the circuit. The only real disadvantage of native is that it can increase channel acquisition times, as the HDMI handshake takes longer when the display itself is forced to handle multiple formats.

    About the only thing you really need to try to avoid is setting the DVR for anything less than 1080i when using a 1080p or 768p display, as that has the potential to reduce the resolution slightly for 1080i content.

    But actually, the best thing is to try all combinations and use what works the best. Theoretically, it makes sense to use either native on or 1080i at the DVR and let the TV rescale, but on careful experimentation, that may not actually be the best option. It can depend on the TV, and the scaler, deinterlacer, and other circuitry in the TV.

    I can give you a couple examples: My original HD set was a 2004 Sony with a 1366x768 fixed resolution (pre-1080p). It did not deinterlace as well as the HD DTivo or HD DVR+, meaning that 720p setting actually produced better PQ than 1080i, even for 1080i content.

    Deinterlacers have improved, but there are still idiosyncratic differences in how TVs handle things. My new 1080p Vizio seems to present the same exact resolution (this from an original 1080i source of either a HD DTivo or a HR2x) whether the DVR is set to 1080i or 720, yet there is a bit more chroma in the 1080i setting. Using a 720p source yields about the same results. These tests were done with vieo that had a lot of high-resolution areas in the picture, and there seemed to be no loss of resolution whatsoever when changing from 1080i to 720p. Of course you can readjust chroma levels to suit your taste with either setting, but different settings also seem to affect how the sharpness control works.

    The Vizio sharpness is a setting from 0 to 7. When fed a 1080i signal from a 1080i source, it seems to look best set at about 5, but there are minor differences from 0-7, actually. If I keep the sharpness set to 5 and feed it a 720p signal (from the same original 1080i source, by setting the DVR output to 720p), that looks a bit over-enhanced, and 4 is the best setting (so ironically, 720p is sharper, at an equivalent setting, than 1080i). Settings above 5 show increasingly-high levels of over-enhancement, and 7 is so cartoony as to be unwatchable.

    So try all settings and choose for yourself.
     
  9. Dec 7, 2009 #9 of 27
    Mike Bertelson

    Mike Bertelson 6EQUJ5 WOW! Staff Member Super Moderator DBSTalk Club

    14,040
    94
    Jan 24, 2007
    You are correct.

    From the HD DVR+ manual:

    "In Native mode, the receiver automatically adjusts resolutions to match the resolution of individual TV programs as they are tuned."

    Per the manual, it doesn’t matter what resolutions you have checked. In Native mode what ever the original program format is received, is what gets passed.

    Mike
     
  10. ccsoftball7

    ccsoftball7 Godfather

    354
    0
    Apr 2, 2003
    Then the manual is wrong. You must check each resolution you wish to pass and have native on.
     
  11. Mike Bertelson

    Mike Bertelson 6EQUJ5 WOW! Staff Member Super Moderator DBSTalk Club

    14,040
    94
    Jan 24, 2007
    And there it is. Not an opinion; not a line out of the manual but direct observation. It didn't take very long for the response(quicker than I thought). :)

    This is one of those situations where what is assumed or per the manual is wrong.

    This has been covered quite a few times before and each time it's determined that what is passed is only what is checked. By both the LED on the front of the receiver and what the TV says it's receiving. :D

    Mike
     
  12. RobertE

    RobertE New Member

    8,024
    0
    Jun 9, 2006

    ccsotball7 is correct. If you only have 1 resolution selected in the setup, the HR will only send that resolution. As far as it's concerned it's the only available resolution on the tv, so thats what it sends.
     
  13. MountainMan10

    MountainMan10 Icon

    607
    0
    Jan 30, 2008
    You also need to change the channel after changing the selected resolutions.

    If you have only 720 selected and it is sending 720 to the tv and you then select 1080i it will continue to send 720 until you change the channel.

    Best was to test is to press the button on the front panel to switch the mode. Most TV's will display the new mode briefly.
     
  14. CHaynes112

    CHaynes112 AllStar

    54
    0
    Dec 2, 2009
    Thanks to everyone for their input... I really appreciate it.
     
  15. DogLover

    DogLover Hall Of Fame

    2,510
    0
    Mar 18, 2007
    And this is a really good function for some of us. My old DLP will not accept 480i over HDMI or component. It will also not accept 720p at all. Even if I use native on, I don't want it to ever send those resolutions because the TV will just display "unsupported signal".
     
  16. TomCat

    TomCat Broadcast Engineer

    4,153
    100
    Aug 31, 2002
    I guess that actually makes sense, as if you had a display that had a problem with one resolution or another, that would make "native on" fairly impractical for that user. The ability to prevent certain resolutions would allow them to use it with a finicky display like that.

    This is not the first time that the manual has been wrong (its almost as if it were written by certain DBSTalk posters! :rolleyes:). The manual also clearly states that HDMI has higher PQ than HD component, a myth that has been thoroughly debunked here many times.
     
  17. soloredd

    soloredd Legend

    100
    0
    Oct 21, 2007
    On my 720p plasma, I've found Native OFF and enabling ONLY 720p gives the best picture for both HD/SD. Channel changes are quick, no need to convert anything as the signal is always 720p (or so I assume).
     
  18. David MacLeod

    David MacLeod New Member

    5,689
    0
    Jan 29, 2008
    on my 2 1080 sets I've messed with all settings a lot and never see any difference. so I leave native on with res at 1080 only due to not having to play with format options.
     
  19. sptrout

    sptrout AllStar

    57
    2
    Dec 29, 2006
    Spring, TX
    Kind of off topic, but - - Is there ever a reason to have 480P selected since there are no sources in that format? I think I remember reading a long time ago that a few TVs were built using that format, but if that was correct at one time, it is no longer.

    In my case (with a 1080P TV) I have native "on" but only have 720P & 1080i as the compatible formats. I think (a question) that this is the best for me since 1) we seldom watch any SD material, and 2) it reduces switching time (because I have only two possible formats selected). Does this seem correct?
     
  20. veryoldschool

    veryoldschool Lifetime Achiever Staff Member Super Moderator DBSTalk Club

    42,684
    349
    Dec 9, 2006
    "Correct" comes down to what you like.
    480p is there because some TVs don't handle 480i over HDMI.
     

Share This Page