1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HD Lite a Myth?

Discussion in 'DIRECTV HD DVR/Receiver Discussion' started by Steve, Apr 1, 2007.

Thread Status:
Not open for further replies.
  1. Apr 1, 2007 #1 of 95
    Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    A FIOS user posted that his FIOS HD was slightly better than the HR-20's HD.

    My OTA feeds come via a Winegard 8-bay on my roof directly from the Empire State Building in NYC, the same place the D* east coast network feeds originate. I don't believe anyone can get a "purer" HD input signal than this. I see no difference between 80, 82, 86 and 88 and 2.1, 4.1, 5.1 and 7.1 on my Fujitsu 50" 40 series plasma. If his observation is correct, maybe it's because they all travel through the same HR-20 output circuitry and are being "normalized", so to speak? My Fujitsu does not have a built-in ATSC tuner, so I can't compare.

    Or are the reports of D*'s HD being "lite" greatly exagerrated? Curious what others have observed. TIA.

    /steve
     
  2. Apr 1, 2007 #2 of 95
    veryoldschool

    veryoldschool Lifetime Achiever Staff Member Super Moderator DBSTalk Club

    42,683
    348
    Dec 9, 2006
    This has been & will be kicked around forever....
    What is the definition of "lite"? If it is not using the full 19 MB bit rate, then most HDTV is "lite". OTA channels have sub channels that "steal" from their HD channel.
    SAT HD feeds don't use the full bit rate, nor does cable.
    Is it a myth that few [no] channels use the full bit rate? NO.
    I have post threads, as have others [many] one of which is here: http://www.dbstalk.com/showthread.php?t=80458
     
  3. Apr 1, 2007 #3 of 95
    Milominderbinder2

    Milominderbinder2 Cutting Edge: ECHELON '08

    4,107
    0
    Oct 8, 2006
    It's no myth.

    Here are the results a calibration company found...

    http://www.widemovies.com/dfwbitrate.html

    These tests are from a year ago. Can anyone find something more recent? It does show that it was being done regardless of what some may say.

    - Craig
     
  4. Apr 1, 2007 #4 of 95
    tonyd79

    tonyd79 Hall Of Fame

    12,971
    204
    Jul 24, 2006
    Columbia, MD
    It is no myth techincally but the discussion point is at what point can anyone truly detect a difference in PQ.
     
  5. Apr 1, 2007 #5 of 95
    Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    I didn't mean to say that D* isn't applying greater compression. What I meant to say is that it doesn't appear to be hurting the perceived PQ I get from my Fujitsu, so no harm, no foul, IMHO.

    What I'm curious about, tho, is if the OTA signal I'm receiving is being "dumbed down" by the HR-20's output circuitry, and that is why I'm not seeing any difference between 80 and 2.1. Has anyone with an ATSC OTA tuner in their display ever made this comparison?

    /steve
     
  6. Apr 1, 2007 #6 of 95
    Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    Yup. /s
     
  7. Apr 1, 2007 #7 of 95
    cygnusloop

    cygnusloop Hall Of Fame

    2,281
    1
    Jan 26, 2007
    Troublemaker.....
    :lol: :lol: :lol:
     
  8. Apr 1, 2007 #8 of 95
    hdtvfan0001

    hdtvfan0001 Well-Known Member

    32,456
    258
    Jul 28, 2004
    The Myth is the term HD Lite - ain't no such creature. It's either Digital/HD (720p, 1080i, or 1080p) or its not. All D*TV HD is full HDTV.

    That said, mixing up the bit rates, occasionally causing hiccups or reduced bitrate transfers is done periodically to leverage the total bandwith of the HD channels - which is also, by the way, done by Dish and Cable.

    MPEG4 comes into play as well.

    Once D*TV has more badwidth with the 1-2 sats this year, the musical bitrate game should drop off as a regular practice.

    There you have it. :D
     
  9. Apr 1, 2007 #9 of 95
    Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    I hear you. I think post #5 better explains what I'm asking tho. /steve
     
  10. Herdfan

    Herdfan Well-Known Member

    6,500
    98
    Mar 18, 2006
    Teays...
    Here is what I know from personal experience: Back in the fall of 1995, I had both D* and E*. I had the E* HD package only for $15 becasue I wanted to see NASCAR on TNT-HD and D* did not offer TNT-HD at the time.

    So I was able to directly compare D* and E* to each other on Discovery and HD Net. I could have done ESPN, but at 720p, both looked pretty good. With both Discovery and HDNet, E* looked better on my 57" CRT HD RPTV with both boxes feeding via component. Was it measurable better, not really. But in a direct comparison, yes you could tell a slight difference in sharpness.

    Could I have noticed on a 720p DLP, probably not? But on the best display technology, yes there was a difference.
     
  11. STEVEN-H

    STEVEN-H Member

    631
    2
    Jan 18, 2007
    Louisville, KY
    Yes but, you are comparing OTA and MPEG 4 from Direct. HD Lite refers to the HD networks like HBO, Showtime, Discovery, HDNET, and the like which D transmits in MPEG 2 with lowered resolution and bit-rate.
     
  12. Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    I am comparing the channels in the 80's to OTA. They're still MPEG-2, if I'm not mistaken. /s
     
  13. BobV

    BobV Godfather

    371
    0
    Dec 15, 2006
    I believe its in the eye of the beholder, Meaning how good is the human eye,age
    and other factors apply.
    At this point I will take any HD, Lite or not!!:p
     
  14. Doug Brott

    Doug Brott Lifetime Achiever DBSTalk Club

    28,939
    72
    Jul 12, 2006
    Los Angeles
    +1
     
  15. Tiebmbr

    Tiebmbr Godfather

    356
    0
    Mar 27, 2007
    "HD is in the eye of the beholder."
     
  16. Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    Agreed. Just wondering if anyone with an ATSC tuner in their display has compared OTA HD to 80, 82, 86. and 88 from the HR-20, and what they've observed! :) When I compare OTA to sat using the HR-20 OTA tuner, I see no difference.
     
  17. veryoldschool

    veryoldschool Lifetime Achiever Staff Member Super Moderator DBSTalk Club

    42,683
    348
    Dec 9, 2006
    I have: OTA is #1, MPEG-4 is #2, MPEG-2 is #3, cable is #4. FWIW
     
  18. harsh

    harsh Beware the Attack Basset

    21,192
    183
    Jun 14, 2003
    Salem, OR
    Here's a link to another page on WideMovie.com page: http://www.widemovies.com/directv-resolution.html

    Here's a "before and after" comparison supporting (quite unequivocably) the degradation in quality: http://www.widemovies.com/directvcomp.html

    These comparisons were done with MPEG2 content.

    The test pattern comparison in the resolution test demonstrates that not only is the PQ better with 1920x1080i, but the bitrate is actually lower (on a substantially static image).
     
  19. Steve

    Steve Well-Known Member

    23,049
    149
    Aug 22, 2006
    Lower...
    I saw those, too. They were done back in '04 or '05, if I'm not mistaken. I really don't see much difference between 1920 and 1280 on the top and bottom HD NET resolution pattern examples on that page.

    My own comparisons yesterday on my Fujitsu aren't showing the same obvious differences on the 80-86 network broadcasts that are shown on the "before 9/7 after 9/14" movie stills, so maybe things have improved since 2005. Of course, I have no way of directly comparing what's going on with the D* HD channels from 70-79. /steve
     
  20. machavez00

    machavez00 Hall Of Fame

    3,739
    8
    Nov 2, 2006
    Phoenix,...
    looks like the only channels that are getting "bit starved" are the 1080i feeds. The 720p feeds look untouched.
     
Thread Status:
Not open for further replies.

Share This Page