DirecTV 4K UHD plans

Discussion in 'DIRECTV General Discussion' started by SomeRandomIdiot, Jun 15, 2014.

Tags:
  1. slice1900

    slice1900 Well-Known Member

    11,147
    1,715
    Feb 14, 2013
    Iowa
    The bit rate for HDMI and the bit rate for your internet service have absolutely nothing whatsoever to do with each other. I can promise you will not need 18 Gbps for 4K streaming unless you plan to somehow stream raw HDMI :)
     
    1 person likes this.
  2. slice1900

    slice1900 Well-Known Member

    11,147
    1,715
    Feb 14, 2013
    Iowa
  3. GregLee

    GregLee Hall Of Fame

    1,703
    19
    Dec 28, 2005
    Then, it is too good to be true. I draw your attention to this article at avsforum: http://www.avsforum.com/forum/286-latest-industry-news/1528750-comparing-mpeg-2-h-264-h-265-video-codecs-nab-2014-a.html. Here's the short version: more efficient codecs can get you substantial improvements at the low end of quality video. But at the high end of quality, there is very little improvement to be had. I am not looking for 4k video for my smart phone.
     
  4. patmurphey

    patmurphey Godfather

    1,034
    90
    Dec 21, 2006
    Netflix recommends a minimum of 25mbps.
     
  5. longrider

    longrider Well-Known Member

    4,181
    216
    Apr 21, 2007
    Elizabeth, CO
  6. jimmie57

    jimmie57 Hall Of Fame

    10,120
    889
    Jun 26, 2010
  7. studechip

    studechip Godfather

    1,987
    214
    Apr 16, 2012
    I noticed it said April 10th.
     
  8. harsh

    harsh Beware the Attack Basset

    23,160
    488
    Jun 14, 2003
    Salem, OR
    The V-Nova press release is dated April 1st.
     
  9. Diana C

    Diana C Hall Of Fame DBSTalk Club

    2,119
    296
    Mar 30, 2007
    New Jersey
    It is certainly possible to get UHD that small, but I question the "higher picture quality" statement. Every compression technology advance since Lempel-Ziv has been lossy (i.e. when decompressed, some information is missing). The advances from MPEG-2 to AVC to HEVC have all been based on getting smarter about what can be lost without it hurting the perceived picture quality. However, any original video compressed with HEVC will have lost information compared to one compressed with AVC which will have lost more than one compressed with MPEG-2.

    If we assume a color depth of 12 bits per pixel (the largest color depth support by 4:2:2) an uncompressed 2160p/60 stream consists of almost 6 gigabits per second (5,971,968,000 to be exact). Getting that down to 7 or 8 megabits per second is a compression ratio of roughly 700:1 or a compressed stream that is 0.14% of the original. I just don't see how you can get down to that without losing enormous amounts of information.

    The only way I can see it being even remotely possible is to require a LOT more processing power at the decompression end of the process. So, it may be possible, but you'll need a dedicated, high power, CPU which would add several hundred dollars to the cost of display equipment and make its use in mobile devices impractical for many years (i.e. until Moore's law gets Atom processors upto the level of an i7).
     
  10. slice1900

    slice1900 Well-Known Member

    11,147
    1,715
    Feb 14, 2013
    Iowa
    You sure about that math? I calculate 17.9 Gbps for 12 bit 4:4:4 4Kp60, even dropping all the way to 4:2:0 only cuts that in half. So it is even worse than what you say :)

    I'm not quite 100% ready to write these guys off since they've at least made it sound like they have some real players working with them, but if they're for real they should be able to provide a simple downloadable viewer app that can run on a smartphone and play some sample HD at a suitably tiny bit rate as a demonstration. Claims like this in the technology world come around several times a year, only once a decade are they for real. Put up or shut up. If they want to claim "but we're not trying to market this at consumers, we're selling it to the video providers" then why the press release? Lotta red flags so far.
     
  11. Laxguy

    Laxguy Honi Soit Qui Mal Y Pense.

    15,541
    617
    Dec 2, 2010
    Monterey...
    Evaluating HD on a tiny screen? I don't think that'd tell us much at all.
     
  12. slice1900

    slice1900 Well-Known Member

    11,147
    1,715
    Feb 14, 2013
    Iowa
    I was thinking in terms of an app to prove that it doesn't need much CPU (i.e. better run on older stuff like an iPhone 4 and Galaxy S2) I would assume if they did an app, they could do a version for a PC that would give you a full sized picture, but having it able to run the decoder on any PC made in the last decade plus isn't going to tell us whether it is suitable to be included in set tops.
     
  13. Diana C

    Diana C Hall Of Fame DBSTalk Club

    2,119
    296
    Mar 30, 2007
    New Jersey
    I could be off (I did the multiplication pretty quickly using Windows Calculator...I may have forgotten one factor).

    As far as the press release goes, that was probably designed to recruit investors. :)

    But, math aside, it IS possible to reach these compression levels IF you have some hefty CPU power at decompression. All the existing video compression technologies have been specifically designed to be very lightweight at display time (that's how a smartphone can decode a h.265 video stream). If you have sufficient processing power at the receiver you can embed hints in the data to allow it to be reconstructed. For example, you might take a gradually shaded surface and send just the code for the base color, along with a formula that describes the shading effect. That could reduce hundreds of thousands of bits in the source into a few dozen. But as I said, it would require so much processing power that the video processor in a DVR would be considerably more powerful than the CPU. Imagine adding the cost of a 3 GHz, 4 core, CPU and a couple of gigabytes of RAM to the exisiting build cost of a DVR or STB. It would be prohibitively expensive.

    Something like this might be useful for back haul tasks, as it would certainly save on satellite space, but the real challenge is getting UHD the "last mile" to the viewer.
     
  14. slice1900

    slice1900 Well-Known Member

    11,147
    1,715
    Feb 14, 2013
    Iowa
    Smartphones are pretty powerful now, especially if you dedicate DSP resources to a problem instead of trying to use a general purpose CPU. The iPhone 6/6S do real time HEVC video encoding for Facetime over cellular via a dedicated block on the SoC - I think only 720p but whether that is limited by the encoding complexity or the resolution of the front camera I'm not sure.

    Not comparing that to real time 4K encoding of course, nor does it have to do the greatest job only be "better than h.264" to be a win for Apple and their customers. That it can do HEVC encoding at all shows that smartphones would be up to the task of decoding a stream that required more resources than HEVC decoding.

    Of course, smartphone SoCs (at least in high end devices like iPhones, Galaxy S6 and so forth) cost quite a bit more than those in set tops, and that's unlikely to change as everyone goes to a client/server model and tries to further drive down the cost of the clients that will be saddled with the grunt work of decoding. Adding the block to do decoding of 'whatever' just costs silicon area and makes the SoC cost more so it becomes a cost/benefit decision. Dish might be more interested in making the Joeys more expensive if it made 4K delivery more efficient since they don't have the ample bandwidth set aside for 4K that Directv does.
     
  15. SomeRandomIdiot

    SomeRandomIdiot Godfather

    1,348
    37
    Jan 6, 2009
    They have put up. It was very well received at the NAB.

    I was at their debut presentation at the Mandarin Oriental in Las Vegas on Saturday.

    It was also demonstrated on the NAB Exhibt floor in 4 Booths including the Hitachi Booth where it is was used in an ultra HD ecosystem composed of Hitachi’s 4K SK-UHD4000 camera and their Data Systems servers.

    Sky Italia is now implementing the Perseus compression technology for commercial distribution of content.

    They claim to be able to add 1 Mb/s -2 Mb/s on top of an existing MPEG-2 signal and achieve UHD with Perseus. Claims are for a greater than 50% compression improvement over existing techniques.

    Clearly, one could not put test equipment on the demos, but considering Sky Italia and Hitachi are actually using/demoing them, it certainly has passed their internal tests.

    As this adds on a layer to MPEG-2 to achieve 4K UHD, I am not exactly sure where this plays out with so many plans for HEVC in place.

    But they claim they can achieve similar results with other techniques besides MPEG-2.

    Again, very late to the game which may make them odd man out, but it is certainly NOT vaporware.
     
  16. SomeRandomIdiot

    SomeRandomIdiot Godfather

    1,348
    37
    Jan 6, 2009
    You clearly are unaware of Sinclair's testing, probably the most advanced of ANY Television Group.

    They fought and fought up through ~2005 not to go with 8VSB because of the issues we all know to well. Mobile HDTV and tiny indoor antennas would be mainstream today if the FCC had listened to them 10+ years ago.

    Broadcasters and the FCC know this in retrospect.

    Many might not like their Corporate Political stance, but Sinclair will be on the technological forefront and your dismissal of them is very naive, showing you out of the loop in that area.


    EDIT: By the way, Broadcasters have no illusion of the FCC doing another ill fated ATSC 3.0 Set top giveaway. That, btw, was paid for the money the FCC made by taking back channels and selling off the frequencies. In the forthcoming auction, $1.8B is going to pay stations who remain on the air to move/repack the spectrum.

    In all likelihood, the move to ATSC 3.0 will be done the day the stations move to their new "home".
     
  17. SomeRandomIdiot

    SomeRandomIdiot Godfather

    1,348
    37
    Jan 6, 2009
    Where to start on this one....

    As i said a long time ago, I have been a fan of Dolby Vision HDR since I first saw it in early 2009. I am amazed it has taken this long for it to be seriously considered.

    That said, what you want will not happen for multiple reasons.

    First, there are 4 different HDR formats (and several additional homebrewed systems TVs have put together) which the market is considering right now. BBC, Dolby Vision, Philips, Technicolor,

    For the most part, these formats are INCOMPATIBLE (Think competing 3D Standards/Glasses).

    Dolby Vision HDR may already be the winner though, as in the last 90 days, Netflix, Amazon, Vudu and Warner Brothers announced they will support Dolby Vision HDR. Brands supporting Dolby Vision are Hisen, Philips, Sharp, TCL and Vizio with their just announced Reference line 65" and 120" series. However, there are NO Dolby Vision HDR UHD actually available for purchase today in the USA. The Vizio will most likely be ther first.

    Dolby Vision wants TVs to do 1000+ NITS (really at least to 1400-1600 NITS) and the format can actually do 4000 NITS (1 NIT = 0.29 FL / 100 NITS = 29.18 FL / 1000 NITS = 291.86). This would give Dolby Vision a Contrast Ratio of roughly 21 Million :1.

    For comparison, most HDTV used to have roughly 100 NITS. Over the past several years, the normal HDTV has 400 NITS, although a few jumped to roughly 750 NITS over the past 9-12 months.

    However, the Vizio Reference will only do 800 NIT - not even 1000 NIT, much less 1400-1600 (or even 4000 NIT). It appears the first generation, none of which are on the market, will probably not go past 1000 NIT either, meaning the full scope of Dolby Vision HDR will not be able to be seen in your home even in at least the next 12+ months.

    Now, further confuse things, Netflix announced at CES that they would work with Sony and LG to stream HDR content. However, the LG UHD and Sony UHD (X930C/X940C) shown at CES that demonstrated "HDR" are not Dolby Vision HDR, but their homebrewed HDR scheme.

    There has been no clarification if Netflix will support the homebrewed Sony and LG HDR, or if they will infact force Sony and LG to adopt Dolby Vision for HDR viewing, a change from their CES models just 100 days ago.

    To confuse things even more, OLED cannot produce the brightness that LCDs can. In the LG OLED that was demo'd at a private suite at the Bellagio (not the CES floor), they were able to increase the OLED from 500 NITS to 800 NITS, but as explained above, Dolby Vision wants roughly twice that as a minimum.

    Even though Philips says they are supporting Dolby Vision, they only demoed their "LCD Laser" HDR system at CES.

    Panasonic was calling their homebrew system "Dynamic Range Remaster" in it's CX850 series.

    And Samsung has put their homebrewed HDR into their SUHD series such as the JS9500 while not calling it HDR, although it has double the brightness of a typical LCD at around 1000 NIT (But of course cannot produce 0 NIT as an OLED can).

    Sony's X940C has their homebrewed "X-tended Dynamic Rango Pro" while the X930C has their homebrewed "X-tended Dynamic Range".

    And as for Sharp and real "Dolby Vision", Sharp is essentially in Bankruptcy -only announcing a $2 Billion Bailout 48 hours ago - and part of that involves shutting down a large portion of the North American Television Operation - so who knows if we EVER see a Sharp Dolby Vision set in the USA?

    What's the word.....Clusterf.....

    Think of the "homebrewed" HDR systems that Samsung, Sony, Panasonic, et al are using are basically the Samsung producing Quasi-3D on their sets from 2D programming several years ago.

    So besides the real HDR formats being incompatible, they are not BACKWARDS COMPATIBLE - Dolby Vision encoded content is NOT viewable on non-Dolby Vision Systems. At the very least, it would look VERY flat and bland.

    If you are still actually reading this, this leaves Samsung out alone on an island. HDR is supposed to be available through their M-GO secure locker system - and it most likely will.

    So to review thus far....

    There are NO Dolby Vision HDR Monitors actually available on the market in the USA today.

    There IS a Samsung available (obviously), but it is off on its own with no support except for their own streaming.

    Clearly, there are LESS HDR Monitors in Living Rooms today than UHD Sets. In fact, less than 1% than of the UHDs on the market today have "simulated" HDR and 0% have one of the 4 "real" HDR techniques.

    And we have yet to talk the added payload. At minimum, HDR will add 10% to the payload - but in reality adds 25%-30% to the payload (bitrate/size).

    As thus, a full UHD at 100/120fps and Dolby Vision HDR et al will need at MINIMUM 25Mbps.

    And circling back around...

    Between all that and incompatibility, DirecTV has no reason to do 1080 HDR. Only the newer UHD sets will have HDR - and Samsung, the only way to watch DirecTV UHD, is incompatible with the system that Netflix, Vudu, Amazon et al look to now behind.

    If DirecTV is going to do it, they might as well go UHD with Dolby Vision when they have an actual IRD that can output the proper format.

    While I agree that the homebrew systems that Samsung, Panasonic and Sony have put together in the interim look nice, there is no reason for DirecTV to enter a dead-end technology and further confuse consumers - especially with 1080 HDR.
     
  18. SomeRandomIdiot

    SomeRandomIdiot Godfather

    1,348
    37
    Jan 6, 2009
  19. Rich

    Rich DBSTalk Club DBSTalk Club

    36,570
    2,191
    Feb 22, 2007
    Piscataway, NJ
    Read it and was happy to see that the writer saw the same thing I did when I was doing comparisons.

    Rich
     
  20. Rich

    Rich DBSTalk Club DBSTalk Club

    36,570
    2,191
    Feb 22, 2007
    Piscataway, NJ
    So, for the sake of brevity (and my sanity), this is just like the fiasco with the BD operating systems? Did I boil that down correctly?

    Rich
     

Share This Page

spam firewall

Advertisements