1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

1080P receiver coming soon ?

Discussion in 'DISH™ High Definition Discussion' started by ken310, Mar 7, 2006.

Thread Status:
Not open for further replies.
  1. ApK

    ApK Icon

    761
    0
    Mar 6, 2006
    Yeah, but wouldn't 1080p/60 be SWEET!?!

    Honestly, while I knew that even 1080p/30 was defined as part of the standard, I didn't realize it was being implemented already. At the beginning because of bandwidth limitations there was no channel allocation for it, and nothing that could display it even if there was.

    Before I'm ready to buy my first HDTV, I'd love to see a head-to-head comparison of 720p/60 and 1080p/30. Higher temporal resolution is supposed to be more immersive.
     
  2. normang

    normang Icon

    1,018
    1
    Nov 14, 2002
    Hardware specs for impending Blu-Ray or HD-DVD hardware is probably not going to include anything for 1080P, because the initial players probably do not support it..

    I would not expect anything viewable in 1080p until later in 2007, and that might optimistic
     
  3. ken310

    ken310 Legend

    109
    0
    Feb 24, 2006

    Not having seen it I'm not sure? Some say that the human eye can't see it or the brain can't process it? I'm not sure what to think there?

    I'm not willing to wait that long. My hd picture is absolutely awesome! I'd like more content but the quality of the picture is good enough for me.
    The sd isn't so great anymore after viewing hd so beware.

    I'd like to know what we're receiving now? Someone said hd lite and I forget the #'s?

    I do know it's only going to be real high end tech like Blu-ray to begin with so for the time being my tv is again way (2nd hdtv) ahead of the broadcasting standards which I consider a good thing. I'd be very happy with 720p which from what I understand we are still a ways away from?? I think it's said for what ever reason that dbs seems to have such a learning curve, mpeg4 has been a standard since 1999.

    How long did they sell the 921? and it's software, is it right yet?
     
  4. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    um, not exactly. 1080i is 30 fps display technology.

    The camera would be a 1080/30p recorder, or a 1080/24p converted to 30 frames. Regardless how the digital signal is sent, once inside the tv, it becomes an analog 1080i. Your TV then takes each of the 60 half-frames and displays them in an odd/even fashion as you noted. But remember, it's half a frame, not a full frame, every 1/60th second. Also, this is only on CRT's, as it's due to the way the brain and phosphurs work that allows you to "see" all 1,080 lines lit at the same time, even though they might not be actually lit the entire 1/30th of a second. (digital tvs like lcd's don't have phosphurs which can stay lit by their nature, hence why they are not sold interlaced)

    You really need to understand the difference in i and p so you don't fall into the trap of thinking 1080i is only 540. 1080i is a 60 cycle, 60 half frames, display method. It is also a 30 full frames per second method. 720p and 1080p are 60 frames per second methods, ONLY for programming recorded in 60p. For movies and normal TV, recorded with 1080/24 or 30p cameras, 720p/1080p is only a 30 fps method even though it's drawing 60 frames (since it can only receive 30 unique frames, it has to show each twice). TV is only a 30 fps medium. Movies are a 24 fps medium (converted to 30 for tv viewing). Don't get hung up on the 60 fps number, as it doesn't offer anything other than a possibly smoother image for fast motion (not everyone will notice the difference if not "prompted" to). Even if a TV could do 120fps, you'd get no benefits as the eye/brain combo can not process those changes. It's just like in audio, your ear/brain can not hear changes of less than 3db in level. Theoretical frames are nice, but your brain can still only process 30 a second. Progressive displays simply make sure all the pixels are lit, where interlaced displays rely on phosphurs to stay lit a predetermined amount of time based on the strength of the electron beam that excited them into action. Hence, especially on older CRT's, interlaced done poorly will not look as good as progressive. But done right, the vast majority of people won't be able to tell.

    actually, it's a 60 fps display technology, but there are no sources of 60p material as yet. The most likely source soon would be something like a gaming console which would benefit from a 60 fps display ability. And fields only apply to interlacing, as a field is the half frame drawn each 1/60th a second when using interlacing. 2 fields = 1 frame, not field = frame.


    well, not exactly again. 540 at 60 would still be only a 540 display. 1080 doesn't display 540 lines of data. It displays a full 1,080 lines of information, over 2 passes of 540, onto a phosphur medium designed to cause the human brain to "see" all 1,080 lines at the same time. There is no such thing as 540@60fps, as there are no 540 line displays anywhere.
     
  5. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    it's not some saying this, there is no disagreement on this issue amoung scientists whom study such things. Here's a good explanation that should help you understand how it all works. pages 2 and 3 talk about the eye, but I'd recommend the entire article if you really want to understand the basics behind TV (and there are links to explain digital TV as well)



    wasn't it about 1 year? and no, it still have issues now and then. Like right now it keeps only grabbing a 2 day guide. I would not recommend anyone buy a used 921 unless you don't mind a box that doesn't perform as expected 99% of the time. I'd say mine only acts as expected about 75% of the time at points, and up to 90% of the time at others.
     
  6. ken310

    ken310 Legend

    109
    0
    Feb 24, 2006
    Do you know what we are receiving now 540i, p ?? and what's expected if know for the immediate future?

    I know the 942 has been out for about a year but the 921 has been out since I believe late 03.
     
  7. Rick_R

    Rick_R Legend

    156
    0
    Sep 1, 2004
    Simi Valley, CA
    The MPEG4 AVC (aka H.264) which is the current MPEG4 that is being implemented was made an ANSI standard late in 2004. Since that time the chipmakers have been going crazy trying to make combined MPEG2/MPEG4 chips using this standard. (The other versions of MPEG4 were considered at best to be marginal improvements over MPEG2).

    Now the chipmakers are delivering MPEG4 chips and everyone is starting to put those chips into products. That is why Dish, DirecTV, and Hi Def DVDs are coming out with these.

    Rick R
     
  8. IowaStateFan

    IowaStateFan Godfather

    270
    0
    Jan 11, 2006
    Actually, I didn't have any major problems with mine until recently. The new software that they sent out yesterday seems to be a good fix. I have the full guide, and everything seems to be more stable. I guess time will tell.
     
  9. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    no such thing as 540i. 480i would be the old norm. 480p is the new Standard Def resolution for digital broadcasts. HD channels are either 720p or 1080i, depending on what the broadcaster chose.

    921 hit right about the last week of 03
     
  10. Alpaca Bill

    Alpaca Bill Legend

    193
    0
    Jun 17, 2005
    A little before that since I had my first one activated on Dec 13th, 2003. I bought it from a retailer in AZ (I was in IL) on eBay. So he actually had it a few days prior to that. I had also been bidding on sevral others for the preceding week so maybe as early as Dec 1, 2003 for people to actually have them in hand (to flip on eBay for a pretty good markup at that).
     
  11. ken310

    ken310 Legend

    109
    0
    Feb 24, 2006

    I never did pin it down that far but knew it was a lot longer then a year. I think maybe Rogueone had confused it with the 942?
     
  12. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    no, it was late Dec when I noticed all the chatter and people placing orders thru dishdepot here. Based on all those comments it seemed it must have just hit the market. I guess I just came in about a month late :)

    I bought mine like Feb 4th or so 2004 from a local dealer by sheer luck. Everyone else was waiting on lists, and I didn't want to wait 3 months like some were being told they'd have to wait, so i started calling around and 1 shop said they'd have more in a week. None of the others expected to have any. Got it a week later :) thought i was happy until I couldn't get my OTA stations to stay locked in half the time ;) hahaha
     
  13. ApK

    ApK Icon

    761
    0
    Mar 6, 2006
    Rogue, from what I understand, this is not quite accurate.

    If you have any sources, please point me to them. I last read about this stuff many years ago and truthfully, I could be wildly mistaken.

    First, in biology, no numbers are absolutes. Everything is really a range and numbers are just approximation or an avergage.

    Second, the studies done for the Showscan (??? whatever that 60fps movie technology was called) a while back made it clear that 60fps was closer to the eye's actualy motion rate, and even if the details of motions may not normally be decernable beyond 30fps, the effect is very real...more immersive, more "realistic feel" and less eye strain.

    Also 3db is (in the range of) what most people people would describe as "noticiably louder" but that's not the same as "perceptable." I think the actual just-noticible-difference is down around 1db. Just like when a room get SLOWLY darker, if you make a sound SLOWLY louder without telling someone, by the time there's a 3db change , the avergage person will go like go "hey that music got louder!" But that doesn't mean that if you carefully A/Bed a 1 or 2 db change that they wouldn't notice a difference. They probably would, and that has an effect on their perception of the sound.

    EDIT: Some links...
    http://www.pechorin.com/m/2002/03/0..._TV_veterans_who_dont_understan-112870-2.html

    http://www.phys.unsw.edu.au/~jw/dB.html
     
  14. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    the best test case for fps is video gaming. Every consensus is that you need a minimum of 30 fps for a smooth picture while gaming. Drop below 30, and in many game types, it's very obvious. At the same time, many games are visually better if your video card can display 60+ fps. But it's not because there are 60 fps, it's because the video card itself has limits to how much it can draw per second, and there are times in a game with lots of explosions, smoke, reflections and the like where a card that is capable of 90 fps normally might only be able to draw 35fps.

    There are video cards out today that can draw some games at 300fps, the only reason to get those cards if you are playing an older game it can run that fast is, it'll never ever get below 30 fps. There are NO visual improvements to having a 30, 60 90, 120 fps capable video card. It's strictly the need to keep the fps above 30 or you'll notice the stutter.

    I've been gaming since before 3d cards, and the fps being 30 or 100 doesn't change how the scenes look. dropping below 30 though, that makes a difference, and many times makes a game unplayable.

    as to the question of 3db, here is a snipet from this longer paper:
     
  15. Oompah

    Oompah Cool Member

    16
    0
    Feb 7, 2006
    That's a good explanation of deciBels (dB). The easiest way to remember what's capitalized and what is not is that the Bel is named after Alexander Graham Bell, so it's a proper name, while the prefix, deci (1/10), is not. Bell noted that 10 times the power about doubled the perceived volume of sound (1 Bel), but multiplying that number by 10 made the quantity easier to work with (10 tenths of Bels, or 10 deciBels) doubles the volume . It's dB, not DB or Db or db.

    More information than you wanted...

    Deci is from the Latin decimus, meaning 10. In modern usage, Latin prefixes (deci [10], centi [100], milli [1,000], micro [1,000,000], nano [1,000,000,000], pico, etc.) are fractions, while Greek prefixes (deca [10], hecto [100], kilo [1,000], mega [1,000,000], giga [1,000,000,000], etc.) are multiples. Think microseconds (1/1,000,000 second), millimeter (1/1,000 meter), centimeter (1/100 meter), cent (1/100 dollar), and decimal (1/10), but decade (10 years), hectares (100 meters squared), kilometer (1,000 meters) and megaHertz (mHz = 10 ^ 6 Hz). Don't forget mile (mille passuum = 1,000 paces; a pace [double step] is about 5 feet), and centuries (100 years), ... hey, they thought those up before "modern" usage.

    Still more useless information...

    Since computers work most efficiently in the binary system (powers of two rather than powers of ten), and, coincidentally, two to the tenth power (1,024) is close to 1,000 (ten to the third power), we see Kilo (2 ^ 10 = 1,024), Mega (2 ^ 20 = 1024 * 1024 = 1,048,576, about a million), Giga (2 ^ 30 = 1,073,741,824, about a billion), etc. in the computer world. Notice the capitalization (the power-of-two values are bigger than the similar power-of-ten). We've got Megabytes (2 ^ 20 bytes) and Gigabytes (2 ^ 30 bytes).
     
  16. ken310

    ken310 Legend

    109
    0
    Feb 24, 2006
    Now can you break the video down as well?

    Seriously, Very Cool !
     
  17. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    :s_welcome oompah, nice one :)

    what more do you want to learn ken? as specifically as you can state it :)
     
  18. ken310

    ken310 Legend

    109
    0
    Feb 24, 2006

    24, 30, 60, 120 fps ?? :confused:


    Thanks!
    Ken
     
  19. Oompah

    Oompah Cool Member

    16
    0
    Feb 7, 2006
    24 frames/sec is a standard dating almost from the beginning of movie making that probably was a compromise between smooth motion on one side, and quantity of film consumed, exposure time, and the mechanical contraptions used to shoot and project film on the other. With a 24 fps video standard, making the conversion from film to video is much simpler, and requires 4/5 the storage and bandwidh of 30 fps.

    30 frames/sec (60 interlaced fields/sec) kept power-supply ripple - a problem in the early days of TV - from making television images "squirm" since frames keep a fixed phase relationship to the 60 Hz power-line cycles (in the USA). The PAL TV standard uses 25 fps (50 interlaced fields) in Europe and other parts of the world where the power-line frequency is 50 Hz. For a reason I've forgotten, NTSC video is really 29.97 frames/sec (actually 30 X 1000/1001... why?) I hope that the new "30" and "60" digital standards are exactly 30.000 & 60.000 fps!
     
  20. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    oh hehe. when i went on to talk about 60 and 120, I was doing so in relation to video gaming. in video gaming, the cpu and gpu (video card) have to be able to draw the scene fast enough for it to seem smooth.

    Ever since 3D cards came out for computers, the magic number has always been 30fps for smooth video. But when online gaming like Quake and Half-Life became big, people notice that if they were only getting 35 or 40 fps during single player gaming (just you and the scripted events for the game), that during multiplayer (using a game map and putting 8 to 16 real people on there shooting rockets, bullets rayguns etc.) the fps would often drop in half during heavy firefights. You see, in SP, the numbers of items to track and draw were controlled, but in MP, things like bullets from a gatlin gun, or smoke trails and explosions from multiple rocket launchers would cripple the GPU. So it was quickly realized you needed at least 60 fps to keep the video above 30 during most fire fights.

    But as the games became more and more graphically intense, more and more speed was needed. Today it is very typical for a GPU to be able to run over 100 fps during normal play, and still in the 60 or higher range during firefights. But my point, in our consideration of video is, these 60, 90, 120 fps don't improve the picture. they allow the GPU to not become overwhelmed. But the images seen by the gamer are the same whether they are looking at 60 fps or 30 fps.

    Now, to some, it's possible to notice less blurriness if you swing around quickly in a 180 with the higher frame rates, which is similar to those who mention seeing sports appear to be smoother when viewed in 720p/60. The thing is, those "moments" you'll see or not see that motion blur are probably no more than 1 or 2% of the time your watching a sport. We just aren't shown camera angles that allow that effect to be prevelant very much. and, if you aren't getting the biggest of displays, it's harder still to see these effects. smaller, under 40" displays are going to make seeing imperfections harder, especially if it's a 1080i image on a 1080p display (not many people buying crt projectors/tubes anymore, which are the 1080i displays).
     
Thread Status:
Not open for further replies.

Share This Page