1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

What HD Monitor Resolution to Get?

Discussion in 'DISH™ High Definition Discussion' started by Oompah, Feb 26, 2006.

Thread Status:
Not open for further replies.
  1. Stewart Vernon

    Stewart Vernon Roving Reporter Staff Member Super Moderator DBSTalk Club

    21,609
    380
    Jan 7, 2005
    Kittrell, NC
    More completely wrong info!

    1080i scans 1080 lines. 720p scans 720 lines.

    720p displays all 720 lines in one sweep of the screen. 1080i displays 540 in the first sweep, then 540 more in the 2nd sweep to complete the 1080 lines.

    1080i is NOT 540 lines overlapping. It is interlaced scanning vs progressive. I hate to keep typing the same info in discussion after discussion...

    For the sake of argument, lets talk about 720p as displaying lines 1-7 for a second. The monitor would display them as follows:

    Line 1
    Line 2
    Line 3
    Line 4
    Line 5
    Line 6
    Line 7

    All of the above would be done in one pass, progressive, to fill the screen with one full frame of info.

    Now, lets talk about 1080i as displaying lines 1-10 (because it has more lines total than 720p does). The monitor would display them as follows:

    First pass:
    ----------
    Line 1
    Line 3
    Line 5
    Line 7
    Line 9

    THEN, second pass:
    ----------
    Line 2
    Line 4
    Line 6
    Line 8
    Line 10

    All of the above would be done in two passes, as indicated interlaced, to fill the screen with one full frame of info.

    This is done so fast that the human brain cannot detect it is happening. This is why movie pictures and TV works. With the exception of some folk who are particularly sensitive and get headaches or have seizures (rare), we simply cannot see the interlacing happen.

    Thus... end result... 1080i displays a higher definition image than 720p. Period. It just displays them differently.

    Another example is dealing cards... Cards are dealt in an interlaced format! Each player does not get all 5 (or whatever depending on the game) at one time, but rather each player gets a card before anyone gets a 2nd card or a 3rd and so forth. It doesn't affect at all the number of cards dealt. Everybody gets all their cards.

    1080i gives more information than 720p. It just displays differently.

    The reason why 1080i is a supported format:

    1. Because it works, and the human brain cannot see the interlacing!
    2. It can be done using less bandwidth than a progressive picture would.

    Sure 720p at 60 frames might look better than 720p at 30 frames.. but most of those 30 more frames are identical information, and not new data... So yes, 1080p if it ever happened might look better than 1080i... but the reality is, 24-30 frames per second is all that is needed for motion. Film is only 24 frames per second, and you don't hear folks in the movie theater complaining!
     
  2. LtMunst

    LtMunst Hall Of Fame

    1,267
    3
    Aug 24, 2005

    So not true. What is the pixel resolution of a 1920x1080i native display? Answer-1920x1080.

    A 1080i set does not scan 540 lines twice per frame. It scans 2 separate sets of 540 lines for each frame. If you cannot understand the fundamental difference between these these 2 scenarios, then what can I say? :nono2::whatdidid
     
  3. olgeezer

    olgeezer Guest

    1,833
    0
    Dec 5, 2003
    On another note, what type of interlace is a pinnacle deal:D
     
  4. LtMunst

    LtMunst Hall Of Fame

    1,267
    3
    Aug 24, 2005
    Technically, at any given point in time, your retina is only being struck by photons from a single pixel. That means that 1920x1080i is really 1x1p. :lol:
     
  5. jrb531

    jrb531 Icon

    916
    0
    May 28, 2004
    If the monitor could display 1080 all at once it would be a 1080p

    The very reason most monitors ar 1080 "i" is because they are capable of doing 540 in one pass but 720 is too much.

    In fact a 720p set is more expensive than a 1080i set. Why? Because a 720p set has ro be able to display 720 lines "at the same time" while a 1080i set only has to do half of that number.

    720p = 720 scan lines every second = 60 frames refreshed per minute thus why fast moving programs look better on 720p - because the picture is updated more.

    1080i = 540 scan lines every second

    Even lines second #1
    Odd lines second #2

    so the full 1080 scan lines are painted only 30 times a minute.

    This is the very reason for the "i" = interlaced!

    So stop saying that a 1080 "i" set can display 1080 scan lines at the same time - it cannot - only a 1080 "p" can display 1080 lines at the same time and not only are those sets expensive beyond belief (if you can find them) but the current hardware has a hard time compressing that much data with current tech.

    -JB
     
  6. harsh

    harsh Beware the Attack Basset

    21,192
    183
    Jun 14, 2003
    Salem, OR
    What other people think is good is pointless. What they know is bad is worth observing and weighing.

    I think the key is to find a reseller that will let you trade if you made the wrong decision. You can stir specifcations and familiarize yourself with the technologies until you have no eyesight left. In the end, it comes down to what pleases you given the programming that you watch. For those who watch Disney provided programming, the answer is probably a 720 line set, otherwise, a 1080 line set has the potential to offer the best picture.

    As was pointed out in a recent technology comparison, make sure you drag the whole household down to the store to see if they all agree. If one person in the house can't stand to watch the TV, you're going to regret it.
     
  7. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    jb, please stop showing how uneducated you are. Please take the time to read THIS it shows how TV actually works.

    Your statements are so screwed up it's almost funny, except someone is likely to think you know something you don't. Also, being as i have a Degree in communications that required learning how TV's work and how to repair them, um, I'm pretty sure I know what the heck I'm talking about, and I'm pretty sure you need to stop listening to teen agers who work at Best Buy (sorry couldn't help the poke at amazingly inaccurate statements that get made at stores like Best Buy)

    here's the deal, JB and EVERYONE else who is too lazy to read thru that link above:
    TV was created with a 30 frame per second requirement. Movie cameras operate at 24 fps (this was already told to you several times now). Why movies picked 24 fps as best I've been able to conclude was cost. It saved 6 frames of film per second, but it was still fast enough for our brains not to "see" the individual frames (like one of those picture books where you flip the pages to see the scene move). Why TV is 30 and not 24 fps, is really really simple and based on how TV is powered. It's called AC. You know, that plug in your wall? AC is 60Hz, so by pulling that 60Hz directly into the TV, and configuring the signal so that frame 1 goes from top to bottom in 1/60th a second (each cycle of the AC wall power), it made the vertical element very simple. In the beginning, technology wasn't sufficient to handle a 30fps progressive scanned image, so engineers decide to take that 30 FPS signal and make it a 60 cycle signal with half of each frame drawn each sweep of the vertical beam. This reduced costs and made the TV easier to build. They came up with the Interlaced concept. Take the odd lines, paint them first, in a single 60hz sweep, then paint the even lines. Phospurs were picked that would stay lit just over 1/60th a second, so line 1 is still barely lit as line 2 is drawn. So that you don't see a mixed image when the even lines are lit and the next frame's odd lines start, these phospurs are designed to basically stop glowing at the same time the matching line is being lit. Since this is occuring 60 times a second, the brain is seeing 30 unique images, which it is in turn processing as a single, continuous, moving image. The brain is incapable of actually seeing the 30 individual frames.

    Progressive scanning came into being due to PC's and text. The static, non moving images of a computer, at 60 hz, are so synchronized with the eye/brain and it's 30 fps processing, that it caused most people to see flicker. This is why even bumping your display to 62hz or 70 hz eliminates all flicker. So when DVD's came out, and Digital TV's were designed, it was realized you could take the 480i dvd image, read it digitally, and before sending it to the TV, it could be sent as a full frame. But the TV still works on a 60HZ system, and the phospurs are still designed to work only for 1/60th a second. So the incoming 480p images couldn't be at 30fps, they had to send each frame twice, hence 60 fps, in order for the TV to display correctly.

    When HD is discussed, 720p is the equivalent of "poor mans" HD. The computer equivalent image is 1280x720 roughly. 1080i/p are 1920x1080 pixels, actual, individual entities in the signal, and hopefully on the CRT though some aren't large enough to actually have that many phospurs. That is a different issue that your claim of 1080i being only 540 lines.

    Please keep in mind, everybody who is confused on this, there are NO MORE than 30 frames per second sent to your TV. NEVER more, and Film will be 24 FPS. Having a display that refreshes 60 times a second is useless, but is required due to having to work within the limitations of the phosphurs on the inside of CRT tubes. And you'll notice almost no CRT's do 720p, and I would guess that 720p is 30fps instead of 60 as a result. If 720p were 60fps, then it could easily be viewed on a crt (the phosphurs have to be relit every 1/60th for progressive scanning). But LCD and Plasma can work at 30fps no problem, since the pixels are individually lit, not lit by a common, sweeping electron beam that won't return for 1/30th a second.

    If some sales guy told you 720p was higher definition than 1080i, sorry you got misinformed by either a sleazy sales person or someone more likely who didn't have a clue he/she didn't know what they were talking about.
     
  8. olgeezer

    olgeezer Guest

    1,833
    0
    Dec 5, 2003
    And the reason this was done, is that original TV was developed as a B&W system. Because of bandwidth issues, when color was developed it had to fit in the same space. Color wheels were rejected, and an interlaced system was developed. It has served very well, and can do an excellent job with HD as well as SD in CRT displays.
     
  9. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    oh, and while discussing how AC impacts your TV, the reason TV's were 4:3 originally, was of course movies were originally. But also, the Horizontal scan rate is like 15,575 hz. The Horizontal part of your TV (CRT's only) is driven by multiplying that 60hz signal from the AC, to a speed that allows the electron beam to get from the left to the right and back in time for the second line to start. TV is actually some odd number like 520 lines, but some of those lines are intentional unviewable. They carry other information not intended to be viewed. So 480 becomes the "viewable" portion of the image, and why we call it 480i and p, as that is about how many viewable lines are on normal TV.

    So, when JB asks why a 720p set costs more than a 1080i set, 2 factors. 1) 1080i's are typically CRT which is still cheaper to make 2) if you look at 720p and 1080i within the same technology, like LCD or Plasma at both res's, you'll notice the 1080i is MUCH more expensive. it's easy to make a 720 res LCD glass, it's much more difficuly and costly to make a 1080 piece of glass which is the same say 1.3" size the 720 is.

    And with CRT's, it all has to do with the Horizontal scan rate. You see, for 480i, the Horizontal is only 15575 hz, which is just under 260 lines per pass. For 1080i, you need to do 540 lines per pass, or 32,400hz. This takes more expensive circuitry to boost the 60hz to 32400. And for a 720p based CRT, it has to do all 720 per pass, or 43,200hz. again, more cost to make. If you look at high end monitors for computers like SGI's, their monitors have to be able to sync over 100kHz, which is why the high end monitors cost so much more than the ones we see in stores. The 3 gun crt projectors I used to install in the 90's were $20g and $25g based on being able to do 80k or 100+k horizontal signals.

    That is what drives up costs the most in CRT's that are high res. And of course, with those higher resolutions, you need more phospurs on the tube, a mask with more holes etc.
     
  10. Stewart Vernon

    Stewart Vernon Roving Reporter Staff Member Super Moderator DBSTalk Club

    21,609
    380
    Jan 7, 2005
    Kittrell, NC
    Hey, and if we are going to consider all factors... Consider that anything you see has already happened, since it takes time for the light to travel to your eyes... so you are always looking at the past!

    Even live events are always looking into the past! :)
     
  11. tomcrown1

    tomcrown1 Hall Of Fame

    1,576
    0
    Jan 16, 2006
  12. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    dats a good one :)
     
  13. liferules

    liferules Godfather

    305
    0
    Aug 14, 2005
    Your link doesn't work. I guess I'll have to be lazy...:D
     
  14. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    wierd, how did the link I pasted get all jacked up like that? :D it's a workin' now. and that is a great web site for simple laymans type explanations for lots of stuff.
     
  15. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    pretty good read. but the guy is sorta wrong on one comment

    "However, because the interlaced half-frames differ in time by 1/60sec, subjects moving rapidly will appear doubled or blurry if one "froze" the video as when hitting Pause (or taking a screen snapshot). When viewed normally, high speed motion will still appear to be very smooth and rest of the scene will be with high detail due to the high resolution."

    The "frame" is sent comlete, there is no "shift" of 1/60th a second between the 2 halves of the same frame. 1 of the 30 frames is cut in half, and displayed in 2 parts. Where does one come up with the idea that the original would look "time shifted" as long as both halves are of the same frame? urg this type of thing drives me nuts :)

    But he is half right, just stated it wrong. IF you were to pause at a point the 2nd half of 1 frame and the 1st half of another were being drawn, THEN you'd MAYBE see his "time shifting". But you'd have to have some serious fast motion if you think about it.

    If you are watching normal TV like CSI or Lost or 24 etc. most of what you see is "normal" motion. Even in most sporting events, there is less fast action than normal action. Basketball, hockey and soccer sorta, car racing, these are sports with constant motion, but depending on camera angles, zoom distance, etc., the action may or may not be "fast" from a drawing perspective. a Slap shot would definitely be "fast", or a camera looking down the start/finish line watching the cars whizz buy in a blurr, haha.
     
  16. bhenge

    bhenge AllStar

    83
    0
    Mar 2, 2005
    Hey, I didn't mean to restart an old war, but the discussion is great. When I made my comment, we were talking about the native resolution of displays, not which resolution provided a 'better' or higher quality picture once you construct a full frame of data 30 times per second (24 for film). The point I tried to make was that if you feed a 720p signal to a native 1080i display, you would need a scaler to downconvert the 720p signal to 540 lines every 1/60th of a second. In other words to display in 1080i, images in 720p must be "downconverted" by eliminating 180 lines (or 25%), every 1/60th of a second. This is because a native 1080i screen can only display (is buffer a better word?) 540 lines per 1/60th of a second to display 1080 lines per frame. 720p images, however, display 720 lines every 1/60th of a second. My older Pioneer Elite RPTV is native 1080i... it won't even display a 720p source because the TV's internal scaler cannot convert the signal, I need the scaler in my 811 to do that. If this is wrong, please let me know. ;-)
     
  17. jrb531

    jrb531 Icon

    916
    0
    May 28, 2004
    Ok Mr. Insult answer this:

    You are telling me that my good old Projection TV that does 1080i really is displaying a full 1080 scan lines "at one time" and for some reason it cannot do a lowly 720?

    Hmmm it can do SD, VCR, DVD and even 1080i but not 720?

    Why? You see I had always thought that the reason was that in 1080i the projection set only had to draw 540 lines per pass and this is why no 720 but I guess I am wrong.

    Silly me.

    LCD's have a fixed resolution so and "LCD" 1080 does have 1080 lines - this I understand which is why most LCD's only go up to 720. but what about CRT's and projection sets that do 1080i... these all have 1080 lines of resolution?

    -JB

    Quote:

    1080i indicates a frame composed of 1920x1080 pixels, usually at 60 interlaced frames per second. This means that there are actually 30 full complete 1920x1080 frames per second made up of two half-frames each 1/60th of a second. The half frames alternate between the even numbered horizonal lines and the odd lines. Upon viewing, the two half-frames are seen as a whole entire frame, although they differ in time by 1/60th of a second.

    Hmmm still seems like a 1080 "i" only draws 540 each pass. Guess I'm just to stupid to understand.
     
  18. Oompah

    Oompah Cool Member

    16
    0
    Feb 7, 2006
    Thanks for all the replies, folks! I've been working late the last few nights and need to do more than skim over them to do them justice.

    Please, gents... I did not want to restart some old arguments when I asked the question!

    More later. I'm tired.
     
  19. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    what I keep trying to get you to understand is that is not the case, most of the time (sporting events maybe, normal HD it's not). 720/60p does seem to be the camera speed for most sporting events, but again, that is still useless during normal playback as the eye only sees 30 of them (but great for slow motion/stop action). normal TV is 24fps according to some data I'll post next.

    In any case, the 1080i set isn't displaying half of one frame then half of another. This is where you keep stumbling. 1080i sets take a full 720p frame and simply reproduce the 2 halves in 2 passes. As you'll read shortly, you will see 1080 lines, not 540. So NO it is not accurate to say 1080i only displays 540 at a time. Now, sure the 1080i drops every other frame from a 720/60p since it can only show 30 fps, but again, you couldn't see more than 30 of the 60 being shown anyway :)
     
  20. Rogueone

    Rogueone Hall Of Fame

    1,133
    0
    Jan 29, 2004
    I'm not insulting you, I am informing you that your understanding of the technology is in error.

    your CRT does not do 720p because of the way the phosphurs work. here is a snipet from this LINK
    "Of the three HDTV formats that are at the bandwidth limit two won out. ABC, and FOX for the over the air (OTA) networks and ESPN and ESPN2 for the cable/satellite networks chose 720p/60fps and NBC, CBS, PBS, WB, UPN and independents along with the other cable/satellite networks chose 1080i/30fps. Here is that darn interlaced format that is technically inferior again, why? In a word, or phrase, CRTs. The CRT based television was based on an interlaced scan system from the beginning of TV. Basically the cost to provide interlaced video on a CRT is much less expensive than the cost to provide progressive video on a CRT. Additionally, the persistence of phosphors had evolved to a point where interlaced video was more than acceptable for HDTV use. It was either force HDTVs to be even more expensive than they are for CRT based units or allow the interlaced format to spur on faster acceptance. Obviously the cost factor won out.​
    But what of the argument that with 1080 line interlaced video there is only 540 lines of video being displayed on the screen at any given time? If you have poked about on the internet much exploring this subject I’m sure you have seen this claim. In short it is a false claim. The claim of only 540 lines of video is based on the fact that the odd lines are scanned on one pass, or field, and then the even lines are scanned. What is forgotten here is a couple of things. First, on a CRT the persistence of the phosphors I mentioned before. Persistence is the ability of a phosphor to glow for a time after the electron beam has moved on, sort of like the glow of a filament in an incandescent light for a while after the electricity is turned off. This persistence is what keeps the first 540 lines of video lit while the second 540 lines of video is being painted. Now it is true that the prior scan of video will not be as bright as the current scan, but our brains will average this out which is why TV works for most humans and some dogs even. In short the phosphors provide the deinterlacing on the screen itself.​
    Now move to fixed pixel type displays like plasmas, LCDs, LCOS, DLPs, SEDs, etc and you have a completely different matter. These type displays are progressive in nature and any interlaced video fed into these displays will be deinterlaced by combining the two fields into a common frame for display. In the case of 1080i/30fps a video memory image of 1920x1080 is created and then scaled to the resolution of the display and displayed at the refresh rate of the display. Since most, if not all, fixed pixel type displays refresh at 60 times per second, each deinterlaced frame would be displayed twice. If the display has a resolution of 1920x1080 pixels, then the full HDTV resolution will be displayed, obviously not just 540 lines of video."
    then there is this on the actual recording of the programming:
    "Another issue to discuss when talking about the difference to the viewer between 1080i/30fps and 720p/60fps video is the source of the video. If the source of the interlaced video is the same frame for both the odd and even lines, such as it would be for movie frames and progressive cameras, the deinterlacing will reconstruct the progressive frame back to the original. Movies are shot at 24fps and even if displayed on a 60fps display the effective frame rate will still be at 24fps, so having a 720p/60fps signal and corresponding display does not help at all. In fact the efficiency is not as good as a lot of data is redundant. The 1080i/30fps matches up for 24fps video with half as many redundant frames. Most prime time HDTV shows are also 24fps, so the only case where the 60fps would offer an improvement would be when the source is also 60fps, such as sporting events.​
    Also there is the interlace artifact where the object moves in the 1/60th of a second between the odd lines being scanned and the even lines being scanned. This was important back in the days of iconoscope cameras which were interlaced in the capture the same as the CRT tubes used for display, because these cameras had the same constraints as CRTs as far as interlaced video is concerned. Modern CCD solid state cameras use a matrix of pixels to capture the images as a full frame and the pixels are shifted out of the captured image matrix electronically. No longer is it necessary to have a different frame between the odd and even scans and these naturally progressive cameras are making the classic interlace artifact a thing of the past. Remember if the two passes are made from a common frame capture, the reconstructed image will end up progressive, even if the transmission is interlaced."
    Keep in mind that even if a sporting program is shot at 60fps, the brain can only see 30. So, when a 720p signal is "upconverted" not downconverted, the "frame" is multiplied by 3, then divided by 2 (hence the term 3:2 or 2:3 pulldown, whichever it is). This transforms the frame in 720 to 1080, THEN the converting chip sends out the 1080 frame in 2 passes. It's still the same frame!! and since YOU can't see more than 30 fps anyway, the fact that it tosses out every other frame is irrelevant since you can't "see" those anyway. and when you do slow mo's, all 60 "frames" are still there and displayable, as the harddrive of the DVR didn't stop storing the 720p program. (This is a lot like in video gaming, where there are video cards capable of running games at 300+fps now. But at 300fps the game isn't any smoother that at 35/45/55/60. For a long time, 60 has been the magic number for online play, as it allows for a 30fps drop in framerate before you "notice" stutter from the video card being overwhelmed. )




    In both cases, you will "SEE" 30 frames per second, as that is ALL your brain can process. If the initial image was a 24p recording, as the notes state, you are repeating twice as many frames in 720p as in 1080i, and 1080i is displaying more than DOUBLE the pixels per second. For 60fps recorded programs, while this makes 720p only 12.5% less pixels per second of 1080i, you can see all 100% of the 1080i pixels while you can only "see" 50% of the 720p, since your brain can't detect every other frame. it's like the addage, If a tree falls in the forest, and no one hears i, did it make a sound? the answer, it doesn't frackin' matter, cause if you WERE there to hear it, you would have :) If a TV frame is shown on screen, and your brain can't detect it, was it really there? hmm :rolleyes:




    I'm not saying all this to say 720p looks bad, I'm saying it to help you understand your ascertions about 1080i are wrong, and that except for programming recorded in 60fps, which is limited to some sporting events, 1080i will have the better picture. And in any case, 1080i will always have the higher resolution picture. as to the question of why CRT's don't normally do 720p, that article explains it well. Expense.


    I'll give a fast example. 1995, worked at a company that sold conference room equipment. We sold 35" Mitsubishi tube "monitors". They were over $5000, cost was about $3500 to $4000. Circuit City et al were selling the same 35" sized sets for under $2000. Why the difference? The CC model only did TV signals, the one we sold could handle anything short of an SGI computer, so up to about 80kHz Horizontal (which is somewhere around 1600x1200 or maybe closer to 2000x something). SGI's went over 100k, and it took extremely special monitors to handle SGI computers.



    As the article stated about the phosphurs as well, they are now-a-days staying lit for nearly 1/30th a second for interlaced TV's. What happens to the picture if you suddenly dump 720/60p into that set? it's blurry as hell. the phosphurs would be lit twice as long as necessary, and the picture would blow chunks. So you'd have to do 720/30p or 1080i. And since 1080 only requires a Horizontal/Vertical combo speed able to draw 540 lines per pass, it's cheaper to make and the same phosphurs that work for 480i work for 1080i. Does it make any more sense now? 720/60p isn't possible on a 480i based CRT. And 720/30p would cost more for an inferior picture in most cases
     
Thread Status:
Not open for further replies.

Share This Page