Jump to content

Welcome to DBSTalk


Sign In 

Create Account
Welcome to DBSTalk. Our community covers all aspects of video delivery solutions including: Direct Broadcast Satellite (DBS), Cable Television, and Internet Protocol Television (IPTV). We also have forums to discuss popular television programs, home theater equipment, and internet streaming service providers. Members of our community include experts who can help you solve technical problems, industry professionals, company representatives, and novices who are here to learn.

Like most online communities you must register to view or post in our community. Sign-up is a free and simple process that requires minimal information. Be a part of our community by signing in or creating an account. The Digital Bit Stream starts here!
  • Reply to existing topics or start a discussion of your own
  • Subscribe to topics and forums and get email updates
  • Send private personal messages (PM) to other forum members
  • Customize your profile page and make new friends
 
Guest Message by DevFuse

Photo
- - - - -

Picture Quality


  • Please log in to reply
15 replies to this topic

#1 OFFLINE   E91

E91

    Godfather

  • Registered
  • 341 posts
Joined: Oct 07, 2008

Posted 24 September 2010 - 01:56 PM

So, I'm seeing some pretty dramatic improvements in PQ since I switched back to D* (HR24). I had a VIP722 DVR for two years, with a Samsung LCD.

I'm a little curious as what factors are driving this effect: 1) Something about the HR24 vs VIP series, 2) Compression, 3) the native passthrough.

What do you guys think?

#2 OFFLINE   Codfishjoe

Codfishjoe

    AllStar

  • Registered
  • 80 posts
Joined: Sep 02, 2010

Posted 24 September 2010 - 02:05 PM

Its mainly the compression Directv uses.
I am not really a codfish.

#3 OFFLINE   jdspencer

jdspencer

    Hall Of Fame

  • Registered
  • 6,565 posts
Joined: Nov 07, 2003

Posted 24 September 2010 - 02:56 PM

Shouldn't be "less compression DirecTV uses"?:)
DirecTV since '96, Waivers for ABC, CBS, NBC, & Fox, HR23-700 & HR24-500/AM21, using ethernet based MRV.

#4 OFFLINE   mdavej

mdavej

    Hall Of Fame

  • Registered
  • 2,249 posts
Joined: Jan 30, 2007

Posted 24 September 2010 - 03:29 PM

I don't see any difference on my 722k versus my HR20 (going from memory). Are you talking about HD or SD or both? And what kind of difference exactly - compression artifacts, softness, macro blocking, color depth, etc.? If anything, my 722k actually seems a little better (sharper). I'm also seeing less compression artifacts on SD with Dish than with D*.

#5 OFFLINE   E91

E91

    Godfather

  • Topic Starter
  • Registered
  • 341 posts
Joined: Oct 07, 2008

Posted 24 September 2010 - 03:48 PM

I don't see any difference on my 722k versus my HR20 (going from memory). Are you talking about HD or SD or both? And what kind of difference exactly - compression artifacts, softness, macro blocking, color depth, etc.? If anything, my 722k actually seems a little better (sharper). I'm also seeing less compression artifacts on SD with Dish than with D*.


With regard to SD. I do watch a lot of SD on a 27" SD set I have in my dining area. But, with E* I had the 722 going to it via coaxial. With D*, I have a HR24 via component imputs. So, the picture is better with D*, but it is not a fair comparison.

With regard to HD, I'm talking mostly about color depth and sharpness of picture. Its hard to quantify, but the picture just looks a lot better now. I did not really get a lot of "compression artifacts" with either E* or D*.

#6 OFFLINE   Jason Whiddon

Jason Whiddon

    Hall Of Fame

  • DBSTalk Club
  • 2,262 posts
Joined: Aug 17, 2006

Posted 24 September 2010 - 06:06 PM

As I increased my set size, Dish's compression (or higher use of) showed up a bit more. IMO, it's easy to see a difference between Dish and Directv on 50+"'s. Dish isn't bad, but Direct looks better.
65" VT50 / BDP-S6200
X4000 / Outlaw Model 7125
Klipsch RF82 II and RC62 II / Hsu VTF-15H (2)
Directv HR44-200 / HR24-500

 


#7 OFFLINE   TomCat

TomCat

    Broadcast Engineer

  • Registered
  • 3,549 posts
Joined: Aug 31, 2002

Posted 24 September 2010 - 07:51 PM

...With regard to HD, I'm talking mostly about color depth and sharpness of picture. Its hard to quantify, but the picture just looks a lot better now. I did not really get a lot of "compression artifacts" with either E* or D*.

Color depth and sharpness are both probably identical, since those are defined by the 4:2:0 color space, the limits of the clor gamut, and the fixed resolution (just like your 1080p 1920x1080 TV has a fixed resolution, the actual resolution of the images it receives are also fixed).

Both DBS services use the same compression algorithm, which is MPEG4 AVC/ Part 10. But surprisingly, compression has nothing at all to do with actual resolution, sharpness, or color depth. Those are fixed during digitization, which is done before compression. And speaking as someone who does this for a living, I'd guess probably done identically by both DISH and DTV (the digitization step, that is; compression choices may be quite different).

There are two things (all else held equal) that can affect how good digital PQ is (assuming we factor out the original PQ differences before digitization or in the source material). Surprisingly, the DVR is not one of them (all DVRs have the same PQ, even from different manufacturers; signals in the digital domain have a fixed PQ in consumer delivery scenarios meaning that if we both tune in the LIL sat signals the PQ of Fringe is exactly, precisely the same going into the HDMI connector on my TV as it is going into the HDMI connector on your TV; things can vary from that point on).

1) Pre-compression conditioning. Usually due to the skill or choices of the compressionist. The goal is to send no information that is not needed into the compression algorithm, so there is a lot of signal conditioning that can happen. Mostly this is due to noise reduction, which sometimes can impair resolution if done with a heavy hand.

2) Compression artifacts. While these do not actually change the resolution (unless that choice is made on purpose; remember HD-Lite?) or the color space (all consumer HD is 4:2:0) they can act to change the perceived resolution by the artifacts masking the actual resolution. And a skilled compressionist will produce a signal with fewer artifacts, partially due to how well #1 above is performed.

There are a number of choices that the compressionist can make that affect the perceived resolution and PQ, even if the compression levels are identical and even though, ironically, color depth and resolution are fixed. But even if fixed, other factors can lead a person's subjective impression of PQ to be perceived better or worse, even for well-trained eyes. Just like the same wine expert will judge the same wine very differently on a different day, subjective impression is very malleable, and that is ironically one of the tools the compressionist himself uses. Since that is subjective and dynamic, it only greatly complicates the task of preserving PQ in spite of compression, for the masses.

I had both DTV and DISH side by side for a few months into the same TV (this was pre HD). The signals were different, but only slightly so. DISH had a bit more mosquito noise and DTV had a bit more quantization noise error (the compression artifacts can be balanced off one against the other depending upon choices made at compression; DISH made slightly different choices than DTV). But they were very close (and pretty consistent channel to channel). I never would have been able to point at a picture and say "That's obviously DTV", for instance. They were so close that I could not state a preference. At least for their SD at that time.

But you are not the first to comment that the PQ for HD is now often considered a little better on DTV than on DISH (I have yet to see the opposite comment, but maybe that is partly due to me perusing DTV sites and not DISH sites anymore). As a former DISH sub and a current loyal DTV sub, I'd like to at least think you are probably right about that.

It is very difficult to have a definitive answer, even with the same material side by side from both vendors. Bottom line, the differences you may see between DISH and DTV are definitely not due to the DVR, and very likely not due to the level of compression (which is fairly much the same between DISH and DTV). The end result you see is basically most probably due to the choices made by the various compressionists and how they balance off the various affects of the various tools available to them in the MPEG toolbox. Compression techniques are so varied and adjustments interact and cause such small differences (and video varies in its character so widely) that the compressionists themselves don't know for sure what the differences will be. It is almost a black art. But some will make better choices than others, and it may just be that those working for DTV are a little better at this than those from DISH.

Edited by TomCat, 25 September 2010 - 08:26 PM.
minor tweaks added for clarity

It's usually safe to talk honestly and openly with people because they typically are not really listening anyway.

#8 OFFLINE   E91

E91

    Godfather

  • Topic Starter
  • Registered
  • 341 posts
Joined: Oct 07, 2008

Posted 24 September 2010 - 11:02 PM

Color depth and sharpness are both probably identical, since those are defined by the 4:2:0 color space and the fixed resolution (just like your 1080p 1920x1080 TV has a fixed resolution, the actual resolution of the images it receives are also fixed).

Both DBS services use the same compression algorithm, which is MPEG4 AVC/ Part 10. But surprisingly, compression has nothing at all to do with actual resolution, sharpness, or color depth. Those are fixed during digitization, which is done before compression. And speaking as someone who does this for a living, I'd guess probably done identically by both DISH and DTV.

There are two things (all else held equal) that can affect how good digital PQ is (assuming we factor out the original PQ differences before digitization or in the source material). Surprisingly, the DVR is not one of them (all DVRs have the same PQ, even from different manufacturers; signals in the digital domain have a fixed PQ in consumer delivery scenarios meaning that if we both tune in the LIL sat signals the PQ of Fringe is exactly, precisely the same going into the HDMI connector on my TV as it is going into the HDMI connector on your TV; things can vary from that point on).

1) Pre-compression conditioning. Usually due to the skill or choices of the compressionist. The goal is to send no information that is not needed into the compression algorithm, so there is a lot of signal conditioning that can happen. Mostly this is due to noise reduction, which sometimes can impair resolution if done with a heavy hand.

2) Compression artifacts. While these do not actually change the resolution (unless that choice is made on purpose; remember HD-Lite?) or the color space (all consumer HD is 4:2:0) they can act to change the perceived resolution by the artifacts masking the actual resolution. And a skilled compressionist will produce a signal with fewer artifacts, partially due to how well #1 above is performed.

There are a number of choices that the compressionist can make that affect the perceived resolution and PQ, even if the compression levels are identical and even though, ironically, color depth and resolution are fixed. But even if fixed, other factors can lead a person's subjective impression of PQ to be perceived better or worse, even for well-trained eyes. Just like the same wine expert will judge the same wine very differently on a different day, subjective impression is very malleable, and that is ironically one of the tools the compressionist himself uses. Since that is subjective and dynamic, it only greatly complicates the task of preserving PQ in spite of compression, for the masses.

I had both DTV and DISH side by side for a few months into the same TV (this was pre HD). The signals were different, but only slightly so. DISH had a bit more mosquito noise and DTV had a bit more quantization noise error (the compression artifacts can be balanced off one against the other depending upon choices made at compression; DISH made slightly different choices than DTV). But they were very close (and pretty consistent channel to channel). I never would have been able to point at a picture and say "That's obviously DTV", for instance. They were so close that I could not state a preference. At least for their SD at that time.

But you are not the first to comment that the PQ for HD is now often considered a little better on DTV than on DISH (I have yet to see the opposite comment, but maybe that is partly due to me perusing DTV sites and not DISH sites anymore). As a former DISH sub and a current loyal DTV sub, I'd like to at least think you are probably right about that.

It is very difficult to have a definitive answer, even with the same material side by side from both vendors. Bottom line, the differences you may see between DISH and DTV are definitely not due to the DVR, and very likely not due to the level of compression (which is fairly much the same between DISH and DTV). The end result you see is basically most probably due to the choices made by the various compressionists and how they balance off the various affects of the various tools available to them in the MPEG toolbox. Compression techniques are so varied and adjustments interact and cause such small differences (and video varies in its character so widely) that the compressionists themselves don't know for sure what the differences will be. It is almost a black art. But some will make better choices than others, and it may just be that those working for DTV are a little better at this than those from DISH.


What a fantastic post! Thanks for the info. I really enjoyed reading this - learned a lot.

#9 OFFLINE   Jason Whiddon

Jason Whiddon

    Hall Of Fame

  • DBSTalk Club
  • 2,262 posts
Joined: Aug 17, 2006

Posted 25 September 2010 - 07:55 AM

Comments I want to add about 4:2:0 is:

*) Blu-rays native "output" is 4:2:2 20 bit (10 bit color), and many can do color interpolation and output 4:4:4 36 bit (12 bit color)

*) The Hr20-23 output 4:4:4 24 bit (some sort of 8 bit with color interpolation), and the HR24-500 outputs RGB.

4:2:0 is the native, but I'm not aware of anything that outputs it consumer wise, there's always some interpolation going on.
65" VT50 / BDP-S6200
X4000 / Outlaw Model 7125
Klipsch RF82 II and RC62 II / Hsu VTF-15H (2)
Directv HR44-200 / HR24-500

 


#10 OFFLINE   TomCat

TomCat

    Broadcast Engineer

  • Registered
  • 3,549 posts
Joined: Aug 31, 2002

Posted 25 September 2010 - 08:11 PM

Comments I want to add about 4:2:0 is:

*) Blu-rays native "output" is 4:2:2 20 bit (10 bit color), and many can do color interpolation and output 4:4:4 36 bit (12 bit color)

*) The Hr20-23 output 4:4:4 24 bit (some sort of 8 bit with color interpolation), and the HR24-500 outputs RGB.

4:2:0 is the native, but I'm not aware of anything that outputs it consumer wise, there's always some interpolation going on.

Two things to keep in mind regarding this:

1) 4:4:4 is a color space definition (as are 4:2:2 and 4:2:0) that defines how much resolution chroma pixels have in relation to luma pixels. RGB is a color scheme; a way of matrixing and displaying the various colors available. Those are two very different things applied at two very different stages of the signal chain. What is usually used as a scheme is Y Pb Pr, which is a similar way of matrixing color values to RGB. It is a much more efficient way to transport full color than RGB, but both will present an identical output once the chroma components are combined at output (at the display, typically). Since they are different things, you can select any color space definition at digitization and then matrix the chroma components in any matrixing scheme. And the results will typically be pretty much the same no matter how you do that.

2) Consumer equipment may have the capability of a higher color space than 4:2:0, but so far there is little if any content available in any higher color space, which means that will not be of any advantage.

Actually, there is no visible (discernible) difference between 4:2:0 and 4:2:2 or 4:4:4. Because human vision can not resolve chroma values as finely as it can luminance values, 4:2:2 and 4:2:0 can be used for consumer distribution with the same perceived result as 4:4:4. The chroma values of 4:2:0 are resolved at a resolution 1/4th that of 4:4:4, but we can't distinguish the difference. The physical difference is that 4:4:4 defines a pixel value for each Pb and Pr pixel, while each group of 4 pixels (Pb or Pr) shares a common value for 4:2:0. IOW, resolution of chroma is quantifiably less, yet not noticeable to the human vision system.

4:2:2 and 4:4:4 become important only when there will be generational changes; when there are post-production transitional affects added, for instance, or editing done. If at 4:2:0, those might induce generational losses due to concatenated rounding errors in the math required. But for consumer distribution (where no further math is done), 4:2:0 is just fine and has no disadvantages while it has the advantage of taking less bits to transport, meaning compression for a given artifact level can be greater (or artifact levels will be less for equivalent compression levels). And that is one of the reasons why we don't get content in 4:2:2 or 4:4:4; it will have no advantage at that point and would make compression harder and have more artifacts.

It would be nice if DTV had 4:2:2 or 4:4:4 video available to them when converting MPEG2 to MPEG4 (they don't) but it hasn't seemed to cause them any problem with PQ regardless of it not being available to them.

Edited by TomCat, 25 September 2010 - 08:32 PM.

It's usually safe to talk honestly and openly with people because they typically are not really listening anyway.

#11 OFFLINE   Richard L Bray

Richard L Bray

    Legend

  • Registered
  • 117 posts
Joined: Aug 19, 2006

Posted 27 September 2010 - 06:37 AM

[QUOTE=TomCat;2592863
Actually, there is no visible (discernible) difference between 4:2:0 and 4:2:2 or 4:4:4. Because human vision can not resolve chroma values as finely as it can luminance values, 4:2:2 and 4:2:0 can be used for consumer distribution with the same perceived result as 4:4:4. The chroma values of 4:2:0 are resolved at a resolution 1/4th that of 4:4:4, but we can't distinguish the difference. The physical difference is that 4:4:4 defines a pixel value for each Pb and Pr pixel, while each group of 4 pixels (Pb or Pr) shares a common value for 4:2:0. IOW, resolution of chroma is quantifiably less, yet not noticeable to the human vision system.
[/QUOTE]

You obviously know a lot more about encoding than I do. However, what we see on our displays is also a factor of how our components handle the incoming signal.

The components don't necessarily handle 422, 444, and RGB with equal quality. Spears and Munsil are acknowledged experts in this area. Their following article provides information on the subject:
http://www.spearsand...colorspace.html

In my case, my PS3 loses chroma detail (shown on the Spears and Munsil Chroma Multiburst pattern) with 444 output but not with RGB. The PS3 output also requires an adjustment for Chroma alignment. My Oppo handles all of this properly. My PS3 444 output also requires a Y/C delay adjustment which isn't required with the Oppo. Plus, I have better Chroma patterns if I have my DVDO Duo change the HR24's RGB output to 444 than if I let it be done internally in the vt25 display prior to its internal 444 processing; and I've read some expert opinions that the vt25 doesn't necessarily get the same results with 422 and 444.

Finally, there are several displays that are known to poorly handle RGB input (which is output by the HR24-500).

Edited by Richard L Bray, 27 September 2010 - 06:50 AM.


#12 OFFLINE   Jason Whiddon

Jason Whiddon

    Hall Of Fame

  • DBSTalk Club
  • 2,262 posts
Joined: Aug 17, 2006

Posted 27 September 2010 - 07:36 AM

The 2010 Panny's are designed for 444 36 bit, and my Duo ouputs it to the Panny on auto, but I did find a slight chroma alignment error. I used the YC Delay on the Duo to correct it properly.

My HR24 sends RGB 24 bit, My TiVo sends YCbCr 4:4:4 24 bit, and my BD85 sends 4:2:2 20 bit to the Duo. Duo outputs 444 36 bit to the TV, which is just color interpolation, of filling in bits.
65" VT50 / BDP-S6200
X4000 / Outlaw Model 7125
Klipsch RF82 II and RC62 II / Hsu VTF-15H (2)
Directv HR44-200 / HR24-500

 


#13 OFFLINE   TomCat

TomCat

    Broadcast Engineer

  • Registered
  • 3,549 posts
Joined: Aug 31, 2002

Posted 28 September 2010 - 10:27 PM

...what we see on our displays is also a factor of how our components handle the incoming signal.

The components don't necessarily handle 422, 444, and RGB with equal quality...

That is very true, Richard (and thanks for the interesting link). But you must remember, I was speaking about the differences between the actual signals, which if no new binary math is performed on them and which if all else is held equal, are not capable of being resolved any differently by the human vision system (IOW, 4:2:2 or 4:4:4 can not on their own merits appear better than 4:2:0). That is a very narrow and specific fact which implies nothing beyond that, but is still a hard and cold fact, nevertheless.

And that (all else held equal) is the distinction. It is very possible for a particular combination of consumer equipment to handle a 4:2:0 signal in a particularly imperfect manner that it might not with an otherwise "identical" 4:2:2 signal, for instance. That is a case of all else not being held equal. But the rules of physics still apply, even though something else in the equation (that is not a constant to both signals being compared) may make things appear different.

The point I was making, or the one you seem to be calling into question at least, is that the use of circuitry with a different color space than 4:2:0 has no bearing on either improving or preserving PQ of 4:2:0 signals. It has the power to do neither of those things. And in a case where the equipment handles it imperfectly, the opposite (degradation of PQ as in your examples) may happen.

Often a processing chip may be chosen by a manufacturer for reasons of economics or supply only, and the fact that it may also handle a higher color space may only be incidental, and certainly not relevant. The HDMI 1.4 spec handles much greater color depths than are available to consumers, yet that adds no intrinsic value to PQ when the source does not have these greater color depths in the first place (which is 98% of the time). Likewise, the Sharp Quattron TV adds a yellow pixel to its displays, but since all video that can feed such a display is already matrixed from a 3-color system, it also does not improve PQ (other displays can already reproduce every color available in the HD gamut).

Edited by TomCat, 28 September 2010 - 10:33 PM.

It's usually safe to talk honestly and openly with people because they typically are not really listening anyway.

#14 OFFLINE   TomCat

TomCat

    Broadcast Engineer

  • Registered
  • 3,549 posts
Joined: Aug 31, 2002

Posted 28 September 2010 - 10:53 PM

The 2010 Panny's are designed for 444 36 bit, and my Duo ouputs it to the Panny on auto, but I did find a slight chroma alignment error. I used the YC Delay on the Duo to correct it properly.

My HR24 sends RGB 24 bit, My TiVo sends YCbCr 4:4:4 24 bit, and my BD85 sends 4:2:2 20 bit to the Duo. Duo outputs 444 36 bit to the TV, which is just color interpolation, of filling in bits.

Video available to consumers is typically 8 bit. All consumer HD that is transported is 8 bit. There may be consumer cameras that handle things at a higher bit depth.

But processing at a higher bit depth than the source without new real math being performed, which is what happens when you plug a HDMI cable into a TV, even though it may process (read: transport at or accept at) at a higher bit depth, for the digital words then actually carried at that higher bit depth, all bits other than those represented by the 8 bits of the source video are nulled out, or all zeroes, the zeroes being merely placeholders and representing no real information (or increase in actual bit depth or quality).

36 bit-depth binary words with the bits representing the lower (least significant) 28 of those bits nulled out will have identical quality to binary words with 8 bits. IOW, no advantage to 8 bit source material whatsoever.

It is similar to ripping a CD backwards. If you rip a conventional CD recorded at 705 kbps quality down to a MP3 with 128 kbps quality, there is a definite undeniable loss of audio quality. If you convert that digital file back up to CD specifications and burn it to a CDR, the actual quality remains at MP3 quality.

Once the original information is discarded (once you digitize at 8 bits, for instance) there is no way to recover the information discarded by digitizing at a low bit rate, and circuitry with a higher bit rate capability is of no value to that signal. It is the same principle as 4:2:0 color carried by 4:4:4 circuitry; the end result is the same 4:2:0 resolution quality.

Edited by TomCat, 28 September 2010 - 10:58 PM.

It's usually safe to talk honestly and openly with people because they typically are not really listening anyway.

#15 OFFLINE   Jason Whiddon

Jason Whiddon

    Hall Of Fame

  • DBSTalk Club
  • 2,262 posts
Joined: Aug 17, 2006

Posted 28 September 2010 - 11:03 PM

You post a lot of techincal info, but you pass over some simple parts of the process. When my Duo accepts 4:2:2 20 bit, and outputs 4:4:4 36 bit, it's using color interpolation (filling in the blanks with color data). This is similar to what LCD's do with frame interpolation, and it works well. I understand what you are getting at with ripping the CD backwards, and converting a MP3 to CD quality, but it's not that cut and dry.

The technology is there to interpolate color and framerate, and some of the newer devices do it well. I've tested the 4:4:4 output with the Spears and Munsil Blu-ray (chroma alignment patterns;it passed), as well as discussing it with Stacey Spears, and you discard it too easily.

Also, the Panasonic BD85, which can ouput 4:4:4 36 bit directly, was found to do this VERY well with displays that can accept it and handle it properly (professional reviewers). The 2010 Panny, one of which I own, plasmas do this.

What is happening with the BD85 and the Duo to the Panasonic, is this "new real math" you reference. Spears and Munsil are working on new test patterns for 4:4:4 36 bit, and I've spoken with Ken, the engineer at AnchorBay, about it also.
65" VT50 / BDP-S6200
X4000 / Outlaw Model 7125
Klipsch RF82 II and RC62 II / Hsu VTF-15H (2)
Directv HR44-200 / HR24-500

 


#16 OFFLINE   TomCat

TomCat

    Broadcast Engineer

  • Registered
  • 3,549 posts
Joined: Aug 31, 2002

Posted 28 September 2010 - 11:39 PM

You post a lot of techincal info, but you pass over some simple parts of the process. When my Duo accepts 4:2:2 20 bit, and outputs 4:4:4 36 bit, it's using color interpolation (filling in the blanks with color data). This is similar to what LCD's do with frame interpolation...

I think what it does is create null bits or zeroes for the 16 least-significant of those 36 bits. It "fills in the blanks" with zeroes, or no real data.

It likely does the same thing for 4:4:4. The only difference between 4:4:4 and 4:2:0 is that for 4:4:4 each Pb and Pr pixel has a unique value and for 4:2:0 each group of 4 pixels share the same value. Do you really believe that converting that to 4:4:4 creates 4 different individual values for those pixels rather than four values that are the same? From what information? There is no information to draw from.

But let's say it actually does intelligently look at each digital word or pixel and then interpolates from that what the new bits should be. There is no way for it to magically know what those bits should be, because the information that would be represented by them was discarded at digitization. Anything it could possibly do would be a wild-assed guess; not even an educated guess because there is no intelligence available to make that guess. That will not result in an improvement of PQ, and may have greater odds of degrading it (there is only one right answer and 16 bits worth of wrong answers).

If it dithers the color values, that is also not of any value. Your eyes already can't resolve chroma resolution finer than what you are getting assuming you are far enough away from the screen to not be able to distinguish individual pixels, and that (dithering) is not really any different from the job your eyes do naturally.

Think of the CSI expert in the cop show that gets a smeary security-cam shot of a license plate, and then magically through his scary software, turns it into perfect HD resolution. That's the same fantasy. Other than a little weak edge enhancement (which only works on relatively clear pictures anyway) you just can't do that, even at the NSA. There is nothing magical about interpolation, and it does not improve quality, especially at the consumer level. If the information is gone and you have no clue or magical hollywood algorithm available, you are not going to get it back.

Frame interpolation is really a different animal. That is done for temporal reasons, not for resolution or color-depth reasons. It works only because of the shortcomings of low frame rates. It does not add new information, it only reshuffles the information it has available, and it does that with very little intelligence; it would take much more processing power to intelligently create in-between frames that are anything other than a blend of the adjacent frames which is prohibitive at a data rate of 1.5 Gbps. It does nothing to or for either resolution or color.

An example of actually being able to improve the PQ of an image (in this case its color depth, contrast, gamma, and detail over an extended luminance range) would be HDR photography. But in that case the extra information added is real; it exists in the raw image or in the bracketed images taken at the same time as the original, information thatt has not been discarded. Which is why that works and very little else does.

Edited by TomCat, 28 September 2010 - 11:47 PM.

It's usually safe to talk honestly and openly with people because they typically are not really listening anyway.




spam firewall