1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

HR20 Power Consumption?

Discussion in 'DIRECTV General Discussion' started by georgecostanza, Aug 21, 2007.

Thread Status:
Not open for further replies.
  1. Tom Robertson

    Tom Robertson Lifetime Achiever DBSTalk Club

    21,331
    246
    Nov 15, 2005
    Your statement "Shaving off 10 watts per day" was misleading then, I used that number. 10Whr per day is .4W.

    Lopping 10W from a device that currently uses 34W is pretty massive. Again, is this the lowest hanging fruit? The most economical place to look?

    Cheers,
    Tom
     
  2. reverett1522

    reverett1522 Cool Member

    20
    0
    Jul 19, 2007
    My Kill-A-Watt says 43 watts.

    That's a fun little device, I got it for tailgating so I know what I can plug in so I don't overload my generator. What was interesting was that the back of my LCD TV said 3 amps (which is supposed to equal 300 or so watts) but the meter only said it drew between 100 and 105 watts depending on volume!
     
  3. dphil9833

    dphil9833 AllStar

    51
    0
    Jul 6, 2007
    According to Cnet the D* DVR draws 33 Watts, see http://reviews.cnet.com/4520-6475_7-6400401-2.html
     
  4. techntrek

    techntrek Godfather

    298
    0
    Apr 26, 2007
    Whoops, my bad Tom. I did write "per day" but I wasn't thinking that.

    Attaining energy conservation at the consumer level is far cheaper and easier than increasing production at the other end. The rule of thumb used in the renewable industry is that for every $1 you spend reducing consumption, you save $3-$4 on production. For instance, by the time I've finished reducing my consumption I will have cut it by 75% (down to 10kW per day, yearly average). I'm half-way there so far. The cost to cut it to 10kW per day will be about $15,000 which includes a new heat/cooling + water heating system that will cut consumption to 1/3 of what I spend now for air heat/cooling + water cooling.

    At 10kW per day usage, I can install a solar system to meet my annual needs for about $20,000. If I stayed at 40kW per day usage, my solar system would cost between $60,000 and $80,000. Not a figure I have to spend.

    So the low-hanging fruit is the 10 watts your TV uses today to stay in standby (which with modern circuitry should be 1-2 watts), the 35 watts your DVR uses (should be 10 or less), the 15 watts your microwave uses to run the clock and take input from the touchpad (should be 1-2)... etc. Its very easy to do in the design stage, but since it costs a little more it won't ever happen until its mandated. Until then, use switched outlets.
     
  5. paulman182

    paulman182 Hall Of Fame

    4,839
    3
    Aug 4, 2006
    I believe any savings gained by removing power from these low-wattage devices will be more than offset by losses encountered when they fail from the repeated power on/off cycles.

    I have seen it too many times in broadcasting. The equipment we leave turned on just works and works. Money-starved radio stations try to save cash by turning everything off, and much of it fails at one of the daily turn-ons.
     
  6. georgecostanza

    georgecostanza Mentor

    34
    0
    Jan 10, 2005
    PC's have been putting hard drives to sleep for years, and Microsoft made S3 standby the primary method of shutdown in Vista. I think a DVR can survive 2-3 wakeup and sleep cycles a day.
     
  7. georgecostanza

    georgecostanza Mentor

    34
    0
    Jan 10, 2005
    Actually, the sun radiates about 2000 Watt's/Meters^2 on the surface of the earth. Do the math, that is way more than we could ever consume, even if captured at 20% efficiency.
     
  8. TDTivo

    TDTivo Cool Member

    26
    0
    Nov 3, 2006
    I work in broadcast TV, and the only things that get turned off in the station are the lights. I know we do have problems with powerdowns. The only reason we power cycle things are for lock ups!
     
  9. Tom Robertson

    Tom Robertson Lifetime Achiever DBSTalk Club

    21,331
    246
    Nov 15, 2005
    Welcome to your first post, TDTIvo! Glad to have you posting. :welcome_s

    True power downs are one issue, but controlled reduced consumption of components is a different matter. Again, I'm not sure how much can be saved nor the value to a reduced power mode in a satellite receiver, especially a DVR, but such a concept is intriguing to at least seriously consider.

    If used in conjuction with an auto-tune function whereby the unit could power back up when a user wishes to see something regularly in the morning without fully recording it, there might be some possibility.

    But remember, a DVR could be receiving pre-staged VOD or showcases at any time, 24/7. I'm just not sure how much power saving is truly realistic without giving up major functionality. Perhaps some options could given to users? Tho that still own't likely do much to dent the consumption of 10M DVRs out there.

    Cheers,
    Tom
     
  10. paulman182

    paulman182 Hall Of Fame

    4,839
    3
    Aug 4, 2006
    I wasn't really talking about sleep cycles, "standby", or a reduction in power mode. Turning on and off a power strip, or plugging and unplugging the power cord, is harder on electronic devices than leaving them powered. I prefer to use the devices as they were designed.
     
  11. techntrek

    techntrek Godfather

    298
    0
    Apr 26, 2007
    Consider this, there are computer processors (like the Pentium-M) with related chipsets used in laptops which regularly go into low-power modes. The laptop is still fully functional - you can still type, use the mouse, the screen is active. The various chips go full-power the instant you ask it to play Doom. On a DVR, the only functionality lost would be the 90 minute buffer if you went into the lowest power consumption state. Make the buffer optional in "standby mode" for those that must have it, and the unit is still able to cut 10+ watts.

    So I pulled out my kill-a-watt yesterday, pulled my system away from the wall and started doing some tests on my TV and HR20. I hadn't had the chance to test my TV since I got it in February so this was the time to do it since I was back there testing the DVR. Many people here can appreciate the pain between the phycial moves and navigating the cable spaghetti.

    TV test results:
    47" LCD TV with backlight set to its lowest setting: 90 watts
    backlight set to the highest setting: 254 watts
    backlight set at 80%, where I usually keep it: 205 watts
    backlight set at 75%, barely noticeable visual change: 195 watts

    Conclusion: a very minor change in my settings saves me 60 watts per day, or 36kW over the 10-year expected lifetime. A good example of how a minor change at the software level makes a big difference. I also now know that when I install a whole-house-UPS, I can extend my TV runtime by 100% just by dimming the backlight!

    HR20 test results:
    During bootup, before "gathering satellite data" displays on-screen: average 27 watts, went as low as 25, high as 30
    During bootup, after "gathering satellite data" displays on-screen: 35 watts
    After bootup, live TV on-screen: 34-37 watts, average 35 watts
    Completely slipped my mind to test "standby mode". I should have had a V-8!

    Conclusion: I'm guessing the receiver circuitry (which includes powering the switch/LNBs downstream) isn't turned on until it starts to gather data. The video processing, CPU and hard drive are already fully active. Once the receivers are active, the useage jumps 10 watts. Possible chance to save some watts with the existing hardware design based on these observations, by disabling the receivers in standby mode 90% of the time. Activate them for a minute every 6 minutes to listen for updates/commands/showcase for the other 10% of the time. The capacity of the command channel(s) from the sats would come into play since unit-specific commands would have to be repeated more often. Possible other savings with the existing hardware design based on the wide fluctuation in power consumption observed; I wouldn't assume this if the consumption was the same the entire time.
     
  12. JeffBowser

    JeffBowser blah blah blah

    2,549
    18
    Dec 21, 2006
    Nice write-up, thanks techntrek. I do appreciate what it takes to navigate a mess of cables and get behind a set. I still have a full size 65" rear projection as my main TV. Sitting on carpet, with a stack of crap on top. It takes quite an effort to move it to get behind.

    I had done my own crude power test of the my setup, and with everything off, I was still drawing 43 watts. I now know 35 of that is the HR20. My uneducated opinion is that that's needlessly high, but without knowing more about the internal design, they may have had a reason they needed to do that.


     
  13. georgecostanza

    georgecostanza Mentor

    34
    0
    Jan 10, 2005
    thanks tech, that's the info I was looking for, except it would have been nice to get the standby power. 35 watts isn't too bad though, considering most pc's run 100-130 watts without the monitor. Also, I was wrong in my earlier post, the sun irradiates 1000 W/m^2, not 2000.
     
  14. JeffBowser

    JeffBowser blah blah blah

    2,549
    18
    Dec 21, 2006
    Since you bring it up twice, I must say, when I said solar is not enough, I mean commercially feasible with current technology, not that the entire sun's output would not be sufficient :rolleyes:

     
  15. techntrek

    techntrek Godfather

    298
    0
    Apr 26, 2007
    I should have checked the standby on my TV, too. But since its on a switched outlet that number would have been useless to me. It would have demonstrated the standby loss of a typical big-screen TV, though. I highly recommend the Kill-A-Watt. Using it I discovered my 18 year old fridge was using 6.5kW per day. Ouch. I bought a new fridge that uses 1.5kW per day. That's 12.5% off my bill every month from now on, with one change. Back when I had 5 of the Hughes receivers, I found they used 10 watts in standby (12 when "on"... again room for big improvement), each. Putting them on switches saved me 27kW per month.

    As for solar electric (PV) - a recent Modern Marvels episode indicated that current-technology solar panels covering 10% of Nevada's land area would harvest enough electricity to supply the US. There are many facts missing here, though. Does this include transmission losses (transmission losses for the average home are 20kW per year) if the panels aren't installed locally? Does this include storage losses from batteries/hydro/flywheel, to cover nighttime use (you loose 10% when converting to batteries and back), or is it just for daytime use? Etc. Now that some of the big-oil tax breaks have been eliminated, we need to apply that money to the solar industry, and other renewables. Renewable's only disadvantage has been a nearly complete lack of funding from Congress, unlike oil and coal.
     
  16. Halo

    Halo Godfather

    266
    0
    Jan 13, 2006
    There are a few other ways for future models to save power.

    The HR20 uses a pretty old BCM7038/BCM7411 chipset. The newer BCM7400 is more than twice as powerful and also integrate two mpeg4 decoders on die (replacing the single decoder 7411 chip). This would enable PictureInPicture while also improving menu,guide and channel change speeds. The BCM7400 is also a die shrink compared to the older chips (130nm to 65nm). Intel has shown with its newest generation of processors (Core 2) that cpu power can be increased at the same time cutting power from 120+ watts to under 40 watts. Reducing the chip count, and board size (no need for the bus between 7038 and 7411) saves a little power and cost.

    Even the largest modern hard drives use less than 10 watts. The Seagate 750 (with 4 platters) uses around 9.5 watts. The upcoming Samsung 1Tb drive uses only 3 platters with record data density. Using similar technology, a 320GB single platter hard drive could be used for the HR20. In general, single platter drives are the quietest, least heat producing and most power efficient drives. They are also cheaper to produce, more reliable in the long term, and (because of the higher data density) have fast read/write speeds.

    Poor power supply design is very common in consumer electronics. A well designed power supply can increase efficiency several percent and extend the component life and product reliability.
     
  17. Tom Robertson

    Tom Robertson Lifetime Achiever DBSTalk Club

    21,331
    246
    Nov 15, 2005
    I suspect by now most readers know I'm not a huge proponent of turning off a DVR's features to save some electricity. That said, I am interested in pondering the overall question from a technology and scientific inquiry standpoint. To that end I thought I would help the cause by taking some measurements since I had the means and a few extra moments tonight.

    I took one HR20 that has a replacement disk inside as my test platform. First, booted the unit normally, using an APC UPS as my watt meter. (Home Depot didn't have a Kill A Watt in stock, so went with what I got.) At boot time the unit drew 27-28W, dropped two watts, then climbed to 32-34 while receiving info, then stabilized at 34-35W when watching live or recorded programming. No detectable difference was determined in standby mode.

    I then unplugged the satellite cables, unit stabilized at 29W. Then using a SWM connection rather than a direct connection, the unit also drew 29W.

    Then to see what turning off the disk might represent, I unplugged the internal drive and used an external drive that was not measured during the test. The unit stabilized at 25-26W usage.

    I did not take the time to test an external drive with SWM. I'm fairly satisfied that such a combination would use about 20W. So in theory for some portion of the day the HR20 could drop from approximately 34W to 20W by turning off the satellite tuners and by turning off the disk if the user were willing to give up certain features.

    BTW, the SWM used 20W by itself.

    Hope this helps,
    Tom
     
  18. techntrek

    techntrek Godfather

    298
    0
    Apr 26, 2007
    Sweet, even more hard-core measurements. Looks like your numbers came pretty close to what I saw at various points, so it confirms that they should apply across the model line. Oh, I should note I have the HR20-700. Did you measure a 700 or 100?

    Certainly proves its possible to shave some watts, assuming the hardware can support it. It will be interesting to see how the HR21 does.
     
  19. JeffBowser

    JeffBowser blah blah blah

    2,549
    18
    Dec 21, 2006
    Funny thing about all this watt-shaver talk here, I was reading this thread the other night, while I gazed out at my heated pool, and my approximately 50 low voltage lights scattered around my property. I amused myself thinking about going around switching my 35 watt HR20 off to save myself money, when I am spending about $$$ a month to keep my pool at my preferred 86 degrees, and Lord knows how many watts driving 50 or so low voltage lights so I can admire my coconut trees after the sun goes down :lol:
     
  20. CCarncross

    CCarncross Hall Of Fame

    7,058
    60
    Jul 19, 2005
    Jackson
    Dont forget this is basically a special task pc, just like any modern dvr. I dont think any of us could build a custom pvr that uses the same or less power, AND still have it do everything the Hr20 does, of course I'm not talking about its ability to actually receive the sat transmissions, since the D* tuner pc cards arent ready yet. Can you even get a shuttle type setup to use under 50W?
     
Thread Status:
Not open for further replies.

Share This Page