Separate names with a comma.
Discussion in 'DIRECTV HD DVR/Receiver Discussion' started by rwjs60, Mar 23, 2010.
So much for global warming.
What I find funny is these guys in Florida and So-cal are turning their heat on at night, when I turn my heat off at night here in IL (I used to turn it off, now my fiance makes me keep it set to at least 60°). I was without power for 2 nights this winter after an ice storm and I still stayed at the house (got down to about 40° at night). My fiance stayed at the neighbors house though (they are on a different branch that never lost power).
Didn't you get the note, it's "Global Climate Change" now
In So. Cal, an HR2x might just help keep the room temp stable... LOL :lol:
Not a chance of that when the outdoor temp is zero deg F or below... Any residual heat off the DVR simply goes 'poof'.
I always get a good chuckle out of yearly temperature variation data for different areas that we visit...
Here in MN, our yearly temperature variation can easily be 120 degrees (from -20 to +100 degrees F) or more.
Compare that to places in the Caribbean who have a temperature variation of only about 40 degrees.
Ain't it fun living in a world of extremes...
Wow, if your turning your heat completly off every night it's no wonder were paying more for heat and facing energy crisis. Your wasting alot of energy their bub, it takes so much more energy to warm a house that's been brought down to 50 degrees that it would say going from 62 back up to 68, your furnace is running for hours to make up that deficit. Your fiancee has got it right by only turning it down a little bit, no one should be shutting their heat all the way off especially in your neck of the woods, one misfire of the furnace and your looking at busted pipes and a whole lot of damage.
I find this strange.
Yes it takes more energy to go from 50º to 68º than from 62º to 68º, "but" there is a savings by not heating the house to keep it at 62º.
I can't see how the heat loss and subsequent BTUs needed to return to 68º would be significantly different, unless the furnace would be MUCH MORE efficient under light load than heavy load.
From the US Dept of Energy:
We should prob get :backtotop
Thanks, and "yes".
Not to mention during the day the sun is out and it warm the house back up part of the way without me having to run the heat.
I guess I shouldn't say I actually turned it off. I used to have it set for about 50°, that way it I wouldn't risk the pipes freezing or anything, but the furnace usually wouldn't kick on at all overnight. Also heat loss through conduction is a product of the difference in temperatures. As the temperatures become closer (the temp in the house gets lower) the heat loss gets lower.
I turn my heat on when the temperature in the house drops below 70
But do you close the windows?
Ha HA HA HA!!!!!
Looking at the OP's first post, the OP makes the remark that the buffer showed he could go back for hours. I guess the one question is if it truely went back for hours or just the 90 minutes that the buffer should. If it did go back for hours, then there is definitely a problem with the receiver as it would appear there was a runaway recording.
I'm not sure why it would take years of study to figure this out. Common sense would seem to indicate that how much energy you use is based pretty much solely on the heat loss of the structure, and there is less heat loss when the outside differential is lower (when outside temp is closer to inside temp). A furnace is not less efficient when the inside ambient temp is lower, either, so it makes no sense at all to think that this strategy would use more energy.
But then its hard to figure some folks' reasoning. My mom (now 88) likes to leave the ceiling fan on to "keep the room cool" when she is gone, and will hear none of the explanations of evaporative heat loss being why a ceiling fan works at all and why that won't actually work that way. I also know folks who raise their thermostat to 90 degrees when they first come in the door, actually thinking that it will warm the room up quicker.
No wonder health care almost didn't pass. :eek2:
But you would only know this for sure if you had another control group of 75 drives that you regularly spin down. Your conclusion is far from scientific, and appears to be based on the fact that people like to think that how they choose to do things is the best way to do things. Because you want to think it is a better strategy does not support at all whether it is or is not.
I manage media servers with hundreds of drives, mostly in RAID configurations. But just because they don't spin down doesn't comfort me that they will last longer. Maybe they do, maybe they don't. I still end up replacing a lot of HDDs.
If the earlier numbers are accurate, who would not be willing to pay the extra 11 cents to lengthen the life of their HDD?
But actually, the jury is still out on whether a constantly-spinning HDD has a longer life than one that spins up and down. If there is a measurable difference, and no one really knows which would last longer, it is probably so insignificant as to not really matter.
To the OP, placing your DVR in standby probably saves a little energy. The old Replays consumed 25 watts when "on", and 21 watts in standby (but the HDD stayed spun up all the time unless you entered a code in an engineering menu). That probably extends to other DVRs as well.
A study by someone, Google, I think, possibly Seagate, found that (on average) if a drive survives infant mortality (makes it through the first 4-6 months) the odds of it failing in the next two years are very slim. But in successive years, it has an 8% chance of failure, which compounds each successive year, meaning a HDD will have close to an even chance of lasting 8 years (48% failure to 52% survival). So, at 10 years, it has a 64% chance of failing in the 11th year, and a 72% chance of failing in the 12th. In my experience, HDDs become obsolete and systems replaced too quickly to support that. My guess would be 5-6 years is pretty good, but my guess is probably not scientific either, and Google/Seagate/whomever is probably more accurate.
Remember the old Seagate 60 MEGAbyte RLL drives (circa 1980?)? If you powered them down 90% of the time you'd have to reach under them to spin the spindle to get them going again. Stickshun ..... I've never powered down a computer since then. :lol:
Is that because all modern HDDs are still plagued with the problem of one run of Seagates from 30 years ago? Guess what, I think they solved that little problem sometime in the previous century. Changing your behavior to avoid a problem that is very likely by now an imaginary one is a prime example of superstitious behavior.
It's very much like the fact that people who are old enough to have grown up in the post-war analog age of telephones, before the digital technology of the late 60's and 70's, remember that the volume levels were such that you might have to speak loudly to be heard, especially on long distance, while those who grew up later never experienced that problem, which is why people over about 60 still tend to talk loudly when on today's telephones. They are still compensating for a problem that was fixed nearly half a century ago.
Geez man, lighten up. You're replies are more intense then Jack Bauer. Didn't you see the :lol: ?
Aside from that, it was NOT "One run" of those drives. Those problems continued for a couple years on virtually the entire Seagate RLL line of drives both full and half-height models from 10MEGAbyte up to the whopping 70+MEGAbytes. I still have scars on my knuckles from reaching into the bays and spinning the motors.
You must be thinking of the more recent (early 90's) IBM "Deathstar" drives. The "SeaSnakes" as they were fondly known were from back in the mid-80's IIRC.
Oh, and just FTR, I never turn off any computers. That's because 1 time back in 1979 I turned one off and when I turned it back on the power supply blew a cap and the chinese capacitor paper was flying all over the room.
I apologize. Truly, I do. It is not my goal to upset you. But if you re-read carefully, I also did not dis you, not even a little bit. It was merely an anthropological observation of expected human behavior, not a condemnation of it. Sorry, but I find that sort of thing fascinating. When someone significantly alters their behavior for decades on end because there is the tiniest remote chance that Chinese capacitor paper might once again fly around the room creating a mini-armaegeddon, I have to marvel at how human thinking actually works and at what drives behavior.
You are free to behave as you wish, of course. But there is still the option of evolving beyond certain behaviors. I think evolution is not something that just happens on its own, it takes self-realization and real effort. And it is our responsibility as well. Otherwise, what are we all really here for?
The phenomenon has always been "climate change" despite being referred to sometimes as global warming... since that is part of the causation. Even years ago when Al Gore was calling it global warming, his movie talked about it causing winters more extreme than anything anyone living has ever seen.
I suspect that the dropping of the "global warming" name has mostly to do with the crackpots who idiotically try to use the fact that since every winter isn't warmer than the previous one, that's evidence that global warming is a scam... as if anyone anywhere ever claimed that the normal and expected variations in severity of winters would no longer occur because of global warming.
That said, I don't want to give up my 24/7 live buffer to reduce my carbon footprint.