How can I convince my dad that a 550 watt power supply isn't a big difference from a 250 watt power supply?

So just the other day I bought a new graphics card for my computer however I needed a better power supply to run the graphics card, everything was going well until my dad found out that my wattage had jumped from 250 watt to 550 watt. I just agreed with him because I didn’t want to be a pain and I took it out, he said that it would make the electricity bill fly up in price. I told some of my friends about this situation and they all said that it wouldn’t be that much of a difference in price down to something about the conversion of the mains to watts also how much my computer actually takes out of the watts and leaves whats left over and not wasted or something like that.

My question I am asking is that am I correct thinking that it won’t cost that much more to run my pc with these extra watts and graphics card, and also how can I put it into words to make it clear to him that it’s not going to cost that much more that I was using before.

Graphics card: http://www.amazon.co.uk/Gigabyte-GTX560-PCI-E-Graphics-Card/dp/B0050MUTUY/ref=sr_1_1?ie=UTF8&qid=1356914279&sr=8-1

Power supply: http://www.amazon.co.uk/OCZ-OCZ550FTY-UK-Fatal1ty-Series-Supply/dp/B001MTYS06/ref=sr_1_1?s=computers&ie=UTF8&qid=1356914331&sr=1-1

other information: he freaked out and told me that 550 watts is way too much to be using, I don’t think he understands that it’s the maximum and my pc will not actually be using that amount it’s just there for extra power if my computer needs to use it.

Thankful for any help please 🙂

Both comments and pings are currently closed.

9 Responses to “How can I convince my dad that a 550 watt power supply isn't a big difference from a 250 watt power supply?”

  1. Adrian says:

    550W is the "capacity" of the new PSU, it does not mean it will use that much. The power used will be what the system uses, no more. Adding a newer video card may increase power usage by 50 to 100W, depending on what video you replaced/added…

    Even at worst case, 100W more (which is likely for a GTX560), and assuming you use the computer 10 hours a day, that is 1kHr (Kilowatthour) of energy more per 10 hour day. In most western countries, a kilowatt of power costs between 5 to 20 cents.
    So, worst case is about 20 cents a day more, $6 a month more… If you use less per day, or electriciy costs less than 20 cents a kilowatthour, then it will be even less (maybe 1/2 if using 1/2 the time or 1/2 if cost is 10 cents/kHw)

    If your dad is concerned about the cost, offer to pay about $5 or $6 a month yourself, and his arguments will be gone. Show him my calculations/assumptions, let him decide. If you offer to pay, and the cost is only $3 to $5 a month more (a fair, worst case estimate in my opinion), he may let it slide….

    The other way of looking at it, is if the 250W PSU was almost fully used (250W usage), adding a video card that uses 50-100W more is only a small 20-40% increase in the computer’s overall power usage, not double – regardless of the PSU’s size or power rating.

    EDIT: Bloody Hair has a similar argument, and is just as valid. Show your dad both of our responses. I could have a 100W power supply, but use only 150W from it, that would what the electricity usage is, not the rating of the PSU. Everyone else that says "it is double", etc. have no understanding of power usage or how computers actually use power from the PSU. I’m a P.Eng., been into electronics since 1971….

  2. Obamavenger says:

    It’s more than double. Do the math.

  3. Gamingbugr says:

    Just say that his pc uses the same amount.

  4. Scatteredmist says:

    Its more then double, so in theory it will need an additional 300 Watts don’t they teach maths in school? Your dads right and he has to fork out the electric bill, sorry but I have to be honest

  5. bloody hair says:

    adding a power supply that can supply more watts won’t automatically make it constantly drain that amount from the socket,it’ll only drain what it needs to,and the moment it doesn’t need to drain as much the power consumption decreases.

    the gtx 560 peaks at around 160 watts,the cpu peaks at around 95 watts typically,everything else in an average pc takes about 50 watts a peak. (that means fans,HDD’s,ram,sound card,usb hubs,etc)
    when your not gaming the graphics card and cpu both take less watts.

    maybe around 15-30 watts for the cpu at idle or when your browsing the internet or something like that which barely tasks the cpu.

    the graphics card idles at around 15 watts

    everything else will idle at around 10 watts.

    so heres how to convince him

    first lure him in,tell him that yes it will increase the wattage since you added the graphics card,but it doesn’t take nearly 550 watts,at peak it’ll take only 300 watts,and peak power consumption occurs very rarely if ever since pretty much nothing requires 100% of all your components ability.

    a majority of the time when your browsing the internet or something it’ll take only 50-60 watts.

  6. Who says:

    first 3 answers and 2 are stupid – is this a record?

    yep 550 watt is 2x 250 watt sure enough

    but a power supply aint like a light bulb – It DONT supply 500W end of story – it will ONLY supply what is needed up to its maximum

    If the pc only requires 250W then thats all it will supply (in fact its possible.it could be more efficient than a 250W, in that it could actually take LESS power than the 250W)

    The big thing comes if you plug more stuff into the pc so that it now requires say 350watt

    A number of things can happen if you only have a 250W power supply
    The BEST thing that can happen is that it just shuts down and dont supply anything
    WORST thing that can happen is that you get "brown outs" Thats when say the 12v only gets to say 10v and the 5v only gets to say 4v, so the TOTAL output from it does not exceed 250W
    This can be REALLY bad for electronic components. cos they can get into a sort of "limbo land" where they are not switched on correctly, but its not low enough for them NOT to be switched on. It CAN be a recipe for blown components

    With a 500W power suppy – NO PROBLEM at all – it just supplies the 350W

  7. Cool PR says:

    Average cost of electric charge is .11 cents a kilo watt per hour. Which means your comuter will use around 25 cents of 4 hours of use. Not much in opinion.

  8. ? says:

    That specification is PEAK.

    I have a video card that uses 250w, yet most of the time my entire computer
    doesn’t draw as much as 175w most of the time

    I KNOW how much it draws because the LCD display on my UPS battery backup
    tells me that (and other things) at a glance….

    I’ve never seen my high-end quad core computer draw more than 400watts off
    it’s 650watt supply. and that was while running multiple video file transcoding
    operations that all used the CUDA cores on the nVidia card at the same time

    BTW, computer powers supplies are "Switching" power supplies and are very efficient,
    because they draw power in proportion to how much is demanded from them.

    The Software for my UPS also tells me that my computer actually uses between
    3kW/hr and 4kW/hr per day. on an average day…

    Power is sold by the KW/hr and as of 2010 Electric rates ranged from between $0.062 per kW/hr (Wyoming) and $0.2512 per kW/hr (Hawaii)

    My current electric rate based on my last bill is $0.12 per kw/hr so my computer
    which I leave running 24hours a day 365days a year costs me ~$0.50 a day

    Because the hard drives and fans in my computer all run "Fluid dynamic" bearings
    I know they actually wear more spinning down when powered off and spinning back
    up when turned back on ONCE than they do running for six months or so…

    So theoretically they will live much, much longer if I simply leave them running.

    My computer also automatically runs all it’s maintainance tasks, checking for updates,
    defragging the drives for which it is allowed (NOT for my SSD which does not require defragmentation) and virus scanning while I am sound asleep (and thus these necissary
    tasks do not interfere with my use of the computer)

    I don’t worry about the power my computer uses.
    Considering that I first replaced my large CRT monitor (a 22" Sony 4:3))
    first with a 20" 4:3 LCD that cut power use from 195w to 25w, then
    replaced that LCD monitor with a 24" 16:9 LED backlit LCD monitor
    that draws 7watts (Less than the previous LCD Monitor drew on stand-by)

    I understand that there is a "diminishing returns" for trying to reduce power
    useage past a certain point.

    Frankly worrying about details would save me power, but I’d likely spend
    more money on anti-acid from the worrying. And I have better uses for my time.

    Some of that time will be used to leave you and everyone else with a little giggle….

    Consider this, your newer power supply is likely more efficient than your old one.
    In addition to that if your old PSU was ever pushed near it’s limits you should know that once a PSU is running at 70% of it’s capacity (or more) efficiency drops.

    So it’s entirely plausible that the new 550W supply uses LESS power in many situations than the older one:)

  9. Jeff Northon says:

    Graphic cards are energy pigs. Why would they have to put all the fans and heat removal fins on it if it wasn’t?

    Don’t tell him, or throw some money at the electrical bill.

Powered by WordPress | Designed by: free css template | Thanks to hostgator coupon and web hosting reviews