52A too much for a 20A device? What do I need to do?

Ask questions about projects relating to: aerodynamics or hydrodynamics, astronomy, chemistry, electricity, electronics, physics, or engineering

Moderators: kgudger, bfinio, MadelineB, Moderators

Locked
bigdog989
Posts: 2
Joined: Thu Sep 18, 2008 4:27 am
Occupation: Student
Project Question: n/a
Project Due Date: n/a
Project Status: Not applicable

52A too much for a 20A device? What do I need to do?

Post by bigdog989 »

Hi, my question today is, if I were to buy a remote controlled relay kit off eBay that runs at 12VDC and I think 20A would I be able to power it from inside my computer that is able to output 12VDC at 52A.

I guess I’m worried my computer power supply will be to powerful for the device and blow it up.

Here is a link to the relay device I’m talking about - http://cgi.ebay.com.au/12-Channel-Wirel ... dZViewItem

Thanks
Craig_Bridge
Former Expert
Posts: 1297
Joined: Mon Oct 16, 2006 11:47 am

Re: 52A too much for a 20A device? What do I need to do?

Post by Craig_Bridge »

I would strongly recommend that you do some reading on Ohms Law and Kirkoff's Law to understand some basic electrical circuits. A passive device like relay contacts doesn't require power, it simply switches power.

When I looked at the remote control package by following the link supplied, the unit has its own built in power supply to power the receiver and the actuator coils on the relays. You don't need to provide any control power.

The controlled contacts that are provided by the 12 relay channels have contact ratings. Essentially do not exceed open circuit voltage and do not exceed load current specified at 14 VDC and 120 VAC. If you exceed the open circuit voltage, the contacts may arc and/or pit and significantly reduce their life expectancy. If you exceed the current ratings, you risk welding the contacts shut or melting the unit.

What are you trying to switch on and off using this device (e.g. what is the controlled load)? The power requirements of the load will provide the information needed to determine if you are within the maximum contact ratings.

I doubt there is any simple way to get 20 Amps to power a single load let alone 52 Amps using a computer power supply. Fortunately, I doubt you have a 20 Amp load to power either.
-Craig
MFagan
Former Expert
Posts: 10
Joined: Sun Sep 07, 2008 1:16 pm
Occupation: College Engineering Student
Project Question: n/a
Project Due Date: n/a
Project Status: Not applicable

Re: 52A too much for a 20A device? What do I need to do?

Post by MFagan »

I would recheck the rating of the computer power supply. I could understand 5.2 amps, but I have never seen a power supply for a computer that could source 52 amps at any voltage. 5 amps seems about right.
bigdog989
Posts: 2
Joined: Thu Sep 18, 2008 4:27 am
Occupation: Student
Project Question: n/a
Project Due Date: n/a
Project Status: Not applicable

Re: 52A too much for a 20A device? What do I need to do?

Post by bigdog989 »

I guess I’ll start with what MFagan said as this is my area of expertise, computers, I guess even 5 years old, once only need 120w to run. A computer runs off several voltages including 12v, 5v, and 3.3v. Now that 200w is divided between those voltages but as an example lets say that the 12v part used all that wattage 120w / 12v = 10A. Due to that wattage being also shared with the other voltages 10A is over estimating it. It would be more likely around 5A. With today’s computers now being far superior, not only on the data processing side but also on the video rendering side, computers draw way more amperage then ever before. One way to fix the problem was to up the wattage and to create two or more 12 rails, this meaning that each rail had its own independent amperage that it could put out and was not shared with the other rails. This also enabled manufactures to still be within special standards. This was a good idea but many extreme gamers, and computer overclockers found that they needed more amperage on certain rails, so some companies decided to create power supplies that went back to the original single 12v rail design with a major amperage and wattage boost.

It just so happens I brought one of these top of the line power supply’s a few weeks ago, have a look at this picture :) Image


Getting back to Craig Bridge, I do see that the receiver has a small battery on it but I also notice next to that there are two capacitors which are sitting next to a set of connectors, labeled + and - which I thought were used to power the device and the battery was like a backup power supply. Not all of these devices I found on eBay have batters on the receiver.

Each relay has three connectors, I believe they are normally referred to as the following, normally open, normally closed and common, is that right?

I only want to power a few computer case fans, neon’s and LED’s, all of which are currently being powered by the power supply. I guess I should say they are all 12VDC and require very low amperage none above 1A

Is it possible to have too much amperage if it's not been drawn by a device?

Thanks to all that have replied, much appreciated
MFagan
Former Expert
Posts: 10
Joined: Sun Sep 07, 2008 1:16 pm
Occupation: College Engineering Student
Project Question: n/a
Project Due Date: n/a
Project Status: Not applicable

Re: 52A too much for a 20A device? What do I need to do?

Post by MFagan »

Alright, I am just used to the older style of power supplies that were quite limited. That Corsair is a pretty powerful one (they're a well known brand). It sounds like you've got the rest pretty well worked out.
Craig_Bridge
Former Expert
Posts: 1297
Joined: Mon Oct 16, 2006 11:47 am

Re: 52A too much for a 20A device? What do I need to do?

Post by Craig_Bridge »

I guess I’m worried my computer power supply will be to powerful for the device and blow it up.
Just because something is CAPABLE of doing something doesn't mean it WILL. I reiterate:
Craig wrote wrote:I would strongly recommend that you do some reading on Ohms Law and Kirkoff's Law to understand some basic electrical circuits.
For example, if you have an LED with a series current limiting resistor designed to limit the current from a 12 VDC supply to the LED to 2 MA, then absent a short circuit, the maximum current draw will be 2 MA as long as the power supply voltage is 12 VDC. If the forward voltage drop of the LED is 0.7VDC, then the voltage drop across the series resistor will be 11.3 VDC. If the current is 2 MA, then the resistor must be about 5.6 K Ohms. This was a straight forward application of Ohms law. Now if you apply kirkoff's Law to this circuit, the current flow in any part of the circuit will be 2 MA. In other words, your high powered computer power supply will only be delivering 2 MA if this is the only circuit attached.

Because you are so concerned with burning something up, lets think about a failure case. What if the series current limiting resistor is shorted out? In this case, 12 VDC will be applied directly to the LED and there will be a 12 VDC drop across the LED. If you looked up the data sheet for how much current the LED would draw with a 12 VDC drop, you would find it is off scale and would cause the power ratings of the LED to be exceeded and the likely outcome would be a very small explosion of the LED plastic housing. Nothing dangerous to people; however, the LED would be destroyed and you'd have the smell of burnt silicone.

Typically circuits are protected from faults by the use of fuses or circuit breakers. In your case, if you want to assure that your circuit won't have the power to do extensive damage, consider purchasing a fuse and fuse holder and wiring it in series with your power supply.
-Craig
Locked

Return to “Grades 9-12: Physical Science”