Turn on thread page Beta
 You are Here: Home >< Physics

# How do transformers increase voltage and lower current? Why does current not go up? watch

1. Normally, if you increase the voltage from a power supply, the current also increases.

However, I remember from GCSE that transformers use very high voltage and low current, but how is that possible? How can you increase the voltage without the current also increasing?
2. (Original post by blobbybill)
Normally, if you increase the voltage from a power supply, the current also increases.

However, I remember from GCSE that transformers use very high voltage and low current, but how is that possible? How can you increase the voltage without the current also increasing?
Your first sentence sounds like a description of ohms law - current going through a resistor is proportional to pd across the resistor... however that's not a feature of power supplies in general.

The best you can get is 100% efficiency which means
power in = power out

power = IV

what people are usually talking about when they're talking about transformers increasing voltage and decreasing current is the current needed to provide a given amount of power.
so if V is 230V and I is 13A that's a power of 2990W

to get the same power at 23V would mean a current of 130A - which is a fairly large current that'd need thick (therefore expensive, copper isn't free) wires to conduct
and at 2300V it would be 1.3A which you could carry on really thin wires

this doesn't mean the current through a resistor goes down as the voltage goes up though - see ohms law
3. (Original post by Joinedup)
Your first sentence sounds like a description of ohms law - current going through a resistor is proportional to pd across the resistor... however that's not a feature of power supplies in general.

The best you can get is 100% efficiency which means
power in = power out

power = IV

what people are usually talking about when they're talking about transformers increasing voltage and decreasing current is the current needed to provide a given amount of power.
so if V is 230V and I is 13A that's a power of 2990W

to get the same power at 23V would mean a current of 130A - which is a fairly large current that'd need thick (therefore expensive, copper isn't free) wires to conduct
and at 2300V it would be 1.3A which you could carry on really thin wires

this doesn't mean the current through a resistor goes down as the voltage goes up though - see ohms law

to get the same power at 23V would mean a current of 130A - which is a fairly large current that'd need thick (therefore expensive, copper isn't free) wires to conduct
and at 2300V it would be 1.3A which you could carry on really thin wires

this doesn't mean the current through a resistor goes down as the voltage goes up though - see ohms law
Even if you had really thick copper wires that were capable of carrying a current of 130A, how would you actually get the current to be so high with a small voltage? I know ohms law, so how would it even be possible to have a high current and small voltage when they increase proportionally?
4. (Original post by blobbybill)
Even if you had really thick copper wires that were capable of carrying a current of 130A, how would you actually get the current to be so high with a small voltage? I know ohms law, so how would it even be possible to have a high current and small voltage when they increase proportionally?
The resistance of a copper wire is proportional to it's cross sectional area - so your 130A wires would need to contain 10 times more copper for a given length to avoid overheating.

you'd connect load components that were designed for the different voltages across your supply.
they'd have a lower resistance and therefore pass a higher current

e.g. a 24V 40W light bulb is equally as powerful as a 240V 40W lightbulb... but it's resistance is lower and so it carries a higher current.
5. (Original post by Joinedup)
The resistance of a copper wire is proportional to it's cross sectional area - so your 130A wires would need to contain 10 times more copper for a given length to avoid overheating.

you'd connect load components that were designed for the different voltages across your supply.
they'd have a lower resistance and therefore pass a higher current

e.g. a 24V 40W light bulb is equally as powerful as a 240V 40W lightbulb... but it's resistance is lower and so it carries a higher current.
Ok. But I still don't get how the transformer would know the power it is aiming for. I get that a 24V 40W bulb is the same power as a 240V 40W bulb, but I don't get how the transformer would know which power (which wattage) output it is aiming for?

And yeah, I get how a high current/low voltage power could be the same as a high voltage/low current power, but why is that different to ohms law. Ohms law says that if you increase the voltage, the current increases too. I really don't get this.

AKA, if you had two transformers - 10V and 50V. Going from 10V to 50v, how would it know whether it is going to follow ohms law (and also increase the current, because the voltage has increased), or whether it is going to use a current that means it can reach the desired power output?
6. (Original post by blobbybill)
Ok. But I still don't get how the transformer would know the power it is aiming for. I get that a 24V 40W bulb is the same power as a 240V 40W bulb, but I don't get how the transformer would know which power (which wattage) output it is aiming for?

And yeah, I get how a high current/low voltage power could be the same as a high voltage/low current power, but why is that different to ohms law. Ohms law says that if you increase the voltage, the current increases too. I really don't get this.

AKA, if you had two transformers - 10V and 50V. Going from 10V to 50v, how would it know whether it is going to follow ohms law (and also increase the current, because the voltage has increased), or whether it is going to use a current that means it can reach the desired power output?
There isn't a choice about which laws get obeyed... really it's a choice about what you want to be constant and what you want to vary. The load resistance doesn't have to be constant - if you want to hit a specific power output at a given voltage you'll pick a resistor to match that.
if you want to keep the resistance constant and see what happens as you vary the pd across it, that's fine too - but be aware that you've chosen to keep the resistor constant, constant load resistance isn't a law

tbh probably it'd make it more concrete and understandable if you looked at some actual questions rather than trying to mentally juggle the concepts for the time being.
7. Whenever you have a problem like this, first think about the basic principles of conservation of energy. Energy into a system has to equal energy out of a system. As power is E/t, therefore power in has to equal power out. This is the fixed value in the system. With a transformer, you are changing the voltage, so the current has to change proportionally to ensure that energy is conserved as P = IV.

Reply
Submit reply
Turn on thread page Beta

### Related university courses

TSR Support Team

We have a brilliant team of more than 60 Support Team members looking after discussions on The Student Room, helping to make it a fun, safe and useful place to hang out.

This forum is supported by:
Updated: January 26, 2017
Today on TSR

### Exam Jam 2018

Join thousands of students this half term

### Solo - A Star Wars Story

Poll

The Student Room, Get Revising and Marked by Teachers are trading names of The Student Room Group Ltd.

Register Number: 04666380 (England and Wales), VAT No. 806 8067 22 Registered Office: International House, Queens Road, Brighton, BN1 3XE

Write a reply...
Reply
Hide
Reputation gems: You get these gems as you gain rep from other members for making good contributions and giving helpful advice.