If the cathode is required to be heated, why do they use a low-voltage power supply when they could use a high-voltage supply with higher current and therefore more heat?
They cheat, by using the transformer trick, massive V, low I same amount of energy. So you are far better off with 400V and like 2A than 400kV and 0.02micro amps. It's far easier to just heat that one, that using the EHT which has the tiniest current.
They cheat, by using the transformer trick, massive V, low I same amount of energy. So you are far better off with 400V and like 2A than 400kV and 0.02micro amps. It's far easier to just heat that one, that using the EHT which has the tiniest current.
So your saying in all cases, when there is a low voltage, there is a high current.
So the current from 6v is enough to heat up a cathode? How? Wouldnt 2A stil be very cold?
So your saying in all cases, when there is a low voltage, there is a high current.
So the current from 6v is enough to heat up a cathode? How? Wouldnt 2A stil be very cold?
thanks
No i'm not saying that. I'm just saying it's not practical to use the actual current to heat a cathode unless you have a shedload of power. It's obviously not enough to heat electrons off. So what you is have a second heat source (a flame does the job well) trained on the wire that allows the electrons to be boiled off.
No i'm not saying that. I'm just saying it's not practical to use the actual current to heat a cathode unless you have a shedload of power. It's obviously not enough to heat electrons off. So what you is have a second heat source (a flame does the job well) trained on the wire that allows the electrons to be boiled off.
So what the hell is the point in having a low voltage supply to heat the cathode when it doesnt do ****