Hey guys, I've come across this questions and although a shady method gives me the correct answer, I would appreciate it if anyone can provide a detailed working.
"Mains-operated power supplies have large-value capacitors to help keep the output voltage constant. In a full-wave rectified supply, without a capacitor the output falls to zero every 10 micro seconds. If the output must be maintained at a value at least 90% of its maximum value when a load of 1.0 Kilo Ohms is connected to the output terminals, show that the minimum capacitance needed in the power supply circuit is about 100 micro Farads."
Quick answers, even if brief will be wholly appreciated.