@Ace of flames01Volts are a measure of the electrical potential between one point and another, such as between the two slots in a wall outlet. Amps (amperes) are a measure of the flow of electric current. If there is an electric circuit that features a resistive element, such as the element in a heater, the current flowing through that element is equal to the voltage across the element divided by the resistance of the element.
oh...i got sidetracked here:
a typical lightning bolt bridges a potential difference (voltage) of several hundred million volts.
Most measurements have been in the range 5,000 to 20,000 amps but a famous strike just before the Apollo 15 launch in 1971 was measured at 100,000 amperes by magnetic links attached to the umbilical tower. Currents over 200,000 amps have been reported.
From articles in Windpower Engineering & Development, we learn that lightning bolts carry from 5 kA to 200 kA and voltages vary from 40 kV to 120 kV. So if we take some averages, say, 100 kA and 100 kV, this bolt would carry this much power, P:
P = 100×103 A x 100 x 103 V
= 10,000 x 106 VA or Watts
= 1 x 1010 Watts
Recall that 1010 Watts is 10,000,000,000 or 10 billion Watts.
Now assume this energy is released in 1 sec. So the power is:
1010 W-sec. On your electric bill, you’ll see you pay for Watt-hours or Wh. So let’s convert W-sec to Wh:
Pl = 1010 Ws x 1 hr/3600 s
Pl = 1/36 x 108 Wh
= 0.0277 x 108
= 2.7 x 106 Wh or Watt-hour per our average lightning bolt.