Hacker News new | past | comments | ask | show | jobs | submit login
A lightning bolt would be worth only about a nickel (2015) (engineering.mit.edu)
55 points by porlune on June 23, 2022 | hide | past | favorite | 120 comments



> The average lightning strike contains about 1 million joules, enough energy to fry the founding father in his boots. “The typical house in the U.S. has 100 amp service or about 28 horsepower,” says Kirtley.

Boy do I get frustrated when using compatible units without conversion. The unit that I hate more than any other unit in the universe is the KwH, which is dimensionally equivalent to the Joule, so I don't understand why we don't just use that instead.

"The typical house in the U.S. has 100 amp service or about 28 horsepower" -- seems that it would be way more interesting to say that "the typical house has 100 amp service at 120V, which means 12,000 J/s".

The way the original quote is phrased (and the introduction of horsepower of all things) seems insane to me; the clarification adds zero value. You still haven't addressed the main question, which is "is the energy in a lightning bolt a significant amount of energy compared to household usage". For all I know 28 horsepower is 1,000,000 J/s, so a lightning bolt would only power a house for a second.

EDIT: as many commenters have pointed out, apparently most houses get 240V service, so just double the number above. Still, this is easily fixable, and the main point is that horsepower does not add any value to this discussion.


> 100 amp service at 120V

Strictly speaking, 240V. Normal electric service in North America is 240V split-phase, with the distribution transformer's center tap grounded and serving as the neutral line. We normally only use the full 240V for heavy loads like electric ovens, arc welders, large air conditioners, and such.

Large buildings often use 208V three-phase power, yielding 120V phase-to-neutral, and large commercial lighting installations are often 277V taken from one leg of a 480V three-phase feed. Voltages greater than 240 are not permitted in residential service, and I wouldn't be surprised if phase-to-neutral > 120 is out as well for homes.


Do you work in clean energy tech by any chance? Software people who know these things are like unicorns


There's probably a large overlap between software people and people who subscribe to Technology Connections: https://www.youtube.com/watch?v=jMmUoZh3Hq4


The residential side can be discovered if you're a DIY homeowner.


The kilowatt hour is a fantastic unit when talking about electrical consumption.

Are you running a 100W load (0.1kW) for an hour? That's 0.1kWh. Running it for ten would make a whole kilowatt hour.

This allows for easy calculations of how much something is going to cost in electricity, and the units are such that it's easy to do the math in your head.


You're using a toy example to strawman the Joule counterpoint. How much do you use in a month, which is the typical billing cycle? Off-peak vs. on-peak cost?.

In the end there's no practical utility to this. We just pretend that we're living in a world where you get a 100W light bulb and know exactly what it will cost you, and not a world where half your bulbs claim to be 100W but are actually 14W with 100W-incandescent-equivalent and such.


> You're using a toy example to strawman the Joule counterpoint.

Not a toy example. That's exactly how I estimate energy consumption for off the shelf devices. And for battery life (W*h but still).

> 100W light bulb and know exactly what it will cost you, and not a world where half your bulbs claim to be 100W but are actually 14W with 100W-incandescent-equivalent and such.

If you've lived with lighting you're responsible for, you've replaced bulbs. You know the different technologies and the packages say how much power they require.


Same here, I find the KWh to be a very useful unit for daily calculations of that sort, where I want a quick upper bound on things to make estimations.

Last year I switched off my fully-functional 2008 workstation (a lovely Fujitsu Celsius W370 on OpenBSD, a furry joy) because of such an upper bound difference (300W vs 65W for the ThinkCentre that hides among the books on my desk's side).

This sort of works in a similar way with light bulbs as well. Although lumen would be the appropriate unit for luminosity, the packaging uses wattage to indicate luminosity.

Although lumens and Watts are correlated, they aren't dimensionally equivalent as Joules and Watts are (CMIIW).

That "100W" on the package an electrically 14W bulb simply means "it's only using 14W, but shines like a 100W bulb, go ahead, BOGOF".


> 300W vs 65W

A watt to watt comparison is fine. Why hours? I can tell you right now that the big one uses ~4.5x the power. Is it really that much easier to convert the time you're using the device to seconds? If you're going to multiply by the electricity cost anyway, might as well break out the calculator one step early.


I use watt when I want to compare power, and watt*hour when dealing with energy. Hours because hours are a lot closer to the real spans of time I use, and therefore much easier to calculate with. E.g. it's more common to run a 3kW AC for 3h than 3s. So, 9 kWh instead of 9 kWs (or 180kWs, for the same time span)

> If you're going to multiply by the electricity cost anyway

I don't usually convert to money. If it's a linear cost per kWh, then I can deal with that at the end or, more likely, don't actually really care.

The _actual_ cost you pay is often tiered anyway, so who knows what the price _actually_ is until you have the whole month's worth of power usage finalized. And then it's not clear what I could account to what tier.

> might as well break out the calculator one step early.

Bold to assume I use a calculator much.


Yeah exactly, I didn't even need to take time into account, I was looking for the upper bound presuming it's always on with a simple Wattage comparison.

But that was mostly because that workstation, albeit lovely, had a very long boot process.

The ThinkCentre boots in under a minute, so I actually end up only booting it when needed.

(Some of my work can be done offline, and I jump at every opportunity to `halt -p` and be in the quiet offline space/state.)


>In the end there's no practical utility to this. We just pretend that we're living in a world where you get a 100W light bulb and know exactly what it will cost you, and not a world where half your bulbs claim to be 100W but are actually 14W with 100W-incandescent-equivalent and such.

Where i live, the pricing is based on the amount of energy "consumed" during a month. plus we use 220v.

i use the kwh extensively, my induction cooker is rated at 1300Watts so i know by running it for 1 hour i am consuming 1.3kwh.

my monthly "consumption" before installing a 5kwh solar on-grid used to be around 300Kwh during summer months so over time i have learned to "reduce" my monthly usage, aka kwh by reducing my electric hot water geyser( going to solar water heater) (2kw geyser).

the kwh is definitely a good indicator for me


To take your example, your 14 Watt lightbulb would use about 0.35kWh per day if running all day, or about 10kWh per month of nonstop usage. At 0.15$/kWh, that's about a dollar and fifty cents, assuming you'd want to leave it on all day and night.

All of that math can be done in your head too, if you're willing to approximate the number of hours in a day to 25 and the numbers of days in a month to 30.


I'll take one more shot at explaining why this continues to be a strawman.

Why would I be running the light all day? The assumptions being made here are unrealistic enough that the answer becomes meaningless. The realistic question here, "how much money would I save by switching from a 100W incandescent bulb to a 14W LED bulb" is not helped by any of this kWh nonsense.


Maybe your utility company bills in joules, but mine does in kWh. You can easily figure how many kilowatts your bulb does (divide by 1000), and you can come up with an estimate of how many hours you use it in a day/month/year, which is then easy to turn into an actual dollar amount.

A joule is just a watt second instead of a kilowatt hour, and the 3600 factor (seconds per hour) is really annoying to use in mental math.


Hours are usually more convenient to work with than seconds when talking about energy use (even in this case when you are comparing to other things. Plus it's the de facto standard unit more people are familiar with. Power bills, electric car capacity, and energy efficiency are all normally described in kWh.


This is true, but it is exactly what I am railing against. It's just tradition -- energy in houses is kWh, so let's use that for other things too.

Instead, let's just use Joules everywhere. Easy peasy. Why are my batteries rated in Ah -- it's not like they're providing a variable voltage source; just give me Joules.

I know this is a ridiculous hill to die on, but I will die on it!


> it's not like they're providing a variable voltage source; just give me Joules.

They are, technically. https://www.batterypowertips.com/how-to-read-battery-dischar...

Probably doesn't matter though. :)


The typical house has 240 volt service in the US (two legs of 120v). I would also say a decent number of homes have 200amp as well


> The typical house has 240 volt service in the US (two legs of 120v).

Can we wire that in a way to would allow installing .uk or .de Schuko Type-F plugs?

It would be funny (and weird!) to have 220V available on a european socket for say a desktop equipped with a 1.5kW PSU for the upcoming Nvidia 4090 :)


You’d probably need to do this with an AC-DC-AC transformer to also get the circuit converted to the 50hz cycle European power systems use, but it’d just be a waste because of the power losses involved (not to mention the expense of the transformers). Also, a typical 15A circuit in a home will handle ~1.8kw of sustained load, so as long as you can dedicate a circuit to such a rig, you’d be fine.


Computer power supplies won't care about 50/60Hz difference (maybe except some slight efficiency change).


Computer power supplies don't give a shit about frequency. They will happily accept 120/50 or 240/60.


There may be ordinances requiring a specific plug type, and I'm pretty sure the NEC has opinions on outlets. :)

That said, it's your house, so you may be able to do that in some places. I've thought about it myself, so I can get a euro tea kettle.

That said, the power will still be at 60Hz not 50, which will matter for some uses (e.g. impedances will change).


You probably have a few such sockets around. Not the UK or eu form factor but 240v at least. Look at how your dryer or AC plugs into the wall and you'll see something different to the usual plugs.


These are all standardized and have meaning. https://www.electronicshub.org/electrical-outlet-types/

There are definitely 240V for higher power usage (because they will need half the current for the same power with double voltage, and current (current density, technically) is what causes joule heating and melts wires/ starts fires.

The sockets will be incompatible with Euro or UK sockets. You could change the socket to a euro socket, bit the frequencies will be wrong for euro or uk devices (60Hz in the us, 50 in europe & UK.) This may or may not be important depending on what you're hooking up to it (understand that impedance and other electrical things are a function of frequency). The safest option would be to get something to convert the power (there seen to be products to do that) and put _that_ behind the Euro or UK outlets.


Maybe motors because you can replace the neutral (in eu) with a hot (us) and it's basically doing the same thing.


A surprising (but still tiny) number of US houses also can have three-phase brought out to them by the electric utility. Some utilities will charge you upfront for stringing it up and lighting up the wires, but another surprising number of utilities will just amortize it from the monthly cost of service.

It's still rare enough you have to ask ahead of time if you want it for any building you get into, but in some major metro areas it has gone from "you have got to be joking" to "sure, let me check on that".

Now if only I can find a reliable, durable three-phase solar inverter...


3 phase gets you 108v on each hot leg. Many items will specify 110v/120v . 108v is close enough


This is true, but I'm fairly certain that the amps of available power are calculated on a 120v basis, a 10 actual-amp washing machine would use 20 accounting amps at maximum power.


The rating of a service (and by extension, the main breaker) is available at 240V. It's true that if all of your 200A load is consumed on a single leg, with zero amps consumed on the opposing leg, that you would only be able to consume 200A * 120V nominal or 24kW, but if the load is balanced, you can pull 200A @ 240V nominal or 48kW.


> The unit that I hate more than any other unit in the universe is the KwH, which is dimensionally equivalent to the Joule, so I don't understand why we don't just use that instead.

kWh/yr is worse. It's just watts but obfuscated. Gets used for appliances.


The KwH makes sense if you consider that while units are largely path-independent, mental calculations are not. At any given moment, the sensible measure of your house's electricity usage is in kW. And to work backwards to figure out consumption, the hour certainly beats the second. Sure, you could call it 3.6e6 joules, but at that what point what does it buy you?

Horsepower is clearly insane though, I have no idea why you'd bother.


For most realistic conversations you'd be talking about the relative consumption of two devices. Watts/kW is totally fine for this.

Regardless, head-math or otherwise, the vast majority of people will never do this calculation at all, except maybe to weigh the relative power consumption. And if they do the math, having a calculator and having all the units be compatible with each other (so just a natural conversion of W to J/s) is totally fine.


Idk, I think OP indicated that a typical lightning bolt would power a typical house (presumably at the stated nominal full load) for about 15 minutes. I.e., you'd need 4 strikes per hour, per hour. Now granted most houses won't run at peak load all the time, maybe you'd only need one strike per hour per house - but that's still clearly a lot more lightning than is generally seen.


The article is short and such an indication should be easy to cite directly. If I missed it, please let me know.

As far as I can see, nowhere in the article does it give enough information (even if you take into account unit conversions) to say how much of a house's electricity would be supplied by a lightning bolt.


The kWh is of course exactly what you want if you're working with power over time.

Example: a 100 amp house circuit running maxed out in the US will use 12 kWh per hour, or 0.2 per minute. Try doing it in your head with joules. Annoying right?

12 kWh / h? Am I a crazy person? No. I'm working on a useful problem.


I thought that quoting in horsepower was very interesting as many people are currently thinking about the load that EVs put on the grid.


Fun facts & questions from Feynamn lectures:

> "On an ordinary day over flat desert country, or over the sea, as one goes upward from the surface of the ground the electric potential increases by about 100 volts per meter. Thus there is a vertical electric field E of 100 volts/m in the air. The sign of the field corresponds to a negative charge on the earth’s surface. This means that outdoors the potential at the height of your nose is 200 volts higher than the potential at your feet! You might ask: “Why don’t we just stick a pair of electrodes out in the air one meter apart and use the 100 volts to power our electric lights?”

> "Although the electric current-density in the air is only a few micromicroamperes per square meter, there are very many square meters on the earth’s surface. The total electric current reaching the earth’s surface at any time is very nearly constant at 1800 amperes. This current, of course, is “positive”—it carries plus charges to the earth. So we have a voltage supply of 400,000 volts with a current of 1800 amperes—a power of 700 megawatts! With such a large current coming down, the negative charge on the earth should soon be discharged. In fact, it should take only about half an hour to discharge the entire earth. But the atmospheric electric field has already lasted more than a half-hour since its discovery. How is it maintained? What maintains the voltage? And between what and the earth? There are many questions."

https://www.feynmanlectures.caltech.edu/II_09.html


'The Action Lab' channel on YouTube did a little experiment with this - https://www.youtube.com/watch?v=0VgqRigZAHA


Lightning-powered bitcoin mining is obviously the next big thing for green energy!

Now you need a double helping of luck - first that you get struck by lightning, and second that your miner guesses the right hash to make a block. But maybe the two somehow combine? After all, people who survive a lightning strike are said to be so lucky that they should buy a lottery ticket... so following this logic, lightning powered mining would be extra efficient!

Now I'm off to patent my new PoS invention - proof of strike :)


I know a number of people who have been struck by lightning. It seems like you've got a decent chance at survival if you're not touching anything metal.


Do you know a lot of people who stand on top of tall towers during storms?

Otherwise, unless you know tens of thousands of people there is very little chance of knowing multiple people who have been struck by lightning.


I also know two people, my mother and a leader at my boyscout camp. Both wearing rubber-soled shoes and survived without needing hospitalization.

Maybe birthday paradox or something similar at play? Only 64 commenters but appears more common than your belief would indicate.

I bet if you are a regular golfer you would know more, and if you spend 100% of your time in a concrete jungle you would know less.


The rubber soles don't mean much, though. The electric current just passed through kilometers of insulation (air). It doesn't care about 2 cm of rubber.


It's the kind of story you're going to hear and remember. Lots of rare things are not notable, nothing you would tell a story about. Also, people could be counting near misses as being "hit by lightning"


TIL one gallon of gas is 120 lightning bolts, and the relative cost of energy between electricity and fuel is near equilibrium.


To illustrate that even more vividly: if a typical lightning bolt is a few km long, the energy released is equivalent to detonating a stream of gasoline roughly 0.1mm in diameter (not much thicker than a human hair).


Reminds me of the units used for vehicle fuel efficiency: litres per 100 km.

But liters is a unit of volume (length x length x length) and kilometers is a unit of distance (length). Hence, this efficiency metric is equivalent to L^3/L or just L^2. That is, the unit of vehicle fuel efficiency is a measure of area!

The area of what, you ask?

If you made the contents of your tank into a long thin stream of fuel as your vehicle moved along, then its cross section is the instantaneous fuel usage. You can imagine your car driving along, "sucking up" this long thin streamer of fuel as it moves.[1] The thicker this line of fuel, to more it needs for the same distance.

My car gets about 7L/100km, which works out[2] to just 0.07 mm^2, which is surprisingly thin!

[1] I can't take credit for this concept, I got it from XKCD's What If section: https://what-if.xkcd.com/11/

[2] https://www.wolframalpha.com/input?i=7+L+%2F+100+km+in+mm%5E...


When you phrase it like that, it does feel like the quoted energy figure is too low - after all, lighting that tiny a stream of gasoline isn't going to illuminate the sky anywhere as bright as a lightning bolt.

Perhaps the 'nickel of electricity' is what's remaining in electrical energy after all the rest has been used up as light, heat and sound across the sky?


> lighting that tiny a stream of gasoline isn't going to illuminate the sky anywhere as bright as a lightning bolt.

Gasoline will burn for much longer so the energy will be released slower, so the peak will be much lower. And we perceive the peak light not the total amount of energy (see 1000 lumen stroboscope going 1 ms on - 999 ms off vs 1 lumen light turned on constantly).

Also gasoline will release more radiation in infrared part of the spectrum.


You are comparing two different time scales and two different spectra.

The lightning finishes in microseconds. The flame front speed of well-mixed gasoline/air mixture is about 16.5 m/s (see https://en.wikipedia.org/wiki/Flame_speed ) so 6 microseconds to cross that 0.1 mm.

Seems comparable, right? But that's the entire length of the lighting bolt in microseconds, not just one patch. Plus, the 0.1mm calculation assumed only gasoline, not a gasoline/air mixture with a 12:1 compression. Any guidance from a real-life comparison would be affected by the diffusion speed of oxygen. If it takes significantly longer to burn the same energy then the intensity (energy/time) will be significantly lower.

In addition, the spectra are different. Have you ever seen a fuel-based camping lantern? They use a mantle to make the light significantly brighter. (See https://www.youtube.com/watch?v=F3rncxf4Or8 for details). This mean the visible light from burning fuel isn't a good guide for the amount of visible light which can be generated from the same amount of energy.


Not in visible light, but I can easily believe it would be as bright if we could both see infrared and also if we made all that gasoline burn as fast as lightning propagates.


But the energy release from the gasoline would be much slower, right?


You have remember that most of the energy in a storm is more to do with its exchange of air masses. It's more like a boiler than a dynamo. So it doesn't surprise me that lightning in practice isn't energy dense. But figuring out how to capture the energy generated seems like a fun experiment just to see how far we can go with material science than anything else.


The energy needed to bring a teapot to boil is enough to fling a nickel into orbit.


Regrettably, I doubt the average home has a stove hot enough to boil teapots.


I was curious, it comes about 310Kj if you ignore friction, or about the same calories as you gain by eating an apple.


Only if you ignore friction.


This reminds me of an old (misleading) graphic in Wired magazine suggesting that people on treadmills could be significantly reducing their electricity bills.


Olympic track cycling medallist powers toasting a slice of bread, which takes everything he's got and leaves him exhausted: https://youtu.be/S4O5voOCqAQ


That reminds me of a thread I saw on an Internet forum a few years ago about some guy suggesting powering his fridge with a bicycle during a power outage. Turns out this is an extremely complicated problem which is probably best solved by ignoring it.


I bet you would get better results directly spinning the compressor via a flywheel rather than trying to generate the electricity to run the refrigerator.


Yeah, toasters and microwaves take a ton of energy, 800-1000W.

Things like laptops and some desktops could be easily powered by bicycle, though. I managed 220W for 30 mins, and could probably idle 50-75W for hours, which is more than enough for multiple laptops.


It would because people would become very aware of the energy sinks and its cost :)

Some city noticed changes in consumption when new houses had meter installed on ground floor rather than in the basement.


In the UK they're rolling out smart meters that have a display that show real time usage.

They're advertised as saving energy which I think is a bit misleading as it's only based on this phenomena


Which is a very short lived behaviour change. Our smart display went missing during a house move, and hasn't been compatible with most of the energy suppliers we've had since moving in.

It was probably the worst energy scheme they could have spent the estimated £16bn on. That much in insulation would have slashed domestic heating costs, but as ever that's a much less sexy project.


I've often wondered if it would be viable to run a gym where all of the equipment is designed to harness customers' energy to help power the building. I suspect it wouldn't make enough of a difference to be worth it but have never seen anyone run the numbers.


Let's say a person outputs 500W and we can somehow capture all of it. That would be 10cents worth of electricity in an hour.

Note that these are very generous estimates, but it does demonstrate how silly the idea of trying to generate electricity by capturing exercise output is (in a purely economic sense).


500W is an extremely high number as well. Pro cyclists can reach up to 400 Watt in short bursts, untrained people typically can't maintain more than 100-150W for long. We really use quite a lot of energy when measured in how many humans it would take to provide that much energy "manually".


While it's true that the power of a human is not much (certainly not enough to power a house), but we have a lot of energy. Most of us carries at least 1 kg (~2 lbs) fat, which contains ~37 MJ of energy. For comparison, a stick of dynamite has ~1 MJ of energy.

https://en.wikipedia.org/wiki/Fat#Biological_importance

https://en.wikipedia.org/wiki/Dynamite#Form


Why stop there? The matter itself is energy, so combine a human with an anti-human....


Sweet oblivion looks just like me


> Most of us carries at least 1 kg (~2 lbs) fat

It's a lot more than that. Even a fit (healthy, but not necessarily peak athletic shape) adult man will have about 15% body fat, 20% for a woman.


> Pro cyclists can reach up to 400 Watt in short bursts

I heard pro cyclists can reach up to 2kW. I managed to make 500W on a rowing machine for 5 minutes, but I'm a couch potato. I believe 400W is a normal rate for cyclists. When you are sitting, your body already produces about 100W of heat.


So cyclists measure their capability in terms of FTP - essentially what power output they can maintain at full throttle for an hour.

This hugely depends on body weight / gender / training levels etc., body weight being a big deal since that’s what you’re transporting. So the other way folks measure output is W/kg of body weight.

A beginner adult male will be in the 100-200W zone, around 0.5-1.5 W/Kg. Usually anyone can train themselves into the 200-300 (3-4 W/Kg) zone which is the recreational pace - the groups of cyclists you see on the road. Beyond 300 ftp (150lb body weight) (4-5 W/kg) you’re reaching race pace. The ones you see on screen have upwards of 5-6 W/Kg FTP output. They obviously have other constraints around putting this output at the end of a 200km ride for 20 mins etc as well, which makes it extra hard.

Finally we come to the KW numbers - all these folks have two kinds of muscles (fast twitch and slow twitch). The sprinters are saddled with a higher proportion of the kind of fibers that can allow huge spurts of power - they put out about 1000-1500W for about 5-10s. These are probably what you’re thinking of. This is pretty much an end of ride (or a sprint section) empty your tanks effort.

Semi related tidbit: track cyclists are a middle kind of beasts here: they put 600-1000W for a couple of minutes but don’t have to worry about riding 200kms to get there.


Yes, and the Wired graphic was particularly misleading in that it showed, as I recall, multiple dollar amounts accumulating on per-session treadmill displays.

On the other hand, it is possible to imagine lifestyles enhanced by various 10W contributions. 10W for heated clothing. 10W for a laptop. 10W average to power a several km electric bicycle or Aptera-size commute. And so on.


How are you proposing we produce 10W to heat clothes? we produce around 80W just by sitting idle, just wear insulating clothes! How do you produce 10W to power an electric bike? 10W is a laughable amount of power, just get rid of the electrics and pedal a bit. 10W is barely enough to charge a phone these days. Anything that gets moderately hot while in use is consuming more than 10W.


e.g., s/lifestyles/post-collapse lifestyles/

Based on Ali Express, heated clothing (5V power in a pocket) is popular enough in China.


This isn't exactly what you are talking about, but this art/concept for self-powered student housing came to mind: https://www.humanpowerplant.be/human_power_plant/human-power...


In their defense this could be seen as a type of gamification. Ultimately you can put any dollar amount on a kWh of electricity. It's the kWh generated that matters.

Not saying it's a good defense though.


The Grand Tour has tried to answer that question. I thought the turnstiles idea was really good, especially if you place them in a building with heavy traffic. Though probably not really cost effective.

https://www.youtube.com/watch?v=WhwyH6tT-EQ


As a quasi sidenote, most gyms already implement this on a smaller scale on their elliptical and spinning machines as they're usually self powered and you can regulate the resistance the machine offers, which I assume must increase or decrease the number of magnets around the flywheel.


However, most energy you burn in a gym is converted into heat energy, which the gym then burns extra electricity trying to air condition.


A dude did that, it didn't last long, but I'm still thinking about opening one. Just to avoid the double madness of powering lights and devices so people spend energy in the air :)

Even if you don't generate much, at least your biomechanical efforts are used somewhere.


Surely it's triple madness if you're driving to a gym to go on the running / cycling machine.


Not just driving to the gym, but driving twice around the parking lot to find a closer space for a shorter walk.

And wasn't it Mumford who observed that given the time it takes to earn the money to buy a car, on average it can be faster to walk than to drive.


Oh yes I forgot to mention this :)


You could have them power fans or keep the lights on. Things that can use just a few watts.


each one yes, but if you have 20 persons averaging 100W, you can power a kettle :)


Having been to a gym, I sincerely hope any power scavenged is use to power the ventilation system.


Well, people with treadmills technically could significantly reduce their electricity bills—by running outside instead.


Therefore, one could easily generate lightning with equal destructive power? Why is it not used yet in war (airplane to ground) or for entertainment (airplane to airplane)?


Not really. Think lasers - they're not expensive because of the energy output, they're expensive because concentrating energy output is far more difficult than simply obtaining and outputting energy.


A megajoule in a microsecond, that's a terawatt impulse with decidedly RF personality. Anything that could generate that would be as big as anything that could capture it. Not exactly nimble enough to weaponize.


Tesla coils, arc lamps.


aiming is a tough thing


This looks like the opposite of a low hanging fruit. That fruit is hanging very high and on top of that it is very small and hard to get to the edible part.

Solar, wind, hydro, biofuel, geothermal, maybe even day-night temperature cycles - all of these look much more promising in the "free" energy department. Actually it's hard to think about a worse energy source. Earthquakes maybe? :-)


A lightning bolt is literally the opposite of clean power. Clean in this sense is lack of electrical noise. Random.org have created their RNGs using lightning strikes.

https://api.random.org/features


I call BS. A lighting bolt generates at least 1.21 gigawatts, which is good for many purposes...


If you really mean gigawatts (unit of power), then you have to multiply it by the time period when this peak power is reached to get the energy output. Lightning bolts have very short durations, tens of microseconds.

Let's be generous and give it 1 millisecond. 1 millisecond times 1 gigawatt is 1 million joules, which is the estimate that the article gives.


Woosh, he is making a back to the future reference


Well, today I learned something about physics and pop culture.


> Lightning bolts have very short durations

That is why you have to hit the cable at a very specific time. It's not that hard.


martin_a, or marty_m? Hmmmm....


It’s also pretty easy to figure where it will strike. Just wire up the ol’ clock tower.


the question is how long.

For example the flash you have/had on standard camera (not smartphone who have LEDs, but the standard compact camera that is just a camera) is about 1kW!!

But this 1kW you have it for about 1ms (milisecond!!). It seems to last longer because the light gets “burned” into your eye.


For 0.2 seconds which is wolfram alpha says is 67.2 kWh


Great Scott!


Some websites suggest the average lightning bolt has 5 GJ of energy, which is 5000x what this page quotes. I think someone has done a math error.

Eg. https://en.wikipedia.org/wiki/Harvesting_lightning_energy#:~...).


At one point, I realized that lightning is the breakdown of the dielectric material of a capacitor. This means you harvest electric by draining it with a large sheet or web of conductors at cloud level. No idea how well this would work or how much energy there is to collect, and it would be pretty impractical.


To say this another way... A lightning bolt would have to hit the average US home every 12 minutes to keep everything powered (assuming perfect conversion and no losses)


I wonder if instead of the bolt itself we could capture the charge accumulation on a continuous basis

That might be "easier" in some aspects


It's really hard to do because within the cloud each water droplet contains a tiny bit of charge. The air between them is an insulator. The cloud is many cubic miles. So to extract the charge you either need to touch every droplet, or make a spark between each droplet and a neighbouring one (which is what happens during a lightning strike).


But worth a fortune if you can put it in a small package deliverable on command (i.e., a weapon).


if so little power, how does it fry a tree?


It's not so little power, it's so little _energy_ (power over some period of time). It's a crap-tonne of power, just over a _very_ short period of time.

And yes, power is what fries the tree: https://en.wikipedia.org/wiki/Joule_heating


mind=blown. I never considered the time factor


And this is why Nikola Tesla died poor.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: