Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The sun is a 1000 watts per square meter exposure at your body, cell towers might be 100w if you hugged the antenna then inverse square law to distance to about 0.00000001 watt with a strong signal. Your cell device is a millwatts transmitter, equivalent to an LED light shining on you.

You don't need hat and sunscreen for street lights or flashlights nor would you need it for cellular power levels, orders of magnitude difference in exposure.



The Sun is also at 5400K.

Therefore, 99.9999038793% of the energy emitted is at wavelengths shorter than 1E-4m (0.1mm).

Or, across the entire continuous spectrum (0, 1e-4m], the total energy from the Sun is 0.0009612W, or 961uW.

You might complain that 961 uW is much more than 10nW, but again, you have to consider that 961 uW is across the entire spectrum (0,1E-4]. The Sun is less powerful at narrow spectrums because otherwise it'd drown out the cell phone tower (Or rather we pump power into the antenna to overcome the "noise" from the Sun).

I'd reproduce the numbers, but unfortunately my HP48 underflows at such narrow bandwidths.

And the "LED" at your face is, of course, much more powerful than the Sun since (as you pointed out) by the inverse square law, it has to pump out a lot of power to reach the tower.

Nor do I happen to think LEDs are harmless. Using LEDs to affect biological system is a very rich area of study.

Do I think non-ionizing radiation is harmful? I doubt it. But comparing a 1000W/m^2 @ 5400K black body radiator to a cell phone tower is dishonest.


Exactly, the sun is mostly higher energy (approaching ionizing) terahertz radiation.

The concern with 5G seems to revolve around the uses of higher frequency millimeter wave radiation compared to 3G/4G which has shown no repeatable damaging results at normal power levels.

If higher frequency = bad, which is actually true, comparing 5G the sun is not dishonest IMO. I am trying to give some perspective to show how it seems odd to worry about extremely low power cellular radiation while giving little thought to the extremely powerful nuclear radiator in the sky.

Dosage matters, cellular frequency will cook you given enough power, so will visible light. The power levels we are talking about do not generate enough heat to damage our tissue, so if they do harm it would need to be through some other unknown process. Should we keep looking for possible other processes, yes of course. However I would be much more concerned about the much more powerful visible range artificial radiators around us every day, like the monitor I am staring at right now emitting a 100 times the radiation of my cellphone right at my face all day long.


>Therefore, 99.9999038793% of the energy emitted is at wavelengths shorter than 1E-4m (0.1mm).

Shorter wavelengths have higher photon energy. It's the short stuff that you have to worry about.


Sure, ionizing vs non-ionizing but millimeter waves do do localized, penetrating, heating and the intensity of millimeter radiation from 5G is orders of magnitude stronger than that from the Sun.

Is it consequential? Probably not, but the OP's original comparison of 1000 W of Sun vs. 100W from a tower and nW from cell phone is disingenuous.


As a general heuristic, sure. But that is hardly an airtight thing.


This is a tangent, but it's crazy that we've built a communications network that relies on basically LEDs shining out of our pockets (with less interference from the sun, but still).


Except our pockets (unless you wear tinfoil pants) and many other materials are transparent at those wavelengths which makes it a lot less crazy.


You don't think making an antenna that can detect an LED shining in a flat miles away is crazy because pockets are transparent?


In darkness our eyes can easily detect a 1W LED from kilometers away.


But can they distinguish a specific 300 mW LED in a stadium with tens of thousands of other LEDs next to it?


If it were a different color, yes. If each of them were a different color, your eye would still be able to, as long as you had enough focus. So the analogy holds pretty well.


My understanding is that healthy human eyes are able to detect individual photons.


No. Amateur radio operators have done it, and it’s not difficult to imagine how one would.


That doesn't make it any less amazing.


Indeed, nothing is difficult to imagine how to do if the method is common knowledge.


Absorbsion rates vary by orders of magnitude depending on signal frequency.

A simple power comparison is not a great measure of affect.


The Sun transfers orders of magnitude more heat to your body than cell towers or your phone.

The problem with every argument about radiation from cell phones being dangerous is that for every proposed mechanism, the Sun is orders of magnitude more damaging.

The obvious worries with cell phones are repeated stress injuries, insomnia and disruption of personal relationships. There's just no plausible mechanism for the radiation to be damaging, though.


Out of curiosity, do you wear sunscreen to keep yourself cool?


Even at 100% absorption these are minuscule amounts of power compared to huge doses of ionizing radation we bath in every day.


What ionizing radiation are you referring to? I'd hope you are not bathing in ionizing radiation everyday...


Sorry I always forget UVB is not technically in the ionizing range, but "near ionizing" with enough energy to cause non thermal DNA damage.


It is, because if you assume the worst case of perfect 100% absorption, we're still talking about orders of magnitude too little power to matter.


Higher the frequency, less radiation penetration there is (skin effect), so higher = "better"


You're falsely equivocating sun's wide spectrum to 5G's very narrow set of bands. Most of sun's spectrum at the surface is visible, infrared light, and a bit of UV.


Note that without significant shielding, the sun will make you very sick or kill you.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: