13

From time to time I'll see things talking about how thanks to all of our radio communications, we're effectively broadcasting our existence to the entire universe. However, when I did some back-of-the-envelope calculations a while ago I found something interesting - by the time a radio station's broadcast reaches Alpha Centauri (the solar system closest to ours), ignoring any effects of our atmosphere, a 1 m^2 receiver would, on average, receive a single photon every seven hours. Of course that is only a single radio station, but throwing more radio stations into the mix would just make the signal less coherent and indistinguishable from random noise. When you consider the sun as well, it seems like all of the Earth's radio signals would be a drop in the bucket.

How much noise would the Earth have to broadcast in order to change that? For this question, the goal is for the noise broadcast from Earth to still be noticeable by the time it reaches a distance of 500 light years (that only covers about 0.01% of the galaxy). You can assume that alien civilizations are looking for signals of intelligent life, but not that they are focused on our solar system.

Alexander
  • 42,224
  • 6
  • 59
  • 153
Rob Watts
  • 19,943
  • 6
  • 47
  • 84
  • I wish I could set a bounty right now. – The Square-Cube Law Sep 06 '18 at 17:42
  • I've seen a documentary that claimed that all our radio signals get scrambled at the edge of the solar system because of "magnetic bubbles" in the Heliosheath. Unfortunately the documentary wasn't in english and I cannot find it on Youtube. – Elmy Sep 06 '18 at 17:51
  • @YElm if that's the case, then the noise level might just have to be enough for it to be obvious that something is producing radio signals other than the sun. – Rob Watts Sep 06 '18 at 17:53
  • 3
    This might actually be a better fit on Physics. –  Sep 06 '18 at 17:55
  • Think about how mobile phones work. There are millions of them, all putting out extremely low power radio signals, yet somehow the cell can communicate with them. This is because communication happens on a very narrow frequency. Any intelligence would do as seti do and listen on very narrow bands, then use computer interpolation. There is no doubt that the earth's electromagnetic output is 'visible' to our nearest neighbours even if it's massively attenuated. I'm going to say that the lower frequency transmissions are probably discernable up to 10's of light years away. – Richard Sep 06 '18 at 18:15
  • Just to give you some idea of the power of some radio broadcasts.. some transmitters can turn 100 amps into EM waves that just disappear into thin air. – Richard Sep 06 '18 at 18:17
  • Just to be on the safe side, you should fluctuate the Sun. (Ref to book "Three Body Problem" by Cixin Liu). – Congenital Optimist Sep 06 '18 at 18:22
  • This is a fabulous question, but most especially with the [tag:hard-science] tag, I don't expect an answer. How would anyone provide an answer (or judge it to be correct) when we have yet to find an example of anyone else's detectability? We can explain how much transmission noise we produce, but without the comparison of an alien civilization (get to work Seti@Home!), we can't prove how noticeable it is at all. – JBH Sep 06 '18 at 18:39
  • @JBH One thing would be to compare the radio noise we emit with the similar noise emitted by the Sun and planets. Once the man-made noise drops below, say, 10% of that it's probably undetectable. Note that we'd need to compare like with like and some specific signals (e.g., strategic radars) may be detectable if you know what to look for even if they're small compared with the general noise level. – Mark Olson Sep 06 '18 at 18:53
  • @MarkOlson, Those are assumptions (c.f. your phrase "may be detectable if..."). Until either we go out into interstellar space and see that we're noticeable or another civilization drives up with the benevolent welcome message "we noticed you and thought we'd drop by," it's only an assumption and therefore unanswerable with the [tag:hard-science] tag. – JBH Sep 06 '18 at 18:57
  • @MarkOlson a 10% increase in the radio output would probably be easily noticeable, but it depends on how variable the sun's radiowave output is - the more consistent it is, the easier it would be to notice a sudden variation. – Rob Watts Sep 06 '18 at 19:03
  • not directly related, but some people do try to send messages that will rise above the noise: https://en.wikipedia.org/wiki/Active_SETI – Bald Bear Sep 06 '18 at 19:16
  • 1
    @Richard suggests that "There is no doubt that the earth's electromagnetic output is 'visible' to our nearest neighbours even if it's massively attenuated.] In this article from the Seti Institute we learn that a general broadcast signal of 0.75 Janskys from HD 164595 (94 ly away) would need to be 10^20 watts, and if directed at us would need to be a trillion watts. Both being "far beyond what we ourselves could do." – JBH Sep 06 '18 at 19:21
  • @JBH seti are wasting their time listening for radio signals then. High power low frequency (radar etc) signals are powerful enough to be detected within 10 light years or so. The higher the frequency, the more power you need. Hence SETI's preferred unit of measurement. – Richard Sep 06 '18 at 19:38
  • @Richard, using Kingledion's (MichaelKjorling's) equations and AlexP's comment below, Radar at 261Khz transmitting to 10 light years has a signal strength 10^27 lower than transmitted. Using the equations, you'd need to broadcast at 102.5 dbm or 17.8MW which is 7X the strength of the Taldom transmitter referenced by AlexP. A quick search did not reveal any stronger transmitters so we appear to not be broadcasting signals that can be detected with a -200db threshold at 10 ly. Seti appears to know what they're doing. – JBH Sep 06 '18 at 20:45
  • I wish I were in the right order of magnitude more often I can tell you that. As I said.. 10's of light years.. less than 10 stars. – Richard Sep 06 '18 at 20:59

1 Answers1

10

First off, this question is just the inverse of this question. So, apologizes to MichaelKjorling, I'm going to crib his answer to get some results.

Assumptions

  • In order to be noticeable at 500 light years, we need to be able to detect us at that distance.

What is the attenuation over 500 light years?

Free space path loss in decibels (dB) is given by

$$20\log_{10}\frac{4\pi d}{\lambda}$$ where $d$ is the distance between antenna and $\lambda$ is the wavelength of the frequency in question.

In the US, the highest energy general transmission (as far as I can tell) are UHF stations in the 512-608 MHz range. This may be an important thing to revisit; emission into space is determined heavily by directional broadcasting characteristics. But we'll just use this assumption for now and say that the wavelength is 0.5 meters. Meanwhile, 500 light years is $4.7\times10^{18}$ meters.

Plugging into the equation above, free space path loss is 401 decibels. That is a lot! This means power is $10^{40}$ lower at the receiver than at the emitter.

What is the most sensitive receiver that we have?

First, since radio transmission terminology is not generally well known, a brief discussion. In order to receive a signal, you need to be able to pick it out of background noise. That background noise includes both background noises from deep space, and interference from local signals. For example, if our UHF transmissions at 550 MHz happen to line up with the most common radio frequency on a distant planet looking for us, in-atmosphere recievers are going to have a tough time finding us.

The unit of measure for the signal here is dBm, or decibel-milliwatts. If $P$ is the power of a signal in milliwatts, then the signal strength is $$10\log_{10}(P).$$ Thus, a 1 W signal is 1000 mW or 30 dBm. The gain of an antenna is measured in terms of dBi. This measure uses the same log scale that the signal strength does, so we can simply add the measures to together to calculate total signal strength.

The rating of a receiver is the smallest signal that it can pull out of the noise background. Following MichaelKjorling's answer, assume that an alien civilization specifically searching for us can detect the signal at -200 dBm. In addition, assume a 80 dBi antenna gain from a dish like Arecibo.

How much noise do we have to put out?

Free space path loss is -400 dB; antenna gain at a foreign civilization is 80 dB and the signal detection threshold is -200 dB. Therefore, signal emission strength must be 120 dBm at the source to make it detectable. This is equivalent to $10^{12}$ mW or 1 GW of emission power.

Conclusions

The Earth is almost certainly not gong to reach this power level. WBCT is the most powerful FM radio station in the US, with effective radiated power of 320kW, or about 85 dB. But this dish is not pointed into space, and so its emissions are further attenuated by the atmosphere before reaching into deep space. There is really no reason to increase the power of such a dish, becase it already has plenty of power to reach the horizon. Any direct signal's line of sight is limited by the horizon; once past the horizon the signal no longer reaches the surface and coverage moves up into the atmosphere, where it isn't really useful.

Furthermore, if you start increasing the emission power of lots of different radio communications methods, then they will start interfering with each other. A 120 dB signal emerging from the earth's atmosphere will bounce of the moon and back to Earth at a much higher power than it will reach 500 light years away.

So, all in all, the Earth is unlikely to ever emit enough noise to reach a planet 500 ly away. Now, a directed signal is another story...

kingledion
  • 85,387
  • 29
  • 283
  • 483
  • The Russian long wave Taldom transmitter radiates 2.5 MW at 261 kHz. Just saying. And some military radars emit tightly collimated microwave impulses in the megawatt range. But on the whole you are right, we don't emit the kind of power which would be noticeable from far away; especially if we consider that there is a very close very noisy very large nuclear fusion reactor... – AlexP Sep 06 '18 at 19:59
  • @AlexP That is at 261 kHz; I was calculating in the 500 MHz range. Still, even MW emitters won't come close to putting out GW of noise. – kingledion Sep 06 '18 at 20:02
  • Please do note that both antenna gain and path loss are dependent on frequency (actually wavelength). Importantly in this case, antenna gain drops as wavelength goes up (or frequency goes down). An antenna that provides 80 dBi gain at 3 GHz will have a much lower gain at 550 MHz. – user Sep 10 '18 at 09:52