In her interview, astronomer Jill Tarter discusses looking for extraterrestrial signals and techno-signatures without actually knowing exactly what to expect. As a result, astronomers must pay attention to a wide range of frequencies from 1 to 10 GHz. In particular, they are looking for signals that cannot occur in nature. For example, frequency compression by transmitters can cause radio signals to be too narrow to occur in nature and optical laser signals to occur at unique frequencies. I found this concept of looking for something but not knowing what it will look like interesting because there are several layers of uncertainty involved. How do you measure each level of this uncertainty?
A question I would like to ask on this point of quantifying uncertainty is: can historical data help to identify extraterrestrial signals? In particular, can historical data provide a range of expected “natural” signals, and any signal that deviates a certain amount from the expected norm could indicate an extraterrestrial signal? It seems that as we learn more about science and collect more observations and data, the qualifications for a signal to be identified as “extraterrestrial” increase. In fact, Tarter mentions that extraterrestrial explanation is often the “last resort,” when science and other methods fail. For example, UFO sightings were recorded in history because people had no explanation for them. But now, we may wish to debunk some of them. Even if today we declare a signal is extraterrestrial based on our current understanding of science, will it get debunked as we learn more about science?