Caleb Scharf: It may be that the enormity of the SETI challenge isn’t just in the scale of the datasets (and acquiring those in the first place) but is also in the open-endedness of the questions. Once we allow for the fact that we may not know how signals are encoded (whether in radio or optical light, or otherwise) the game quickly becomes very, very expensive. For example, Google’s “Transformer” deep-learning system can be run on thousands of processors to train hundreds of millions of parameters. But the energy costs are staggering, with training runs clocking up equivalent carbon footprints of hundreds of thousands of pounds of carbon dioxide. What’s the answer? It could be with quantum computing.