This Isn't Science Fiction — It's Happening Right Now
In 2015, a group of scientists published an open letter calling for a moratorium on all active transmissions into space until humanity reaches consensus. Among the signatories were Elon Musk and Stephen Hawking. Hawking was blunt: actively contacting extraterrestrial civilizations could lead to human extinction. He compared it to Native Americans encountering Columbus — and we all know how that "contact" ended.
On the other side, astronomer Douglas Vakoch and his organization METI International argue that humanity has already been leaking its presence through radar signals, television broadcasts, and radio waves for decades. Silence is futile. Instead of waiting passively, we should transmit deliberately — send carefully designed signals demonstrating our civilization's goodwill.
The debate remains unresolved. No international law prevents anyone from pressing the send button. Any radio telescope on Earth could, in theory, shout into the cosmos without the consent of a single other human being.
If you've read the Three-Body trilogy, your blood should be running cold right now.
Because this has already happened once.
Ye Wenjie's Choice: A Desperate Woman's Enter Key
- Red Coast Base. Ye Wenjie receives a warning signal from the Trisolaran world: "Do not answer! Do not answer! Do not answer!"
The Trisolaran who sent that message risked their life to deliver it. The meaning was unmistakable: once your coordinates are exposed, your civilization is finished.
Ye Wenjie replied.
Not because she didn't understand the risk. Not because she was insane. But because she had lost all faith in human civilization. The Cultural Revolution had destroyed her family, her beliefs, her last shred of trust in humanity. Her reply wasn't a scientific decision — it was a verdict. She judged all of humanity and sentenced us to death.
This is Liu Cixin's most brutal answer to the METI debate: the decision to reply won't be made by rational calculation, but by the psychological state of whoever has their finger on the button. Institutions, laws, international consensus — all of it fails in the face of one person's despair.
In reality, we have two-key systems to prevent unauthorized nuclear launches. But we have zero mechanisms to stop someone at an observatory, on some quiet night, from sending a message into the void.
The Dark Forest Says: Shut Up or Die
Luo Ji's Dark Forest theory delivers the ultimate answer from cosmic sociology: in a universe of limited resources and opaque information, any civilization that reveals its location will be destroyed. Not because the other party is evil, but because the cost of eliminating a potential threat is always lower than the cost of determining whether they're friendly.
The Singer civilization erased the solar system with a single two-dimensional foil. It wasn't war. It wasn't even hostility — just cleanup. Like swatting a mosquito without a second thought.
By this logic, every METI advocate is a lunatic. Every deliberate transmission is a death warrant signed on behalf of all humanity.
But here's the problem: Liu Cixin himself undermined this theory in the third book.
The Returners — a pan-universal super-civilization alliance — openly broadcast across the Dark Forest, calling on all civilizations to return mass borrowed from the main universe. They weren't afraid of exposing their position. Their very existence proves that chains of suspicion are not eternal, and that inter-civilizational cooperation becomes possible at higher stages of development.
Trisolarans and humans experienced half a century of cultural exchange during the Deterrence Era. Trisolarans learned literature and emotional expression. The Dark Forest theory says trust between civilizations is impossible? Then what was that?
The trilogy's real message isn't "stay silent forever." It's this: the Dark Forest is real, but it's not permanent. It's a phase of civilizational evolution, not the destination.
Silence Itself Is the Dark Forest
Let me ask a sharper question: if we stay silent forever, how are we any different from the suspicious civilizations hiding in the Dark Forest?
We choose not to reply because reason tells us the risk is too great. But this is exactly how the chain of suspicion operates — I don't know if you're friendly, you don't know if I'm friendly, so we both stay quiet. And the silence itself reinforces the suspicion.
The Dark Forest isn't the natural state of the universe. It's a product of fear. Every civilization that chooses silence adds another brick to the Dark Forest's walls.
The Fermi Paradox asks: "Where is everyone?" Maybe the answer is this: every civilization read their own version of The Three-Body Problem and scared themselves into silence. The universe is quiet not because no one's out there, but because everyone is afraid.
If that's the answer to the Great Silence, it's more tragic than any alien invasion.
We Should Reply — Not Because It's Safe, But Because Silence Is Certain Death
My conclusion might make you uncomfortable: we should reply.
Not because replying is without risk. The risk is real. If the Dark Forest theory has even a one percent chance of being correct, replying means gambling the fate of our entire civilization.
But consider the alternative: what happens to a civilization that hides forever?
The sun has a lifespan. Resources will run out. If we maintain eternal silence, if we never make contact with other civilizations, we will never access breakthrough knowledge and technologies. We'll be like someone locked in a room, eventually dying from the room's own limitations — killed not by the monsters outside, but trapped by our own fear.
Liu Cixin returns to one theme throughout the trilogy: survival is a civilization's first need, but mere survival is not enough. Cheng Xin chose kindness and lost everything. Wade chose survival at all costs and was executed. Zhang Beihai chose to flee and created a new civilization. Every choice carries a price, but only those who dared to act left a mark.
The civilization that survives the longest in the universe won't be the quietest one. It'll be the bravest.
So if SETI receives a signal tomorrow, my answer is: reply. But reply with clear eyes. Know that this may be the most dangerous decision humanity ever makes. Know the lesson of Ye Wenjie — never let a desperate person make this choice for all of us.
Let this decision be made by human civilization at its most lucid and most courageous, collectively.
This isn't a gamble. It's evolution.