“What actually transpires beneath the veil of an event horizon? Decent people shouldn’t think too much about that.”
— Academician Prokhor Zakharov, “For I Have Tasted The Fruit”
The video for the Singularity Inductor invites the player to continue the train of thought that Miriam began in the quote for the Controlled Singularity technology. Why would a perfect God create a universe at all? But instead of popping the stack to consider Miriam’s plight in the broader context of her being a character in a video game, it encourages the player to ask what happens if we push the stack down a meta-level?
It has been a fairly popular cosmological theory that our universe might be inside a black hole. As such, it’s a sci-fi theme that the player is expected to have some rough familiarity with. The logical extension of this theory is that black holes in our universe might potentially contain entire sub-universes within them.
Now recall that this project represents the ability for people to create and manipulate a persistent black hole. In this light, then, Zakharov has constructed an entire universe. He is literally their creator-god. And he brought forth their universe in a great Big Bang for industrial purposes. In essence, eons of pain, suffering, and evil were brought into being as a side effect of getting a free Quantum Converter at every base.
This line of logic almost assuredly applies to the Singularity Laser weapon and the Singularity Engine as well. Which means that Zakharov’s University is building entire universes on an assembly-line basis. And he’s doing it all so that his Gravships can more efficiently roast Gaian bases at the end of the game.
He dismisses any concerns like these with what amounts to a verbal shrug. We already know that he has a refined philosophical disdain for what others might consider pressing moral concerns. For him, the fierce moral urgency is located entirely in the quest to most quickly find the best answers to entirely practical questions. What is this phenomenon? How is it best described? What possible use does it have?
So Zakharov would clearly maintain that it just doesn’t matter what scale of horrors might take place beneath the event horizon. It’s no different in principle than the old Bioenhancement Centers, in which row upon row of brains were grown in jars to develop better direct brain implants. And generations of people have lived and died on Planet in the time it took to progress the eleven tiers of the technology tree since then.
So the objective Ethical Calculus does not necessarily to give a strong weight to the well-being of sentient beings in general. Nor does it seem to require any particular concern for any being who exists on a meta-level distinct from one’s own. If a simulator need have no concern for those whom he simulates, it follows that a god need not have any care for the universe he brings to life.
If you’re a decent person, Zakharov grants that the thought of all that suffering probably bothers you. He’d say it’s an irrational preference, of course. But he’ll generously grant that it’s not necessarily altogether unworthy. So his advice to someone plagued by such an overabundance of conscience is to just try not to think about it too much. Not when there’s still so much science to be done.