It’s not phrased in the way that I would express it but isn’t this just utilitarianism taken to the extreme? If 70 million people die in exchange for generations of future people living lives that include much less suffering and much more joy, that might be a worthy trade off. In human terms this is an (in my opinion) unacceptable calculation, but theoretically I get it
Perfect doesn’t exist, I don’t think anyone was claiming a perfect system could be built. The thing to consider is ‘will this action, which requires the death of 70 million people, result in less human suffering overall?’ If 70 million people need to die for 100 billion + future people to live amazing lives devoid of most suffering, that might be a worthwhile tradeoff.
Or, we find a way to achieve less human suffering without 70 million people dying. And it’s not like a beautiful future was created out of that death and suffering in China. It’s a repressive dictatorship that’s squashed human rights and is committing a genocide against an ethnic minority while constantly threatening war with its neighbors. But hey they have fast trains and pretty neon skylines so it was all worth it
I mean that’s why we send young men to die in our wars (at least our older wars like WWII). We figure that these men’s deaths will allow far more people to live far better lives. The tradeoff makes sense. Unlike this example though I would never personally be ok with the 70 million Chinese lives trade off situation. The benefits are so uncertain and the suffering of those people is so immense.