One of the absurd things about longtermism is that they assume that no matter how high the quantity of humans is and no matter how contrived the method of increasing the quantity is, they assume their proposed system will result in people living lives worth living so simply increasing the number of people is a net good, when it is entirely plausible that they also create unimaginably high numbers of unhappy people but their complete earth surveillance program simply eliminates them before they can cause a rebellion and destroy the wet dream of 10^58 people living in a tiny room with a permanently attached matrix like VR setup.
I think there’s a straw man somewhere here. It’s not like utility monsters/demons and the “mere addition paradox” (AKA the repugnant conclusion) aren’t already a well known issue from the philosophy studies of utilitarianism.