The "end game" from the user's point of view would if someone indeed came up with "content selectors". It would be quite an achievement to build one that was actually useful.
Maybe easy version could work exactly the same way as blockers do, except they'd invert the way the rules are evaluted, and of course the rules would need to be custom as well. And then if the rules would fail to find some actual content on the page, then I guess you might not even know about it :).
That closely describes uMatrix in default deny mode. For each individual site and subdomain, the user gets to choose if they want to allow cookies, images, CSS, JS, XHR, frames. Too bad it's unmaintained, with most of its practical and commonly intended use cases fitting the simpler uBlock Origin.
Perhaps I misunderstood. But let's say I do run uMatrix in deny more, I get the site HTML as the Tim Berners-Lee intended. Yet that already may contain some kind of advertising material, so to view only the actual content I'd need to have e.g. an XPath do it. So I would have a database of sites and XPath (or other ways) describing how to find the content, instead of describing how to get rid of non-content.
Can I tell uMatrix to only show a certain XPath from the page? If so, I agree it is a content selector, albeit pretty impractical :).
Except you'd still have client-side execution in that case. Obviously not everyone cares, but there are some straightforward objective reasons to care if you're blocking it being rendered anyway, like why waste cycles on it.
Maybe easy version could work exactly the same way as blockers do, except they'd invert the way the rules are evaluted, and of course the rules would need to be custom as well. And then if the rules would fail to find some actual content on the page, then I guess you might not even know about it :).