Another possible aspect of it is that probably more and more sites are blocking AI crawlers through e.g. Cloudflare's support for blocking AI crawler and AI agents. This will give them a backdoor to that content through a user's connection.
I am not sure if this is happening, but as blocking becomes more prevalent, having a widely-used browser will help.
"Oops, we got caught using our customers' internet connections as exit nodes for the largest residential proxy ever to exist, both on pages they visited and ones that they didn't. But don't worry, this was an unauthorized experimental rollout to only parts of the world that we don't have legal nexus in. The program has been halted, and the person responsible has been sacked. Mynd you, møøse bites Kan be pretty nasti..."
It can (and likely will) just transmit standard browser signals. The AI integration is more of a UI layer on top, not something that is being sent in a request header UA string.
That lack of signals in addition to the regular human behavior patterns that something like Puppeteer doesn't have is going to make this practically impossible to block
I am not sure if this is happening, but as blocking becomes more prevalent, having a widely-used browser will help.