Why is it that every time I read an article about modern British politics and culture, I walk away almost feeling happier about the state of the US?
It's almost like God, while planning the course of current events, looked at the inhabitants of the US recovering from 4 years of Trumpian misrule and decided to console them by having the original English-speaking developed country end up even worse off than they were.
(I'm kidding, of course. God doesn't actually exist, and I know there's a lot of places way worse off than the US or the UK. I'm also not actually this American-centric in my views. My point is that the UK somehow seems worse off these days than the US is, and that's a sentence I never thought I'd ever be saying 10 years ago.)
It's almost like God, while planning the course of current events, looked at the inhabitants of the US recovering from 4 years of Trumpian misrule and decided to console them by having the original English-speaking developed country end up even worse off than they were.
(I'm kidding, of course. God doesn't actually exist, and I know there's a lot of places way worse off than the US or the UK. I'm also not actually this American-centric in my views. My point is that the UK somehow seems worse off these days than the US is, and that's a sentence I never thought I'd ever be saying 10 years ago.)