It's good etiquette, for one, and encouraging good etiquette (both on the parts of website operators and website requestors) is a good thing.
As a website operator, I've actually increased ratelimits for a service I ran , from a particular crawler, that's normally much more stringent just because it was the easiest way to identify the people crawling and I liked what they were doing.
I know some web services effectively require you not to lie about your user agent (this applies more to APIs, but they'll block or severely ratelimit user agents that are browser-like or are generic "requests" or what have you).
As a website operator, I've actually increased ratelimits for a service I ran , from a particular crawler, that's normally much more stringent just because it was the easiest way to identify the people crawling and I liked what they were doing.
I know some web services effectively require you not to lie about your user agent (this applies more to APIs, but they'll block or severely ratelimit user agents that are browser-like or are generic "requests" or what have you).