I'm not really commenting on that, I'm saying the practice is good for me as an interviewee.
However I do think it's a good way to filter candidates. I should clarify that what I'm talking about is fairly basic programming tasks, not very hard leet code style DSA type tasks. I've never been given an actually hard task in an interview, they've all been fairly simple tasks like write a bracket tax calculator, write a class that stores car objects and can get them by plate number and stuff like that. Helped a friend do a take-home one where we fetched some data from spacex's api and displayed it in a html table.
Every time I do these, people act like I'm Jesus for solving a relatively simple task. Meanwhile I'm just shocked that this is something my peers struggle with. I would have honestly expected any decent dev to be able to do these with roughly the same proficiency as myself, but it turns out almost nobody can.
That's why I think it's a good way to test candidates. If you're going to work as a programmer you should be able to solve these types of tasks. I don't care if you're frontend, backend, finance, healthcare, data science, whatever kind of programming you normally do, you should be able to do these kinds of things.
If someone can't then by my judgement they don't really know programming. They may have figured out some way to get things done anyway but I bet the quality of their work reflects their lack of understanding. I've seen a lot of code written by this kind of people, it's very clear that a lot of developers really don't understand the code they're writing. It's honestly shocking how bad most "professional software developers" are at writing simple code.