Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's long been observed (e.g. Emily Bender has written some articles to that effect) that NLP technology underperforms on languages that aren't English, especially when they are significantly different structurally.

If you train and evaluate something mostly on a language loke English, you're going to end up with a model that thinks everything works like English, which means among other things very little morphological complexity.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: