>>why would I give up getting the computer to automatically check for me at compile time that I'm passing the right number of parameters to a function
I never understood this, how is it possible that a developer calling a function doesn't test if they are calling it right. Once they test it, how does the additional compiler test matter?
Unless of course your calling function is mutating the value of a variable across so many types that it's impossible to realistically test it. It would then make sense to use a type to prevent the variable from being something it shouldn't.
But most people, don't have that situation most of the time. Even in a big application.
May be that explains why something like Python has such wide adoption.
How many ways can you call a single function? Like, it may be called from a loop, with a quickly changing variable that may change type here or there (especially with weak typing), and you really can’t just say that “I tested it with this one data and it worked”.
The main advantage of typing is, for me, IDE showing errors as you type and better autocomplete, and they serve as documentation also (Of course you should read docs but types are helpful as quick reminder.)
Maybe I am just more absent minded than most dynamic programmers, but live IDE diagnostics and autocomplete are very important for me.
It's not as accurate or satisfactory in my experience. For example I get errors after running in python, which could be detected in statically typed languages while typing in IDE. Autocomplete is also less than optimal. As for autocomplete intellij IDEs > LSPs for typed languages > Untyped languages IME.
I never understood this, how is it possible that a developer calling a function doesn't test if they are calling it right. Once they test it, how does the additional compiler test matter?
Unless of course your calling function is mutating the value of a variable across so many types that it's impossible to realistically test it. It would then make sense to use a type to prevent the variable from being something it shouldn't.
But most people, don't have that situation most of the time. Even in a big application.
May be that explains why something like Python has such wide adoption.