Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

He was right in a sense, the 00s and 10s were the era of the dynamic languages. In momentum if never quite in absolute usage.

But my sense is that we're gradually swinging back around to static typing via gradual typing + mainstreaming of ML style features like generics, unions, records that are more expressive than the C/Java model.



Dynamic languages compare well against languages with poor 'static' type systems imho. In both paradigms, null is still the biggest source of pain.

In part, I think the move to mainstream more ML features is to finally try and address this.


I think instead you see both dynamic and static languages borrowing ideas from each other. I don’t think it is without reason Go is succeeding. It makes it feel a lot like writing a dynamic language in a statically typed one.

Meanwhile Julia gives dynamic languages some of the same feel as static ones.

Both dynamic and static languages have benefits, so I don’t find it weird that they borrow from each other.

Personally I prefer opting in to use static type checking over opting out.


Don't you lose a lot of the benefits of static type checking if you don't use it systemically?


Julia strikes a nice middle ground. Functions which don't specify the types of their arguments are implicitly generic, and are JIT compiled when called based on what the actual argument types are. So even if a function doesn't have types specified, if it is called with arguments with incorrect types, then there will be a compile error at a level further down if there's no implementation available for nested function calls.

The error messages are not as nice as if the types were specified at the top level, but there's usually still enough static type checking to know whether the code will run or not before it runs, which is one of the big benefits of static type checking. The other benefit of static types is performance due to compile time optimisations, which Julia can benefit from if the functions are type-stable.


Sounds like an interesting approach. I'm actually working on a language as a hobby, I've been experimenting with implicit generics and it takes you pretty far.

Do you have type constraints for generic arguments in Julia? I.e. when you're looking at the signature, does it ensure that the operations performed on values inside are going to work?


You can put arbitrary type-level restrictions on any argument in a function's signature. You can also interleave runtime with compile time to restrict things based on value. Julia's type system is fully parametric and allows values in the parameters.

If I write

    f(x :: Number) = x + 1
    f(x :: String) = x * x
    f(x :: Int)    = x - 1
this defines 3 separate methods for the function f. The Int method is actually more specific than the Number method because Int <: Number, so we get the following behaviour:

    julia> f(1.0)
    2.0

    julia> f("hi ")
    "hi hi "

    julia> f(2)
    1
Regarding this part of your question

> when you're looking at the signature, does it ensure that the operations performed on values inside are going to work?

I guess the answer depends on what you mean. Julia's interfaces are somewhat implicit, and we do not do ahead of time checks by default, so if I define my own type that is a subtype of Number, but that type does not have methods for addition, then f will error on that type.

However, the static checking package JET.jl will easily detect that this will happen statically at compile time.


Not quite. The main problem with gradual typing is performance, because having to do checking and conversion at every boundary or interaction between static- and dynamic-typed code introduces a lot of overhead.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: