Comment by BoppreH
I agree, but I also thing that static analysis is a requirement for high quality. The obvious solution is (global) type inference, but I have yet to see a language that embraces it.
I agree, but I also thing that static analysis is a requirement for high quality. The obvious solution is (global) type inference, but I have yet to see a language that embraces it.
Yes, with the caveat that I'm assuming generics here. I'm working on a language where you can have:
def add(a, b):
return a + b
add(1, 2) # Works
add('a', 'b') # Works
add(1, 'b') # Type error!
So changing one line should never break things outside its "light cone" (call stack + reachable data structures).There are practical limits on global type inference; eg, sometimes writing Haskell youll be not sure what’s going on with some code that won’t typecheck, and the way to deal is break it down piece be piece, add type annotations, until you see where your mental model is failing.
This only gets harder with more type inference, as the actual issue may be far away from where the type error occurs.
This is why in haskell it is considered best practice to add type annotations to all top level definitions, at the very least.
Have you seen OCaml? Though its inference stops at module boundaries (but you don't need to annotate them) and for some more special type features. In addition it solves some type inference problems by having e.g. different operators on integers and floats, there are no type-directed traits.
But in practice it's quite handy to annotate module interfaces and even individual functions.
I hear that OCaml without interface files can cause spooky-action-at-a-distance type errors[1]. Have you had experience with that?
> But in practice it's quite handy to annotate module interfaces and even individual functions.
Yes, but it also limits what information the types can represent because of syntax and human patience limitations. A fully inferred language could associate numbers with possible value ranges, lists with their sizes, enumeration of fixed string values, etc.
> spooky-action-at-a-distance type errors
Global type inference is literally this. Inferring types leaves gaps in the typing information, being global means the gaps can be arbitrarily large, and so a type conflict could theoretically span every line of a large program.
Do you really want type inference to be global? The idea that changing one line in a function might change the type of a whole other function — and thus variables assigned to its result, which affect the types of other functions, which ... — well it just seems like a lot to have to reason about.