To me, the nonlinearity is the key here. Residue is already off the cliff of undecidability, so meaningful chunks are quite hard to take out of it. Whereas simpler type systems impose dramatically-less load than more-complex ones (which themselves often fall off same cliff).
I find it interesting that Rust gets so much mileage out of a programming model that lacks HKTs entirely and really eschews existentials.
-
-
I think this is .. somewhat predictable? I'm no expert but given the quantity of stuff people have been able to encode tolerably-well in System F (plus or minus some sugar/matching support) I'm a bit skeptical about the practical need for points between there and full DT.
-
Also Rust is a lot more than System F, even without higher kinds or first class polymorphism.
-
Formally, I'm quite curious about that! Rust's types went beyond my intentions, but it's not at all clear to me how much "more" it is, vs. a bunch of encoding sugar (eg. traits) that's still formally just System F? Here my "not an expert" hat is to be very prominently exhibited.
-
(Like I was re-reading "F-ing modules" last night and daydreaming about alternative timelines in which it had been published _before_ we abandoned the module-centric design in Rust)
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
