Conversation

Breaks down on more complicated expressions. I then tried doing an explicit substitution approach like in the PiSigma paper, but then my normal forms weren't fully reduced, so I needed to have a more interesting conversion rule.
1
I then wondered, "how do I fully reduce my 'normalised terms' into 'normal forms' as opposed to 'weak head normal forms'". Seemed like I had a gap in understanding and things were breaking down.
1
Note that I'm also trying to avoid HOAS as is commonly used in Haskell/ML implementations, because Rust isn't as good at that stuff. (not sure, but it might lead to Rc cycles that wouldn't be cleaned up)
1
1
This Tweet was deleted by the Tweet author. Learn more
Maybe I'm getting mixed up between 'environments' and 'contexts' too - I was attempting to use the typechecking context to do my substitutions, but it's obvious now that it that would break down in more complex examples. In HOAS you get an environment for free.
2
AFAIU: context = name -> type, env = name -> value. And the diff between "normalization" and "evaluation" is more-or-less that normalization goes under binders and "fully evaluates" (to "*the* normal form"), mere 'evaluation' doesn't (necessarily).
1
1
(I'm also not -certain- HOAS couldn't work in Rust - do you _need_ to use Rc everywhere, can't you use &Fn during the normalization process? I might experiment a little...)
1
1
(incidentally, if you know how to make the ugly parts of that (e.g. (*foo).clone()) less ugly, i'd be curious) (haven't looked at yours in detail yet because bbiab:)
1
Show replies