@sigfpe What do you think of the following grand scheme:
- Define Haskell-style typeclasses for mathematical structures - rings, fields, vector spaces; instance them on float, etc.
- Now instance them on an alegebraic datatype representing symbolic calculations.
-
Show this thread
-
- Now you have a simple version of Mathematica in Haskell or whatever. But what if... - Instead of symbolic calculation on an algebraic datatype, do it on syntax terms in a language with a hygienic macro system, so you can e.g. compute a function’s derivative at compile time.
2 replies 0 retweets 3 likesShow this thread -
Is that interesting? And is there a particular representation, such as geometric algebra, which makes such symbolic computations particularly nice and uniform?
5 replies 0 retweets 3 likesShow this thread -
Replying to @TimSweeneyEpic
You should look into the more general idea of metaprogramming in Haskell, I think you should find some ways to execute your idea there.
1 reply 0 retweets 0 likes -
Replying to @PLT_cheater @TimSweeneyEpic
I think it's interesting to try this stuff without metaprogramming. Eg. you can go a long way with automatic differentiation without having to do any metaprogramming even though naively it seems calculus requires something meta.
2 replies 0 retweets 3 likes
You’re right! I’m just realizing that a metaprogramming-free framework can express, reflect, and reify arbitrary terms in some subset of the language (e.g. lambdas on vector spaces). Wow.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.