Wondering why I made the choice of Racket over Haskell. Can think of two things: at this point, Racket brings more joy. I really miss Scheme macros.
But maybe more so: I am reluctant to introduce Haskell into a team with essentially all embedded systems people.
Conversation
The distance between Lua and Javascript (which are already used) and Scheme is much less than the jump over to Haskell. While a great tool, it is also a big time investment to learn a completely different perspective.
1
(should have stuck to a single thread...)
Got distracted by writing a Scheme interpreter, which then made me realize that I need one anyway for partial evaluation so that got integrated into the state machine compiler.
1
Cleaned up the Lua compiler code a bit and then it became easy to write a simple Scheme to Lua compiler using it. (Just lambda, application, if, and top level module). This mixes well with LuaJIT.
1
Then I realized this could also be slightly modified to create a higher order syntax representation HOAS in Lua (representing syntax as functions). Not sure yet if this is just another distraction or a better way to represent the compilers/evaluators.
1
Oleg says so. Who am I to question. Anyways I don't know yet what is better for my case: go the macro route & focus on easy syntax extension, or go the HOAS route, build a fixed DSL and focus on adding partial evaluation annotation.
1
And yes I'm heading towards re-implementing a dumbed down untyped version of MetaOCaml in a 1kloc Scheme compiler written in Lua. Don't do as I do this is just for play.
1
1
Read the MetaOCaml stuff here:
okmij.org/ftp/ML/MetaOCa
1
I think I figured out a sane semantics: Scheme file contains start function, which spawns tasks (closures). Conceptually those closures run in a new task context, but they can be specialized to C state machine code.
1
The neat part here is that any scheme code can be used to construct a network of connected tasks, and that network structure is what is compiled to C.
1
This resembles some form of hibernation, where the start codes runs on workstation, code hibernates before state machine receives events. The image then resumes on target machine and starts processing events.
Replying to
Figured out the implementation, without scheduler for now. Just supports 2 coroutines connected through a single channel, where blocking read/write is always coroutine jump.
1
Next step is analysis of task/channel network at compile time, then see what is necessary to implement the scheduler as a coroutine.
1
Compiler structure is surprisingly free of bad hacks. It all just meshes together, which makes me believe I'm on the right track. I.e. I'm not adding any "clever" extensions that complicate things.
1
Only one hack wrt partial evaluation. I don't think I can get around this because of the implicit embedding. The compiler at some point expects a collection of mutually recursive definitions, but it's not clear how to stop PE without annotation or more involved analysis.
1
So now it just stops (spawn! <closure>) PE when it finds a (begin (define ...) ...) form. At that point the lexical environment contains all parameters necessary to specialize the code per task.
1
I'm stoked because I finally found how to represent the PE tricks in Staapl (Forth-style language) in something that is not too weird. This might actually be useful in a broader sense.
1
( In Staapl it's something that I discovered accidentally: peephole optimization works by treating the compiled instruction stream as a data stack for macros. A hack. Not something I could translate to a language with bindings and expressions. )
1
But in this Scheme implementation, the partial evaluation is defined in a direct way.
An eye opener for me. It seems it was necessary to just try to implement it to see the structure of the problem.
1
Removing use of statement expressions :-)
I had to implement a different approach for the C switch statement anyway so this is not a big change. Infra is already there.
1
Generated C code is running. 2 coroutine example.
I've ended up specializing all code for each task, which then gets rid of a task pointer. Coroutine call is now just save return adress + save optional argument + goto. All context is accessible from one "global" pointer.
1
Next up is a scheduler that can be inlined. Will be a bit more work... Prob build the static task network in first pass, then specialize at each blocking point.
1
Nope next up was realizing that coroutines without synchronization primitives (other than yield/call other coroutine) cannot really be called concurrent. Making this explicit removes some magic.
1
1
It only becomes concurrent once the tasks are decoupled, i.e. not explicitly resuming each other, but isolated by scheduler. task->sched->task are still coroutine calls. Obvious now, but took a while to figure out that's what the code was trying to tell me.
1
Cleaned up some more. I think this is pretty much it for the coroutine part. Building the compiler as a modified interpreter was the trick. Should have listened. When there's an interpreter available you can just modify it slightly and run it. Basically gives PE mostly for free.
1
Cleaned up the scheme->lua compiler to use the factored out macros. The base language is now just: lambda,if,set!,let* with begin + local defs -> letrec -> set! + let*.
1
I think this will be useful to release as a rocks library.
Help me find a pun that includes Lua, in the style of conniver,scheme,gambit,racket etc...
1
Maybe "lure" would be a good one.
1
Name is reserved: luarocks.org/modules/zwizwa
Not packaged correctly yet, but there's a stripped down github repository to support the Lua rock.
1
Got distracted with the Scheme->Lua compiler. Been debugging it to try to get it to the point where it can run the Ribbit VM.
1
Starting to understand why people are tempted to bootstrap compilers. Doing this in Lua works up to a point, but I'm starting to really miss pattern matching, and the only way out I see is to use the Scheme-in-Lua to bootstrap a Scheme to write a pattern matcher.
1
This is getting out of hand...
Also I don't think I'm ever going to write a direct output compiler again. ( Best way to learn is to make told-you-so mistakes. )
You really want to be able to clean up the output in an additional pass, even if it is just for debugging
1
1
Once you leave AST/Sexpr domain it's game over.
1
But what I did find out (probably re-invent) is that it makes sense to generate the output as a token stream even if it generating a nested expression, at least for compilers written in semi-imperative style.
1
Probably artifact of trying to do this in one pass. Seems to be mostly about the pretty-printer. Splitting up in more passes, the intermediate passes do want to be pure.
1
Made a draft of a simple recursive matcher. Thing is, with s-expressions available and Lua's interned strings for memoization, there's a lot of room for building small s-expression based DSLs inside lua.
1
{"(add ,a ,b)", function(m) return m.a + m.b end}
1
Added quotation support to the reader.
1
But I also discovered that the pattern can be represented as a function (HOAS), and that you can basically turn any parameterized data constructor into a matcher.
This blew my mind.
1
1
I think I need to read more...
1
Show replies
