A EBNF generator would be that. You could expand the generated EBNF to lambda expressions. But EBNF-by-example is difficult. Smells like it will require some machine learning ingredients.
-
-
-
There is only going to be a single consistent solution (but many possible implementations), so gradient descent may not work.
End of conversation
New conversation -
-
-
Are you just describing a high level abstraction of machine learning? You can start with the regex generators that take inputs as examples. Then perhaps extend that to grammars.
-
It is not hard to implement. I just wonder if we can skip a few hundred lines of implementation in favor of a few dozen lines of implicit constraints.
End of conversation
New conversation -
-
-
I think a compacter, (but harder to find) way would be to have the language be entirely artifacts and side effects of a minimal set of interacting (type)constructors
-
I am worried that this might be not very didactic, whereas a mapping of $language to Lisp would be intuitive to understand?
- 1 more reply
New conversation -
-
-
Take a look at http://racket-lang.org
-
Lisp can be defined on a single page. A decent constraint satisfaction algorithm is going to be half a page of code. How does Racket come in here?
- 1 more reply
New conversation -
-
-
https://en.m.wikipedia.org/wiki/Rosetta-lang?wprov=sfti1 … not exhaustive example based, but nearly everything else
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Why not ask the same question on Quora and address it to Alan Kay? (https://www.quora.com/profile/Alan-Kay-11 …) I don't think that he has tried the exact same thing but he must have lots of ideas about this :)
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.