Tweets
- Tweets, current page.
- Tweets & replies
- Media
You blocked @danieldekok
Are you sure you want to view these Tweets? Viewing Tweets won't unblock @danieldekok
-
Just pushed the version 0.1.0 of the `sentencepiece`
@rustlang crate, which provides a rustic interface to the sentencepiece unsupervised text tokenizer. https://crates.io/crates/sentencepiece … https://rustdoc.danieldk.eu/sentencepiece/Thanks. Twitter will use this to make your timeline better. UndoUndo -
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Daniël de Kok Retweeted
direnv 2.21.0 has been released. This is a massive release! * `.envrc` files are now loaded with `set -euo pipefail` which is probably going to expose issues in existing scripts. * Ctrl-C now actually works during reload in bash and zsh. And more!https://github.com/direnv/direnv/releases/tag/v2.21.0 …
Thanks. Twitter will use this to make your timeline better. UndoUndo -
Daniël de Kok Retweeted
Our
#rust and#ocaml bindings for#PyTorch are now compatible with PyTorch 1.4, updated opam package/crate are available! A bunch of examples can be found on GitHub, RNNs, GAN, Reinforcement Learning... https://github.com/LaurentMazare/tch-rs …https://github.com/LaurentMazare/ocaml-torch …Thanks. Twitter will use this to make your timeline better. UndoUndo -
Dear lazyweb, I mean fellow
@rustlang crustaceans. I am working on a crate that binds a C library, but can only reasonably unit-test it by providing a ~500KB data file. Would you find this acceptable for a tiny binding? Alternative: feature-gate those tests.Thanks. Twitter will use this to make your timeline better. UndoUndo -
Daniël de Kok RetweetedThanks. Twitter will use this to make your timeline better. UndoUndo
-
Daniël de Kok Retweeted
For some reason, I haven't been invited to Davos this year. Was it something I said?
pic.twitter.com/6Qmrup7HrB
NowThis
Davos Panelist Calls Out Davos For Not Addressing the Obvious‘It feels like I’m at a firefighters conference and no one’s allowed to speak about water.’ — This historian wasn’t afraid to confront the billionaires at Davos about their greedThanks. Twitter will use this to make your timeline better. UndoUndo -
0.45 dependency parsing LAS improvement on Dutch, with only half the parameters. That's what I call a productive week
. Enjoy the weekend!Thanks. Twitter will use this to make your timeline better. UndoUndo -
Daniël de Kok Retweeted
Firefox market share vs Mozilla Foundation chair salary (2.5 million/year in 2018)pic.twitter.com/htEKDNFMDH
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
Daniël de Kok Retweeted
1/7 Did you ever need to run a piece of C# code on Windows 3.11? Me neither, but I did it anyway. A thread.pic.twitter.com/ZW0p29d9Ef
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
Finally made the switch from Tensorflow to PyTorch. The tch-rs
@rustlang bindings for libtorch are really nice. Learning the libtorch API and porting the BERT model from Hugging Face Transformers to Rust was really quick: https://crates.io/crates/sticker-transformers/0.1.0 …Thanks. Twitter will use this to make your timeline better. UndoUndo -
I am very happy that we use far less gas for heating/hot water than the average of all house types. But that bar chart is weird. The deltas are probably proportional, but the bars themselves…pic.twitter.com/HbIQ2IwG1I
Thanks. Twitter will use this to make your timeline better. UndoUndo -
The final, finetuned models, use each layer almost equally. The interesting patterns here seem to be: for lemmatization, almost all layers are used equally. For the other tasks, use increases per layer. 4/4pic.twitter.com/LqoAQDaYXM
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
However, for morphological and POS tagging, it seems that for ML BERT the classifiers rely most on the initial/middle layers, whereas for BERTje more uniformly on all the layers. 3/4
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
This can give hints where information is represented in the initial encoder. Interestingly, the information seems to be distributed quite differently between ML BERT and BERTje. In both, the last layers are used most for dependencies and the initial layers for lemmatization. 2/4pic.twitter.com/KJKaVYklei
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
More Dutch BERT explorations: since we used scalar weighting, we can see what layers are used per task. As is common in finetuning of pretrained models, we freeze the encoder weight for the first epoch, to avoid that large softmax gradients 'destroy' the encoder 1/4.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo -
And then your 5yo daughter walks into the room at 4 am, shouting “good morning everyone”. Turns out she thought her clock said 9am. :oops:
Thanks. Twitter will use this to make your timeline better. UndoUndo -
I played Commanche in the early 90ies, but never realized the the Voxel Space rendering algorithm is so simple. Cool!https://github.com/s-macke/VoxelSpace …
Thanks. Twitter will use this to make your timeline better. UndoUndo -
Daniël de Kok Retweeted
Hot off the press: Bertje. We collected a large and diverse corpus of Dutch and trained a monolingual BERT model. The model is available for download. Paper: http://arxiv.org/abs/1912.09582 joint work by
@WietseDEV me@AriannaBisazza@tommaso_caselli Gertjan van Noord & Malvina Nissimpic.twitter.com/c1aRvR8QOv
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
, Nix/NixOS
, occasional tinkerer with electronics. Dad of a Lego queen
.
Opinions are my own.