Yeah but geometry grounded would ideally be the go to to an extent
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Evolution may have something to say about that.
-
Psh, biology is just an implementation detail of physics, obviously!
End of conversation
New conversation -
-
-
There is trade-off in there: ensembling good models usually beats single model.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
A point in favor for graph neural networks?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think networks like group convolutions for example enforce a type of Occam's razor on the 'complexity' of features, and if the minimum necessary group size is enforced, then you can get better generalization. Not necessarily due to sparsity, but definitely due to less
-
'relationships' that are more probable to pick up on spurious correlations.
End of conversation
New conversation -
-
-
Are you talking about Conway's law?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
> why very sparse networks are so effective Is it true? Transformers are usually more effective than convolutional neural networks.
-
I think that’s true at the moment, but in comp neuro, sparsity is a common topic that’s observed to be correlated with good computation. Biological neurons tend to self-organize in sparse ways
- Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.