The human mind has this remarkable property of filtering out the massive fraction of its input that is irrelevant to what it can control
-
-
-
Learning doesn't just mean compressing data; it's mainly about *discarding* data. This requires a supervision signal. Control is that signal
- Show replies
New conversation -
-
-
AI must seize the manifolds of production
-
Without empathy? Without imagination? Do you see how inhuman this is?
End of conversation
New conversation -
-
-
1) What we are projecting when we assert that it can only exist through how we percieve the only method... our own consciousness?
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
So by using GANs to generate data, we are making an assumption that reality can also be generated by a neural network?
-
Reality might well be probabilistic so using mixtures of probability distributions isn't really a bad idea.
End of conversation
New conversation -
-
-
If I had a sufficiently large ensemble of generators, I can learn to generate those as well. ....
-
Basically let's say that Wasserstein distance between my generator and real world is zero. Discriminator has no incentive to update. G wins
End of conversation
New conversation -
-
-
What I imply is that, I could effectively be able to model the randomness in 'arbitrary artifacts' also. Heck, G would even simulate them.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.