Sometimes people say, "X is technologically feasible (with deep learning), but it would require a dataset much larger than what we can collect". A more accurate take is we don't have the technology to do X given a realistically-sized dataset.
-
-
Every problem is solvable with k-NN. You "just" need enough data.
Show this threadThanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
A model with sufficient (accurate) simplifying assumptions can do a lot with a small amount of data.
-
Yes but making assumptions compromises the ethical applicability of model
End of conversation
New conversation -
-
-
It makes me think at this: are there attempts to use topological results in order to (if you can) produce specifics data. You can imagine to do it so to have some topological constants. It has been done to avoid holes in detector coverage without knowing location.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Good thing we never have to say this. We symthesize data from physics simulation. This changes the problem and introduces other problems, but feasibility is in sight. It requires not only sim2real but also a loop closure with real2sim.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Does this really generalize? Scale of dataset and complexity of problem are deeply intertwined. Unless you're an oracle that can always pick the perfect priors, at some point you're going to run into problems where you need more data to appropriately define the distribution.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Pioneers like Amelia Earheart may disagree. Pushing and probing leads to exploration and discovery. Not bizarre, but not for everybody.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.