On a positive note, I am optimistic that near term AI technology will mostly obliterate bullshit jobs (transport, retail, packaging), while making the services of experts dramatically better and affordable. There will be a lot more quality jobs, including entirely new professions
-
-
Replying to @Plinz
Do you think we’ll create a self-improving AI that is self-improving in a more than linear pace?
1 reply 0 retweets 0 likes -
Replying to @JMoVS
I don't know how to put any probability estimates on it, but self improving AI is quite certainly possible, and a sigmoid is the default pattern one would expect, no?
1 reply 0 retweets 0 likes -
Convnets and stochastic gradient descent are probably not the right answer. But it might be that meta learning breakthroughs, new ways of adaptive credit assignment, autocompositional reward optimizer networks, or function agnostic Gibbs samplers are just around the corner?
2 replies 0 retweets 5 likes -
What if you let nodes in the neural net grow connections to any other node, rather than just change the weights of pre-existing connections? And also the ability to add nodes and layers? Along with some (flexible) criteria to judge success of the model.
1 reply 0 retweets 0 likes -
I’m not that deep into the field yet, but has there been some experimentation with multi-dimensional neural networks? All I’ve seen were linear NN that were able to be represented in a 2D-fashion
1 reply 0 retweets 0 likes -
Almost all approaches deal with high dimensional spaces, by reducing a very high dimensional space of input values to a smaller set of dimensions. https://www.quora.com/How-could-artificial-intelligence-work-to-understand-perspective-i-e-our-3-dimensions/answer/Joscha-Bach-1?__filter__&__nsrc__=2&__snid3__=1757276987 …
2 replies 0 retweets 0 likes -
maybe I’m misunderstanding your point - what I have in mind is this graph with different layers, but in all the representations I’ve seen, there are only ever 2 layers connected to each other. They seem to never be cross-connected.
1 reply 0 retweets 0 likes
Recurrent networks are currently harder to train, but some types (like LSTMs) are quite popular. But that is unrelated to 3d.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.