They do a series of experiments to highlight the accuracy of this 6D representation. source code in pytorch is here: https://github.com/papagina/RotationContinuity …pic.twitter.com/2FJzv26frl
U tweetove putem weba ili aplikacija drugih proizvođača možete dodati podatke o lokaciji, kao što su grad ili točna lokacija. Povijest lokacija tweetova uvijek možete izbrisati. Saznajte više
They do a series of experiments to highlight the accuracy of this 6D representation. source code in pytorch is here: https://github.com/papagina/RotationContinuity …pic.twitter.com/2FJzv26frl
Interesting paper on manifold valued regression. The proposed solution is equivalent to that found in the Homeomorphic VAE paper of @lcfalors, @pimdehaan and @im_td
https://arxiv.org/pdf/1807.04689.pdf … (-> Eq. 33)pic.twitter.com/Jvu38Eu7j3
very interesting, thanks for highlighting this to me!
Note that, since NNs are not shy of high dimensional parametrizations, you can go wild with this rotation vector e.g. if you suffer from occlusions or ambiguous rotations, see http://openaccess.thecvf.com/content_ECCV_2018/papers/Martin_Sundermeyer_Implicit_3D_Orientation_ECCV_2018_paper.pdf …
thanks. yes, this is another way of doing things - they do everything in embedding space and do L2 or cosine loss.
I suppose you refer to Gram-Schmidt? One of the key theorems of linear algebra in orthonormalizing vectors in a space with inner product.
An alternative that works well is to do soft classification and then quaternion fitting e.g. https://arxiv.org/abs/1907.04298 This way you can model orientation ambiguity. But yes to get high precision you end up naively with a huge net!
Lie algebra representation is well differentiable, I never thought it was discontinuous. What a surprise.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.