It's not really a matrix of gradients, it's either a vector of gradients or a matrix of first derivatives. This, forgetting about the fact that matrix is sorta used as a way to get around the fact that these are tensors, at least if you're calling the functions "fields"
-
-
-
recommend any book in particular on this tensor approach ?
- 2 more replies
New conversation -
-
-
Hessian is then a Jacobian of a gradient
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
And then there is the Laplacian
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
@ReginaP67870358 بس شفتهن بال calc صارت المأسي كلها على مقاسي
Ez game tho 

-
HAHAHAHAHAHAHA shara7on el dr bas ma 3nde fkra shu hne


pic.twitter.com/T6gsNzjucA - 4 more replies
New conversation -
-
-
What's really confusing is when someone also uses the terms Jacobian and Hessian to refer to the determinants of those matrices. Also don't forget about Laplacians and d'Alemberians.
-
*d'Alembertians...missed the t
End of conversation
New conversation -
-
-
Newton’s method nicely marry the 1st and the 3rd for its step p moving toward the minimum of f: p = - (∇²f)⁻¹ ∇f where ∇²f is another notation for the Hessian.
-
And when f is a sum of squares, it becomes the Gauss-Newton method which can be entirely written with a Jacobian.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.