Of all the directions you can walk towards in a continuous (often high dimensional) space, the gradient offers the highest probability of getting to a minimum, so why would it odd to descend? (I realize that when everyone is descending, a few should explore the mountains, too.)
-
-
-
In an convex problem, there's an enormous range of directions that get you to a minimum with probability 1, and it's not clear the gradient is special in any way. In a nonconvex problem, your statement is false.
- Näytä vastaukset
Uusi keskustelu -
-
-
Surely that depends on the shape you're dealing with. And I seem to recall that some descent algorithms overshoot local minima, decreasing the chances of getting stuck in one.
-
Learning rate optimisation
Keskustelun loppu
Uusi keskustelu -
-
-
But if you could find a vector pointing to the minimum, the problem becomes trivial. You just needs to follow it in a straight line till you're there
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
-
-
Genetic algo solves this, no?
-
It's an option.
Keskustelun loppu
Uusi keskustelu -
-
-
Master of clickbait tweets!
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
-
-
Thats why evolution is effective. Take many random walks, select the closest route to the objective. No gradient. Sometimes mix routes that optimize different dimentions (sex). Also don't look for an optimum, but for a locally stable solution.
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
-
-
-Gradient always points to the steepest slope given the topology where it's measured. Depending on the point and the topology, sure, it may not point directly at the minimum. But you wouldn't know where the minimum is anyway. If you knew then you could just jump there
Kiitos. Käytämme tätä aikajanasi parantamiseen. KumoaKumoa
-
Lataaminen näyttää kestävän hetken.
Twitter saattaa olla ruuhkautunut tai ongelma on muuten hetkellinen. Yritä uudelleen tai käy Twitterin tilasivulla saadaksesi lisätietoja.