"First, the inner optimization path need not be stored nor differentiated through, thereby making implicit MAML memory efficient and scalable to a large number of inner optimization steps."
-
-
Prikaži ovu nit
-
"Sec-ond, implicit MAML is agnostic to the inner optimization method used, as long as it can find an approximate solution to the inner-level optimization problem."
Prikaži ovu nit -
"We show that an –approximate meta-gradient can be computed via implicit MAML using O˜(log(1/)) gradient evaluations and O˜(1) memory, meaning the memory required does not grow with number of gradient steps."
Prikaži ovu nit
Kraj razgovora
Novi razgovor -
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.