Nice but t-SNE looks *much* worse than it should be for this example. I suspect it might be due to the learning rate. For n=1mln points you need to increase the learning rate from the default 200, good rule of thumb is to set learning_rate=n/12.
-
-
-
Definitely - we just grabbed default parameters. Thanks for the tip on hyper parameters - we'll rerun and update!
- Još 14 drugih odgovora
Novi razgovor -
-
-
And a closer look at ivis (https://github.com/beringresearch/ivis …) because we're biased
#OpenSourcepic.twitter.com/t3OCDaA3qyPrikaži ovu nitHvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Gorgeous! Any chance to run it just a bit longer, to see how ivis looks once it's settled down (compared to
@leland_mcinnes 's UMAP)? -
Final configuration! UMAP ran for ~8 hours and ivis ~2 hours under default parameters.pic.twitter.com/uH5ko9Btr1
- Još 2 druga odgovora
Novi razgovor -
-
-
This is oddly satisfying to watch. Which dataset did you guys use? I'm working on a new algorithm and would love to work to have it in that table!
-
We used the dataset from https://3d.si.edu/t-rex . Head to downloads section and grab file called 'Water-tight - T. rex and Triceratops full skeletons, 1:20 scale, .STL (48MB)'. Would be great to see how your algorithm performs!
Kraj razgovora
Novi razgovor -
-
-
Wow that's so cool! Really impressed by the UMAP result. But LLE doesn't look particularly helpful...
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
-
-
Hvala. Twitter će to iskoristiti za poboljšanje vaše vremenske crte. PoništiPoništi
-
Čini se da učitavanje traje već neko vrijeme.
Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.