kaalam.ai

@kaalam_ai

The authors of Jazz, the efficient open source AI platform, tweet , and news from Twitter and arXiv.

Global
Vrijeme pridruživanja: studeni 2017.

Tweetovi

Blokirali ste korisnika/cu @kaalam_ai

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @kaalam_ai

  1. Prikvačeni tweet
    8. lip 2019.
    Poništi
  2. proslijedio/la je Tweet
    3. velj

    PyTorch for research, C++ for production?

    Poništi
  3. proslijedio/la je Tweet

    If an algorithm has "mastered natural language", I would expect it to be able to do some of the things language is for -- communicating information, receiving information, acting on the world... Not merely output something that statistically sounds like language.

    Prikaži ovu nit
    Poništi
  4. 3. velj

    How did no one think about that before? "The core idea of our approach is to transform existing, pre-trained word embeddings via semantic differentials to a new "polar" space with interpretable dimensions defined by such polar opposites"

    Poništi
  5. proslijedio/la je Tweet
    3. velj

    Added ImageNet validation results for 164 pretrained models on several datasets, incl ImageNet-A, ImageNetV2, and Imagenet-Sketch. No surprise, models with exposure to more data do quite well. Without extra, EfficientNets are holding their own.

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    21. sij

    Those 3 dreaded Stackoverflow words

    Cats comic where a little kitty asks for help catching mice.
Tiger: Catching mice is not longer considered a best practice
Bigger Tiger: Recommend upgrading to relying on humans
Lion: Covered by how to stalk birds. Marked as duplicate
    Poništi
  7. proslijedio/la je Tweet
    31. sij

    Got graphs? 📈📊 In episode 2 of Neural Structured Learning, Software Engineer Arjun Gopalan discusses what natural graphs are and how their data can be used to train neural networks. Watch now →

    Training with natural graphs
    Poništi
  8. proslijedio/la je Tweet
    1. velj

    Sentiment analysis is still mostly bullshit, friends.

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    31. sij

    Today we announce a novel, open-source method for text generation tasks (e.g., summarization or sentence fusion), which uses edit operations instead of generating text from scratch, leading to less errors and faster model execution. Read about it below.

    Poništi
  10. 31. sij
    Poništi
  11. proslijedio/la je Tweet
    30. sij
    Poništi
  12. proslijedio/la je Tweet
    30. sij
    Poništi
  13. proslijedio/la je Tweet
    Odgovor korisniku/ci

    Make sure to check out the KerasTuner implementation for , as well! 😄✨ Vignettes, documentation, and more available on Turgut's Github.

    Poništi
  14. proslijedio/la je Tweet
    29. sij

    Machine Unlearning “Once users have shared their data online, it is difficult to revoke access and ask for the data to be deleted. ML exacerbates this problem because any model trained with said data may have memorized it, putting users' privacy at risk.”

    Prikaži ovu nit
    Poništi
  15. proslijedio/la je Tweet
    28. sij

    Check out Meena, a new state-of-the-art open-domain conversational agent, released along with a new evaluation metric, the Sensibleness and Specificity Average, which captures basic, but important attributes for normal conversation. Learn more below!

    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    28. sij

    Procedural Content Generation via Reinforcement Learning “A new approach to procedural content generation in games, where level design is framed as a game (as a sequential task problem), and the content generator itself is learned.”

    Poništi
  17. proslijedio/la je Tweet
    26. sij

    In this paper we present a simple yet powerful idea: when using a recurrent AE to perform online lossy compression of a highly temporally correlated signal, one should feedback the state of the decoder to the encoder. We compare FRAE to many natural auto encoder designs.

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    25. sij

    Learning to adapt class-specific features across domains for semantic segmentation by Mikel Menta et al.

    Poništi
  19. proslijedio/la je Tweet
    26. sij

    Partially-Shared Variational Auto-encoders for Unsupervised Domain Adaptation with Target Shift by Ryuhei Takahashi et al.

    Poništi
  20. proslijedio/la je Tweet
    23. sij

    Q-learning is difficult to apply when the number of available actions is large. We show that a simple extension based on amortized stochastic search allows Q-learning to scale to high-dimensional discrete, continuous or hybrid action spaces:

    Poništi
  21. proslijedio/la je Tweet

    Machine learning is about fitting to a static distribution -- human learning is about gathering knowledge that may turn out to be useful in a future that is guaranteed to share little commonality with the past.

    Prikaži ovu nit
    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·