Daniel Adamec

@DanielAdamec5

Math! Language! Other things! Check out my secondary account for side-projects

here
Vrijeme pridruživanja: veljača 2019.

Tweetovi

Blokirali ste korisnika/cu @DanielAdamec5

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @DanielAdamec5

  1. proslijedio/la je Tweet
    2. velj

    Wikipedia2Vec: An Efficient Toolkit for Learning and Visualizing the Embeddings of Words and Entities from Wikipedia

    Poništi
  2. 28. sij

    PersLay: A Neural Network Layer for Persistence Diagrams and New Graph Topological Signatures

    Poništi
  3. proslijedio/la je Tweet
    27. sij

    Capturing Evolution in Word Usage: Just Add More Clusters?. (arXiv:2001.06629v2 [] UPDATED)

    Poništi
  4. proslijedio/la je Tweet
    27. sij

    PDE-based Group Equivariant Convolutional Neural Networks. (arXiv:2001.09046v1 [cs.LG])

    Poništi
  5. proslijedio/la je Tweet
    27. sij

    MagNet: Discovering Multi-agent Interaction Dynamics using Neural Network. (arXiv:2001.09001v1 [cs.LG])

    Poništi
  6. proslijedio/la je Tweet
    27. sij

    PoWER-BERT: Accelerating BERT inference for Classification Tasks. (arXiv:2001.08950v1 [cs.LG])

    Poništi
  7. proslijedio/la je Tweet
    27. sij

    From Nesterov's Estimate Sequence to Riemannian Acceleration. (arXiv:2001.08876v1 [math.OC])

    Poništi
  8. proslijedio/la je Tweet
    24. sij

    I have been waiting for that: Scaling Laws for Neural Language Models, by the / team!

    Prikaži ovu nit
    Poništi
  9. proslijedio/la je Tweet
    13. sij

    "Data-Dependence of Plateau Phenomenon in Learning with Neural Network --- Statistical Mechanical Analysis", Yuki Y…

    Poništi
  10. proslijedio/la je Tweet
    13. sij

    Inductive Document Network Embedding with Topic-Word Attention

    Poništi
  11. proslijedio/la je Tweet
    9. lip 2019.
    Poništi
  12. proslijedio/la je Tweet
    9. sij

    Training Neural SDEs: We worked out how to do scalable reverse-mode autodiff for stochastic differential equations. This lets us fit SDEs defined by neural nets with black-box adaptive higher-order solvers. With , and .

    Prikaži ovu nit
    Poništi
  13. proslijedio/la je Tweet
    19. pro 2019.

    Check out our latest work, DeFINE: Deep Factorized Input word Representations, accepted at . DeFINe is as efficient as existing methods such as Adaptive Inputs from , but delivers better performance. Work done at

    Poništi
  14. proslijedio/la je Tweet
    14. pro 2019.

    Interested in semantic processing? New preprint out by , and myself: Differential contributions of left-hemispheric language regions to basic semantic composition.

    Poništi
  15. proslijedio/la je Tweet
    13. pro 2019.

    Biases for Emergent Communication in Multi-agent Reinforcement Learning

    Poništi
  16. proslijedio/la je Tweet
    10. pro 2019.

    Embedding Comparator: Visualizing Differences in Global Structure and Local Neighborhoods via Small Multiples

    Poništi
  17. proslijedio/la je Tweet
    2. pro 2019.

    Probing Natural Language Inference Models through Semantic Fragments

    Poništi
  18. proslijedio/la je Tweet
    27. stu 2019.

    A Mutual Information Maximization Perspective of Language Representation Learning

    Poništi
  19. proslijedio/la je Tweet
    26. stu 2019.

    "Rethinking Softmax with Cross-Entropy: Neural Network Classifier as Mutual Information Estimator", Zhenyue Qin, Do…

    Poništi
  20. proslijedio/la je Tweet
    25. stu 2019.

    Our recent paper on abstracting semantic spaces for tasks/domains using structured meta learning is up on arxiv. The inferred structure helps us improve upon existing transfer learning strategies Link:

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·