vochicong

@vochicong

I play chess/shogi ❤️ work with CI/CD, NLP/RL/ML 😎 speak Vietnamese, Japanese and English, forgot Russian 🤠 majored in Operations Research 🤓

Tokyo
Vrijeme pridruživanja: lipanj 2008.

Tweetovi

Blokirali ste korisnika/cu @vochicong

Jeste li sigurni da želite vidjeti te tweetove? Time nećete deblokirati korisnika/cu @vochicong

  1. Prikvačeni tweet
    4. ruj 2019.
    Poništi
  2. 31. sij
    Poništi
  3. proslijedio/la je Tweet
    28. sij

    New paper: Towards a Human-like Open-Domain Chatbot. Key takeaways: 1. "Perplexity is all a chatbot needs" ;) 2. We're getting closer to a high-quality chatbot that can chat about anything Paper: Blog:

    Prikaži ovu nit
    Poništi
  4. proslijedio/la je Tweet
    27. sij

    A lot has been said about the dangers of large-scale language model trained on the internet, like GPT-2. We fine-tuned a version of GPT-2 to avoid generating descriptions of non-normative behavior (killing, suicide, things inconsistent with social norms)

    Prikaži ovu nit
    Poništi
  5. proslijedio/la je Tweet

    数日前、ハワイのホテルのエレベーターの中で、赤ちゃんを二人抱っこして行き先階のボタンを押せないでいる白人ママがいたので 、僕が「何階?」と聞いて押してあげた。そうしたら「Thank you, Darling!」と言われたので僕がびっくりした顔をして振り返って・・・(続く)

    Prikaži ovu nit
    Poništi
  6. proslijedio/la je Tweet
    23. sij

    Google Dataset Search is now officially out of beta. "Dataset Search has indexed almost 25 million of these datasets, giving you a single place to search for datasets & find links to where the data is." Nice work, Natasha Noy and everyone else involved!

    Prikaži ovu nit
    Poništi
  7. proslijedio/la je Tweet
    14. sij

    By restructuring math expressions as a language, Facebook AI has developed the first neural network that uses symbolic reasoning to solve advanced mathematics problems.

    Poništi
  8. proslijedio/la je Tweet

    A while ago, I thought doing things in a clever way was cool. Now, I understand that simple is better than clever. Clever is mostly showing off.

    Poništi
  9. proslijedio/la je Tweet
    24. pro 2019.

    Some folks still seem confused about what deep learning is. Here is a definition: DL is constructing networks of parameterized functional modules & training them from examples using gradient-based optimization....

    Prikaži ovu nit
    Poništi
  10. proslijedio/la je Tweet
    24. pro 2019.
    Prikaži ovu nit
    Poništi
  11. proslijedio/la je Tweet
    23. pro 2019.

    予想以上にサンタさんにHHKBをお願いしてる人が多かったので公式からもHHKBをプレゼントしちゃいます!!   ① をフォロー ②このツイートをRT で応募完了!12/25締切   (私の調整力不足で新HHKBは用意できなかったのですが頑張りましたのでご容赦ください…)

    Prikaži ovu nit
    Poništi
  12. 22. pro 2019.

    Looking forward to this "In a not so future release, you will be able to use your custom language model fine-tuned on custom corpus for the encoder model."

    Poništi
  13. proslijedio/la je Tweet
    Poništi
  14. proslijedio/la je Tweet
    17. pro 2019.

    2019年末版 形態素解析器の比較 - Qiita

    Poništi
  15. proslijedio/la je Tweet
    18. pro 2019.

    🔥🔥 Series A!! 🔥🔥 Solving Natural language is going to be the biggest achievement of our lifetime, and is the best proxy for Artificial intelligence. Not one company, even the Tech Titans, will be able to do it by itself – the only way we'll achieve this is working together

    , , i još njih 6
    Prikaži ovu nit
    Poništi
  16. proslijedio/la je Tweet
    16. pro 2019.
    Odgovor korisniku/ci

    粘り強く日本のゴミ収集のルールを説明するしか無いのではありませんか。1960年に始まったゴミの定時回収が日本に定着するまで都会はゴミだらけでしたから。

    Poništi
  17. proslijedio/la je Tweet
    12. pro 2019.

    おはようござえます、日本の友達 Hello, Friends from Japan 🇯🇵! Thanks to , we now have a state-of-the-art Japanese language model in Transformers, `bert-base-japanese`. Can you guess what the model outputs in the masked LM task below?

    Prikaži ovu nit
    Poništi
  18. proslijedio/la je Tweet
    11. pro 2019.

    The wait is over! You can now take the new specialization course on . We’re excited to see more people learn how to apply ML to build awesome things! Thank you for your partnership!

    Poništi
  19. proslijedio/la je Tweet
    11. pro 2019.

    huggingface/transformersにBERT日本語pretrainedモデル追加。

    Prikaži ovu nit
    Poništi
  20. proslijedio/la je Tweet

    TF 2.1 is coming soon (RC0 out now), with extended TPU support: (via )

    Poništi
  21. proslijedio/la je Tweet
    6. pro 2019.

    This is a really cool work from UberAI on a tough question: Is it possible to control the generations of an unconditionally trained language model? We loved it so much that we added it to our repo and made an online demo to play with it! Give it a try👉

    Poništi

Čini se da učitavanje traje već neko vrijeme.

Twitter je možda preopterećen ili ima kratkotrajnih poteškoća u radu. Pokušajte ponovno ili potražite dodatne informacije u odjeljku Status Twittera.

    Možda bi vam se svidjelo i ovo:

    ·