Skip to content
By using Twitter’s services you agree to our Cookies Use. We and our partners operate globally and use cookies, including for analytics, personalisation, and ads.
  • Home Home Home, current page.
  • About

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @
  • Language: English
    • Bahasa Indonesia
    • Bahasa Melayu
    • Català
    • Čeština
    • Dansk
    • Deutsch
    • English UK
    • Español
    • Filipino
    • Français
    • Hrvatski
    • Italiano
    • Magyar
    • Nederlands
    • Norsk
    • Polski
    • Português
    • Română
    • Slovenčina
    • Suomi
    • Svenska
    • Tiếng Việt
    • Türkçe
    • Ελληνικά
    • Български език
    • Русский
    • Српски
    • Українська мова
    • עִבְרִית
    • العربية
    • فارسی
    • मराठी
    • हिन्दी
    • বাংলা
    • ગુજરાતી
    • தமிழ்
    • ಕನ್ನಡ
    • ภาษาไทย
    • 한국어
    • 日本語
    • 简体中文
    • 繁體中文
  • Have an account? Log in
    Have an account?
    · Forgot password?

    New to Twitter?
    Sign up
fchollet's profile
François Chollet
François Chollet
François Chollet
Verified account
@fchollet

Tweets

François CholletVerified account

@fchollet

Deep learning @google. Creator of Keras. Author of 'Deep Learning with Python'. Opinions are my own.

United States
fchollet.com
Joined August 2009

Tweets

  • © 2021 Twitter
  • About
  • Help Center
  • Terms
  • Privacy policy
  • Cookies
  • Ads info
Dismiss
Previous
Next

Go to a person's profile

Saved searches

  • Remove
  • In this conversation
    Verified accountProtected Tweets @
Suggested users
  • Verified accountProtected Tweets @
  • Verified accountProtected Tweets @

Promote this Tweet

Block

  • Tweet with a location

    You can add location information to your Tweets, such as your city or precise location, from the web and via third-party applications. You always have the option to delete your Tweet location history. Learn more

    Your lists

    Create a new list


    Under 100 characters, optional

    Privacy

    Copy link to Tweet

    Embed this Tweet

    Embed this Video

    Add this Tweet to your website by copying the code below. Learn more

    Add this video to your website by copying the code below. Learn more

    Hmm, there was a problem reaching the server.

    By embedding Twitter content in your website or app, you are agreeing to the Twitter Developer Agreement and Developer Policy.

    Preview

    Why you're seeing this ad

    Log in to Twitter

    · Forgot password?
    Don't have an account? Sign up »

    Sign up for Twitter

    Not on Twitter? Sign up, tune into the things you care about, and get updates as they happen.

    Sign up
    Have an account? Log in »

    Two-way (sending and receiving) short codes:

    Country Code For customers of
    United States 40404 (any)
    Canada 21212 (any)
    United Kingdom 86444 Vodafone, Orange, 3, O2
    Brazil 40404 Nextel, TIM
    Haiti 40404 Digicel, Voila
    Ireland 51210 Vodafone, O2
    India 53000 Bharti Airtel, Videocon, Reliance
    Indonesia 89887 AXIS, 3, Telkomsel, Indosat, XL Axiata
    Italy 4880804 Wind
    3424486444 Vodafone
    » See SMS short codes for other countries

    Confirmation

     

    Welcome home!

    This timeline is where you’ll spend most of your time, getting instant updates about what matters to you.

    Tweets not working for you?

    Hover over the profile pic and click the Following button to unfollow any account.

    Say a lot with a little

    When you see a Tweet you love, tap the heart — it lets the person who wrote it know you shared the love.

    Spread the word

    The fastest way to share someone else’s Tweet with your followers is with a Retweet. Tap the icon to send it instantly.

    Join the conversation

    Add your thoughts about any Tweet with a Reply. Find a topic you’re passionate about, and jump right in.

    Learn the latest

    Get instant insight into what people are talking about now.

    Get more of what you love

    Follow more accounts to get instant updates about topics you care about.

    Find what's happening

    See the latest conversations about any topic instantly.

    Never miss a Moment

    Catch up instantly on the best stories happening as they unfold.

    1. François Chollet‏Verified account @fchollet 11 Mar 2019

      Are you a deep learning researcher? Wondering if all this TensorFlow 2.0 stuff you heard about is relevant to you? This thread is a crash course on everything you need to know to use TensorFlow 2.0 + Keras for deep learning research. Read on!pic.twitter.com/dFNI2E6yjF

      30 replies 1,276 retweets 3,483 likes
      Show this thread
    2. François Chollet‏Verified account @fchollet 11 Mar 2019

      1) The first class you need to know is `Layer`. A Layer encapsulates a state (weights) and some computation (defined in the `call` method).pic.twitter.com/og1hmez7vu

      1 reply 25 retweets 131 likes
      Show this thread
    3. François Chollet‏Verified account @fchollet 11 Mar 2019

      2) The `add_weight` method gives you a shortcut for creating weights. 3) It’s good practice to create weights in a separate `build` method, called lazily with the shape of the first inputs seen by your layer. Here, this pattern prevents us from having to specify `input_dim`:pic.twitter.com/9aZ9AkZFVQ

      2 replies 14 retweets 76 likes
      Show this thread
      François Chollet‏Verified account @fchollet 11 Mar 2019

      4) You can automatically retrieve the gradients of the weights of a layer by calling it inside a GradientTape. Using these gradients, you can update the weights of the layer, either manually, or using an optimizer object. Of course, you can modify the gradients before using them.pic.twitter.com/XcFLBMPFhy

      9:13 AM - 11 Mar 2019
      • 13 Retweets
      • 69 Likes
      • Anushka Maddu Swaroop Vladislav L Tapan jain Arunkumar Venkataramanan 𝕚𝕊𝕒𝕟𝕕𝕤 👨🏻‍💻 Jeff Yang Philipe Borba Behrooz Azarkhalili
      4 replies 13 retweets 69 likes
        1. New conversation
        2. François Chollet‏Verified account @fchollet 11 Mar 2019

          5) Weights created by layers can be either trainable or non-trainable. They're exposed in the layer properties `trainable_weights` and `non_trainable_weights`. Here's a layer with a non-trainable weight:pic.twitter.com/QKGyZr7OxM

          1 reply 13 retweets 55 likes
          Show this thread
        3. François Chollet‏Verified account @fchollet 11 Mar 2019

          6) Layers can be recursively nested to create bigger computation blocks. Each layer will track the weights of its sublayers (both trainable and non-trainable).pic.twitter.com/WeJMQ2nkte

          1 reply 11 retweets 56 likes
          Show this thread
        4. François Chollet‏Verified account @fchollet 11 Mar 2019

          7) Layers can create losses during the forward pass. This is especially useful for regularization losses. The losses created by sublayers are recursively tracked by the parent layers.pic.twitter.com/TcsfyBqilg

          1 reply 11 retweets 53 likes
          Show this thread
        5. François Chollet‏Verified account @fchollet 11 Mar 2019

          8) These losses are cleared by the top-level layer at the start of each forward pass -- they don't accumulate. `layer.losses` always contain only the losses created during the *last* forward pass. You would typically use these losses by summing them when writing a training loop.pic.twitter.com/ROYk5AOQUL

          1 reply 9 retweets 51 likes
          Show this thread
        6. François Chollet‏Verified account @fchollet 11 Mar 2019

          9) You know that TF 2.0 is eager by default. Running eagerly is great for debugging, but you will get better performance by compiling your computation into static graphs. Static graphs are a researcher's best friends! You can compile any function by wrapping it in a tf.function:pic.twitter.com/CnSrpVmCJc

          4 replies 11 retweets 74 likes
          Show this thread
        7. François Chollet‏Verified account @fchollet 11 Mar 2019

          10) Some layers, in particular the `BatchNormalization` layer and the `Dropout` layer, have different behaviors during training and inference. For such layers, it is standard practice to expose a `training` (boolean) argument in the `call` method.pic.twitter.com/FA5pZM3kWS

          2 replies 10 retweets 59 likes
          Show this thread
        8. François Chollet‏Verified account @fchollet 11 Mar 2019

          11) You have many built-in layers available, from Dense to Conv2D to LSTM to fancier ones like Conv2DTranspose or ConvLSTM2D. Be smart about reusing built-in functionality.

          1 reply 7 retweets 51 likes
          Show this thread
        9. François Chollet‏Verified account @fchollet 11 Mar 2019

          12) To build deep learning models, you don't have to use object-oriented programming all the time. All layers we've seen so far can also be composed functionally, like this (we call it the "Functional API"):pic.twitter.com/OohI9IZlQ5

          1 reply 12 retweets 60 likes
          Show this thread
        10. François Chollet‏Verified account @fchollet 11 Mar 2019

          The Functional API tends to be more concise than subclassing, & provides a few other advantages (generally the same advantages that functional, typed languages provide over untyped OO development). Learn more about the Functional API: https://www.tensorflow.org/alpha/guide/keras/functional …

          1 reply 8 retweets 60 likes
          Show this thread
        11. François Chollet‏Verified account @fchollet 11 Mar 2019

          However, note that the Functional API can only be used to define DAGs of layers -- recursive networks should be defined as `Layer` subclasses instead. In your research workflows, you may often find yourself mix-and-matching OO models and Functional models.

          2 replies 5 retweets 33 likes
          Show this thread
        12. François Chollet‏Verified account @fchollet 11 Mar 2019

          That's all you need to get started with reimplementing most deep learning research papers in TensorFlow 2.0 and Keras! Now let's check out a really quick example: hypernetworks.

          1 reply 7 retweets 44 likes
          Show this thread
        13. François Chollet‏Verified account @fchollet 11 Mar 2019

          A hypernetwork is a deep neural network whose weights are generated by another network (usually smaller). Let's implement a really trivial hypernetwork: we'll take the `Linear` layer we defined earlier, and we'll use it to generate the weights of... another `Linear` layer.pic.twitter.com/11HjEvBBkh

          4 replies 23 retweets 126 likes
          Show this thread
        14. François Chollet‏Verified account @fchollet 11 Mar 2019

          Another quick example: implementing a VAE in either style, either subclassing (left) or the Functional API (right). I've posted this before. Find what works best for you!pic.twitter.com/3xUliC3nFb

          3 replies 16 retweets 109 likes
          Show this thread
        15. François Chollet‏Verified account @fchollet 11 Mar 2019

          This is the end of this thread. Play with these code examples in this Colab notebook: https://colab.research.google.com/drive/17u-pRZJnKN0gO5XZmq8n5A2bKGrfKEUg … 🦄🚀

          19 replies 51 retweets 287 likes
          Show this thread
        16. End of conversation

      Loading seems to be taking a while.

      Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.

        Promoted Tweet

        false

        • © 2021 Twitter
        • About
        • Help Center
        • Terms
        • Privacy policy
        • Cookies
        • Ads info