actually besides http://fast.ai i always credit @chollet's DL book, starting with its very early draft; so the choice of Xception and use of Keras is due to the latter :)
-
-
I haven't been recommending Xception because grouped convs and depthwise separable convs are still slow in CuDNN AFAIK.
1 reply 0 retweets 5 likes -
i like how lean is the Xception model, will try to compare the inference speed bw my previous VGG-like and Xception
1 reply 0 retweets 0 likes -
Vgg is the slowest. Compare to rn34
1 reply 0 retweets 1 like -
ok - i abandoned resnets in favor of xception due to the former requiring larger size of input image; will revisit
2 replies 0 retweets 0 likes -
No modern CNN requires a specific input size. You can use any size you like with rn34. Be sure to use adaptive pooling to make this work (fastai does that for you)
2 replies 0 retweets 11 likes -
Replying to @jeremyphoward @graphific
Keras API puts a low boundary on the input image size - ex for rn50 it requires input size to be no smaller than 197
2 replies 0 retweets 2 likes -
Hi Helena, all modern CNNs operate downsampling on their inputs, which means they start with large feature maps that get smaller as you do down the network. In general, convolution is a downsampling operating unless you explicit pad your inputs.
3 replies 0 retweets 13 likes -
Thats means that all CNNs have a minimum input size, which is the minimum size you can call the network on without resulting in empty feature maps. At that size, the output of your network is a 1x1 feature map.
1 reply 0 retweets 2 likes -
For most nets, this tends to be about ~50% of the input size the network was designed for. Downsampling is absolutely necessary when you have large inputs. To get rid of this limitation, you need to create your own network, where you would do less downsampling.
2 replies 0 retweets 2 likes
For instance, you could edit Xception and 1) remove max pooling layers, 2) use "same" padding in all convolution layers. This would create a version of Xception that would be compatible with smaller input sizes.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.