I'm going into this with limited knowledge. I built a simple NN implementation from scratch to understand the math, but that's about it. So I'm going to tweet my progress and thoughts in this thread, in case someone might get value out of it. [2/n]
-
Show this thread
-
21I'm currently taking in 512 samples for each left/right channel, running an FFT, and feeding the complex numbers as separate real/imaginary into the network. 2048 neurons at the input layer. Then dense 1024/512/256/512/1024 hidden layers. [3/n]
2 replies 0 retweets 2 likesShow this thread -
At the output is 9216 neurons, corresponding to 9 buckets of 512 complex values. This will be, in essence, 9 buckets from left to right in the mix, with 0 being left, 4 being middle, and 8 being right. [4/n]
1 reply 0 retweets 2 likesShow this thread -
And I'm building training sets by taking a bunch of song snippets and chords made of sine waves, and mixing them at known spatial points. Might add reverb and such, too. That way I can do standard backprop. [5/n]
1 reply 0 retweets 1 likeShow this thread -
One thing I realized a few minutes ago is that my output in complex numbers might not do loss minimization well. So I need to figure out how to do an inverse FFT and drop the imaginary component of the result, and do my cost function there. No clue how to do this yet. [6/n]
1 reply 0 retweets 2 likesShow this thread -
Tensorflow/Keras seems like it should work great for all of this, but holy crap, it's hard to get into it. I don't think I know enough about the terminology to intelligently google things yet. But hey, that's why I'm doing this! It's fun to be a complete noob sometimes :) [7/n]
1 reply 0 retweets 3 likesShow this thread -
Neat, apparently Lambda layers in Keras let you do arbitrary operations. Should be able to do my ifft and imaginary drop there.
2 replies 0 retweets 2 likesShow this thread -
Replying to @daeken
Sure, you could put this in a Lambda: https://www.tensorflow.org/api_docs/python/tf/spectral/ifft … or just write a custom layer. Happy to answer any questions!
1 reply 0 retweets 1 like -
Replying to @fchollet
Awesome, thanks! I may well have some questions. This is really my first foray and I'm approximately 95% lost, so we'll see how things go :) Outside of not knowing how to find the things I need thus far, I'm enjoying myself.
1 reply 0 retweets 0 likes -
Actually, quick question already: how do custom layers play with Keras' ability to run things on the GPU? I imagine with lambda just using tf operations, it'll figure it out and efficiently compile it, but I can't imagine it doing the same with a custom layer.
1 reply 0 retweets 0 likes
You would typically write custom layers out of TF ops, almost all of which have CUDA implementations. So as long as you stick to these ops, you don't need to worry about gradients or running on GPU, everything will just magically work. Exciting times compared to 5 years ago ;)
-
-
Replying to @fchollet
Great! I'm excited to actually get this running to the point of testing :)
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.