Is there an efficient way to sort a list of integers such that each consecutive one has a low number of bits different from the last? I came up with an O(n^2) algorithm to do it but I'm hoping there's a better way.
I have an idea to reduce dimensionality. Split the number into, say, 4-bit chunks. For each chunk, treat it as if it were a grey-coded integer, and convert it back into a normal binary integer. Now you've divided the number of dimensions by 4, without discarding locality info.
-
-
I'm not sure I fully follow. How would the distance function look?
-
Just Euclidean distance (or maybe Manhattan, idk)
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.