I will being streaming "Cache Invalidation Isn't Hard" at http://twitch.tv/handmade_hero shortly. The topic of this lecture will be that cache invalidation isn't hard, and will include discussions on cache invalidation and it's not-being-hard-ness.
A 128-bit hash collides roughly every 2^64 unique inputs. I would add the extra key validation when Unicode defines a UTF-64 standard.
-
-
Fair point. I get the point of the probability being incredibly low, but I was thinking in the lines of something where getting the correct answer 100% of the time mattered. Like caching user profile pictures. (although maybe it doesn't matter all that much).
-
I guess I don't understand if "roughly every 2^64 unique inputs" means that's some sort of guarantee or if it's just a probability. Would you have any suggestions on what to read to get better understanding on how that works?
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.