And if you are hashing your users' passwords with high-cost bcrypt or scrypt, a smaller blacklist provides good cross-coverage
-
-
Replying to @TychoTithonus @thorsheim and
Do you think people could estimate a blacklist size based on a specific hash speed, or more broad guidelines than that?
1 reply 0 retweets 2 likes -
Replying to @PwdRsch @thorsheim and
That is a super interesting idea, and I bet quite possible! A calculator would need to let the user select their threat model variables.
2 replies 0 retweets 2 likes -
Replying to @TychoTithonus @PwdRsch and
I feel the need to mention this paper by
@CormacHerley about password blacklists and bloom filters: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/popularityISeverything.pdf …1 reply 2 retweets 7 likes -
Replying to @lakiw @TychoTithonus and
And I like to add it's follow up work https://www.usenix.org/system/files/conference/soups2017/soups2017-segreti.pdf …
1 reply 3 retweets 6 likes -
At least I'm reinventing better and better wheels. I must be on to something! ;)
1 reply 0 retweets 0 likes -
Replying to @TychoTithonus @m33x and
I simply love these Twitter discussions that ends up with
@PwdRsch,@lakiw,@m33x &@CormacHerley being right, with a link to prove it. :D1 reply 0 retweets 4 likes -
Replying to @thorsheim @TychoTithonus and
Sorry to wade in late. Agree
@thorsheim ban 1-10k. Users will get why monkey1 isn't OK. Banning 300m will prob annoy too much for benefit.1 reply 3 retweets 3 likes -
Replying to @CormacHerley @thorsheim and
I know math is hard, but am I seriously the only one capable of mentally guesstimating the coverage of a 300M blacklist?
1 reply 0 retweets 0 likes -
Replying to @marcan42 @CormacHerley and
"monkey1" isn't in the top1k and is barely in the top10k. To pass the 300M blacklist all you need is "monkey%33". This stuff is exponential.
1 reply 0 retweets 0 likes
FFS, the 300M blocklist only covers 37% of /usr/share/dict/words, with no numbers or other transformations!
-
-
Replying to @marcan42 @CormacHerley and
I think we're agreed: 320M blacklist is suboptimal. Instead, check common dicts & passwords, masks, length, & let user select random phrase
1 reply 0 retweets 0 likes -
Replying to @TychoTithonus @marcan42 and
Causing trouble here but [[citation needed]] ;p I'm struggling because I agree with you but I can't back up 320M blacklist is sub-optimal...
3 replies 0 retweets 1 like - Show replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.