Today I discovered that <input maxlength> doesn't count multi-byte characters (e.g. emojis) properly! E.g if maxlength="1", you cannot insert an emoji at all! Testcase: http://dabblet.com/gist/78b6ab63152068da047398e57786a438 … Chrome bug (wontfix): https://bugs.chromium.org/p/chromium/issues/detail?id=747649 … Spec bug (open):https://github.com/whatwg/html/issues/1467 …
-
-
Replying to @LeaVerou
This matches string.length in JS tho. Which is stupid and bad, but it's what we're stuck with. I'd wager nearly all custom impls of this have the same behavior too, nobody uses [...str].length
3 replies 0 retweets 2 likes -
Replying to @tabatkins @LeaVerou
A solution would be to replace maxlength with pattern="<highly convoluted regexp>". When https://tc39.github.io/proposal-regexp-unicode-property-escapes/ … gets accepted, that might actually become somewhat feasible, ie. `pattern=".\p{Mn}"` to allow diacritics.
1 reply 1 retweet 1 like
Yep. I made sure pattern="" enables the `u` RegExp flag behind the scenes to make this possible. https://www.w3.org/Bugs/Public/show_bug.cgi?id=26915 …
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.
JavaScript, HTML, CSS, HTTP, performance, security, Bash, Unicode, i18n, macOS.