-
Notifications
You must be signed in to change notification settings - Fork 13.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't allow "UTF-16 surrogate codepoints" in char or str #8319
Comments
At least it's consistent now:
|
Because UTF-8 encodings of these surrogates are not considered "valid UTF-8" https://en.wikipedia.org/wiki/UTF-8#Invalid_code_points Because OS APIs often do weird/unsafe things if you give them mismatched surrogates, or surrogates where they aren't expecting surrogates. |
The cases in this bug report are both fixed.
and
Thanks, @thestinger! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I think
should be disallowed at compile time and
should assert. Or even better, the incomplete
int as char
could be removed.The text was updated successfully, but these errors were encountered: