In strings and regular expressions, are `\uD834\uDF06` and `\u{D834}\u{DF06}` equivalent? If not, how does the latter behave — does it throw an error? I’d expect both to represent U+1D306.
https://people.mozilla.org/~jorendorff/es6-draft.html#sec-patterns says:
RegExpUnicodeEscapeSequence[U] ::
[+U] u LeadSurrogate \u TrailSurrogate
u Hex4Digits
[+U] u{ HexDigits }
Should `[+U] u{ LeadSurrogate } \u{ TrailSurrogate }` be added there?
They're equivalent in strings, but not in /u regular expressions. In that context,
\uD834\uDF06 represents the single code point U+1D306., but \u{D834}\u{DF06} represents two code points - U+D834 and U+DF06.
(In reply to Allen Wirfs-Brock from comment #1)
> They're equivalent in strings, but not in /u regular expressions. In that
> context,
>
> \uD834\uDF06 represents the single code point U+1D306., but \u{D834}\u{DF06}
> represents two code points - U+D834 and U+DF06.
When does this matter? Does that constitute an observable difference? Since both are equivalent in strings, I don’t think this is the case, but I might be missing something.
/\uD834\uDF06/.test('\uD834\uDF06'); // true
/\u{D834}\u{DF06}/u.test('\uD834\uDF06'); // true
Ah, I guess it matters in situations like:
/\uD834\uDF06{2}/u.test('\uD834\uDF06\uD834\uDF06'); // true
/\uD834\uDF06{2}/u.test('\uD834\uDF06\uDF06'); // false
/\u{D834}\u{DF06}{2}/u.test('\uD834\uDF06\uD834\uDF06'); // false
/\u{D834}\u{DF06}{2}/u.test('\uD834\uDF06\uDF06'); // true