« Bugzilla Issues Index

#1802 — Section 7.6: Backwards compatibility and U+2E2F in `Identifier`s

I wrote a (new) script that generates a regular expression that matches valid JavaScript identifiers as per ECMAScript 5.1 / Unicode v6.2.0.

Then, I made it do the same thing according to the latest ECMAScript 6 draft, which refers to Unicode Standard Annex #31: Unicode Identifier and Pattern Syntax (

After comparing the output, I noticed that both regular expressions are identical except for the following: ECMAScript 5 allows U+2E2F VERTICAL TILDE in IdentifierStart and IdentifierPart, but ECMAScript 6 / Unicode TR31 doesn’t.

Was this potentially breaking change intentional? I’m fine with disallowing U+2E2F, but only if we’re sure it doesn’t break any existing code.

We decided on Sep 17 that it was intentional. Implementations need to use the ID_Start category instead.