« Bugzilla Issues Index
#1585 — Should parseInt handle octal and binary integer literals
- bug_id:
1585
- creation_ts:
2013-07-16 08:18:00 -0700
- short_desc:
Should parseInt handle octal and binary integer literals
- delta_ts:
2015-07-10 08:35:05 -0700
- product:
Draft for 6th Edition
- component:
technical issue
- version:
Rev 19: September 27, 2013 Draft
- rep_platform:
All
- op_sys:
All
- bug_status:
RESOLVED
- resolution:
WONTFIX
- priority:
Normal
- bug_severity:
enhancement
- everconfirmed:
true
- reporter:
Erik Arvidsson
- assigned_to:
Allen Wirfs-Brock
- cc:
["claude.pache", "erik.arvidsson", "mathias", "waldron.rick"]
- commentid:
4535
- comment_count:
0
- who:
Erik Arvidsson
- bug_when:
2013-07-16 08:18:56 -0700
Currently parseInt handles 0x as Hex. Should it handle 0b as binary? and 0o as octal?
- commentid:
5900
- comment_count:
1
- who:
Rick Waldron
- bug_when:
2013-10-13 08:43:28 -0700
I was about to file this and found the dup, so bumping the revision number
- commentid:
7636
- comment_count:
2
- who:
Mathias Bynens
- bug_when:
2014-04-09 15:54:16 -0700
WONTFIX as per April 9 TC39 meeting.
It’s a compatibility risk — both `parseInt('0b11')` and `parseInt('0o42')` return `0` in ES5. Use `Number('0b11')` and `Number('0o42')` instead.