« Bugzilla Issues Index

#324 — How should Intl.NumberFormat format negative zero?

It's not clear from the specification how negative zero (-0.0) should be formatted. Should it use the negativePattern or the positivePattern?

I understand there is a possibility that the spec would need to display zero using a negative pattern (format -0.01 with one significant digit), but what if the input is the IEEE negative zero, which fails the x < 0 test in 12.3.2 step 4(a).

IEEE negative zero is an artifact of the representation of numbers in IEEE 754; it's not relevant to users, and so should be treated in the same way as positive zero. I think the algorithms in the current spec version do that.

The case of a real negative number displayed as -0.0 is more interesting. I doubt that -0.0 makes much sense to users, but don't have evidence, and at least ICU4J happily produces this string.