[Python-ideas] Disallow "00000" as a synonym for "0"
Steven D'Aprano
steve at pearwood.info
Fri Jul 17 17:22:45 CEST 2015
More information about the Python-ideas mailing list
Fri Jul 17 17:22:45 CEST 2015
- Previous message (by thread): [Python-ideas] Disallow "00000" as a synonym for "0"
- Next message (by thread): [Python-ideas] Disallow "00000" as a synonym for "0"
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
On Fri, Jul 17, 2015 at 10:28:19AM -0400, Eric V. Smith wrote: > On 07/16/2015 06:15 AM, Neil Girdhar wrote: > > As per this > > question: http://stackoverflow.com/questions/31447694/why-does-python-3-allow-00-as-a-literal-for-0-but-not-allow-01-as-a-literal > > > > It seems like Python accepts "000000000" to mean "0". Whatever the > > historical reason, should this be deprecated? > > No. It would needlessly break working code. I wonder what working code uses 00 when 0 is wanted? Do you have any examples? I believe that anyone writing 00 is more likely to have made a typo than to actually intend to get 0. In Python 2, 00 has an obvious and correct interpretation: it is zero in octal. But in Python 3, octal is written with the prefix 0o not 0. py> 0o10 8 py> 010 File "<stdin>", line 1 010 ^ SyntaxError: invalid token (The 0o prefix also works in Python 2.7.) In Python 3, 00 has no sensible meaning. It's not octal, binary or hex, and it shouldn't be decimal. Decimal integers are explicitly prohibited from beginning with a leading zero: https://docs.python.org/3/reference/lexical_analysis.html#integers so the mystery is why *zero* is a special case permitted to have leading zeroes. The lexical definition of "decimal integer" is: decimalinteger ::= nonzerodigit digit* | "0"+ Why was it defined that way? The more obvious: decimalinteger ::= nonzerodigit digit* | "0" was the definition in Python 2. As the Stackoverflow post above points out, the definition of decimalinteger actually in use seems to violate PEP 3127, and supporting "0"+ was added as a special case by Georg Brandl. Since leading 0 digits in decimal int literals are prohibited, we cannot write 0001, 0023 etc. Why would we write 0000 to get zero? Unless somebody can give a good explanation for why leading zeroes are permitted for zero, I think it was a mistake to allow them, and is an ugly wart on the language. I think that should be deprecated and eventually removed. Since it only affects int literals, any deprecation warning will occur at compile-time, so it shouldn't have any impact on runtime performance. -- Steve
- Previous message (by thread): [Python-ideas] Disallow "00000" as a synonym for "0"
- Next message (by thread): [Python-ideas] Disallow "00000" as a synonym for "0"
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Python-ideas mailing list