From: Meador Inge Date: Thu, 19 Jan 2012 06:17:44 +0000 (-0600) Subject: Issue #2134: Clarify token.OP handling rationale in tokenize documentation. X-Git-Tag: v2.7.3rc1~149 X-Git-Url: http://git.ipfire.org/gitweb.cgi?a=commitdiff_plain;h=da747c3d977d2c90e9877be7264adf7516bbf599;p=thirdparty%2FPython%2Fcpython.git Issue #2134: Clarify token.OP handling rationale in tokenize documentation. --- diff --git a/Doc/library/tokenize.rst b/Doc/library/tokenize.rst index 30677eaadce7..707503528157 100644 --- a/Doc/library/tokenize.rst +++ b/Doc/library/tokenize.rst @@ -15,6 +15,12 @@ implemented in Python. The scanner in this module returns comments as tokens as well, making it useful for implementing "pretty-printers," including colorizers for on-screen displays. +To simplify token stream handling, all :ref:`operators` and :ref:`delimiters` +tokens are returned using the generic :data:`token.OP` token type. The exact +type can be determined by checking the token ``string`` field on the +:term:`named tuple` returned from :func:`tokenize.tokenize` for the character +sequence that identifies a specific operator token. + The primary entry point is a :term:`generator`: .. function:: generate_tokens(readline) diff --git a/Misc/NEWS b/Misc/NEWS index 0233823bfb45..2193af0c468d 100644 --- a/Misc/NEWS +++ b/Misc/NEWS @@ -495,6 +495,9 @@ Tests Documentation ------------- +- Issue #2134: The tokenize documentation has been clarified to explain why + all operator and delimiter tokens are treated as token.OP tokens. + - Issue #13513: Fix io.IOBase documentation to correctly link to the io.IOBase.readline method instead of the readline module.