From: Miss Islington (bot) <31488909+miss-islington@users.noreply.github.com> Date: Sat, 25 Jan 2020 19:36:04 +0000 (-0800) Subject: bpo-36654: Add examples for using tokenize module programmatically (GH-12947) X-Git-Tag: v3.7.7rc1~67 X-Git-Url: http://git.ipfire.org/gitweb.cgi?a=commitdiff_plain;h=6dbd843dedc9e05c0e3f4714294837f0a83deebe;p=thirdparty%2FPython%2Fcpython.git bpo-36654: Add examples for using tokenize module programmatically (GH-12947) (cherry picked from commit 4b09dc79f4d08d85f2cc945563e9c8ef1e531d7b) Co-authored-by: Windson yang --- diff --git a/Doc/library/tokenize.rst b/Doc/library/tokenize.rst index 4c0a0ceef7dc..4dd56f9e7c8b 100644 --- a/Doc/library/tokenize.rst +++ b/Doc/library/tokenize.rst @@ -267,3 +267,22 @@ The exact token type names can be displayed using the :option:`-e` option: 4,10-4,11: RPAR ')' 4,11-4,12: NEWLINE '\n' 5,0-5,0: ENDMARKER '' + +Example of tokenizing a file programmatically, reading unicode +strings instead of bytes with :func:`generate_tokens`:: + + import tokenize + + with tokenize.open('hello.py') as f: + tokens = tokenize.generate_tokens(f.readline) + for token in tokens: + print(token) + +Or reading bytes directly with :func:`.tokenize`:: + + import tokenize + + with open('hello.py', 'rb') as f: + tokens = tokenize.tokenize(f.readline) + for token in tokens: + print(token)