467,117 Members | 1,134 Online
Bytes | Developer Community
Ask Question

Home New Posts Topics Members FAQ

Post your question to a community of 467,117 developers. It's quick & easy.

Which one to use: generate_tokens or tokenize?

According to the Python documentation:

18.5 tokenize -- Tokenizer for Python source
....
The primary entry point is a generator:
generate_tokens(readline)
....
An older entry point is retained for backward compatibility:
tokenize(readline[, tokeneater])
====
Does this mean that one should preferably use generate_tokens? If so,
what are the advantages?

André Roberge
Jul 18 '05 #1
  • viewed: 2475
Share:
1 Reply
[Andr? Roberge]
According to the Python documentation:

18.5 tokenize -- Tokenizer for Python source
...
The primary entry point is a generator:
generate_tokens(readline)
...
An older entry point is retained for backward compatibility:
tokenize(readline[, tokeneater])
====
Does this mean that one should preferably use generate_tokens?
Yes.
If so, what are the advantages?


Be adventurous: try them both. You'll figure it out quickly. If you
have to endure "an explanation" first, read PEP 255, where
tokenize.tokenize was used as an example motivating the desirability
of introducing generators.
Jul 18 '05 #2

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

4 posts views Thread by Kelvin@!!! | last post: by
6 posts views Thread by Ram Laxman | last post: by
2 posts views Thread by James | last post: by
5 posts views Thread by Lam | last post: by
10 posts views Thread by Mavenos | last post: by
20 posts views Thread by bubunia2000@yahoo.co.in | last post: by
1 post views Thread by Tim | last post: by
1 post views Thread by Nicolas M | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.