My first attempt at python
Will Ware
wware at world.std.com
Fri Dec 8 09:01:44 EST 2000
More information about the Python-list mailing list
Fri Dec 8 09:01:44 EST 2000
- Previous message (by thread): My first attempt at python
- Next message (by thread): My first attempt at python
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Jason Stewart (kahuna01 at hotmail.com) wrote: > This is the first piece of code that I wrote in Python... > This same functionality might be in a module somewhere, but I > wrote it for practice. Its a string tokenizer. In the "string" module, there's a function called "string.split" that does this same thing. The Python libraries are vast deep repositories of boundless ingenuity: http://www.python.org/doc/current/lib/lib.html > def tokenize(string_to_chop): > "Break a string up into individual words" > last_space = 0 > token = [] > for index in range(len(string_to_chop)): > if string_to_chop[index:index+1] == " ": > token.append(string_to_chop[last_space:index]) > last_space = index + 1 > elif (index+1 == len(string_to_chop)): > token.append(string_to_chop[last_space:index+1]) > return token Instead of writing "string_to_chip[index:index+1]", you can just write "string_to_chop[index]". Two lines later you write "last_space = index + 1" which suggests that you expect the tokens on the line to always be separated by exactly one space; otherwise you'll end up with confusing results. Try tokenize("abc def ghi") and see what it does. -- # - - - - - - - - - - - - - - - - - - - - - - - - # Resistance is futile. Capacitance is efficacious. # Will Ware email: wware @ world.std.com
- Previous message (by thread): My first attempt at python
- Next message (by thread): My first attempt at python
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Python-list mailing list