Simple lru cache?
Ype Kingma
ykingma at accessforall.nl
Sat Oct 5 04:48:16 EDT 2002
More information about the Python-list mailing list
Sat Oct 5 04:48:16 EDT 2002
- Previous message (by thread): Simple lru cache?
- Next message (by thread): Simple lru cache?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Steve Holden schreef: > "Martin v. Loewis" <martin at v.loewis.de> wrote in message > news:m3elb5etuk.fsf at mira.informatik.hu-berlin.de... >> "Steve Holden" <sholden at holdenweb.com> writes: >> >> > What, you mean like >> > >> > mylst = mylst[:n] + mylst[n+1:] + [mylst[n]] >> > >> > for example? >> >> I think Ype will critize this solution for its O(m) complexity, where >> m is the number of elements in the cache. >> > > Well, the order of the behavior wasn't metnioned in the question, but I > would quite understand if he felt it wasn't a good idea to reconstruct the > whole list that way. I know that the order of magnitude is still larger than when using the doubly linked list, but the solution Steve gave at least allows to have these long operations done in loops implemented within the interpreter and not needing a for statement with all the name lookups getting in the way. It also allows to get rid of all the doubly linked list elements. The idea is to save a database lookup, which currently takes around 10msec avarage, which is good for quite a bit of work on current CPU's. > The fact is that an LRU cache is far from simple! It's nice to be reassured that investing some human time in a classical space/time tradeoff is worthwhile :) Thanks guys, Ype
- Previous message (by thread): Simple lru cache?
- Next message (by thread): Simple lru cache?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Python-list mailing list