[Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage?
Chris Withers
chris at simplistix.co.uk
Wed Oct 14 09:11:46 EDT 2015
More information about the Python-Dev mailing list
Wed Oct 14 09:11:46 EDT 2015
- Previous message (by thread): [Python-Dev] Rationale behind lazy map/filter
- Next message (by thread): [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
Hi All, I'm having trouble with some python processes that are using 3GB+ of memory but when I inspect them with either heapy or meliae, injected via pyrasite, those tools only report total memory usage to be 119Mb. This feels like the old "python high water mark" problem, but I thought that was fixed in 2.6/3.0? Under what circumstances can a Python process still exhibit high memory usage that tools like heapy don't know about? cheers, Chris PS: Full details here of libraries being used and versions here: https://groups.google.com/forum/#!topic/celery-users/SsTRZ7-mDMI This post feels related and seems to suggest the high water mark problem is still there: http://chase-seibert.github.io/blog/2013/08/03/diagnosing-memory-leaks-python.html
- Previous message (by thread): [Python-Dev] Rationale behind lazy map/filter
- Next message (by thread): [Python-Dev] under what circumstances can python still exhibit "high water mark" memory usage?
- Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
More information about the Python-Dev mailing list