Only store new session data if the data is non-default
Forcing every person (and web crawler) who visits the site to have a unique cookie means that every page is unique, and thus un-cacheable. Also, Googlebot crawling my site results in millions of sessions clogging the database, that all contain default data.
I've made a small change to web/session.py:Session._save so that it only stores sessions and sets cookies if the data is different from the initializer - with that done, I can now stick Varnish in front of the site, and it correctly serves cached pages to anonymous browsers and dynamic pages to logged-in users, drastically reducing the load.
Could this (or similar code to the same effect) be merged? I think it would be generally useful, and it saves me having to use a custom build of web.py for each of the sites I run :P
def _save(self):
current_values = dict(self._data)
del current_values['session_id']
del current_values['ip']
if not self.get('_killed') and current_values != self._initializer:
self._setcookie(self.session_id)
self.store[self.session_id] = dict(self._data)
else:
if web.cookies().get(self._config.cookie_name):
self._setcookie(self.session_id, expires=-1)