As somebody who runs multiple webservers each running a set of Django sites, keeping on top of my Python stack is very important. Out of (probably bad) habit, I rely on Ubuntu for a number of my Python packages, including python-django and a lot of python-django-* extras. The websites require these to run but as long as the package still exists, this isn't an issue. I do this rather than using VirtualEnv (et al) because I want Ubuntu to install security updates.
However the Ubuntu repos don't cater for everybody. There are cases where I'll use pip or easy_install to suck in the latest version of a Python package. When you update Python (as occasionally happens in Ubuntu), you lose all your pip-installed packages.
What terrifies me is the deeper I get, the more servers I administer, there's going to be an OS update one day that requires hours and hours of my time running around, testing sites, reinstalling python packages through pip. The worst bit of this is potential downtime for client sites though I do test on my development machine (always at Ubuntu-latest) so this should offset some of that worry.
Is there anything I can do to make sure updates to Python mean the existing, non-dpgk'd Python packages are brought forward?
That would make sure I always had access to the same packages. I'd still have to test for incompatibilities but it would be a good start.
There's perhaps one better solution: an application that behaved like apt and dpkg but for interacting with PyPi (where pip and easy_install get most of their mojo). Something that stored a local list of installed packages, checked for updates like apt, managed installing, etc. Does such a thing exist? Or is it a rubbish idea?