A few years ago, Khalid wrote a wonderful script that would help make your apache server sane again after the opcode caching on the server started to throw segmentation faults (he aptly named it logwatcher). It was great because APC would crash for unknown reasons at the time and completely kill a website. It took care of an important issue (the one change I had made was to clear the apc cache instead of restarting the server but all in all, super ^_^).
Nearly every place, I work at, a common problem comes up that is not implemented: backing up the database(s). Working with Drupal, I commonly see the usage of Backup and Migrate for backing up databases. Which isn't a bad first approach. But a number of questions come to mind - what if you have a database server that doesn't just serve Drupal websites (I have a db server that hosts sites running wikimedia, wordpress, and in the past Ruby on Rails applications)?
I was asked to archive a site whereby I could not host the codebase on our servers AND I did not have access to the database or media assets in any way. Instead of figuring out things to do with wget, we figured out a way to archive our website using HTTrack.
At CalArts, I have wanted to move the search functionality on our Drupal powered websites into something better for a long time. We have been using the Lucene API (which is lucene search ported to PHP) module on most of them since September of last year but (even though I am a big fan of the module) we truly wanted a way to offload the search services onto another vps (or server; basically, something more flexible).