IP address changes

All my IP addresses have rather hurriedly changed. If you're using the names below, you'll be fine and when the DNS changes propagate (~3h) everything will work again.

If you're not, you'll need to update things. Ideally to using the names :) fairygodmother.avi.co is now the same IP address as bigbadwolf.avi.co, not just the same host.

Old New Name
80.87.131.17 80.87.129.54 swamp.avi.co
80.87.131.16 80.87.128.7 bigbadwolf.avi.co
80.87.131.48 80.87.130.153  donkey.avi.co
80.87.135.231 80.87.130.154 merlin.avi.co
80.87.131.18 80.87.128.7 fairygodmother.avi.co

Postfixadmin Installer for Wheezy

Debian Wheezy ships with Dovecot 2.x which has a different config layout to the 1.x verion in Lenny and Squeeze. In response, I've created a wheezy branch of postfixadmin-installer (there's an issue for it, too) which configures Dovecot 2.x and it's actually been a really easy switch.

In much the same way as the current version generally does away with the heavily commented documentation masquerading as a config file, this one simply moves /etc/dovecot out of the way and writes two files into it - dovecot.conf and dovecot-sql.conf (which are the same as for 1.x). This causes a pretty hilarious reduction in filesize, too:

root@pfa:~# find /etc/dovecot/ -type f -exec cat {} \; | wc -l
48
root@pfa:~# find /etc/dovecot_2013-01-29/ -type f -exec cat {} \; | wc -l
1772
root@pfa:~#

Anyway, with some incredibly limited testing, and assuming you have already installed dovecot, this seems to work. If you want to test it (please!), enable Wheezy backports in Squeeze and then:

apt-get install libwww-perl mysql-server postfix
apt-get -t squeeze-backports install dovecot-common dovecot-imapd dovecot-pop3d
wget --no-check-certificate https://raw.github.com/BigRedS/postfixadmin-installer/wheezy/postfixadmin-installer
perl ./postfixadmin-installer

And, finally, here's that working config I'm using, in case that's what you're after:
/etc/dovecot/dovecot.conf

protocols = imap pop3
log_timestamp = "%Y-%m-%d %H:%M:%S "
mail_location = maildir:/var/vmail/%d/%n
mail_privileged_group = vmail
# This should match that of the owner of the /var/lib/vmail hierarchy, and
# be the same as the one postfix uses.
first_valid_uid = 999
# Allow people to use plaintext auth even when TLS/SSL is available (you
# might not want this but it is handy when testing):
disable_plaintext_auth = no
# Uncomment this to get nice and verbose messages about authentication
# problems:
# auth_debug=yes

ssl = no

protocol imap {
}

protocol pop3 {
  pop3_uidl_format = %08Xu%08Xv
}

# 'plain' here doesn't override the disble_plaintext_auth_default of 'yes'.
# you should add any other auth mechanisms you want
#auth_mechanisms = plain
userdb {
  driver = sql
  args = /etc/dovecot/dovecot-sql.conf
}
passdb {
  driver = sql
  args = /etc/dovecot/dovecot-sql.conf
}

service auth {
  unix_listener /var/spool/postfix/private/auth {
    mode = 0660
    # yes, 'postfix' (or the user that owns the above socket file), not vmail
    user = postfix
    group = postfix
  }
}

/etc/dovecot/dovecot-sql.conf

connect = host=localhost dbname=vmail user=vmail password=1lgI2ehK6aEqytjkeDFT4Z7Pq
driver = mysql
default_pass_scheme = MD5-CRYPT
password_query = SELECT username AS user,password FROM mailbox WHERE username = '%u' AND active='1'
user_query = SELECT maildir, 999 AS uid, 122 AS gid FROM mailbox WHERE username = '%u' AND active='1'

Tidying up postfixadmin installer

I've *finally* merged about a billion changes into master in postfixadmin installer, chief amongst them is that most of the boring output now goes to a logfile, the vacation plugin might work after install and it the setup password is randomised. This is all procrastination in order to avoid working out how to configure Dovecot on Wheezy.

It's still a big pile of poor hacks rather than a 'proper' script, but if you just don't look at the source you'll be fine!

Sitecreator

I've just spent a few days using up spare holiday, which means I've been making things for work that work doesn't want but I do. This time it's sitecreator, a tool for configuring websites and all their dependencies (Unix users, databases, ssh keys, DNS records etc.) on servers.

Since there's so many possible things for the site to rely upon, and I'm not *that* fond of reinventing the wheel, all it really does is generate passwords and call scripts. There's a configuration file that tells it how many passwords to generate, how to work out what the username should be and perhaps to generate a couple of other things (like database names) if needed. Another bit of the config then explains which scripts to call and with which arguments (including these recently-generated passwords and usernames), and at the end it tells you what it thinks it did. I've written a few scripts for it already (mirroring what I want to do with it).

For example, here's a relatively simple config file with some explanation of what's going on, and some output with that configuration:

avi@amazing:~$ sitecreator example.com
MySQL:
        username: example
        password: gN?@c6$Y7}Y{yg
        database: example

SSH:
        username: example
        password: r;x6kEgO!

MySQL dev:
        username: example_dev
        password: vA!)9WIMo&by}'
        database: example

avi@amazing:~$

And there's at least another example config file in etc/config/. Anyway, hopefully this'll be useful to somebody else who isn't quite into automation enough to have already done this (or to have started using puppet or similar), but does have enough users or systems to configure that some automation would be good.

Oh, it's not very tested yet, and I've still not come up with a sane thing to do with the output from the scripts :)

Network Manager disabling Virt-manager’s bridge

This doesn't work, and it's filed as bug 1099949 in Ubuntu. So we'll see how that goes.

As of about six hours ago, I've had this regularly popping up in my syslog:

Jan 13 20:13:54 amazing NetworkManager[1347]:  (virbr0): device state change: unavailable -> disconnected (reason 'none') [20 30 0]

virbr0 is the bridge created by virt-manager for its VMs to communicate on and, franky, NetworkManager has no business doing anything to it, let alone disconnecting it (especially when it doesn't know why it's doing it).

Fortunately, NetworkManager has an unmanaged-devices option that you can put in the irritatingly-capitalised file at /etc/NetworkManager/NetworkManager.conf. It belongs in the keyfile section (so you need to make sure keyfile is listed under plugins:

[main]
plugins=ifupdown,keyfile
dns=dnsmasq

[ifupdown]
managed=false

[keyfile]
unmanaged-devices=mac:2e:e7:1f:7c:ef:76

Annoyingly, there doesn't appear to be a 'managed-devices' configuration, and virbr0's mac address changes from time to time. So far, sticking this at the end of /etc/rc.local to get the mac address of virbr0 and replace the old one in that file seems to be working:

#! /bin/bash

echo -n "Before  : "
egrep '^unmanaged-devices' /etc/NetworkManager/NetworkManager.conf
mac=$(ifconfig virbr0 | grep HWaddr | awk '{print $NF}');
echo "New mac : $mac";
perl -pi -e "s/^unmanaged-devices.+/unmanaged-devices=mac:$mac/" /etc/NetworkManager/NetworkManager.conf
echo -n "After   : "
egrep '^unmanaged-devices' /etc/NetworkManager/NetworkManager.conf

Half an hour in, I've still got network connectivity on my VMs! :)

Converting from Apache1-style to (Debian-style) Apache2-style vhosts

Yeah, some of us are still doing that migration.

Anyway, historically Apache vhosts are all in one file at /etc/apache/httpd.conf or if you're really lucky something like /etc/apache/vhosts.conf.

Apache2 in Debian uses two directories - /etc/apache2/sites-available and /etc/apache2/sites-enabled. sites-available contains one file for each vhost and in order to enable them they're linked to from sites-enabled. This is all fairly nice an elegant and human friendly, but tedious to migrate to from Apache1.

Since this one's coincided with a feeling that I should know more awk here's how I just did this one:

cp /etc/apache/vhosts.conf /etc/apache2/sites-available
awk '/^"vhost" n }' vhosts.conf
for i in $(ls vhost*); do name=$(grep -i ^ServerName $i | awk '{print $2}'); mv $i $name ; done
rm /etc/apache2/sites-available/vhosts.conf

Yeah, the name should be doable in the initial awk, but by that point I sort-of just needed to get it done.

Finding exploited wordpress pages

WordPress seems to be hilariously easy to compromise (this might be a bad place to write that) and the general form of an exploit is to inject code like this

  1. < ?php $a = base64_decode(YSBsb25nIHN0cmluZyBvZiBiYXNlNjQgdGV4dAo=.......);

right at the top of a script. base64_decode is rarely used by the Good Guys outside of mailers and doing tricks with images, but it's almost never found right at the top of a script. I did write a really convoluted script that found calls to base64_decode and exec and guessed whether they were nefarious (generally, for example, base64_decode is called with a variable (base4_decode($mailBody)), not just a string (base64_decode(dGV4dAo=)) but that just ate all my I/O and didn't really work.

So I came up with a much cruder way of doing it. Have a script called ~/bin/base64_in_head

#! /bin/bash
file=$1
head $file | grep base64 2>&1 >/dev/null || exit 1;
echo $file
exit 0;

And then run it like this:

$ ionice -c3 find /home/user/public_html/ -name \*.php -exec ~/bin/base64_in_head {} \;

I've not yet had a situation where that's missed a file that later manual greps have found.

Unattended Virtualmin installs

A while ago I was asked to concoct a fire-and forget script to install Virtualmin without prompting.

It's really easy:

#!/bin/bash
if [ -z $1 ]; then
        echo "Usage"; echo "  $0 [hostname]"; echo ""; exit
fi
wget http://software.virtualmin.com/gpl/scripts/install.sh -O install.sh
export VIRTUALMIN_NONINTERACTIVE="1"
chmod +x install.sh
./install.sh -f -host $1
rm install.sh

And then you call it like so:

./virtualmin.sh virtualmin.vm.avi.co

Per-extension logging in MediaWiki

This is another of those things that took me rather longer to work out than I would have liked, so hopefully this'll appear in the sorts of searches I should have done.

MediaWiki has this nifty feature where you can split the logging for particular extensions out into individual files by doing things like this:

  1.  
  2. $wgDebugLogGroups = array(
  3. 'SomeExtension' => '../logs/wiki_SomeExtension.log',
  4. );
  5.  

What's not made overly clear (well, with hindsight, it is implied by the manual) is that the keys of the hash don't necessarily have anything to do with the name of the extension. I assumed that, in debugging SimpleCaptcha, what I wanted was

  1.  
  2. $wgDebugLogGroups = array(
  3. 'SimpleCaptcha' => '../logs/wiki_SimpleCaptcha.log',
  4. );
  5.  

But not so! What I actually wanted was

  1.  
  2. $wgDebugLogGroups = array(
  3. 'captcha' => '../logs/wiki_confirmedit.log',
  4. );
  5.  

And, as far as I can find, this isn't documented *anywhere*. For other extensions lacking in documentation so, you can find this out by poking around in the code, and looking for where the extension does this sort of thing:

  1.  
  2. function log( $message ) {
  3. wfDebugLog( 'captcha', 'ConfirmEdit: ' . $message . '; ' . $this->trigger );
  4. }
  5.  

That first argument to wfDebugLog is what you want as the key in the hash. Why it can't just use the name of the class invoking it, which is the name used to configure the rest of the extension, I've no idea.

Allowing uploads of arbitrary files in MediaWiki

I did RTFM and I did what it said, and still my Mediawiki complained when I tried to upload executable files and things with funny file extensions or mime types. if $wgFileExtensions is empty but $wgEnableUploads = true and $wgStrictFileExtensions = false it should just let me upload anything. I can't think what other behaviour one would expect there, but set like that I can't upload my dodgy files.

So I've removed the code it uses to check.

Here's a pair of diffs if you'd also like to do this. These are on version 1.17.0 but I suspect it's not changed very much.

This just comments out the two blocks of code in UploadBase.php which check whether files are considered safe and warn if they're not - it prevents the checking and the warning:

  1. wiki:/home/wiki/public_html# diff includes/upload/UploadBase.php includes/upload/UploadBase.php.bak
  2. 447,455c447,454
  3. < // ## Avi Commented this out so that we can upload whatever we like to our server. That was nice of him
  4. < // // Check whether the file extension is on the unwanted list
  5. < // global $wgCheckFileExtensions, $wgFileExtensions;
  6. < // if ( $wgCheckFileExtensions ) {
  7. < // if ( !$this->checkFileExtension( $this->mFinalExtension, $wgFileExtensions ) ) {
  8. < // $warnings['filetype-unwanted-type'] = $this->mFinalExtension;
  9. < // }
  10. < // }
  11. < //
  12. ---
  13. > // Check whether the file extension is on the unwanted list
  14. > global $wgCheckFileExtensions, $wgFileExtensions;
  15. > if ( $wgCheckFileExtensions ) {
  16. > if ( !$this->checkFileExtension( $this->mFinalExtension, $wgFileExtensions ) ) {
  17. > $warnings['filetype-unwanted-type'] = $this->mFinalExtension;
  18. > }
  19. > }
  20. >
  21. 557,570c556,569
  22. < // ## Avi Commented this out so that we can upload whatever we like to our server. That was nice of him
  23. < // /* Don't allow users to override the blacklist (check file extension) */
  24. < // global $wgCheckFileExtensions, $wgStrictFileExtensions;
  25. < // global $wgFileExtensions, $wgFileBlacklist;
  26. < // if ( $this->mFinalExtension == '' ) {
  27. < // $this->mTitleError = self::FILETYPE_MISSING;
  28. < // return $this->mTitle = null;
  29. < // } elseif ( $this->checkFileExtensionList( $ext, $wgFileBlacklist ) ||
  30. < // ( $wgCheckFileExtensions && $wgStrictFileExtensions &&
  31. < // !$this->checkFileExtension( $this->mFinalExtension, $wgFileExtensions ) ) ) {
  32. < // $this->mTitleError = self::FILETYPE_BADTYPE;
  33. < // return $this->mTitle = null;
  34. < // }
  35. < //
  36. ---
  37. >
  38. > /* Don't allow users to override the blacklist (check file extension) */
  39. > global $wgCheckFileExtensions, $wgStrictFileExtensions;
  40. > global $wgFileExtensions, $wgFileBlacklist;
  41. > if ( $this->mFinalExtension == '' ) {
  42. > $this->mTitleError = self::FILETYPE_MISSING;
  43. > return $this->mTitle = null;
  44. > } elseif ( $this->checkFileExtensionList( $ext, $wgFileBlacklist ) ||
  45. > ( $wgCheckFileExtensions && $wgStrictFileExtensions &&
  46. > !$this->checkFileExtension( $this->mFinalExtension, $wgFileExtensions ) ) ) {
  47. > $this->mTitleError = self::FILETYPE_BADTYPE;
  48. > return $this->mTitle = null;
  49. > }
  50. >

And this just stops Setup.php making-safe the $wgFileExtensions array by removing whatever's in $wgFileBlacklist from it, which I think wouldn't complain had I not already done Bad Things to those two variables, but it's late and it can't hurt to turn this off, too:

  1. wiki:/home/wiki/public_html# diff includes/Setup.php includes/Setup.php.bak
  2. 296,298c296,297
  3. < // ## Avi Commented this out so we can upload whatever we like to our server. That was nice of him
  4. < //# Blacklisted file extensions shouldn't appear on the "allowed" list
  5. < //$wgFileExtensions = array_diff ( $wgFileExtensions, $wgFileBlacklist );
  6. ---
  7. > # Blacklisted file extensions shouldn't appear on the "allowed" list
  8. > $wgFileExtensions = array_diff ( $wgFileExtensions, $wgFileBlacklist );
  9.