Debugging And Removing A Spam Attack Through Postfix

Some days it’s a bad idea to ignore (or rather, just not check) email. And ignore weird stuff when you see it. Turns out for two days straight my server has been spewing out spam, after a user on the server had their password compromised. The server is a Ubuntu server with Postfix as the underlying mail transport.

The spam was in the form of a from address of, where xxx was a random string. The server doesn’t allow relaying, so to send mail from (hosted on my server) to random other domains (yahoo, gmail, etc) they’d have to be sending it as a user on my system.

To temporarily fix things, all the email that was in the queue got put on hold.

# postsuper -h ALL

This puts the mail on the back burner until you figure out what to do. The server won’t try to deliver it at all until you “un-hold” it. This had to be done a couple of times before I figured out where the spam was coming from. Thing is, what to do with 600,000+ emails sitting in the hold queue?

First thing was to figure out where the mail was coming from. I looked through the logs and it seemed like all the spam mail seemed to be sent through the same user.

Oct 27 10:01:14 amarok postfix/smtpd[25071]: C3E4FB1CEA3: client=unknown[], sasl_method=LOGIN, sasl_username=bob

Ok, so looks like ‘bob’ got his email login compromised. Ok, now at least there’s a starting point. A bit of digging through the logs I found about where the spam started, and confirmed that by checking where the user was logging in from. Unless he could jump from Vancouver to Bulgaria in a minute, and then decided to send mail every second, it was pretty easy to figure it out.

Ok, so what now?

Next step, get a list of the IPs that the user was logging in from sorted and uniqued so I had each of the IPs used to spam.

# grep sasl_username=bob /var/log/mail.log | sort -u > iplist.txt

Now I tried a few different things to figure out if the IPs were real or not. I figure if they came from the Vancouver area they were probably legit, but if they were from Asia, Russia, or a host of other countries they were probably not valid. I used a few different methods to try to do an automated lookup of where the IPs were from, but the reverse lookup tools seem to be inconsistent at best for automated lookups.

In the end I basically used a network tool to do a lookup like this: where the IP I looked up sometimes was just the first number, i.e.: Honestly after a while I just deleted the IPs I knew were commonly used by Rogers and Shaw, and then deleted everything else:

 root@server:/var/log# grep -wFf /root/iplist.txt m.log | cut -f 6 -d ' ' | cut -f 1 -d ':' | postsuper -d -

This greps for sources in iplist.txt, in m.log (which was a combination of mail.log mail.log.1 and mail.log.2 (the three days of log files I knew had relevant data). Any resulting log file messages are cut up until just the mail queue ID was left, and then that is piped into postsuper which deletes it.

After a few of those I was down to 5,000 messages in the queue, down from 600,000. Not bad. Still a few to deal with though.

# mailq | grep | awk '{ print $7 }' | sort -u

Now that’s a list of all the email addresses in the queue that are “bad” and need to be deleted. So using a great little tool for deleting postfix messages by to or from address I called pfdel, I did this to now run each of those bad emails through it and delete those messages from the queue:

# for i in `mailq | grep | awk '{ print $7 }' | sort -u` ; do $i ; done

This took me down to 38 messages in the hold queue, which were easily looked at to see if they were legit (hint: if it was coming from a .ru or .br address, or a spammy looking domain, it got nuked).

So that’s it, short, sweet, and not the way I wanted to spend my Sunday night. Now dealing with removing my server from all the blacklists, that’s another issue… Ugh :(

Read More

Moving From nVidia To ATI In Ubuntu Karmic Koala

A bit ago I got an almost-free computer from work with a couple of parts I wanted to upgrade my home system, bumping the CPU a bit and replacing the video card that goes WRRRAAAAAAWWWRRRRRRR-ER-ER-ER WRRRRRAAAAAWWWWRRRR with something a little quieter. Last night I finally got around to switching the parts, and found a couple of issues moving from the nVidia TI-4200 video card to an ATI Radeon 9250, so I figured I’d document it as an excuse to get something on this page so it doesn’t completely stagnate.

First of all, I did it completely wrong, and simply shut down the system, swapped the video cards and booted it up. The graphics, which were configured for the nVidia card completely failed when booting up, giving me just a flashing console as it tried to run X11 with a driver for a card that wasn’t installed at all.

I was lucky and had a computer next to me that I could ssh into the linux system and do maintenance remotely, however if you don’t have this, read this to find out how to edit the GRUB config on boot to enter single user mode.

To recover the video, the first step was to remove that customization from the /etc/X11/xorg.conf file. I have no idea why I had device specific configuration in there, as the last couple of Ubuntu releases have required only a skeleton file and will figure out your video cards for you. So the section I had at the bottom which looked like this:

Section "Device"
Identifier "Configured Video Device"
Driver "nvidia"
Option "NoLogo" "True"

was removed.

Next I removed the nvidia packages I had installed:

$ dpkg -l | grep nvidia | grep ii
[list of packages, for me they were the "173" version, you might have different ones]
$ sudo apt-get remove nvidia-settings nvidia-glx-173 nvidia-common nvidia-173-kernel-source

And then install the non-free, evil, binary only packages for the ATI drivers (in theory I could have rebooted, let the system boot up with the default video drivers and install the restricted drivers from the ‘hardware’ application under the system configuration menu).

$ sudo apt-get install fglrx-kernel-source xorg-driver-fglrx

Packages install, reboot, and voila, the system should boot up as normal.

Read More

Leopard Time Machine Backups To an Samba SMB Share

The short answer is to look here for the instructions that finally made it all work for me. After a little “mishap” with my addressbook a few days ago (helpfully synced, empty to the net and to my phone, so nowhere had a backup) I finally figured that checking out the much touted “Time Machine” backup software from apple probably would be a good idea.

Course, being that I use my laptop 99% of the time on my lap on the couch, having a external hard drive connected would kinda suck. Also I’m not paying the apple tax to get the Time Capsule hardware when I have a plethora of unused external disks here.

So what I did:

  • First step was get one, plug a 500G disk into my linux workstation.
  • Format it as ext3 and mount it in a safe place

    • Find the disk’s UUID
    • Put it in /etc/fstab

  • Make it a share in Samba
  • Mount it under leopard and make sure it automounts.
  • Follow the instructions here to tell Leopard to allow unsupported mounts (the defaults write… command) and test.
  • In my case it failed, and I got the “backup disk could not be created” error, so I followed the rest of the instructions to create the sparse image and copy it to the samba share.
  • Test again, correct typeos.
  • Run the Time Machine config to select the share. Wait for the long ‘processing’ to finish.
  • Success!
  • Run backup.

I now feel a little bit safer, and get to use the funky Time Machine interface :)

Read More

Allow Me To De-Fanboy A Bit (Linux)

First of all, I love Linux. I discovered it in 1994(ish) right around the time that kernel 1.0 was just about to come out. The fact it was a home brew system, that it was a rebellious band of freedom fighters coming together as a community to create something great, and that anyone could contribute with ease all tickled my heart.

One of the great things about linux is that there is a real feeling that the community actually listens. I have no idea how you’d submit a bug to Microsoft or Apple, I really don’t. I don’t know if I need to be an MVP, an Gold level partner, a ACG (Apple Certified Genius(tm)(r)(c)) (actually I just totally made that up), or what. With Linux distros, desktops and major (and minor) projects there’s generally a way to submit bugs or get support via bug tracking sofware of some sort, and more often than not you get a response. You can even submit suggestions and there is a good chance that it will actually get implemented or fixed (I believe a bug I reported to the Mozilla browser got fixed, go me!).

However there are now three bugs that I’ve reported and recently got updates about that pissed me off a bit.

[Bug 145524] Show emblems from Nautilus

In the main file manager for GNOME there is some inconsistancy with the “emblems” you can apply to folder icons… seems like a simple fix to me, but I’m not a GTK coder. The only real activity this has gotten over the last 4 years is people agreeing with me that it’s a bug, showing examples of it, asking about the status, doing the classic OSAM (Open Source Asshole Move… I totally just made that up too) of “well you write a patch then!”

[Bug 85973] no drop down box for “find” in groups

This is another simple (to me anyway) omission when Pan (my newsreader of choice) got rewritten from the GTK 1.x to GTK 2.x toolkits. Basically the whole thing was re-written and a few minor details were missed. No biggie I guess. Except that in six (6) years this hasn’t been done. It was re-prioritized around 2004, a suggestion made in 2006, and then in 2008 the commenter from 2006 was reproached for the way they did a OSAM.

[Bug 123796] hiding a panel with dual monitor moves panel to other monitor

This was an issue I got probably around the time I first got a dual monitor setup in 2003, 5 years ago, with an easily reproducible issue regarding panels and dual monitor setups. The only activity it’s gotten in 5 years is people reporting that it’s still happening.

These are the sort of things that turn people off to FLOSS and the community, which is sad. If you’re not a coder with a good knowledge of the guts of the system the main ways you can contribute is to write documentation and report bugs, and people asking “how can I help” are constantly told this. But when those contributions are ignored for months and years, well, why bother? To put things in perspective, the last bug was reported on GNOME 2.4, the current version is 2.22, that’s 9 releases (assuming that they did actually go all the way from 2.4, 2.6, 2.8 .. 2.22).

From the developer perspective of course, I’m sure it’s tough dealing with an army of demanding whiners complaining that some incidental bug on some hardware or software set up that you may not have complaining that you’re not fixing at the same time as you have a day job and put out software in your spare time for free, and not doing anything to help you along other than mention that it’s still not fixed…. still not fixed…. still not fixed…..

Course, at least I don’t have to pay Microsoft $99 per incident to be told it’s not a bug, it’s a feature, and it’s my fault anyway. Well, that’s how I imagine it is anyway. A search for “How to report a bug to microsoft” has the top results either dead links (requiring me to login with a / email address), outdated information or articles saying how it’s impossible to submit a bug to Microsoft. Submitting a search for “report a bug” from the front page of crashes Firefox!!!

I don’t know if it’s better to not be able to report bugs, or report them and have them ignored… :)


Seems people have been going and updating old bugs lately…. the following just got an update:

[Bug 113556] META Refresh does not update back history as expected

This is the oldest bug in the list, one I reported to mozilla in 2001… looks like mozilla is doing a big bug-triage and this bug is being shuffled around.

Read More

Thoughts on the iPod and Music Management

As I said I just got an iPod, so now I have to deal with managing the music for it in my particular situation. I have the following going on at home right now:

  • A fileserver with approximately 140G of mp3s all nicely tagged via MusicBrainz‘s tools and organized nicely into directories (ie: music player independant). All music is stored there.
  • Laptop from work. “Daily” music is stored here, podcasts are downloaded for playing at work, and when I download my radio show during the day at work, I load it onto my mp3 player from here.
  • Windows workstation with headphones. Minimal music management is done here, this is a gaming and accounting box.
  • Linux workstation, hooked up to speakers. Music playing is done here, music management (basically rating songs and creating playlists) is done in Rhythmbox. No podcasts here.

I’ve never had a decent mp3 player before, the one I have now has 256mb of space, so not a lot of daily changes are made.

Anyway, music management on the iPod is supposed to be done through iTunes which is a nice application, however it has a few downfalls.

You can only sync your iPod on one computer.

Bah. Unless there’s a trick around it, when you sync an iPod to a computer, and then take it to another one and try to sync it, it tells you the device is new and anything on it will be nuked off. Say if I were to sync music for the gym on it at home, then try to put the radio show or something new on it at work. No fun at all.

Except that the new iPods don’t work under Linux thanks to Apple encrypting stuff. It’s already been broken, but hasn’t made it’s way into any released products.

What I was hoping to do was to sync things like my radio show and podcasts via the laptop, and load up the music under Linux. Looks like that won’t work, at least for now :(

My solution in the meantime….

I’m copying 141G of music onto my work laptop, in the theory that it moves from work to home and back. Inconvenient however as I don’t want to have to pull out the laptop each time I add some new music. Of course, the laptop doesn’t have that big a hard drive, so I’m mounting it on the laptop and adding all the music in it to the iTunes library. So in theory it’ll just show up as “can’t find file” when I’m not at home.

This will let me get everything I need, for now. I’m hoping that I can, in theory, manage songs via Linux when I get there.

Anyone have better ideas? It comes down to:

  • Can’t sync an iPod on more than one computer with iTunes.
  • iTunes doesn’t run (well) on Linux (last time I checked) and isn’t what I want to use under Linux anyway.
  • Can’t manage an iPod from Linux

Anyone have anything similar? Any better ideas?

Read More

Gentoo Apache 2 Updates

Just so my bitching isn’t restricted to Microsoft, today a fair chunk of time fixing the webserver after the latest version of Apache that is in the Gentoo repository. Version 2.0 has been installed and working fine for a while, but the config layout seemed to change and a bunch of things moved. Generally I try to not change config files that don’t need to be replaced if I can avoid it, less chance of messing up an existing config. However, it seems that the new install didn’t like that, and it was only after lots of mucking around (and many times thinking “man, I should downgrade to 2.0”) I finally realized that I basically had to blow away all the old configs (the new default server layout had changed) and leave only my own virtual host settings before it would finally work. Bleah. I keep on wanting to switch to Ubuntu (still) but just can’t bring myself to undo all the tweaking and setup I’ve done. However, when the server is upgraded with gobs of more disk space (1TB drives are getting cheaper all the time) I will move over to something a little less volatile :)

Read More

Linux Northwest 2007 Wrapup

Long time no blog. Feeling a bit down lately, busyness, stress, life, work etc all sucking, so I’ve been pretty quiet. A nice change from that was Saturday myself, Dana, Wim and Tammie and Clayten went down to Linuxfest Northwest, the yearly geek fest down in Bellingham. Definitely not a place for non-geeks though… there was lots of heavy heavy geeking out going on :)

We were a bit late getting down there due to border traffic, so I missed the Xen presentation given by Reverend Ted. I chose to be a complete bastard and not go to Dana’s presentation on strong authentication, and instead went to the Production Grade Scripting by Brian Martin. Quite a good presentation, though the first half was a bit slow. I was impressed with the way he set up his scripts and how output was filtered, and the shell of the script has all the fun stuff already set up to go.

Next on the list was OpenID which was kinda meh. Lots of interesting discussion about security afterwards, but since OpenID is still relatively new, and hasn’t really found it’s place in the world, it’s sort of hard to address what it does right and wrong when no one really knows where it’s going or what it’s trying to do (or rather, where it should be going).

Last and best IMHO was Stuart from Real networks talking about scaling web services, especially in relation to their Rhapsody online music service. Stuart was a really good speaker, no powerpoint, just him talking passionately about what he does, lots of joking and very high energy, moving from hyperbole to reality and then back to hyperbole again. A few things gleamed from the talk was:

  • “The same is better than better.” – IE: having a computer that is the same as the rest of the computers in the data center is much better than having one or two that are tweaked, or different in anyway. Obviously exceptions are going to happen, but if you’re rolling out 30 webservers, do you want them to be built automagically from script / PXE all exactly the same, or each one built by someone different with a slightly different configuration?
  • “A reactive sysadmin is better than a contemplative sysadmin.” – IE: Someone who has lower knowledge / training but who knows that when condition X happens hit button Y is better to have than someone who sits with his feet on the desk wondering why condition X happened….
  • “Document startup and shutdown procedures.” – Everyone needs to know how to take down a server, what it will affect, and what else needs to be done before after to it or other systems… nothing sucks more than “ooops, so that was tied into that system!” :)
  • “Documentation sucks!” – Both hyperbole and truth. It’s written for the wrong reasons for the wrong people by the wrong people.
  • “People are idiots. You are an idiot.” – Ain’t that the truth! And knowing that, make sure that everything is scriptable so that you don’t have to rely on anyone knowing anything other than “hit this button when that happens”.
  • “Script everything!” – Similar to the ideas of Extreme Programming, where devs just need to hit a button and get a “everything ok” or “something wrong” indication. Since everyone is an idiot, no one should ever log into a server and type anything, and if they do it should be a script. I assume that at Real they have scripts run from web pages or something. Obviously way more important in larger scale installations of course.
  • Testing environments suck. – IE: you can spend $15m on a complete duplicate of your web service, but is it worth it? Basically all testing environments suck, and it sounds like at Real they basically poke at it and say “yup, it should work ok….”, then swap from test environment to production, and swap back the second that anything goes wrong.
  • … much more….

We also ran into a guy from the One Laptop Per Child project with a real OLPC laptop. Looked like a fisher price toy (or like the original iBook), but was super cool. Semi-sucky hardware (128mb ram, 100mb flash disk, geod processor) but it also has some neat stuff. High resolution screen, built in camera, wifi (with cute little ears) and the display has a black and white mode that is actually more readable the brighter it is outside (ie: you can use your laptop outside in the sun). It’s also running a stock (ish) fedora Linux distro, so you can even play around with it if you don’t have the hardware (which hardly anyone does) via LiveCD. It also sounds like the price will be about $175 USD, which is pretty cheap for a cool toy to play around with and hack on. There was a rumor that there was a “buy 2, get 1” deal where you pay for two OLPCs, got one and the other goes to someone who needs it, but that’s just a rumor.

Other than that the day was pretty uneventful. Other than a mishap driving home (apparently if you keep going north on the I-5 you don’t get to the 264th border crossing :) it was all home and I was back for a yummy dinner cooked by my loverly lady.

Read More

The Great Ubuntu Upgrade that Wasn’t

I played with Ubuntu Linux a few days ago and reported that it went on slick and easy and was nice and stable, and I was considering replacing my desktop system with it. Well, in a fit of boredom this weekend I decided to do just that. Because I’m sane, and have had “oops” moments before I decided to do it on a totally separate hard drive so if something did go wrong I could revert back and still get work done. I didn’t have any standard hard drives sitting around, so I decided to install the three 18G super-fast SCSI drives I still had from the old system. I later found I also had a couple of 40G IDE drives, but leaving those 15,000 RPM drives and adapter just sitting in my closet seemed like such a crime.


Read More

Playing with the Latest Ubuntu Linux

Decided to throw Feisty Fawn, codename for the latest Ubuntu onto the second hard drive on my main windows box today, just to shake things up.

System specs are as follows:

  • Intel Core 2 Duo E6600 CPU (2.40Ghz)
  • 2G DDR2
  • nVidia GT7950
  • 2x 320G SATA Hard Drive

The system is my windows Photoshop / Gaming box, with dual hard drives, one of which has never been used. So I disconnected the one that was plugged into the Windows drive and rebooted…

Booted up fine, but there were a couple of moments of “WTF?” when nothing happened, or my monitor went into powersaving mode. I had to hit CTRL-ALT-F1 and then ALT-F7 to get back into the graphical environment. Or think I did anyway, not sure if I was just impatient.

Since it’s a clean box, I just hit the ‘install’ icon and let ‘er go. Simple install, language, keyboard, my name and the name of the computer, then the partitioner (all of which I used defaults) and then off it went. Full install took about 20 minutes, all of which time it’s still in the Live CD environment, so I played games while waiting. When it was done I hit the ‘Reboot’ button and was thrown into the new environment.

I have to say, bootup was pretty sexy, at least compared to what I’m used to with mainly servers, and fast. Either this computer is lightyears better than my other Linux box, or Ubuntu has really got some super-optimizations in their bootup system. I went from GRUB to GDM login screen in only a few seconds. Had the same issue as I had with the Live CD as far as having to go to the console and then back to ALT-F7 to get to the graphical environment. Odd.

Honestly, the only real reason I was doing this was to play with the cool eye candy stuff. I followed the Beryl Install which has their “three click” method (which worked perfectly) and was up with super-sexy desktop eye candy in about 5 minutes. Sweet! Much more stable than under my Gentoo setup as well.

The new additions to the Movie Player to allow it to download codecs on demand is cool too. Basically it’s just a frontend into the package manager to allow the system to search for what’s needed and then tell the system package manager to install it. Very slick. Only sucky part is that if you have another package manager going at the same time it bails out (understandably) and then you have to close the movie player and start again. Not a big deal though, and it works perfectly.

The one thing I was a bit disappointed with was the lack of widescreen support. My monitor does 1650×1080 native resolution, and it took a bit of fiddling to get it to work. Not much, but it was more than the rest of the system.

And that was about it. I had accomplished what I wanted (so far) and plugged in my windows hard drive to continue on with a day of coding and gaming. Now my next step is to see if I want to replace my current gentoo desktop with Ubuntu, and what I need to do to do this (as in special setup, backing up the systems, etc).

Read More