In other news, I've hauled my home PC out of the 32bit stone age into the gleaming 64bit bronze age, with a reasonably priced Athlon64 3200+. This is in an effort to get reasonable framerates out of X3 Reunion, but unfortunately I've found out that the bottleneck sits more with the graphics card than it did with the CPU.
With modern games, it seems that having a decent GPU and a reasonable CPU is a much better situation than a decent CPU and an average graphics card. My GeForce 6600 GT simply doesn't cut it with Black and White 2, X3 or Valve's 'The Lost Coast', which is doing a nice job of demonstrating what it's like to have a decent optical physics model in games.
Now I'm left wondering at exactly what point I can buy a top-end graphics card without paying too much of a premium, taking into account that my gaming experience gets worse with each new game purchase.