Things have been working fine except for one extremely nagging thing - the initial draw of a konsole window takes 5-10 seconds. It's extremely obnoxious, especially when you need a terminal so often.
I found out this was due to NVIDIA's drivers on newer boards - as described in the KDE techbase and illustrated in the Ubuntu forums and How-To's. Luckily it was an easy fix - running
nvidia-settings -a InitialPixmapPlacement=2 -a GlyphCache=1as root set the appropriate settings in the NVIDIA driver to allow windows to resize at the speed they should. But I don't necessarily blame NVIDIA - the blame deserves to be cast further.
The Linux Hater's Blog brings up a great point about Xorg not being about to allocate offscreen buffers - something that I didn't realize. Xorg lacks a memory manager, so all the stuff you need for full OpenGL support just can't be done. It simply can't be done with Xorg. All the points made in the rant are absolutely right - the memory management infrastructure for pbuffer and framebuffer objects have to be there, otherwise you're hosed.
The core issue that comes from this deficiency is that X11 in and of itself is inherently unable to support OpenGL. Lack of offscreen buffers means that all the great stuff you should be able to do directly in hardware can only be accomplished with a software renderer. Of course, this defeats NVIDIA's entire business model of making the GPU the most important part of your workstation. So they had to massively replace parts of X11; the NVIDIA Linux drivers must, by sheer necessity, replace huge chunks of the XOrg implementation.
After reading the Linux Hater's post a lot of other stuff made sense - why NVIDIA's drivers are so invasive, why you magically don't need to install Xgl to run Compiz Fusion if you are using the proprietary nvidia driver (because it already replaced Xorg for you, thanks), and why KDE's desktop effects had window resizing slowdowns.
NVIDIA didn't break things - they fixed things. They're just trying to live in our broken world.