Saturday, February 28, 2009

Old is the new New

The open beta of Quake Live opened this Tuesday, and I made sure to jump on and register an account as soon as I could. Of course, like most other people online at Tuesday night, I was in queue with tens of thousands of other players. I finally had an opportunity to play last night for about 30 minutes, just to make sure my account worked and see how things were put together.

It is definitely the classic Quake III Arena in all its OpenGL goodness. When the title first launched over nine years ago its hardware requirements may have stopped it from becoming ubiquitous; it supported hardware rendering only and, unlike other titles shipping at the time, didn't have a software renderer. Ah, how times have changed. A quick look at Steam's hardware survey shows how the desktop tide has changed, and now tons of people have way more than enough horsepower. And it doesn't stop at the desktop - Q3 has hit every major console and is even being developed for the Nintendo DS. It has even been ported to the iPhone. Official Linux and Mac support of Quake Live is reportedly a priority after the beta, granting an increasing OS X demographic access. Ubiquity no longer is a problem; making the installation as easy as a multi-platform browser plugin lowers the barrier of entry to near nil.

The key factor that stops Quake Live from just being a Q3A port is the actual infrastructure it resides within. The game proper is the endpoint, but the content itself is driven from the ladders, achievements, matchmaking and map inventory system contained within the Quake Live web application. It is one of those blindingly obvious why-isn't-everone-doing-this moments when you see how the game is orchestrated with the Quake Live portal; the strengths of the browser as a platform is completely leveraged, while the strengths of your desktop are used to power the game itself. The creators didn't try to cram Quake III into the browser itself, thereby condoning it to some sort of Flash-based hell. Instead they let you use the browser just as you would normally use it: for networking, finding a game, chatting, browsing leaderboards, looking at achievements, bugging friends, strutting your profile and other... forgive me for saying this... "social networking" features. When it comes time to do the deathmatch an external application is launched in tandem, allowing a fully fledged and fast OpenGL app to run on its own.

An interesting effect of this split-brainness between an online presence and a desktop renderer is that it accomplishes exactly what Valve wants to do via the Steam Cloud, where preferences and saves are stored on a central network instead of client-side. Most desktop gamers don't like the idea of savegames or prefs being stored on a remote server pool, and I would agree. For single-player experiences I would much rather hack my own .ini files and not be stranded when someone's cloud goes down in flames (clouds do NOT equal uptime... see Google and Amazon themselves for examples). However for multiplayer games this is acceptable; if a server is dead or a line is cut you wouldn't be able to multiplayer anyway - so it doesn't matter where configs reside. As an added bonus when the configs reside remotely you can't have players hack them, resulting in reducing the map to a wireframe or performing some esoteric modification to give them a competitive edge. Again, hacks are fine in single player, but not in a multiplayer scenario.

Add into the mix the fact that the economy has positively tanked and people have completely eviscerated discretionary spending, meaning $60 titles are no longer in the budget. Quake Live brings a new title, albeit of an old game, to market for the price of absolutely free. While it is true that CPMs and CPCs for online advertising has completely dropped through the floor, hopefully Quake Live will be able to cash in on its unique presentation, dedicated fan base and sheer volume of eyeballs. If the advertising model works, and Quake Live continues to be free this will provide a huge edge over other FPS titles this year.

Quake Live is a game-changing title, even if they didn't change the game. But why bother? Quake III Arena was arguably one of the most well-rounded and polished multiplayer first-person shooters out there with textbook weapon balancing and gameplay mechanics that became a staple in the genre. Why change something that works? The only balance issue that the original Quake III Arena had was that, towards the end, veteran players became so good that it was no longer possible for a new player to have any fun on a map. Now with Quake Live's matchmaking mechanics and dynamic skill levels even that mismatch has been mitigated.

Yeah, I've already gone on too long about this. This approach just makes so much sense from an engineering perspective and a gaming perspective that I'm sure tons of titles are now going to flood into the market, ready to follow suit.

Monday, February 16, 2009

Who Can You Count On During Crunch Time? Turns Out... Nobody.

I've long depended on the software community to save my butt in times of need. And it used to.

It stopped helping this week, and instead started wrecking havoc.

You'll notice I did not say the open source community. And I did not say the Java community. Even tho these two communities are the ones that my latest rant is aimed at. No... this issue has already burned me big time with commercial companies, which is why I left the likes of IBM, Microsoft and Oracle. But now I'm not sitting any better... everyone has sunk to the same level of mediocrity.

Bugs now are being reported, exhaustively, patched and submitted to release managers. And yet months, even years go by without so much as a cursory review. A few good examples come to mind... there were fairly blatant bugs, even typos in a Hibernate dialect for the H2 RDBMS. The author of H2 reported the bug, patched it and even made unit tests for the project. Has the fix even seen daylight? No. It has been open since July of 2008.

Here's an even worse example: thousands (if not millions) of people rely on Apache's Commons Codec library. It's used for string matching, BASE64 encoding and a slew of other things. One of the speech codecs suffers from an ArrayIndexOutOfBounds exception during encoding. A simple mistake to remedy, and one that was remedied and committed to their source repository. Was such an obvious bug ever fixed in a production release? No. In fact, a new release hasn't been made in five years.

And some of the bugs are bad because the maintainers refuse to fix them and label them as a feature. For example, does Spring's Hibernate DAO framework actually begin a transaction when you call... say... beginTransaction()? Nope, beginTransaction is a do-nothing operation. Wow, that makes things easy to troubleshoot and fix.

Okay, so far I've described problems that all have ready work-arounds. That's the only saving grace in these instances - the projects are open-source and so fixes can be applied and binaries re-built. But do you really want patched, out-of-band libraries going into your production system? And what about when you hit the really big problems nary days before the "big release," like finding a fatal, obvious and unfixed bug in your JMS broker? It's been crunch time for two weeks, you're already sleep deprived, your code is absolutely going out in two days... are you going to make a gentle post on the dev list after unit testing a thoroughly researched patch for an obvious bug the maintainers missed? No. You're going to punch the laptop.

Basically I've succumbed to the entropy and decay of all the frameworks I used to depend on. Hibernate Core has over 1500 bugs that have yet to be assigned a release or triaged and doesn't even appear to be actively maintained anymore. Commons Codec hasn't seen a release since July of 2004... kids born during their last release are headed towards elementary school. And the instability of ActiveMQ 5.1 continues to plague its 5.2 release.

The standard reaction to this kind of rant is "if you don't like it, why don't you submit patches?" "Why don't you join the project and help out?" "Stop complaining and contribute!" Yet contributions have been made, entire bugs have been fixed by others MONTHS ago, and yet there addition to the project has netted nothing. What hope is there for a sleep-deprived guy like myself to contribute before his project goes down in flames and the powers that be bail on these frameworks for the rest of their collective careers?

Tuesday, February 10, 2009

Retelling Yet Another Myth

Last month I put together my second MythTV box. I've become tired of my VIA Chrome box... video acceleration was a joke. So I thought that an onboard GeForce 7050PV would work since it had XvMC support. Plus NVIDIA's (binary-only) drivers were usually pretty good. Right?



I ordered Shuttle's SN68PTG5 AM2 case, figuring I could re-use an old AMD64 CPU I had and stuff the hard drive in from the previous MythTV box. When I got the case in it was a bit larger... and louder... than I expected. I needed to push my TV forward to get the box to fit behind it. The sheer amount of real estate I was able to work with made the opportunity worthwhile however - there was plenty of space around the motherboard, even with the ginormous heatsink installed.

Maneuvering around the box wasn't bad - I was able to put in the old drive and fill both memory channels with two Corsair XMS2 512M PC2 6400 sticks without a single flesh wound. Putting in the CPU was another issue however - I was short a pin.



Yeah... my old, to-be-recycled CPU was for Socket 939. This was a AM2 socket board - meaning it took 940-pin CPUs. Gah.

I paused for a few days and ordered a 2.1GHz AMD Athlon X2. It was a 65nm Brisbane core and only consumed 45W, so the lower thermals would serve a Myth box fairly well. Plus AMD chips are cheaper than avocados right now... it was an easy sell.



I wrangled the case together, put on some new thermal paste, and re-attached Shuttle's massive (did I mention its massive?) heat pipe for the CPU. Installed my old pcHDTV tuner card to receive HDTV channels from my local cable provider, then sealed everything up. I attached the barebones StreamZap Ir receiver via USB then did a openSUSE install with the MythTV repositories.

lircd actually worked without a hitch... the StreamZap didn't have any issues. Neither did the pcHDTV card - it worked out of the box as well. Things were going well, so I shoved the machine behind the TV, connected it via HDMI and went along my way.



At first things appeared to work well. Thanks to NVIDIA's nvidia-settings application I was able to effortlessly setup my 1080p LCD to work over HDMI. X configuration was TONS easier than with VIA's chipset - there it took me nearly 20-30 hours to get the monitor configuration correct for SVGA out. Of course it doesn't hurt that HDMI is just a DVI formfactor... but I digress.

No, the big pain was XvMC. With acceleration enabled, video would playback (either live or recorded) for anywhere between ten seconds to five minutes, then send X into a complete CPU spin. I'd suddenly have 100% usage on a single core and a complete lockup of X11. X had to be SIGKILL'd - repeatedly - before things would stop spinning out of control.

Luckily I bought a dual-core CPU so I could easily regain control. No matter now hard I tried, however, video acceleration just wouldn't do anything but hard lock MythTV.

Finally I gave up and just told MythTV to use the CPU for decoding. This worked acceptably, even with 720p streams. The CPU was beefy enough - it's a dual core, dual channel rig that has the bandwidth. Just a shame that neither NVIDIA nor VIA were able to provide a chipset that would allow accelerated MPEG2 playback in Linux. C'mon - is it really that bad?

Now things are recording and playing back just fine. The vsync is disabled right now - so I do have image tearing during high-motion decoding. But hopefully NVIDIA's next round of Linux drivers will fix XvMC, give me accelerated video and take all my worries away.