I remember back in the days when I had free time, I decided to shop around for an RPG. It was a toss between Baldur's Gate and Morrowind. I heard about the open-endedness (is that even a freakin' word?) of Morrowind, so it one the day. I remember when I first popped it in and my GeForce3 starting spewing out frames.
I think for the first week solid all I did was take screenshots. I wasn't so much a player character as I was a tourist.
Now, believe it or not, I'm sitting here doing production code moves and listening to the Morrowind soundtrack in VLC Player. Even if I can't play I'm still hideously hooked. That's how freaking engrossing Bethesda makes things... they put your cortex right in the world they created and never let it go.
That's why I fear Oblivion. I may lock myself into the den and never reappear.
Friday, March 31, 2006
Wednesday, March 29, 2006
Too Many Ideas
I just got back from TheServerSide's Java Symposium. A great conference... it was definitely interesting seeing the faces behind the big names in Java. Now I'm overflowing with ideas for both enterprise Web apps and ConsultComm, but I'll be damned if there's any time to see them happen.
Epic is now getting press for the Unreal 3 engine, which they're saying will be more of a platform for studios to launch titles with. An interesting point in the engine and tool's feature set is dynamically generated landscapes. Considering how levels are expected to be insanely vast nowadays, dynamic terrain generation is going to have to become the norm. Next dynamic content generation (augmented by tons of user-generated content) is going to have to be standard if you want your development team to be slightly less in size than, say, Texas.
Speaking of user-generated content and severe addictions, Linden Labs has secured additional funding for Second Life. And, like a true cookie pusher, Icculus brought the Second Life Linux client up-to-date. This native client may just hit beta after all!
Oh yeah, and I've become so... mentally absent due to lack of sleep that I kinda reformatted my entire hard drive, wiping out my partition table along with it. See.... ummm... my OGG Vorbis player used to be on /dev/sda. Soooo... if I wanted to reformat the drive, I'd just need to reformat /dev/sda. But... well... then I got a SATA drive. I can tell Linux hacks are already rolling their eyes at what's coming next... and they're right. Since my new drive is on the SCSI device block, my player moved to /dev/sdb. My hard drive is now /dev/sda. So, not paying any attention at all, I tried to reformat what I thought was my OGG player. But... ummm.... nope. It wasn't.
So now I just reinstalled XP, have so far rebooted over ten (not kidding) times, and am trying to restore the backups I was luckily able to make. I'm kinda cheesed that openSuSE 10.1 isn't slated to be released until April 13th - meaning that it's pointless to install an older version now, but I also can't wait another two freakin' weeks. I'll have to make do with my Wintendo in the meantime. Grrrr.
Epic is now getting press for the Unreal 3 engine, which they're saying will be more of a platform for studios to launch titles with. An interesting point in the engine and tool's feature set is dynamically generated landscapes. Considering how levels are expected to be insanely vast nowadays, dynamic terrain generation is going to have to become the norm. Next dynamic content generation (augmented by tons of user-generated content) is going to have to be standard if you want your development team to be slightly less in size than, say, Texas.
Speaking of user-generated content and severe addictions, Linden Labs has secured additional funding for Second Life. And, like a true cookie pusher, Icculus brought the Second Life Linux client up-to-date. This native client may just hit beta after all!
Oh yeah, and I've become so... mentally absent due to lack of sleep that I kinda reformatted my entire hard drive, wiping out my partition table along with it. See.... ummm... my OGG Vorbis player used to be on /dev/sda. Soooo... if I wanted to reformat the drive, I'd just need to reformat /dev/sda. But... well... then I got a SATA drive. I can tell Linux hacks are already rolling their eyes at what's coming next... and they're right. Since my new drive is on the SCSI device block, my player moved to /dev/sdb. My hard drive is now /dev/sda. So, not paying any attention at all, I tried to reformat what I thought was my OGG player. But... ummm.... nope. It wasn't.
So now I just reinstalled XP, have so far rebooted over ten (not kidding) times, and am trying to restore the backups I was luckily able to make. I'm kinda cheesed that openSuSE 10.1 isn't slated to be released until April 13th - meaning that it's pointless to install an older version now, but I also can't wait another two freakin' weeks. I'll have to make do with my Wintendo in the meantime. Grrrr.
Monday, March 20, 2006
GPU - The Next FPU?
What was the difference between a 468/SX and a 486/DX2? The additional floating point processor. Hellz yeah.
In the budding days of personal computers and 486 goodness, CPU's were strictly integer machines. Floating point math was simply too much for the little chicklets that didn't even need active cooling. I recall that back in the olden days of ISA cards and SIMS you could purchase separate FPU's that you could hammer into your brute-force socket... this granted you massive power that your Excel spreadsheets never previously dreamed of.
I recall how up-in-arms everyone was that Quake actually demanded a freakin' FPU. Who'd id think we were, Rockafeller?
Now that graphics cards have become cheap, plentiful and pretty freakin' powerful vector processing has become all the rage. Projects like BrookGPU and GPGPU have made it (relatively) easy to create userspace apps run on a GPU just as it would a CPU. This means your GeForce 7800 could work on certain serialized tasks, crunching numbers while it remained idle between SecondLife sessions.
Of course, GPU's are just good for vector processing, which means only a certain type of algorithms are really suited for it. You don't need to look much farther than critiques of IBM's Cell architecture to see how people feel about a rash of vector processing units versus a CPU that can span and branch effectively.
It was interesting to see that Nvidia and Havok have teamed up to offload physics to the GPU. Bear in mind this doesn't work with an actor's physics since it branches and can't be effectively streamlined or predicted, but does extremely well with cloth and particle physics.
Nvidia could be realizing that Ageia's physics accelerator is a more than vaporware, and is stepping up their offering to get into that product space. However, from the specs it appears that only Ageia's PhysX (*groan*) will be able to handle physics asynchronously. Although, I guess if you run physics calculations like shaders you could send multiple calculations down each pipe on the graphics card itself.
Should be interesting to see how the space pans out. The GPU is quite a hoss nowadays, and needs extremely fast access to memory, so it may not be incorporated as a side-by-side processor or on-die any time soon. But still... one has to wonder if multiple fast vector units are the next big thing to go on a proc.
In the budding days of personal computers and 486 goodness, CPU's were strictly integer machines. Floating point math was simply too much for the little chicklets that didn't even need active cooling. I recall that back in the olden days of ISA cards and SIMS you could purchase separate FPU's that you could hammer into your brute-force socket... this granted you massive power that your Excel spreadsheets never previously dreamed of.
I recall how up-in-arms everyone was that Quake actually demanded a freakin' FPU. Who'd id think we were, Rockafeller?
Now that graphics cards have become cheap, plentiful and pretty freakin' powerful vector processing has become all the rage. Projects like BrookGPU and GPGPU have made it (relatively) easy to create userspace apps run on a GPU just as it would a CPU. This means your GeForce 7800 could work on certain serialized tasks, crunching numbers while it remained idle between SecondLife sessions.
Of course, GPU's are just good for vector processing, which means only a certain type of algorithms are really suited for it. You don't need to look much farther than critiques of IBM's Cell architecture to see how people feel about a rash of vector processing units versus a CPU that can span and branch effectively.
It was interesting to see that Nvidia and Havok have teamed up to offload physics to the GPU. Bear in mind this doesn't work with an actor's physics since it branches and can't be effectively streamlined or predicted, but does extremely well with cloth and particle physics.
Nvidia could be realizing that Ageia's physics accelerator is a more than vaporware, and is stepping up their offering to get into that product space. However, from the specs it appears that only Ageia's PhysX (*groan*) will be able to handle physics asynchronously. Although, I guess if you run physics calculations like shaders you could send multiple calculations down each pipe on the graphics card itself.
Should be interesting to see how the space pans out. The GPU is quite a hoss nowadays, and needs extremely fast access to memory, so it may not be incorporated as a side-by-side processor or on-die any time soon. But still... one has to wonder if multiple fast vector units are the next big thing to go on a proc.
Saturday, March 11, 2006
Userland, Not Wonderland
So... I've been engrossed in Second Life now. Before I could never justify trying it out. Then they dropped the price to "free." Then they provided a Linux client. I buckled. So sue me.
On the forums for the Linux client things devolved, as they always do for Linux gaming, into the extremely stupid "all packages are incompatible and unlinkable" debate. The crux of the issue is that people claim libraries are always changing in Linux distros, so that you can never dynamically link to libraries. People argue that this makes it impossible to offer a single, standard installation or provide supportable, sustainable products.
Of course, this hasn't stopped Epic, id or Introversion. Not to mention the zillions of enterprise applications that are written for Linux. The ones that, if they break, cost companies downtime and lost revenue. The moving libraries argument is shallow at best, especially in light of the many successes people have had with deploying commercial software packages to Linux distros.
Now, if they wanted to complain about the lack of unified package management or still immature video drivers, that's something different. While Linux at least has package management, most put it aside and instead just use a GUI installer (such as Loki's).
While we're on the subject of drivers, I noticed that Vista will be running most drivers in userland space, presumably so they can be loaded/unloaded easily without tinkering with the kernel. I hope this is something that is implemented solely by design and not due to some abstraction layer. While it was a great idea back when NT 4.0 decided to run drivers in userland, hiding things by its hardware abstraction layer put some serious restrictions on what drivers could do. I recall that DirectX was a real headache for Microsoft to add to NT, mainly because of the barriers HAL put in front of direct hardware access.
I actually think Linux handles this fairly well. While it's a monolithic kernel (I'm not going to get into the whole Tanenbaum v. Linus debate), allowing module loading/unloading/dependencies seems to work very well. While you can still get a nice kernel panic from a errant driver, for the most part modules behave fairly well. Well enough that I recall an interview with Linus (I can't find the article) where he thought that developers needed to be less afraid of running in kernel mode - too many were running to userland to invoke code.
Then again, what the hell do I know. Most of my CS knowledge is leaking out of my head. Microsoft may have it right and I have it wrong. I always cringe when I hear "drivers in user space," but if you constantly get bad press for someone else's crappy drivers crashing your OS I guess there's no good alternative.
On the forums for the Linux client things devolved, as they always do for Linux gaming, into the extremely stupid "all packages are incompatible and unlinkable" debate. The crux of the issue is that people claim libraries are always changing in Linux distros, so that you can never dynamically link to libraries. People argue that this makes it impossible to offer a single, standard installation or provide supportable, sustainable products.
Of course, this hasn't stopped Epic, id or Introversion. Not to mention the zillions of enterprise applications that are written for Linux. The ones that, if they break, cost companies downtime and lost revenue. The moving libraries argument is shallow at best, especially in light of the many successes people have had with deploying commercial software packages to Linux distros.
Now, if they wanted to complain about the lack of unified package management or still immature video drivers, that's something different. While Linux at least has package management, most put it aside and instead just use a GUI installer (such as Loki's).
While we're on the subject of drivers, I noticed that Vista will be running most drivers in userland space, presumably so they can be loaded/unloaded easily without tinkering with the kernel. I hope this is something that is implemented solely by design and not due to some abstraction layer. While it was a great idea back when NT 4.0 decided to run drivers in userland, hiding things by its hardware abstraction layer put some serious restrictions on what drivers could do. I recall that DirectX was a real headache for Microsoft to add to NT, mainly because of the barriers HAL put in front of direct hardware access.
I actually think Linux handles this fairly well. While it's a monolithic kernel (I'm not going to get into the whole Tanenbaum v. Linus debate), allowing module loading/unloading/dependencies seems to work very well. While you can still get a nice kernel panic from a errant driver, for the most part modules behave fairly well. Well enough that I recall an interview with Linus (I can't find the article) where he thought that developers needed to be less afraid of running in kernel mode - too many were running to userland to invoke code.
Then again, what the hell do I know. Most of my CS knowledge is leaking out of my head. Microsoft may have it right and I have it wrong. I always cringe when I hear "drivers in user space," but if you constantly get bad press for someone else's crappy drivers crashing your OS I guess there's no good alternative.
Thursday, March 09, 2006
Create Your Own Content... 'Cause It Takes Us Too Long Otherwise
Kotaku had an interesting article about how user-created content has started to supplant studio-created content. The gist is that creating content by artists and developers takes waaaay too long... and studios are starting to come up with creative ways to get around the issue. Procedurally generated content is one solution - but another approach is to make content generation part of the game.
Second Life is the gold standard of this approach. It's not a game per se - it's a world where users can create whatever they want and make their world whatever they wish. The Sims did this somewhat, although the content is largely generated from the mod community rather than making it an in-game mechanic.
Spore is supposed to really take advantage of both user-generated and procedurally generated content. By creating your own... whatevers... that evolve into something on micro and macro scales.
Now that Linden Lab has released an alpha build of their Linux client long at last... it looks like I'm going to have to check out this trend first hand.
There goes the last of my productivity...
Second Life is the gold standard of this approach. It's not a game per se - it's a world where users can create whatever they want and make their world whatever they wish. The Sims did this somewhat, although the content is largely generated from the mod community rather than making it an in-game mechanic.
Spore is supposed to really take advantage of both user-generated and procedurally generated content. By creating your own... whatevers... that evolve into something on micro and macro scales.
Now that Linden Lab has released an alpha build of their Linux client long at last... it looks like I'm going to have to check out this trend first hand.
There goes the last of my productivity...
Wednesday, March 01, 2006
Just One Small Tweak...
Blech. So I was working on finishing my ConsultComm changes in January, then had to pass out for a LOOOOOOOOOOOOONG while. Now I try to pick them back up and - presto - I'm starting to get into scope creep.
I'm just trying to fix a few final bugs, but then blammo NetBeans 5.0 comes out, along with a new Swing layout that actually makes sense. So I upgrade and start tweaking form layouts. Then blammo, SourceForge launches its subversion service, and I feel compelled to migrate. Then blammo, someone I know gets an OS X machine, so I can finally test OS X Java packages.
Sooo... now I went from fixing two bugs to overhauling the project. Again. Basically every major revision of ConsultComm has come from me putting the project off to the side, someone giving a really good suggestion, picking up development again just to do the one last tweak, then ending up rewriting the entire ConsultComm codebase.
I haven't been coding in a quite a long time, so hopefully I can get back into the groove. I'm firing up my Groove Salad.
I'm just trying to fix a few final bugs, but then blammo NetBeans 5.0 comes out, along with a new Swing layout that actually makes sense. So I upgrade and start tweaking form layouts. Then blammo, SourceForge launches its subversion service, and I feel compelled to migrate. Then blammo, someone I know gets an OS X machine, so I can finally test OS X Java packages.
Sooo... now I went from fixing two bugs to overhauling the project. Again. Basically every major revision of ConsultComm has come from me putting the project off to the side, someone giving a really good suggestion, picking up development again just to do the one last tweak, then ending up rewriting the entire ConsultComm codebase.
I haven't been coding in a quite a long time, so hopefully I can get back into the groove. I'm firing up my Groove Salad.
Subscribe to:
Posts (Atom)