Thursday, May 22, 2008

Google Sites = The Reason for a Google Account

The whole reason I wanted to start developing on Google App Engine was because I wanted to start building a repository of code samples, How-To's, documentation, projects, blog posts, all that stuff. Something akin to Confluence. Google already did it for me.

Google just recently launched Google Sites, which basically becomes a content management system for whatever you like. It's exactly what I needed - and I'm planning on moving my docs, code snippets and projects over soon.

Saturday, May 10, 2008

Wow... Actual Support! They Know We Exist!

Wanted to buy an album I had just discovered... but didn't want to haul myself to the neighborhood Best Buy. So I wanted to see if the band had an online purchase method. They didn't... but they sold through Amazon. I knew Amazon sold DRM-less MP3's, so I decided to check it out. For whole album downloads you are required to use their lil' download app, so imagine my aghast expression when I saw:

Sweet. Works only on 32-bit Linux (can work on 64 but has problems with library dependencies), but otherwise purchasing was swell. No issues.

Thanks Amazon for realizing Linux users buy music, too!

Tuesday, May 06, 2008

Spent

A while back I received this e-mail, completely out of the blue:
Wise men say, you only have to resign yourself to what you cannot improve

I'm not sure why I got the e-mail. Or who sent it. No real idea. Could well be spam. But it kinda stuck with me for some odd reason.

I've been working the 80 hour weeks lately, as promised. That means that I've had to give up working on my open-source projects. I feel real pangs of guilt when people ask for bugfixes or when the next release will come out... especially when users as kind as tomasio even volunteer their own time for icon assets. But I've cut sleep down to a few wee hours and just have nothin' left.

I'm hoping I can get everything going at work, get things on a stable foundation, then give myself free time once again. At least, that's the lie I tell myself.

Tuesday, April 15, 2008

Two Great Tastes - maemo & Qt

Fresh off the wire, it has been announced that Nokia will introduce Qt, my favorite C++ toolkit, to the maemo platform, my favorite portable hardware platform. Two great tastes that go great together.

If this kind of platform expansion and cooperation with Qt developers (such as KDE authors) is what will come of the Nokia acquisition of Trolltech, it may not be as bad as I predicted. Here's to hoping that Nokia sees Qt as a toolkit that will serve mobile, embedded and desktop platforms. Especially with the recent fame of low-cost low-footprint laptops, Qt and maemo have to be getting some additional attention.

If the WiMAX-enabled Nokia 810 becomes more powerful and popular, the addition of Qt and simplified cross-compiling could provide a huge increase of third-party applications hosted on an already open mobile device.

Monday, April 14, 2008

Java 6.9

Someone just directed my attention to the Java 6 update 10 intro on Sun's site. What the living...?

First off, this isn't a minor update. This is taking a backhoe to the foundations of Java, hitting a water line, but digging a basement anyway. Why this wasn't released in 7 I don't know... I guess it's because the update is largely centered around deployment of Java as a platform and not adding any functionality to the underlying API. But damn, it's an overhaul.

First, Java is now chopped neatly into libraries, so you only download what you need. That means Java installations can be one-third of what they were in the previous release. Java can now be downloaded and installed more efficiently as well, thanks to some much easier-to-use JavaScript and HTML-fu.

Konqueror has already done this for a while now, but applets will now execute within a full JVM instead of a half-baked nsplugin. This allows for more robust applets and, from my experience with Konqueror, plugins that are more crash-resistant.

Finally the fairly... blech... look of Java has been completely overhauled with Nibus, long at last. Previously I've had to use javootoo's Look and Feel libraries to make things look remotely presentable. Now Nimbus should be able to fill that gap nicely by adding more modern window decorations and UI components.

These were all desperately needed improvements to have Java make inroads into the desktop space. Let's hope it isn't too late.

Saturday, April 05, 2008

Independent Horticulture

Another great invention by the creators of Penny Arcade: Greenhouse.

Steam has done a great job making independent and smaller titles much apparent to the populous, and since titles don't have to compete for shelf space a genre for every palate can be made readily available. And while CodeWeavers has done their best to allow Steam & Source titles to run on Linux & OS X, it can't be said that Steam is a cross-platform solution.

Not so with Greenhouse. It offers native support for OS X, Windows and Linux in tandem. And their inaugural title will be cross-platform. And if they continue to support independent and episodic titles, this could be a bigger competitor to Steam than GameTap.

Friday, April 04, 2008

Introversion's Procedural Art

The huge amount of effort required for content creation was a hot topic a couple of years ago, as many people saw the enormous cadre of artists and animators making AAA titles and realized no garage developers could hope to reach that type of scale. The fear at the time was that this would mean the end of indie development.

Of course after Peggle, Portal and Crayon Physics hit the mainstream it suddenly became apparent bigger doesn't equal better. Or more sales.

I've always loved the approach Introversion has taken with development. They're truly dedicated garage developers, spending more time trying to perfect a fractal tree than they really should. But I can respect spending an inordinate amount of time trying to wrap ones head around a concept like procedurally generated landscapes.

When I heard that Gamasutra was hosting an event with Chris Delay speaking on the topic of procedurally generated content, I definitely wanted to jump on the opportunity. While they had a fairly unrehearsed HP shrill asking the questions, Chris had some great points.

Chris emphasized that the main reason his titles have procedurally generated textures and meshes was because artistic content is just not a space he feels Introversion can compete within, since other companies have mastered that area. He saw it as neither a positive or negative thing, it's just the case for Introversion. Should artists be afraid? Chris doesn't think so. Procedural content cannot replace people, since it ultimately can't produce those unique items that make an environment distinct. While you can generate the landform that the world consists of (mountains, hills, streets, clouds, etc) it cannot add fine-grained details to the world.

You automagically gain several efficiencies with procedural content:
  • You don't have to re-draw or re-generate a scene if you need to modify level of detail
  • You end up with a large amount of content and detail that artists can't get (you can delve as deep as you want into a fractal)
  • If you do procedural animation, you can have adaptive animations that exist as a consequence to a number of actions

    While the idea of using fractal algorithms for landscape generation or building trees, I hadn't thought much about procedural animation. Of course Spore uses it for their character builder, but introducing this as a new way of rigging meshes would again immensely help developers. Not needing an entire team of dedicated animators or texture artists would make things much more palatable.

    There are tradeoffs of course, and Chris repeatedly mentioned that procedurally generated content requires a different way of thinking about memory management. Rather than loading assets off disk, you load them in memory at runtime - so you don't worry about texture compression, but you do now have to worry about LOD given to your algorithm and how much memory the resulting data structure will reside within. You can't let your procedure go willy-nilly and create too many verts.

    Introversion's latest undertaking, Subversion, sounds interesting. Right now Chris describes it as more of a thought experiment, so who knows if we'll actually see it. But what he's pursuing is procedurally generating cities from 10 kilometer view all the way down to pens and desks inside a building. Not only does this employ a landscape generator for hills and mountains, but also will procedurally generate streets and buildings based on markets and traffic demand. Each procedural algorithm feeds its brothers, affecting its ultimate output. For example, more traffic makes more roads which can make bigger buildings.

    One difficulty Chris found with this approach was that it was often ard to find out bad results - sometimes you would have cities being built on entirely one side of an area, with another being completely blank. Or sometimes a fire escape would open into nothingness on 30th floor. It's all a matter of finding a way to re-seed or compensate when these failures occur. Or maybe it just makes the whole concept quaint.
  • Sunday, March 30, 2008

    Intel Not Killing VPU After All

    Looks like Intel isn't killing the VPU after all, but instead birthing it. Larrabee, their GPU/HPC processor, is supposedly an add-in proc slated for 2009/2010. Although I'm going to put myself out on a limb and say it will probably become part-and-parcel of their mainline CPU and, instead of being a discrete co-processor, will quickly be absorbed as additional cores of their consumer processor line. But I digress.

    Additional information about Larrabee continues to trickle out, but it definitely seems to introduce vector processing instruction sets to be used by general computing, not just as a GPU.

    Even if this comes out as a daughterboard or discrete chipset, it should be a compelling reason to pick up a good assembly programming book and start hacking again. How long will it take (non-Intel) compilers to optimize for the vector instruction sets?

    Saturday, March 08, 2008

    Nifty Nokia

    I'm really enjoying the n770. I'm definitely putting an n810 on my wishlist for the end of this year.

    First thing I did was re-flash the device with OS 2007 Hacker Edition, an OS intended for the n800 but crammed into n770 hardware. It works rather well, only occasional reboots, but then again I'm working with a heavily used and refurbished unit. Who knows if it's the OS or the device. Google Talk, contacts, Bluetooth, 802.11b/g, a stripped-down mozilla engine and MPEG4 playback all works well.

    I turned my lil' Nokia into a pocket translator with the Google Talk translator bot - the streamlined chat interface of OS 2007 turned the Nokia into a very handy (and quick) translation service.

    Also tried to crack a test WRT54G router I have laying around using Aircrack, but I couldn't inject wifi packets using the OS 2007 wireless drivers so had to resort to the slower WEP cracking that needs a fair amount of seed traffic. It was still neat to browse all surrounding AP's on a full-screen xterm. With the n770's fantastic resolution, even the small typeface was definitely readable.

    Also been mowing through a number of third-party apps. There is a fantastic developer community around the device - their Sourceforge-like approach to the Maemo Garage and the extensbility of the platform has served the developer and user community well.

    It took me a while to find out what type of video the n770 will natively accept. There are several good resources out there, such as Andrew Flegg's Perl script that easily transcodes video into a n770-digestible format. The wide screen and nice resolution make mobile video much more palatable. The only caveat was that newer releases of MPlayer tag video with a newer but much less understood "FMP4" codec tag which OS2007HE doesn't understand. I had to tweak the script to pass the value "DX50" to the ffourcc option in order for the built-in media player to recognize the MPEG4 codec used. I also had to make sure encoding only happens at 15 frames per second, otherwise audio quickly gets out of sync.

    When I get a free second I'm going to try getting some OpenVPN binaries to work as well. Would be very nifty to have an SSH stack and VPN access wherever I go.

    Got Flash 9 somewhat working, although sound doesn't appear to work. Not a deal breaker tho, considering I'm working on a refurbished device running an unsupported OS meant for an entirely different hardware platform.

    All in all, I'm a pretty happy gopher. Not sure what that means, but I am.

    Tuesday, March 04, 2008

    When $300 Is More Popular than Free

    For the past two years digital delivery has supplanted shelf space, but those attached to selling physical inventory have poo-poo'ed the viability of such consumerism. But good ole' Trent may be proving that the merch sells itself once and for all.

    The Reg puts it well when it says "Nine Inch Nails cracks net distribution" - their latest album has gone up for sale in several interesting ways on their site: get the first volume (nine tracks) for free. If you like it, you can buy all the volumes lossless (36 tracks) including a 40 page PDF booklet for a measly $5. For only ten stinkin' bucks you can get the whole thing as a two disc CD set with a printed booklet. For $75 you can get the audiophile version, digital versions, Red Book CD versions, hardcover slip case and more. Or you can pay $300 and get a super-mega-uber-limited-edition-collectors pack.

    Or at least you could before all 2,500 sold out.

    At a time when people keep claiming that pirated music is killing the industry and no one will pay for music anymore, it seems awful incongruous that 2,500 units at $300 a pop sold out in almost a day.

    Same thing happened back in the day when I bought a copy of Uplink. I could buy it cheaply on its own or shell out some extra bucks and get the signed "limited edition." Of course I now have a proudly signed copy of Uplink on my shelf.

    It's not hard to upsell customers, even (or especially) with digital distribution. Give them schwag and they will come.

    Monday, March 03, 2008

    A Rite of Passage

    I purchased a well used Nokia n770 Web tablet from a friend last month and, as tradition dictates, I must christen the device by authoring an entire blog post using only said device.
    It really is a sweet little device... and since it runs a Debian-derived distro I can do pretty much anything I want with it. From checking e-mail to WEP cracking it runs the gambit.
    The screen is positively beautiful. Video on this thing makes me giddy. Plus I have more connectivity options than I can shake a stick at.
    I can totally understand why the n800 has the following it does now.

    Wednesday, February 20, 2008

    Procedurally Generated Pinkslips

    Penny Arcade's recent podcast featured a rant - no... more of a reckoning... versus Spore. I find Spore's idea of dynamically generated content interesting, mainly because of my bias towards one-man development teams and procedurally generated content. But Mike and Jerry don't want to see artists and writers out of a job... and the concept that zombie algorithms can build music or images is looked upon with disdain. To them games are an artistic outlet for modelers, musicians and authors. But to developers they can seem like a growing necessity that a garage studio simply can't bankroll.

    Eskil Steenberg's Love is described by Rock, Paper, Shotgun as "...lavish impressionistic artwork brought to life... in motion it was suggestive of a smokey, dynamically lit version of Okami." Dynamic terrain deformation and procedurally generated assets allow Eskil to wrap some amazing gameplay into what looks like a surreal and compelling atmosphere.

    Not only does this mean that players get to glimpse into chaos, they get to play with it. And anyone who names such an ambitious effort after "For The Love Of Game Development" inspires hope in a lot of indie developers.

    Tuesday, February 19, 2008

    UML Hell

    I've been searching for a UML editor for a while that I like. So you don't have to, I installed (or tried to) several UML editors and took each for a spin. I needed some diagramming mainly for collaboration and presentations... and my choices usually broke down into a) crappy but usable or b) pretty but unusable.

  • Poseidon for UML Community Edition is "free," but you have to register the product. I dislike typing. Didn't install it.
  • Gaphor wouldn't even install or run with my Python setup. Tried for 15 minutes then threw in the towel.
  • Umbrello I've actually used for some time now and consider it my favorite UML editor. When it doesn't crash. Which it does. A lot. I used both the KDE 3.5 and KDE 4 versions, both enjoy the segfault.
  • UMLet remained up, but the UI just didn't do it for me. It was more a random collection of widgets than an enforced UML diagramming tool.
  • Violet was one I really, really like. It was simple to the point of minimalism, which I like. However it had some serious UI bugs that made all elements change their text attribute at the same time.
  • Dia isn't a strict UML editing tool - it's more of a casual diagramming tool. It works really, really well when you want to brainstorm or braindump ideas. But I was looking for something that strictly enforced UML patterns and let me define attributes, methods, classes, sequence diagrams, etc.
  • ArgoUML, once again, is the only one that can make the cut. This is the open source relative of Poseidon and includes a ton of functionality. ArgoUML has been under active development for years and years, and continues to be the only big player on the block. And with Java WebStart deployment it's exceptionally easy to get cross-platform installation on everyone's machine.

    So ArgoUML is still the hands-down cross-platform favorite, with Dia playing a different role yet still the only other contender. At least I finally freakin' settled on one for good.
  • Monday, February 18, 2008

    Intel Killing the VPU?

    I had faintly heard grumblings of Intel loathing Nvidia, but I didn't really put 2 and 2 together until listening to Gordon Mah Ung on this week's No BS Podcast.

    Good ole' Gordo spelled out why Nvidia ultimately acquired AGEIA - because Intel deep-sixed Havok's use of GPU's that had been in development for several years. While Havok was an independent company they worked with both ATI and Nvidia to support GPU processing of their physics API. Once Intel bought them the interoperability was tossed in the trash. I'm sure this was a pretty big dig at the GPU makers.

    So Nvidia purchases the #2 player in the market to ensure this doesn't happen again. Let's see who enjoys the #1 spot in shipping titles during Christmas of '08.

    Eff the Function Lock

    Dear Microsoft:

    The Function Lock key was a funny joke at first, but now it is just immensely annoying. While I love nothing more than spending 15 minutes figuring out why my F2 key stopped working, I really need to move on with my life.

    If you want another lock key, try using the scroll lock. I haven't used it in a hundred years. You can have it if you want.

    Sunday, February 17, 2008

    Make a VPU Socket Already! Get It Over With!

    My lands. If the floating point unit took this long to become mainstream I'd be using a Core 2 Duo with a math co-processor still.

    Not to go all Halfhill or anything, but it has appeared that a vector co-processor or VPU's on-die were an immediate need for at least the past two or three years. Both Nvidia and AMD are bringing GPU's closer to the CPU, and it at once appeared that AMD's multi-core platform has included VPU/FPU/integer math/memory controller/CISC/RISC/misc./bacon&swiss together to take many types of tasks and integrate them under one die's roof.

    And now that Nvidia has wisely acquired AGEIA and their PhysX platform it seems a general purpose vector processing platform is getting closer. A standalone PhysX never took off on the consumer marketplace, and purchasing another Radeon or GeForce just for physics processing (as both AMD and Nvidia were touting at electronics expo's) never caught on either. But a generalized, open-platform physics API that takes advantage of unused GPU cycles would definitely catch on. Spin your GPU fan faster and get real-time smoke effects... sign me up.

    Nvidia has been extremely forward-thinking with their Linux drivers, and I hope they continue to be trend setters with the PhysX API. The PhysX engine and SDK was made freely available for Windows and Linux systems prior to Nvidia's acquisition, but hardware acceleration currently only works within Windows. Since Nvidia is porting PhysX to their own CUDA programming interface, it seems entirely probable that the Linux API would plugin to Nvidia's binary-only driver. And why not release the PhysX API under GPL? They could port to CUDA (whose specification is already open, available and widely used) then reduce their future development efforts by letting a wide swath of interested engineers maintain the codebase as needed.

    Widely available drivers, development kits and APIs will help drive hardware sales in an era where Vista w/ DirectX 10 adoption isn't exactly stellar. I won't invest in being able to run Crysis in DX10 under native resolution for a 22" LCD, but I will invest to get more particle effects or more dynamic geoms. At that point you're adding to the whole gameplay proposition instead of polishing up aesthetics, with continuously diminishing results.

    Saturday, February 09, 2008

    Encryption Would Be Easy... If We Let It

    Whenever something sensitive comes around my desk on a slip of paper I can't think about how much more accessible and secure the info would be if it was passed around using public key cryptography. After all, it has been seventeen years since the more than capable crypto advocate Phil Zimmermann made the case with PGP. Surely by now all e-mail clients can now securely pass info back and forth using some asymmetric key algorithm, right? Right?

    Well, yes... unless you're freakin' Outlook. And of course what to 9/10 enterprises mandate you use? Outlook. Fantastic.

    Now I've been able to sail under the radar with Evolution, which sports both excellent WebDAV support and public key encryption. I've got the best of both worlds in Linux. However the rest of my correspondents aren't so lucky - they need to use a Windows e-mail client that can book conference rooms and schedule appointments in Microsoft Exchange. So... stuck with some variant of Outlook.

    About a year ago I went out on a quest to find an interoperable public key encryption plugin for Outlook. I tried several clients... and all failed. I went out looking again and the playing field hasn't changed a bit.

    First you might notice that there were several Outlook plugins originally vying for PGP/GPG abilities, but they have largely atrophied or merged. OutlGPG became GpgOL from g10, but executable distribution was moved to Gpg4win, meaning that GPG distribution became the single player. The only other option would be G DATA's GnuPG-Plugin, but aside from being over five years old it was never that great. And Gpg4win wasn't much better - it too could only do plaintext, and even then as an attachment.

    All Linux and Windows mail clients that have some remote sanity use MIME to encode their encrypted payload, and yet Gpg4win (from what I've been able to find) refuses to do so. At best I get an attachment which I need to decrypt separately.

    Now look at Thunderbird, KMail or Evolution. All can encrypt and decrypt inline, natively, within the mail browser. And it works seamlessly without any additional windows or superfluous UI components. This isn't rocket surgery.

    Until someone out there makes an interoperable GPG plugin for Outlook 2003 that works with OpenPGP MIME compatible messages, no one will adopt public key encryption.

    Maybe that's the whole idea.

    Sunday, February 03, 2008

    Designing Head Patterns

    When I saw Jason McDonald's design patterns quick reference guide I marked out. I forwarded the link to everyone I knew who heard of the Gang of Four and printed out a nice, one-page color version to keep. I almost freakin' framed it. I liked how succinct it is... it gives you the class diagrams so your brain is instantly sparked, and just enough description that you still have to think analytically about the pattern.

    A few people asked me about the original Design Patterns book and a few others mentioned how much they liked Head First Design Patterns. I have to sadly admit - I originally wrote off the Head First series as just another "X for Dummies" clone, but now that I read the RMI section of their Java book and read the first 100 pages of their Design Patterns book I have to admit I judged... sigh... a book by its cover.

    The RMI section was actually fairly straight forward and illustrative and even included an informative chapter that included Jini. The Design Patterns book has done a good job of explaining the Decorator pattern, something that I can't do with a pencil and paper in front of someone. Both books have been good references for me to pass along to other developers.

    If only there was a pattern to remove outdated cliches two paragraphs ago.

    Friday, February 01, 2008

    Trying to Work (Together)

    Photo by Jerry Daykin, re-used with permission from the Wikimedia ProjectI'm currently bashing my head against a wall. Both physically and metaphysically.

    Nothing is cooler than Jini. However, unfortunately nothing is more obscure than Jini. It is something that exists etherally as a specification, not necessarily (although occasionally) as an implementation. It's the how, not the what. It's an adverb. Or something. My brain is lightly fried.

    See, you can start with Jini's reference implementation via the starter kit, which exists as a concrete example of the specification. However, the last release was at the end of 2005. This is probably due to this implementation being picked up by the Apache Incubator and turned into Apache River. Apache River does seem to be under active development but has yet to issue an official release and has distributed just one release candidate.

    Alright... so the Sun/Apache standard implementation is in transition. Who else has Jini services ready? Rio is a java.net project that appears to be both flexible as well as standards-compliant, so it appears to be a contender. There even appears to be moderate development traffic. However the stable binaries don't seem to match the current on-line documentation, and I haven't been able to download the latest milestone release. It does appear to have Spring integration however - and may yet be a contender. But until the documentation syncs up with the latest stable release, it's a bit hard to follow.

    Cheiron also has an implementation with Seven that is Jini-compliant, and appears to be receiving some good development traffic as well. I'm still researching how to go about implementation, and their docs currently match up with their releases. I'm trying to (still) read their documentation and see what I need to do to get up to speed. It appears that most people discussing Jini in forums use Seven Suite as their implementation of choice, although Rio has a strong following as well due to its ease of Spring integration and nice administrative tools.

    But for me this means I'm writing a helluva lot of "Hello World" and "Echo" applications, reading until my eyes bleed and trying to figure out how to get this all to work under a local development environment. Jini has been around forever... maybe I'm just having a hard time catching up with the rest of the world.

    I'm hoping Spring Integration makes this all a bit more straight-forward for complete knobs like myself. Please oh please tell me they're adding Jini into the next milestone. Having ubiquitous services over a self-healing network is just too good. I'd love to be able to scale just by plugging in a server and walking away.

    Monday, January 28, 2008

    Nokia Acquires Trolltech & Qt

    Trolltech, maker of my favorite development platform Qt 4, just announced they're being purchased by Nokia. Damn.

    History has pretty much shown that the independent, open-source shops being purchased by mega-corps have largely resulted in a slovenly product. Novell's purchase of SuSE has resulted in a distro that kernel dumps... kernel dumps occasionally on bootup. I don't hold fantastic hopes for Qt 4 unfortunately.

    It appears that others share my scepticism. A good deal of comments to LWN - which still boasts a pretty coherent readership - seems to also have the sort of timidity stemming from being burned before. The Register, which prints an overall positive article, still feels it needs to assert Nokia has made claims of continuing Qt development.

    It feels at the same time like the OSS world is shrinking and expanding. While awareness and adoption is at an all-time high, the high-profile projects are starting to be absorbed into the same machine they raged against.

    Saturday, January 19, 2008

    Keyboard and Mouse Synergy

    I've been using Synergy for... what... like three years now? It's one of those integral pieces of software that I can't do without anymore. If you ever have craved a dual-screen setup for your laptop, but still want to use your desktop at the same time, Synergy is the perfect thing for you.

    Synergy works like a KVM in reverse. You give it two or more machines, each with its own display. Pick the machine with the nicest keyboard and mouse - that will become the "host." You tell Synergy where the other machines' monitors are located (i.e. the laptop is to the left of the desktop's monitor) and Synergy will transmit all keyboard and mouse events to the other machines. You basically have connected your mouse and keyboard to your remote machines via TCP/IP.

    For example, let's say you have a desktop at home and a laptop at work. Pretty typical setup. And you have a nice dual-monitor setup at work: you have your laptop's monitor on the left side of your desk, and a nice DVI monitor on the right side of your desk to use. When you get home, you'd like to have the same sort of setup... except you don't want to detach your desktop's monitor at home and re-attach it to your laptop every day.

    Synergy will connect your laptop and your desktop together at home so one keyboard/mouse can control the contents of both screens. Copy and paste, lock screens, whatever you like. You can't drag-and-drop files from one desktop to another mind you - they're still physically separate machines. But you can verily easily browse the Web on one monitor and code in the other, all using the same keyboard.


    The above image is translated from http://synergy2.sourceforge.net/about.html - but unlike the source image it isn't animated (thanks blogspot). It gives you a sense of how the desktops can sit side by side and Synergy allows the mouse cursor to "hop" over to the other screen. Do take a look at the animated version on Synergy's site to get a better sense of it.

    It's cross platform, so Linux desktops and Windows desktops can still work well together. Even works with OS X. So your PowerBook can share a screen with your WinXP desktop alongside your Linux server. Nifty!

    Tuesday, January 15, 2008

    Jini in the Skull

    Now that I have my uber-huge project out the door, I'm trying to think smarter about development in general. I kept thinking that being smarter about development meant thinking bigger - so I initially tried to get more involved in the infrastructure of things. But it wasn't a good fit in my brain - you can't lose yourself in a PowerEdge like you can lose yourself in a stream of bytecode.

    This hit home when I was talking to a previously C++ programmer at work. Because I'm a) lazy and b) often muddle explanations, I often tell new Java developers... whilst stammering and mealy-mouthed... that Java passes objects by reference. This quickly helps people understand(ish) why setting a value on an Object inside a method doesn't necessitate a return value. However, when I say that I'm perpetuating a distinctly wrong concept. One I should be ashamed about.

    I try to avoid the long conversation that ensues, but I should tell developers the truth. Java always passes by value, never by reference. In fact, it passes references by value. Most people look at me like a Buick is growing out of my head when I say "passes references by value." But the fact is that variables only hold primitives and references, and are always passed by value.

    Take a look at Java is Pass-By-Value, Dammit! It all boils down to this fact: C and Java always pass by value... that's why you can't do the standard "swap" function. For example, in C++ you might do:

    Object hello = new("hello");
    Object world = new("world");

    swap(hello, world);

    printf("%s", hello);
    printf("%s", world);

    void swap(Object obj1, Object obj2) {
    Object swap = obj1;
    obj1 = obj2;
    obj2 = swap;
    }

    This transposes the value in obj1 with the one in obj2. You should see "world hello" from running this craptastic pseudocode. Try this same code in Java and nothing happens - you've locally overwritten some values, but you didn't swap your references:

    Object hello = new Object("hello");
    Object world = new Object("world");

    swap(hello, world);

    System.out.println(hello);
    System.out.println(world);

    void swap(Object obj1, Object obj2) {
    Object swap = obj1;
    obj1 = obj2;
    obj2 = swap;
    }

    Here you'll just get "hello world" out - your references remained intact, because you passed by value.

    This is ultimately a cleaner way to develop... so it's a major plus for both C and Java. And my New Year's resolution is to stop telling people that Java passes by reference just so I can end the conversation sooner.

    On the flip side of the complexity coin, I've been reading the the Jini specification by the Jini community as well as Jini and JavaSpaces Application Development by Robert Flenner. Jini has evidently been around forever, but I've only recently become interested in it. Remember the old pre-turn-of-the-century adage that Java would be running on everything in your house, from your kitchen toaster to the fridge? Evidently around 2000 or so Jini was sold as becoming the premier way Jini could allow your toaster to auto-discover your refrigerator and... er... do... heating and cooling... stuff. Who the hell knows. The idea of automagically connected and integrated micro-device clusters communicating across a mesh network is cool, but practical consumer applications are pretty much nil. Then once EJB's started incorporating RMI, Jini came back to the forefront as an easy way to do the heavy lifting of RMI without the thick refried JavaBean layer.

    Once you get Jini up and running, it is wicked cool. Start up your Jini registrar and then poof, services get published across a network. Look for a remote service and poof you can discover it and invoke it - no need for stubs or manual marshalling. Once you get the Jini infrastructure up, you don't have to teach developers how to use it... they just implement an interface and the rest is done for them. You can have a mesh network of peer-to-peer nodes up and running within seconds, and the actual node developers don't even know they've done it.

    No crappy WSDL's. No UDDI. No thick & slow SOAP XML transport over HTTP. Bytecode and serialized objects all the way. We're not just talking a faster mode of transport, we're talking about delving down two entire network layers.

    The application for such technology is mind-boggling... which maybe is why people aren't using it as much as they could. Damn shame, too.

    Thursday, January 03, 2008

    Now Maybe We Can Talk

    Well... it's out. The press releases were finally sent out, the media embargo was lifted, and my big no-sleep-'till-prod project has finally been announced to the public. It was almost four months ago I last prepped the exit music, and hopefully I'm back for a little while now.

    Project Apricot is in full swing, the NDS widely available homebrew cart, I'm still filing bugs for openSUSE 10.3, accelerated MPEG2 playback isn't working on my Mythbox, I finally got a copy of The Orange Box along with a copy of Orcs and Elves for the NDS. There's plenty to do.

    In my five-minutes-here-five-minutes there I've been enjoying Carmack's mobile-turned-DS title, although it took me a while to adjust my frame of reference. Don't expect more than a turn-based DooM mod... it's a sprite-based engine that is first person but completely turn based. But after you adjust your expectations you realize it's a return to form of sorts. Carmack was a pencil-and-paper AD&D player, and such roots definitely go deep in this title. You can see where his experience as a DM sparked a lot of good ingenuity and design into a rather primitive (but imminently playable) title.

    Wednesday, December 05, 2007

    Left Undone

    I'm so damn tired. And I'm leaving so many things left undone.

    It's amazing to me how close Novell can get with openSUSE, and how far away they still remain. I've known even anti-SuSE people converting away from Fedora Core or Ubuntu lately, due to its (relative) stability, out-of-the-box Compiz and working wifi drivers. But it seems they're ignoring fixes to some pretty obvious issues, even when the answer is served up to them on a silver platter.

    Case in point: Intel wireless 3945 not connecting openSUSE 10.3. Ben gave patches, source RPMs, binary RPMs, explicit debug steps, log files... everything. In freakin' October. And its still not released via Novell's update service. It's a big, obvious issue... and getting no attention.

    There are so many things I need to take care of, too. I have an entire graveyard of half-completed electronic assemblies that need to be pieced together into some working mechanism. I need to research how I can get XvMC to freakin' work on my VIA EPIA box. Evidently I decided to pick the one "pro" chipset that works with virtually no other distributions out there, due to the complete insanity behind driver support. No accelerated MPEG2 to for me until then.

    DeskBlocks and ConsultComm are both abandonware right now, too. I really want to devote some time to them... especially DeskBlocks... but there's absolutely no freakin' time. I can't get over feeling extremely guilty over leaving a half-eaten OSS project sitting out on SourceForge... bitrotting with the rest of the half-eaten projects out there.

    Wednesday, November 21, 2007

    Games 'n' Music Finally Viable?

    It appears that the widely available Games N Music homebrew cartridge has finally been hacked - the filesystem can now be written to after all. It now looks to be a possible development platform...

    Dammit. Wish I had time to mess with it.

    Sunday, October 07, 2007

    SuSE Ten Point Ugh

    It definitely seems like Novell's openSUSE distribution has taken a page from RedHat's Fedora Core. RedHat took their mainline distribution and split it into two - an Enterprise product, and a "community" product. Ultimately their community product had a lot of problems with communication and collaboration, although it seems they've leveled off since. The "Core" distro was really a way to publicly test and iron out bugs for their enterprise product.

    10.3 definitely feels this way. Although they have an open Bugzilla, there are lots of bugs that should have been caught. Eclipse doesn't even start, Intel wireless cards have problems with NetworkManager and autoplug, and Xgl has crashes and other weird problems. Novell is even looking to awkwardly cram all of KDE into /usr, like RedHat infamously did. Mosfet blew a mosfet about this layout, and rightfully so.

    I had a similar volume of really, really annoying bugs in 10.2, although most were eventually remedied after several weeks. And I even grew to like some of features that annoyed me at first. But the first few months of a release are choppy at best.

    So I'll hold off on any big judgement. The problem with Linux desktop releases is that everything is really close to almost working... so close that it's maddening. I think that's why people fly off the handle when a new release comes out - you're so close to having a perfect setup, it's absolutely maddening when a few nagging bugs completely screw it up.

    Saturday, September 15, 2007

    Prep the Exit Music

    Started a new job this week. The motivational speech from the .com's owner was "if you're not working 80 hours here, let me know. I'll write your a letter of recommendation and get you out of here."

    So, alas, I'm going to fade away into work. I'll let the two most nerdcore l33t play the music as I sign off into the sunset.

    You know I never post YouTube videos on this blog, right? Right? So trust me when I say this is deserving of breaking my streak.

    Until later...

    Monday, September 03, 2007

    Crystal Chair

    I remember, still quite distinctly, one particular moment in the December of 2002. My job wasn't what it once was, and I was starting to contribute more into some random open source projects. I wanted to see what game engines "looked like," so I downloaded the developer documentation for CrystalSpace on my (now destroyed) iPAQ. It was December however - and of course I had to do a blitz of Christmas shopping.

    While walking all over the mall's green turf, I tried to read the CrystalSpace HTML docs. Finally I took a break and crashed in a lounge chair inside a Von Maur and poured through half the pages I had sync'd. It was in that chair that I was first turned on to CrystalSpace's automagic "smart pointers." When things finally clicked about factory design patterns. When I finally saw how platform-independent file mounting could work. Things largely fit into place for the first time in my head, and a rush of Computer Science courses that were starting to leak out of my brain finally kicked into place. It all weirdly made sense.

    I still think of that December of 2002 whenever I pass by that chair. Last year, before I aged another decade, I provided myself with a crap-or-get-off-the-pot objective: to finally have a working game out by February of 2007 or otherwise just give up the ghost. While I did some extensive hacking, spent a lot of late nights and tried to do it, in the end I just wasn't able to produce the goods. I have a partially completed project out there, but in the end I wasn't able to make it happen. I finally gave up the daydream.

    I passed by the chair yesterday and thought about how fall was coming soon. Christmas shopping season. And how I'm not going to be perched in that chair pouring over docs anymore.

    Sitting Upright with Tux

    I saw the weirdest thing in a local putt-putt establishment today. Sitting front-and-center was an upright arcade version of an old favorite, Tux Racer.

    Everyone in my family, young and old alike, has played a more updated version of Tux Racer on my Linux box at home. PlanetPenguin Racer was the more "finished product," granting desperately needed functionality and map layouts to its elder version. I'm used to it being an open source game on every Linux distro that even two year olds love to play.

    Evidentially Roxor Games turned this into an upright. It was so weird seeing the familiar Tux splash screen behind a quarter slot... I had to do a double and triple take. And then take a crappy cell phone photo. But it was fantastic to see it occupying floor space. It's like the smell of roast beef and mashed potatoes when you walk through the front door. Not the front door of an arcade... the front door of... crap, forget it.

    It was just cool to see.

    Thursday, August 30, 2007

    Best Intentions

    I seem to have hit an interesting cycle of professional life. Hire, work 70 hour weeks, burnout. Hire, 70 hour weeks, burnout. My independent development fits into the cycle. When I did work for Planeshift I was in the burnout phase - but then I ran out of time to hack when I started to look for new jobs in the "quit" phase. I lost my Planeshift development responsibilities and instead started flailing about in my new (paid) job.

    Recently I went through another burnout period, which yeilded the beginnings of ConsultComm version 4 and DeskBlocks. But ultimately before I was even finished I started to interview for new jobs, quit my old one and move on to the flurry of a new corporate projects.

    I'm starting the new job soon, and I'm sure that means my open-source development will once again need to go on hold for a bit. I'm hoping to finish my latest round of ConsultComm bugfixes first, so I can at least leave with a clean conscience.

    Paul Graham wrote a good article that pretty much explains the above circumstance in his article “Holding a Program in One’s Head”, which pretty accurately reflects the difficulties of hacking. Code often exists like a house of cards in your head, and one weak wind can send it all into collapsination. It's really bad in the office, when random walkby's and phone calls are pretty much modus operendi.

    At least I'm starting over again. And a fresh start means I might just make it happen this time, right? Right?

    Thursday, July 26, 2007

    We Suck Less Than Our Competition!

    My head is a cloud from the latest plague that I caught, so I'll make this disjointed and brief.

    Everyone (myself included) has been caught up in the Linux for the desktop craze. I do use Linux as my primary desktop 95% of the day, and the other 5% is either a) scanning b) iTunes or c) gaming.

    However there are several things that I have grown... calloused... towards. Audio can be wonky at times, with either devices being exclusively reserved. Things will become completely unresponsive during high I/O loads. Still, it's a developer's panacea... things work sensically, have an easy interface and are interoperable.

    Still, I can understand why Con Kolivas was frustrated. He did his best to fix up CPU scheduling, make things more desktop-friendly, and in the end his kernel patches were nibbled on and digested into someone else's mainline patch. I can see both sides of the story at times... but I think advocates like Kolivas are desperately needed in the Linux desktop world. Especially when it comes to speed. His argument that our gajillion-gigawatt processors should be cutting through our daily chores like cake... but instead we're dealing with the same lag time as the "desktop search engine" indexes every freakin' file on our 1 TB hard drive.

    A familiar notion nowadays is that of the early zygote of a hacker... addicted to the Amiga 500 or Commodore64. Kolivas brings an interesting perspective on why: hardware no longer sells. We're dealing with the same scraps as we had before, just with increasing amperage. Hardware is sold because of the OS, where the hardware was pushing the OS in the late 80's. And so it has been ever since.

    Or maybe not. Take a look at the XO Laptop from the One Laptop Per Child project. Do you care what operating system it runs? Nope, the hardware is the thing that drives the device. The OS reflects the hardware's abilities and limitations, but in this instance "operating system" is an abstract notion. You don't care that it's running Linux, or Windows, or OS X... just that it's linking together a mesh network of 50 kids over 20 square miles.

    Notice it has a "view source" key? Kids can evidently take a look at the current running program's source code on-the-fly, in hopes that they'll want to peek under the hood and maybe hack a little.

    Sounds like OLPC is spawning a new generation of Amiga 500 hackers, doesn't it? Both stood to be inspired by "cheap, cheerful, unique" computers that spawned their interest as kids. Here's to hoping that we're encouraging another generation of Kolivas'.

    Thursday, July 12, 2007

    My Cheydinhal Home

    I've been suffering from pretty extreme headaches for the past five weeks, so I decided to take a break from... well... everything and try and relax. Part of that relaxation involved finally finishing the main quest & guild quest in Oblivion. It only took fourteen months - better late than never!

    Things have been so crazy busy that I haven't played much of anything at all since I walked away almost exactly one year ago. Damn... where does the time go? I've been doing more casual gaming since that fateful summer in 2006 - thanks to Nintendo's solution to gaming in a bathroom stall. Now I can fill even the most narrow of crevasses in time with some match-three variation.

    To any point, I finally finished the important quests in Oblivion. I finally trucked back to my home in Cheydinhal, put all my memoirs up for display on my shelves, read a few final tomes I had sitting on my desk, visited a few more scenic locales, then went back to reside My Cheydinhal Home en perpetuity. It was hard to give up, but after a solid three days of gaming I was ready to finally put the DVD-ROM away.

    It's been hard to code with the headaches, but I've been trying. I'm working on finishing up Deskblocks and rolling out a point release for ConsultComm. It's just really, really hard to concentrate and code when it feels like a titanium spork inside your dura mater is trying to shovel its way back out through your skull.

    Friday, June 29, 2007

    Wow. Just..... wow.

    I was always a big fan of ReiserFS. It did a good job on filesystem recovery from sudden calamity, and was great with metadata over many small files. I moved to ext3 recently, despite being incredibly slower, mainly due to the more cautious journalling features.

    I heard about Hans Reiser's wife, him being a suspect and his friend admitting to murder of eight other people. But Wired's article on the subject is an unbelievable piece of art. By juxtaposing code snippets of ReiserFS with the crescendoing story line (especially if you recognize the code), there is an amazing amount of suspense and revelation that works in tandem to the first-hand narrative. Reading
    + if (!JF_ISSET(node, JNODE_HEARD_BANSHEE))
    + warning("nikita-3177", "Parent not found");

    gave me absolute goosebumps.

    Sunday, June 17, 2007

    A Package On Your Doorstep

    I first started working with CrystalSpace in 2003. Back then compatibility would change from version to version, building CS was a regular task and CEL was still an up-and-coming project.

    Then this January the team announced the first stable release of CrystalSpace; one feature-complete and ready to be production ready. It's amazing how mature the project has become since then... documentation has flourished, CEL has become a fantastic development framework, titles can be developed with little to no coding and Blender integration is now ready for prime time and sponsored by the Blender project itself.

    Not only that, CrystalSpace has finally hit the prime time: binary packages are now widely available on most Linux distros. Debian has been carrying packages for a while, but now native SuSE RPM's are available via PackMan.

    No, you shut up.

    Now people can develop entire titles while only focusing on content creation and scripting, allowing people to focus on what makes a game a game. Not only that, there are copious templates and examples for how to build projects. And you no longer have to build packages every night. And it's open source and readily available. It blows my mind.

    Wednesday, June 13, 2007

    I've Never Been So Excited for an Apricot

    Blender has been the model editor of choice for CrystalSpace for a long, long time. Most of my CrystalSpace documentation has revolved around using Blender to create content for CrystalSpace code. The two together are like peanut butter and bananas. Fantastic.

    Recently Jorrit made an exciting announcement on behalf of the CS team - Blender and CrystalSpace are partnering to build Apricot, an independently developed and completely open game title. To fully understand my unbridled enthusiasm you have to understand Elephants Dream, originally called Blender's project Orange. This was an open movie project - a full movie title with all content released under a flexible Creative Commons license. Textures, models, all production files are freely available. This did unbelievable things for Blender. Not only did this give users access to professionally generated content, new documentation and a whole new realm of tutorials it also pushed the envelope for Blender itself. Orange generated demand for whole new genres of features, and kept the Blender development team pushing point release after point release to keep up. If I recall correctly, Blender's hair generation system was largely built due to demands made by artists creating content for Elephants Dream. Not only did the movie promote Blender, it made Blender a production-quality product that could demonstrate it was ready for prime time.

    The same envelope is getting ready to be pushed for CrystalSpace now. That's where my unbridled enthusiasm lies; CrystalSpace has been a commercial-quality 3D engine for a while now, but now every stage of the production process will be thoroughly tested and fleshed out. While I have no doubt that this project will result in enhanced functionality grown by the demands of the game developers, I'm most excited about the tool chain being completely fleshed out. In my mind while the Blender exporters for CS were fantastic, all the corner cases hadn't been completely covered. With Apricot, model exporters should be polished, skeletal animation should be more integrated into Blender armatures, physics should be more strictly related to Blender riggings and meshes should have attributes that more exactly equate to CrystalSpace equivalents. This should make the entire end-to-end content generation process as smooth as a polished stone.

    Nothing like a real-life, production quality project will take the edges off of the various and sundry tools used for development. It's amazing how much one will forbear when it's not a "huge issue," but when you encounter the same "not a huge issue" twenty times a day it suddenly becomes something worth tackling. Ideas and features may be the result of inspiration, but the remaining 99% of time spent refining a project is sheer perspiration. I'm looking forward to both projects' continued trial by fire, and seeing what has been forged once the fires have quieted down.

    What the NDS and iPhone Have In Common

    This past weekend I saw a copy of Opera for the Nintendo DS just idly sitting on the store shelf. I picked up a copy and thought to myself "damn, I'm out of the loop. I didn't even realize this was out yet!" Evidentially I picked up an early copy - the release wasn't reported until the following Monday.

    I'm surprised how much I'm using the browser. I didn't think I'd be the type to roam through my house checking random sites on the NDS. But in an age of ubiquitous WebMail and continuously streaming blogs, a device that allows you to quickly scroll through snippets of online text is actually pretty useful. There turns out to be plenty of opportunities where I "just want to check something," such as see if a webcomic has been posted for today or check if I've received new e-mail at work. Instances not exactly worth booting up a laptop, but perfect for just cracking open the NDS and hopping on wireless briefly.

    The DS doesn't have an open third-party SDK, and no accessible means for running homebrew currently exists. Instead, Nintendo is hoping that Web applications will grant enough functionality to fill the gap. Sound familiar?

    Steve Jobs' recent keynote hammered home the insistence that while 3rd party API exposure won't be available for the iPhone, Web applications will be more than enough to offer custom functionality. He suggests that a Web browser can be used in lieu of an ability to launch third-party applications.

    The assertion that modern Web applications, what with their asynchronous JavaScript and XML, can replace standard applications is pretty ridiculous. Can JavaScript monitor what roaming tower your SIM card is using? Er... no. Can XML be used to play Doom? No. While you may be able to monitor a RSS mashup, no applications can leverage the hardware in your hand. Saying that any Web application is going to replace a device's native API is hella stupid.

    But will the lack of third-party applications hurt the iPhone's success? Not likely. Lack of homebrew availability on the NDS hasn't exactly hurt sales all that much. If you do something and do it well, you're going to sell.

    Sunday, June 10, 2007

    Muted Games N Music

    In the time that has passed since I first lamented the DS' limited ability to execute independent applications Nintendo DS development has become increasingly mainstream, and along with that a slew of affordable and easy-to-use NDS flash cards have become available that allows independently developed applications to be executed on the DS. There's even a retail flash card now - the "Games 'n' Music" cartridge. I'm not linking to it... it lacks the file system drivers that make it a useful device. But it's the first flash cartridge (that I know of) that's widely available in retail channels such as your local neighborhood BestBuy.
    I picked one up, just to see what it was like. It lacks DLDI support, which means it can't interact with the filesystem. If it can't interact with the filesystem, that means no save games, no loading libraries, no loading maps, no user profiles. Blech.

    But it's in retail, which makes it interesting. And it boots DS Linux, which is at least mildly intriguing. At it's cheap... only $35 for the flash card, 128 microSD card and a microSD USB reader. I might waste an equal amount on a craptastic NDS title... so I don't feel too entirely guilty about buying a flash cart that's missing a DLDI.

    I think I understand why Datel didn't offer DLDI support. By disabling DLDI people can't execute pirated ROM's from commercial cartridges - instead people are stuck with pure homebrew that doesn't require local storage. This could possibly limit ROM execution of course... but this also wrecks a lot of homebrew.

    Thus far I've booted Linux, tried a video and failed to play one homebrew title. Maybe this will eventually gain usefulness once the cart's filesystem is cracked, but until then it may just stay in the bag.

    Thursday, June 07, 2007

    Bluetooth Affinity

    Now that I have my handy-dandy Bluetooth phone I'm a regular Bluetooth addict. I've always had an odd engineering respect for the specification (the true measure of a geek is how emotionally attached they can become to a tech spec). Bluetooth seems to have been designed by engineers for engineers, and done very well. The fact that it uses the Hayes Command Set gives me that odd sensation of nostalgia... like seeing a Pac-Man clone on a modern $600 cell phone.

    I was pretty impressed with Linux Bluetooth support - a single KDE desktop applet allowed me to browse all Bluetooth devices within range and view their individual services. Within only a few seconds I was able to transfer files and view device status... very schwag.

    One thing I didn't understand until now was the concept of Bluetooth "profiles." Just as TCP may be a conduit for HTTP or FTP, Bluetooth is a conduit for OBEX exchanges or headset commands. It makes sense in retrospect, but Bluetooth host devices offer up services such as HSP (headset communication), OPP (pushing files) or BPP (printing).

    No OSS project I've run across so far has been able to do vCard or vCal retrieval or transmission. Most projects appear to perform vCard and vCal access via OBEX, which Linux supports. One thing I was hoping to do was send vCards and vCals to and from the phone, but thus far I've been unable to do so. My phone doesn't have the SYNCH profile used for PIM exchange, yet it supposedly transmits raw vCards over Bluetooth... but I haven't been able to get Linux to receive them. Something I can hopefully remedy.

    This limitation seems to exist even in commercially available products. DataPilot evidentally supports syncing the phone book, but not the calendar or text messages. Mobile action is evidently C|Net's choice, but just does contacts also.

    From what I can tell, my handset supports:

    HSP

    Headset Profile

    Uses Hayes Command Set for headset operations


    HFP

    Hands-Free Profile

    Used for car interop. Synchronous Connection Oriented link with mono, PCM audio


    DUN

    Dial-up Networking Profile

    Networking dial-up abilities using SPP from laptop or other workstation


    OPP

    Object Push Profile

    Transfers are always instigated by the sender (client), not the receiver (server) using the APIs of OBEX operations connect, disconnect, put, get and abort


    FTP

    File Transfer Profile

    Uses OBEX as a transport and is based on GOEP for getting folder listings, changing to different folders, getting files, putting files and deleting files


    BPP

    Basic Printing Profile

    Sends print jobs to printers without needing drivers


    A2DP

    Advanced Audio Distribution Profile

    Possible ALSA integration, used for headphones & media players


    AVRCP

    Audio/Video Remote Control Profile

    Used in concert with A2DP or VDP to allow a single remote control (or other device) to control all of the A/V equipment to which a user has access

    Tuesday, June 05, 2007

    Requiescat In Pace

    It has been a fantastically horrible month in my real life. I'll just leave it at that. All my current projects, including the portal I was hoping to launch, are pretty much indefinitely on hold.

    In other news, I've been trying to find a better way of connecting to my home network while abroad. I've been using SSH to connect and locally forward ports to the home network, but that meant every service had to be hard-coded. Instead I've been evaluating OpenVPN on OpenWRT. Both are amazing projects, and both worthy of considerable attention.

    I haven't evaluated OpenWRT in nearly four years now... they had just started using a package manager when I last tried to flash my WRT54Gv2. It's in an amazingly well-adjusted and highly advanced state now... it was hard to believe how flexible the OS & utilities were. Everything worked out of the box with a minimum of hacking. Especially for someone like myself who used to construct Linux home firewalls out of old workstations, this fit my schema perfectly. I was a HyperWRT guy, but as that firmware grew stale I moved an entire distro-in-RAM.

    OpenVPN is further evidence of why IPSec tunnels just never gained proper adoption in the roadwarrior market segment. They work fantastic when joining disparate networks through concentrators, but they just don't offer the flexibility, interoperability and ease-of-use that SSL tunnels do. I was an IPSec advocate in the days of FreeS/WAN, but once opportunistic encryption adoption didn't reach ubiquity they supposedly just closed up shop. PPTP offers good interoperability and ease-of-use, but was ultimately PPP with some wrappers around it. OpenVPN has proven itself to be a secure and flexible compromise between the two while still maintaining ease of use behind firewalls, proxies and NAT's. It may lack a certain "purity" of IPSec, but for roadwarrior and ad-hoc connections OpenVPN is indispensable.

    Juniper Networks has a pretty good "Instant Virtual Extranet" platform that incorporates an SSL-based VPN solution which does a great job - it even supports Linux. I have to give a big tip o' the hat to Juniper Networks on that one - the actually developed a VPN client that works properly in Linux. Launch a Java Applet, grant it rights to install a client stub on your machine, then an SSL tun0 interface automagically pops up on your Linux box. Bravo.

    But I digress.

    The NetworkManager GUI within both Gnome and KDE has support for OpenVPN tunnels, so I decided to give it a try. At first I attempted the most simple case using a static key. For the life of me I couldn't get static key support to work with NetworkManager... it wouldn't even establish a connection despite the fact that it worked manually on the console. I gave up on static keys and instead created a public key infrastructure, issuing client keys when needed. This allowed me to establish a connection just fine, but it brought one critical bug to the surface: DNS resolution was subsequently borked, since NetworkManager wiped out resolv.conf once the tunnel was initialized. Fooey.

    So instead I created a manual script that initiates the tunnel. That appears to be working pretty well now... no big worries. For now it's a straight UDP tunnel, but I might change it to TCP down the road.

    The configuration wasn't too bad - I followed the HOWTO Quickstart pretty much by the letter by creating the keys, issuing them to clients then using their sample server and client config files. On OpenWRT, all I had to do was create an /etc/init.d/S50openvpn script to start OpenVPN on startup, then add the following firewall rules:
    ### OpenVPN traffic
    ## -- Permit initial negotiation
    iptables -t nat -A prerouting_wan -p udp --dport 1194 -j ACCEPT
    iptables -A input_wan -p udp --dport 1194 -j ACCEPT
    ## -- Permit tun interfaces
    iptables -A forwarding_rule -i tun+ -j ACCEPT


    Now I'm able to connect and browse at will. Not too shabby!

    Amazing that you can build a VPN concentrator, WAP, firewall and management station for a little more than $50. Ultimately you end up with more usability than you could get with a $200 cheapo desktop.

    Thursday, May 17, 2007

    Volunteer Voluntschmeer

    I'm so freakin' tired right now.

    ConsultComm 4 development is proceeding slower than snot, but what's new. Work is workin' me... like... 45 hours a week, plus I've got family matters to attend to. I'd like to have help development this next version - but whom?

    I've tried to get colleagues and friends to help out, but that lasted for less than a week once their interest waned. I put out a job posting on SourceForge and received two responses, but they never e-mailed back. I've tried recruiting, but no luck.

    I can't say I blame 'em. Finding volunteers for an open source project sucks. I'm just as guilty as the next guy. I volunteered and did some coding for PlaneShift back in the day, and I loved it. Stayed up late nights in IRC, chatted with the technical architects, worked hard. But in the end I had to quit my day job, find a new one, perform for interviews, etc. The fun development had to go on pause for a while. And when I tried to resume development, the PlaneShift developers weren't really interested in taking me on anymore. Can't say I blame 'em.

    I tried to just "donate code" to the Java Desktop Integration Components library, but I was convinced instead to create a new incubator project. I feel guilty from time to time... I just didn't have the time or desire to keep up the project, and it's basically remained dormant since the time I dumped off the source & JavaDocs.

    It's tough to donate time and effort. Life sucks up a lot of energy. And often life just plain ol' sucks. When you already donate 10 hours a day to work, 6 hours a day to sleeping, 4 hours a day on family and 2 hours a day on basic upkeep of the house and oneself that leaves... lessee... two whole hours?

    Great. One can only wonder why I can't finish anything. The sad thing is, I know all my fellow hackers out there are in the same boat. We're getting old. We're getting jobs. We're getting families. And our internal organs are shutting down. It's hard to have the gusto to be a nighttime hacker extraordinaire.

    Sing it with me! "Shiny, happy people holding haaaaaaaaaaaaaaaaaaaaaaands!!!!!!!"

    Tuesday, May 08, 2007

    Converge to a Crash

    Since the days of calculator watches, convergence of personal electronics was the goal of every gadget-loving geek. Our coffee makers should make eggs and toast. Our watches should be able to convert metric to imperial and give pi to fifteen digits. Our game consoles should be the media hub of our living room. PDA's should be integrated into every electronics device imaginable, but inter-operate with none.

    I had a dream called the Samsung i500. Part PalmOS PDA, part phone, sync'd to Linux, allowed third-party applications & Palm gaming. It should have consolidated my iPAQ, cell phone and mobile gaming console into one lil' compact unit.

    Hilarity.

    The phone was serviceable at first, but it was neither a good PDA or a good phone. Third party apps worked, but space was limited. Sync'ing contacts was a pain because the Linux USB Visor drivers locked up my machine. Things like voice dialing just didn't work. And for some inexplicable reason you could use the phone to receive text messages, just not send text messages. Right.

    The i500 had finally taken all the abuse it could and started corrupting my contacts and calendar databases. They eventually got so bad they caused memory exceptions and locked up the phone. Ugly.

    During this time I aquired my NDS and an iPod. Calendaring & contacts were actually being served up much better by the iPod than any PDA I had used in the past (thanks to its native vCard and .ics file format support). The NDS was my stop for mobile gaming. And my i500 became nothing more than a half-assed brick.

    But this doesn't just deal with the idiocy of smartphones. Look who else is doing this - namely console manufacturers. Sony wanted the PS3 to be your high-def media center, file server, gaming center, music server, bread toaster all-in-one. In the end, however, sales were awful. CNN deemed "the PS3 may be the chrome-trimmed headstone on the grave of convergence." Not to say that you can't have successfuly convergent devices... you definitely can. But you have to do all parts well, not just some. My i500 was a weak PDA and a poor cell phone, but I mistakenly thought that those two weak facets added together would equal a stronger convergent device. Instead I got a broken PDA and a crappy phone.

    Same thing with gaming. Cell phone games are popular, but I'd wager dollars to doughnuts the industry has hit its peak. I know that Intel, IBM and Nvidia are ramping up their own system-on-a-chip products for mobile devices, hoping to bring vertex shading to 2" screens. But if you're a gamer, purchasing 4-5 titles a year, what system are you going to rely on? A J2ME-based cell phone that's so tiny the ligaments in your thumb pop, or a pocket-sized DS Lite? Would you rather have a phone that third-party publishers and developers still can't deploy product onto, or would you rather have a handheld console where you can just buy a $30 title off the shelf?

    So forget convergence, I'm back to just buying what I need. Now I just need Dockers to bring back their Mobile Pant.