Sunday, April 27, 2014


Middle of last year, the project I had been working on since 2012 was drawing to an end, and reflecting back on the work I had done, I realized that the quality of the code was incredibly uneven. Some parts were easy to maintain and extend, whereas others were poorly structured and a severe headache to work with.

Having identified this quality issue with my code, I began to contemplate what could be the problem. I re-read all the great books on code quality and the software development process; Clean Code, the Programmatic Programmer, and so forth. It turned into a bit of an obsession.

By the time I identified the common factors in the problematic code, which was mostly down to the code having too many responsibilities, and one instance of using inheritance where aggregation would be better, the quest for a solution to this problem had inadvertently left me a changed man.

Problem solving is something I am very good at. It's in my mixed education I think, both physicists and programmers are tasked with solving problems--it's the very essence of the crafts, I'd argue--and I have one foot in both professions. The crux is that before you can solve a problem, you need to first realize that it's a problem (and it can be solved by you).

My epiphany was that problems with my development process were "problems" like I knew them from problem solving. That is, I could apply myself to them. I've always liked the concept of kaizen, continuous improvement, but the idea of applying it to the craft of programming never occurred to me. But now everything came together very rapidly. I have probably improved more in the last 9 months than I have in the last 4½ years, and I was by no means a bad programmer to begin with.

Over these months, I have made innumerable alterations to almost every aspect of how I work, and most changes are small, but I think among the more drastic changes I did was one to how I write my code.

I used to half-write a function, then half-write a function it called, and so forth, jumping around in the source file like a mad-man; and then when all the compiler errors went away, cross my fingers I didn't make any mistakes when I put everything together. Besides creating one hell of a clutter, this method also tends to introduce a lot of unnecessary bugs. With this method, your code simply isn't very well thought through, because there is no human alive that can keep all that half-finished code fresh in their mind all at once. As a compensation for this, it tends to lead to pretty long functions, and some severely unhealthy complexity.

I've realized that if you start at the top, and make it a rule to always finish your functions before doing anything else, while keeping them short and giving everything descriptive names, calling other functions that haven't yet been written and using fields that aren't defined, you're able to produce much clearer code. This way, you don't need to keep very much in your mind at all. When you move to implement the next function, you just need to look at its name to know what you intended it to do.

Before incorporating this method into how I work, I mostly got things right, but my ambition ended there. My biggest problem was that I never consistently put any thought to what I was doing. I tried to get better, but my attempts at improvements were wildly undirected, so I saw no tangible change in quality. For the most part it isn't very fruitful to try to assimilate solutions to problems that aren't your problems. So even though I read all the right books and blogs, I was pretty stagnant in terms of skill. I thought I was pretty good, but I just had no idea how much better I could become. The solution to re-iterate being to use self-reflection as a tool to identify problems, which you can solve, and use as leverage for greater understanding.

All that said, I'm still learning and improving. The day I stop is the day I am dead.

Saturday, April 26, 2014

Epitaph to Neverwinter Nights

The Internet isn't very good at keeping its own history, I guess in part it's too young to learn it needs to. But it's beginning to become apparent it should. is great and all, but it only captures a part of the Internet that was. A lot of things are fleeting, and you don't realize just how fleeting they are until they are already gone. This applies especially to communities. With regards to that, here is my testimony about a community that came and went.

It was in the early 2000s. I was a socially awkward teenager playing a game called Neverwinter Nights in my bedroom. This is not to be confused with the recently released MMORPG with the same name, or the older MMORPG with the same name. Neverwinter Nights was a computer RPG, one in a long line of attempts to bring the Dungeons and Dragons franchise to the computer. It wasn't the best in its genre by any means, but it had one feature that set it apart from anything RPG-like before or since: It was very easy for players to make content, and share content. In fact, you could build entire worlds with community created maps, quests, and rules; and keep them running on a server indefinitely, so called Persistent Worlds, or PWs for short.

Neverwinter Nights

After a while, a pretty large number of these Persistent Worlds started cropping up. Each had their own tone. Some were dedicated to non-stop hack&slash action (mimicking games like Diablo), some were absurdly hardcore RPG experiences where all experience was awarded by game-masters for role playing, rather than through game play mechanics like killing monsters. It was the sort of place where you'd get a hour long pause and discussion about what should happen when you use alcohol to numb the pain before treating a crippling 2 HP injury on a fellow player, with intense historical and contemporary medical research going on in the background...

The Goldilocks world for me was called World of Haven (WoH). It encouraged role playing, but didn't enforce it. I spent the sorts of time on this server only a teenager can afford. The hours must tally in the thousands. I played probably half a dozen or so characters, but I sank most time into an alcoholic down-on-her-luck sword master called Joanna, and a god-of-disease worshiping half-orc monk called Roshnak.

The people that frequented the server were a mixed bunch, but a decent portion of the crowd were servicemen in the US armed forces. All that aside, the player community around the PW was very close knit, yet welcoming. There was also a sense of unspoken friendship between a lot of the more active role players, that despite rarely breaking character, tended to associate with the same people.

All good things must come to an end, and Neverwinter Nights was no exception. Around 2005, the Neverwinter communities started to severely hemorrhage players. There was a new player in town, World of Warcraft, which had hit the online RPG scene like a bomb. There was a glimmer of hope in Neverwinter Nights 2, the sequel released in 2006, that it would stem the tide and save the community. Unfortunately, it didn't offer nearly the same modding capabilities as its predecessor, and this was really the final nail in the coffin. There was no resisting the new generation of MMORPGs anymore, and the community slowly ran into the sand.

There are still a very small number of servers running, but that isn't the same as saying the community still lives. It was the people that created the experience. What truly set NWN apart from the games that replaced it is that absolutely everything was community created, the maps, the quests; and most importantly all the role playing adventures were made enjoyable by the fellow players you interacted with. And sadly, almost all of the players are gone.