I’ve been a nerd, and proud of it, more or less since I could speak. I decided I wanted to be a scientist at the age of nine, and loathed sport at school with a deep passion. However, my true nerdly status was properly recognised only a week ago when, frazzled by wrestling with Windows 7 drivers, for a diversion I clicked on a link to www.nerdtests.com and filled in the questionnaire. It granted me the rank of Über-Cool Nerd King, and no award has ever pleased me more.
So what exactly constitutes nerdhood? To Hollywood, a nerd is anyone in thick-rimmed spectacles with a vocabulary of more than 1000 words, some of which have more than three syllables. What a feeble stereotype. In the IT world, a nerd is someone who knows what an INF file is used for and can use a command prompt without their hands shaking, but that’s still a bit populist for an Über-Cool Nerd King. Developers who know four programming languages might be admitted to the lower echelons, but true nerd aristocracy belongs only to those with a deep knowledge and love of programming language design. If you’ve ever arm-wrestled someone to solve a dispute over late versus early binding, you just might be a candidate.
Back to the dark ages
In the late 1980s and early 1990s, I steeped myself in programming language design. I was fluent in 14 languages, some leading-edge such as Oberon, Occam, Prolog and POP-11. I wrote books on object-orientation and coded up my own recursive-descent parsers. I believed we were on the verge of making programming as easy and fun as building Lego models, if only we could combine Windows’ graphical abilities with grown-up languages that supported lists, closures, concurrency, garbage collection and so on. That never happened, because along came the web and the dotcom boom, and between them they dumbed everything down.
HTML took us back to the dark ages so far as program modularity and security were concerned, but it was simple and democratic and opened up the esoteric world of the programmer to everybody else. If you’d had to code all websites in C++ then the web would be about one-millionth the size it is today. I could only applaud this democratic revolution and renounce my nerdish elitism, perhaps forever.
Progress did continue in an anaemic sort of way with Java, C# and interpreted scripting languages such as Python and Ruby that modernised the expressive power of Lisp (although neither ever acquired a tolerable graphics interface).
The rebirth of nerdhood
Stripping away the bloat
That need has spawned some profoundly nerdish research. Languages such as Erlang and Scala have resurrected the declarative programming style pioneered by Lisp and Haskell, while Google has been working on the Go language to control its own huge processor farms. Designed by father-of-Unix Ken Thompson, Rob Pike and Java-machine specialist Robert Griesemer, Go strips away the bloat that’s accumulated around object-oriented languages.
It employs strong static typing and compiles to native code, so it’s extremely efficient for system programming. Go has garbage collection for security against memory leaks, but its star feature is concurrency via self-synchronising “channels”: pipes that pass data between lightweight concurrent processes called “goroutines”. And because they’re first-class objects, you can send channels through channels, enabling huge network programs to reconfigure their topology on the fly – for example, to route around a failed processor or to balance a sudden load spike. Google has declared Go open source, and recently released a new version of its App Engine that offers a Go runtime in addition to supporting Java and Python.
I had sworn that Ruby would be my final conquest, but an itching under my crown signals the approach of a nerd attack.
Add your opinion below.