The Lurker

Latest posts | Archive

posted by ajf on 2010-12-05 at 09:08 pm

One of the highlights of YOW Melbourne last week was Guy Steele's talk on designing algorithms for parallelism. He started with a punch card he'd written in the 1960s, which dumped the machine's memory, walking us through the series of contortions required to load it as a debugging tool — limited to one card, self-modifying code to produce 16-bit values in memory which couldn't be read from the 12-bit card, determining experimentally that the hardware permitted "getting away with" certain undefined operations, reading from the address containing an instruction because that instruction happens to be represented by a useful number, and so on.

Steele called it the dirtiest code he had ever written. Not only did later hardware require fewer such "dirty tricks", tools that improve developer productivity — macro assemblers, garbage collection — sometimes take away the degree of control that made many of those tricks possible.

But then I stumbled on this interview from a couple of months ago compares things he has created with HTML5 to games developed for the Atari 2600 (my emphasis):

Something else that took me a while to internalize: you have to accept that with Web development, anything that's worth anything will be a hack. Not just prototyping; production code as well. That's hard to swallow when you're used to proper, clean, sterile programming. [...]

And eventually that battery of hacks in your sleeve might make you stand above. My crude and jaded metaphor of Web development is button mashing when playing video games. Everyone hates button mashers, but working with cutting-edge Web really is flying blind a lot of the time — you're trying out all sorts of things that sometimes don't logically make a lot of sense. But they somehow work. If you get used to that mentality and you get familiar with those hacks, you will train your instincts to know which buttons to mash first, and give yourself more buttons as well.

We're not talking about programs restricted to eighty 12-bit columns, we're not talking about getting the most out of early, primitive gaming hardware like the 2600, we're talking writing code for enormously complex machines, conforming to specifications that require millions of lines of code to implement.

Can somebody explain to me why the fuck this is considered acceptable?

Having spent the last eight months working with a team of people producing a high-quality applications for mobile phones, it really pounds home the feeling that writing a browser-based application is like tying your hands behind your back and trying to type with your molars.

Related topics: Rants Web Mindless Link Propagation

All timestamps are Melbourne time.