What do you think?
Rate this book
384 pages, Paperback
First published July 13, 2004
I admit that when I first learned all this about computer memory allocation, I was disappointed--no, offended!--in a linguistic sense. Programmers were so inept at metaphor-creation, I thought. Memory leak: this wasn't a "leak" of memory at all. ... Why not name it to show the origin of the problem, with the programmer? "Memory gluttony," it should have been called. Or "memory hogging." Even the routine they used to request memory from the operating system had been named incorrectly. It was called "malloc," short for memory allocation, "m" "alloc." But of course human beings don't read "m" then "alloc" unless there is a separator, a space, a dash. No, by the implicit structures of the English language, everyone pronounces it "MAL-loc." Mal, loc. Mal: bad. Loc: Location. Bad location! But of course they'd have trouble keeping track of memory when they'd named their tool so stupidly!This quote isn't terribly representative, but it struck me because that is my kind of tangent. Mostly the book is not technical, and has some great scenes and descriptions, like when a drunk person in an argument "felt his tongue start to slide around in his mouth like a bar of soap on a wet shower floor."
Debugging: what an odd word. As if "bugging" were the job of putting in bugs, and debugging the task of removing them. But no. The job of putting in bugs is called programming. A programmer writes some code and inevitably makes the mistakes that result in the malfunctions called bugs. Then, for some period of time, normally longer than the time it takes to design and write the code in the first place, the programmer tries to remove the mistakes.
The thoughts were gone, decomposed, passed into code, where they worked, where they ran, but could not be reassembled into human-think. All those tumbling thoughts had become marching lines of stars, pointers to pointers to arrays of pointers, functions calling functions calling functions. Layers of code talking to code, machines muttering to themselves in their own language.
... there is the problem of crossing the chasm between human and machine "thought": some fundamental difference in the way humans and computers are designed to operate. I understand the world by telling stories; the human mind makes narrative, this happens then that, events given shape so we can draw a circle around them, see them relate, cohere, connect. We're built to tell stories to one another, and to be understood. But the computer was built to do, to run. It doesn't care about being understood. It is a set of machine states - memory contents, settings of hardware registers - and a program, a set of conditions that determines how to go from one machine state to the next. [...] So reading the code was a matter of banishing the human story from my mind.
...my perception of the machine had been changed forever. I knew then it was just an approximation, a fudge, a best-case work-around on the intractable problem of time. The machine seemed to understand time and space, but it didn't, not as we do. We are analog, fluid, swimming in a flowing sea of events, where one moment contains the next, is the next, since the notions of "moment" itself is the illusion. The machine - it - is digital and digital is the decision to forget the idea of the infinitely moving wave, and just take snapshots, convincing yourself that if you take enough pictures, it won't matter that you've left out the flowing, continuous aspect of things. You take the mimic for the thing mimicked and say, Good enough.