I've spent most of my life avoiding doing much of anything that I perceived as hard. I mean, Why bother? Much of my motivation for this approach admittedly has had much to do with fear of humiliation; my dread of gym classes a good example. Avoiding mathematics of all sorts has more to do with an unfortunate lack of aptitude. I really wish sometimes that I hadn't become so clever about avoiding those things which seemed difficult. Ironies abound, avoiding hard things has lead to much humiliation. I seemed ready to suck my belly in and take humiliation with the pretense that at least it served my plan to avoid whatever it was I was avoiding. And one would think that avoiding hard things is akin to "going with the flow," but damned if I'm not a stubborn fellow. I'm rather proud of how little work history I've accumulated, if for no other reason that's been a sort of plan all along.
As much as I don't seem to mind being stupid, I very much like smart people. Sometimes being stupid in front of smart people can lead to very informative interactions, but at other times can be downright embarrassing. I'd do well to find a little reticence about using my mouth and keypad.
A recent post at Phil Jones' Platform Wars concerned groups of users of software with attitude versus developers of software with attitude. As a programmer and developer, Jones strips away the pretentiousness of attitude of software users which boils down to users telling developers what to do. Jones wrote:
But the idea that development is a commodity - which is essentially what he's saying here -, that it’s like the water supply which can be turned on or off or piped-around at the will of the user, is wrong, wrong, wrong.Heaven forfend I should attempt to learn software engineering, that's way too hard. But I think users and developers should talk with each other; developers say the most interesting things and are very well represented in the blogosphere. So I typed in a comment to the post. I rather knew that my comment wasn't really responsive to the thrust of his post and that I was vaguely trying to get at something but hadn't really gotten a handle of what that something was.
The reason is, that good software creation, like any other creative activity, requires a deep knowledge of the nature and constraints of the medium. You can’t invent the transistor without a profound understanding of physics. Nor write a great novel without being a master of your own language. Nor a great painter without knowing paint. Nor invent radio or television or the computer without a background in the relevant science.
As luck would have it the post corresponded with this weekend's BloggerCon conference and Phil Jones's post was linked to at Dave Winer's Scripting News. The result: plenty of people went to Platform Wars to read Jones' insightful post. If they wanted to comment they had to scroll past my moronic comment. I hadn't bargained on that.
Years ago as part of a standardized test to get a teacher's accreditation I had to write an essay in a short time. The test booklet gave the standard advice about making an outline before writing, but I'm not an outline-making kind of guy. We had to open an envelope to discover my essay question. It was essentially: "Technology: Good or Bad?" The gist of my essay was that people--homos--are tool makers. It's hard to think of people without thinking about language. Language is a kind of tool, a technology, so good or bad people are rather stuck with technology. And to make matters even more muddled, every technology seems to bring along with it unintended negative consequences. I barely passed the essay portion of the test!
My comment at Platform Wars had something to do with the unintended, not always negative, consequences of new technologies. I get goose pimples all over when I think about the great possibilities that computers and the Internet will make possible. I'm also of the sort that wants computer software to allow me to do stuff without any care about what makes the software tick. Recently, and in no small part a result of reading John Robb's Global Guerrillas, some of the potential negative potential of these new technologies have begun to dawn on me. What if the most innovative and up-to-date supply chain management software is used by criminal gangs running drugs and guns as well as by health departments and NGO's to deliver lifesaving health care?
I've been following the development of Kiva with great interest. The model of micro-credit financing is so innovative I'm certain that it's creating something good. Another great thing is that in addition to the wonderful Kiva site there are several blogs to follow the adventure. A post at IntoContext relates some hard questions about the business model:
The response was very critical with very pointed remarks about the failures of Kiva even in a large sense about what their intentions were. They brought up issues we had never considered like what if people use Kiva to launder money. What measures had Kiva taken to protects the MFIs.Whoa! I hadn't thought of any of those things. I'm not the least bit suspicious of the good folks at Kiva; rather that new technologies bring attendent negative consequences.
I always groan when I hear, "Guns don't kill people; people do." Still, I generally find it easy to imagine technologies as lacking intentions. Drivers of automobiles ought not to drive drunk, and it's really rather hard to imagine that drunks would have much of anything useful to say to automobile designers. Or for that matter what users of satellite communications devices might have to say to rocket scientists. The same sort of differential of knowledge attainment between users of technologies--especially moronic ones--and developers of technologies apply to software and the sorts of discussions users and developers of software and software systems.
Not being a rocket scientist I have nothing to of use to add in discussions of rocket designs. Nonetheless I think I shouldn't be excluded from discussions about nuclear weapons. Indeed, I think there are avenues of scientific investigation that scientists ought not to proceed and such discussions are not the sole province of scientists. Because I haven't paid much attention at all to software other than what it might do for me, I hadn't until very recently imagined that software development posed dilemmas on the order of nuclear weapons. When I wrote the comment at Platform Wars it was discussions about human responsibility between users and developers which I had in mind.
The trouble is I don't know enough to write a sensible comment on the subject, or a sensible blog post for that matter. I believe strongly in the great positive potential for everyone in peer production, and the arena that Jones calls the peerosphere. With that comes the responsibility to know something about "the relevant science" of software development. Oh no! That sounds like hard work. Still, I've managed to learn something about physics and something about biology in order to engage in thoughtful discussions about other technologies where vital questions about human responsibility come into play. It's about time to begin my education in this vital area begin as well.
There's been some great discussion of a recent article by Jaron Lanier, Digital Maoism: The Hazards of the New Online Collectivism. The discussion about the piece is particularly interesting (more here and here) because implicit in the thoughtful discussion is an understanding of the architecture of software design, the fundamental science informs the discussion.
It's a good thing that developers discuss their work online so now I have a bit more motivation to listen in. I hope I don't write too many more mindless comments on anymore of their blogs, but I probably will.