Eric S Raymond Homesteading the Noosphere (minus endnotes and acknowledgements): ---1. An Introductory Contradiction ---2. The Varieties of Hacker Ideology ---3. Promiscuous Theory, Puritan Practice  ---3. Ownership and Open Source ---5. Locke and Land Title ---6. The Hacker Milieu as Gift Culture ---7. The Joy of Hacking ---8. The Many Faces of Reputation ---9. Ownership Rights and Reputation Incentives ---10. The Problem of Ego (rilly rilly excellent!!!!) ---11. The Value of Humility ---12. Global Implications of the Reputation-Game Model ---13. How Fine a Gift? ---14. Noospheric Property and the Ethology of Territory ---15. Causes of Conflict ---16. Project Structures and Ownership ---17. Conflict and Conflict Resolution ---18. Acculturation Mechanisms and the Link to Academia ---19. Gift Outcompetes Exchange ---20. Conclusion: From Custom to Customary Law 21. Questions for Further Research 22. Bibliography 
 Raymond on 9/11: Decentralism Against Terrorism ----- The Biology of Promiscuity -------------------- The Circus Midget and the Fossilized Dinosaur Turd -or- "What up with that software industry?" A Treatise on Free Software Development. With apologies to Eric S. Raymond. ---- Fame? Ego? Oversimplification!  -------- ---------------------  Raymond on 9/11: Decentralism Against Terrorism --------  The Biology of Promiscuity ----xxxx-----xxxx-----  1. The Revenge of the Hackers I wrote the first version of A Brief History of Hackerdom in 1996 as a web resource. I had been fascinated by hacker culture as a culture for many years, since long before I edited the first edition of ``The New Hacker's Dictionary'' in 1990. By late 1993, many people (including myself) had come to think of me as the hacker culture's tribal historian and resident ethnographer. I was comfortable in that role. At that time, I had no faintest idea that my amateur anthropologizing could itself become a significant catalyst for change. I think nobody was more surprised than I when that happened. But the consequences of that surprise are still reverberating through the hacker culture and the technology and business worlds today. In this essay, I'll recapitulate from my personal point of view the events that immediately led up to the January 1998 ``shot heard 'round the world'' of the open-source revolution. I'll reflect on the remarkable distance we've come since. Then I will tentatively offer some projections into the future. 2. Beyond Brooks's Law My first encounter with Linux came in late 1993, via the pioneering Yggdrasil CD-ROM distribution. By that time, I had already been involved in the hacker culture for fifteen years. My earliest experiences had been with the primitive ARPANET of the late 1970s; I was even briefly a tourist on the ITS machines. I had already been writing free software and posting it to USENET before the Free Software Foundation was launched in 1984, and was one of the FSF's first contributors. I had just published the second edition of ``The New Hacker's Dictionary''. I thought I understood the hacker culture -- and its limitations -- pretty well. As I have written elsewhere, encountering Linux came as a shock. Even though I had been active in the hacker culture for many years, I still carried in my head the unexamined assumption that hacker amateurs, gifted though they might be, could not possibly muster the resources or skill necessary to produce a usable multitasking operating system. The HURD developers, after all, had been evidently failing at this for a decade. But where they had failed, Linus Torvalds and his community had succeeded. And they did not merely fulfill the minimum requirements of stability and functioning Unix interfaces. No. They blew right past that criterion with exuberance and flair, providing hundreds of megabytes of programs, documents, and other resources. Full suites of Internet tools, desktop-publishing software, graphics support, editors, games...you name it. Seeing this feast of wonderful code spread in front of me as a working system was a much more powerful experience than merely knowing, intellectually, that all the bits were probably out there. It was as though for years I'd been sorting through piles of disconnected car parts -- only to be suddenly confronted with those same parts assembled into a gleaming red Ferrari, door open, keys swinging from the lock and engine gently purring with a promise of power... The hacker tradition I had been observing for two decades seemed suddenly alive in a vibrant new way. In a sense, I had already been made part of this community, for several of my personal free-software projects had been added to the mix. But I wanted to get in deeper...because every delight I saw also deepened my puzzlement. It was too good! The lore of software engineering is dominated by Brooks's Law, articulated in Fred Brook's classic The Mythical Man-Month. Brooks predicts that as your number of programmers N rises, work performed scales as N but complexity and vulnerability to bugs rises as N-squared. N-squared tracks the number of communications paths (and potential code interfaces) between developers' code bases. Brooks's Law predicts that a project with thousands of contributors ought to be a flaky, unstable mess. Somehow the Linux community had beaten the N-squared effect and produced an OS of astonishingly high quality. I was determined to understand how they did it. It took me three years of participation and close observation to develop a theory, and another year to test it experimentally. And then I sat down and wrote The Cathedral and the Bazaar to explain what I had seen. --------------- 3. Memes and Mythmaking What I saw around me was a community which had evolved the most effective software-development method ever and didn't know it!. That is, an effective practice had evolved as a set of customs, transmitted by imitation and example, without the theory or language to explain why the practice worked. In retrospect, lacking that theory and that language hampered us in two ways. First: we couldn't think systematically about how to improve our own methods. Second: we couldn't explain or sell the method to anyone else. At the time, I was only thinking about the first effect. My only intention in writing the paper was to give the hacker culture an appropriate language to use internally, to explain itself to itself. So I wrote down what I had seen, framed as a narrative and with appropriately vivid metaphors to describe the logic that could be deduced behind the customs. There was no really fundamental discovery in The Cathedral and the Bazaar. I did not invent any of the methods it describes. What was novel in that paper was not the facts it described but those metaphors and the narrative -- a simple, powerful story that encouraged the reader to see the facts in a new way. I was attempting a bit of memetic engineering on the hacker culture's generative myths. I first gave the full paper at Linux Kongress, May 1997 in Bavaria. The fact that it was received with rapt attention and thunderous applause by an audience in which there were very few native speakers of English seemed to confirm that I was onto something. But, as it turned out, the sheer chance that I was seated next to publisher Tim O'Reilly at the Thursday night banquet set in motion a more important train of consequences. As a long-time admirer of O'Reilly's institutional style, I had been looking forward to meeting Tim for some years. We had a wide-ranging conversation (much of it exploring our common interest in classic science fiction) which led to an invitation for me to give The Cathedral and the Bazaar at Tim's Perl Conference later in the year. Once again, the paper was well-received -- with cheers and a standing ovation, in fact. I knew from my email that since Bavaria, word about The Cathedral and the Bazaar had spread over the Internet like a fire in dry grass. Many in the audience had already read it, and my speech was less a revelation of novelty for them than an opportunity to celebrate the new language and the consciousness that went with it. That standing ovation was not so much for my work as for the hacker culture itself -- and rightly so. Though I didn't know it, my experiment in memetic engineering was about to light a bigger fire. Some of the people for whom my speech was genuinely novel were from Netscape Communications, Inc. And Netscape was in trouble. Netscape, a pioneering Internet-technology company and Wall Street highflier, had been targeted for destruction by Microsoft. Microsoft rightly feared that the open Web standards embodied by Netscape's browser might lead to an erosion of the Redmond giant's lucrative monopoly on the PC desktop. All the weight of Microsoft's billions, and shady tactics that would later trigger an antitrust lawsuit, were deployed to crush the Netscape browser. For Netscape, the issue was less browser-related income (never more than a small fraction of their revenues) than maintaining a safe space for their much more valuable server business. If Microsoft's Internet Explorer achieved market dominance, Microsoft would be able to bend the Web's protocols away from open standards and into proprietary channels that only Microsoft's servers would be able to service. Within Netscape there was intense debate about how to counter the threat. One of the options proposed early on was to throw the Netscape browser source open -- but it was a hard case to argue without strong reasons to believe that doing so would prevent Internet Explorer dominance. I didn't know it at the time, but The Cathedral and the Bazaar became a major factor in making that case. Through the winter of 1997, as I was working on the material for my next paper, the stage was being set for Netscape to break the rules of the commercial game and offer my tribe an unprecedented opportunity. ------------- 4. The Road to Mountain View On January 22nd 1998 Netscape announced that it would release the sources of the Netscape client line to the Internet. Shortly after the news reached me the following day, I learned that CEO Jim Barksdale describing my work to national-media reporters as ``fundamental inspiration'' for the decision. This was the event that commentators in the computer trade press would later call ``the shot heard 'round the world' -- and Barksdale had cast me as its Thomas Paine, whether I wanted the role or not. For the first time in the history of the hacker culture, a Fortune 500 darling of Wall Street had bet its future on the belief that our way was right. And, more specifically, that my analysis of `our way' was right. This is a pretty sobering kind of shock to deal with. I had not been very surprised when The Cathedral and the Bazaar altered the hacker culture's image of itself; that was the result I had been trying for, after all. But I was astonished (to say the least) by the news of its success on the outside. So I did some very hard thinking in first few hours after word reached me. About the state of Linux and the hacker community. About Netscape. And about whether I, personally, had what it would take to make the next step. It was not difficult to conclude that helping Netscape's gamble succeed had just become a very high priority for the hacker culture, and thus for me personally. If Netscape's gamble failed, we hackers would probably find all the opprobrium of that failure piled on our heads. We'd be discredited for another decade. And that would be just too much to take. By this time I had been in the hacker culture, living through its various phases, for twenty years. Twenty years of repeatedly watching brilliant ideas, promising starts, and superior technologies crushed by slick marketing. Twenty years of watching hackers dream and sweat and build, too often only to watch the likes of the bad old IBM or the bad new Microsoft walk away with the real-world prizes. Twenty years of living in a ghetto -- a fairly comfortable ghetto full of interesting friends, but still one walled in by a vast and intangible barrier of mainsteam prejudice inscribed ``ONLY FLAKES LIVE HERE''. The Netscape announcement cracked that barrier, if only for a moment; the business world had been jolted out of its complacency about what `hackers' are capable of. But lazy mental habits have huge inertia. If Netscape failed, or perhaps even if they succeeded, the experiment might come to be seen as a unique one-off not worth trying to repeat. And then we'd be back in the same ghetto, walls higher than before. To prevent that, we needed Netscape to succeed. So I considered what I had learned about bazaar-mode development, and called up Netscape, and offered to help with developing their license and in working out the details of the strategy. In early February I flew to Mountain View at their request for seven hours of meetings with various groups at Netscape HQ, and helped them develop the outline of what would become the Mozilla Public License and the Mozilla organization. While there, I met with several key people in the Silicon Valley and national Linux community. While helping Netscape was clearly a short-term priority, everybody I spoke with had already understood the need for some longer-term strategy to follow up on the Netscape release. It was time to develop one. -----. The origins of `Open Source' It was easy to see the outlines of the strategy. We needed to take the pragmatic arguments I had pioneered in The Cathedral and the Bazaar, develop them further, and push them hard, in public. Because Netscape itself had an interest in convincing investors that its strategy was not crazy, we could count on them to help the promotion. We also recruited Tim O'Reilly (and through him, O'Reilly Associates) very early on. The real conceptual breakthrough, though, was admitting to ourselves that what we needed to mount was in effect a marketing campaign -- and that it would require marketing techniques (spin, image-building, and rebranding) to make it work. Hence the term `open source', which the first participants in what would later become the Open Source campaign (and, eventually, the Open Source Initiative organization) invented at a meeting held in Mountain View the offices of VA Research (now VA Linux Systems) on 3 Feb 1998. It seemed clear to us in retrospect that the term `free software' had done our movement tremendous damage over the years. Part of this stemmed from the fact that the word `free' has two different meanings in the English language, one suggesting a price of zero and one related to the idea of liberty. Richard Stallman, whose Free Software Foundation has long championed the term, says ``Think free speech, not free beer'' but the ambiguity of the term has nevertheless created serious problems -- especially since most ``free software'' is also distributed free of charge. Most of the damage, though, came from something worse -- the strong association of the term `free software' with hostility to intellectual property rights, communism, and other ideas hardly likely to endear it to an MIS manager. It was, and still is, beside the point to argue that the Free Software Foundation is not hostile to all intellectual property and that its position is not exactly communistic. We knew that. What we realized, under the pressure of the Netscape release, was that FSF's actual position didn't matter. Only the fact that its evangelism had backfired (associating `free software' with these negative stereotypes in the minds of the trade press and the corporate world) actually mattered. Our success after Netscape would depend on replacing the negative FSF stereotypes with positive stereotypes of our own -- pragmatic tales, sweet to managers' and investors' ears, of higher reliability and lower cost and better features. In conventional marketing terms, our job was to rebrand the product, and build its reputation into one the corporate world would hasten to buy. Linus Torvalds endorsed the idea the day after that first meeting. We began acting on it within a few days after. Bruce Perens had the opensource.org domain registered and the first version of the Open Source website up within a week. He also suggested that the Debian Free Software Guidelines become the `Open Source Definition', and began the process of registering `Open Source' as a certification mark so that we could legally require people to use `Open Source' for products conforming to the OSD. Even the particular tactics needed to push the strategy seemed pretty clear to me even at this early stage (and were explicitly discussed at the initial meeting). Key themes: 5.1. 1. Forget bottom-up; work on top-down One of the things that seemed clearest was that the historical Unix strategy of bottom-up evangelism (relying on engineers to persuade their bosses by rational argument) had been a failure. This was naive and easily trumped by Microsoft. Further, the Netscape breakthrough didn't happen that way. It happened because a strategic decision-maker (Jim Barksdale) got the clue and then imposed that vision on the people below him. The conclusion was inescapable. Instead of working bottom-up, we should be evangelizing top-down -- making a direct effort to capture the CEO/CTO/CIO types. 5.2. 2. Linux is our best demonstration case Promoting Linux must be our main thrust. Yes, there are other things going on in the open-source world, and the campaign will bow respectfully in their direction -- but Linux started with the best name recognition, the broadest software base, and the largest developer community. If Linux can't consolidate the breakthrough, nothing else will, pragmatically speaking, have a prayer. 5.3. 3. Capture the Fortune 500 There are other market segment that spend more dollars (small-business and home-office being the most obvious example) but those markets are diffuse and hard to address. The Fortune 500 doesn't merely have lots of money, it concentrates lots of money where it's relatively easy to get at. Therefore, the software industry largely does what the Fortune 500 business market tells it to do. And therefore, it is primarily the Fortune 500 we need to convince. 5.4. 4. Co-opt the prestige media that serve the Fortune 500 The choice to target the Fortune 500 implies that we need to capture the media that shape the climate of opinion among top-level decision-makers and investors: very specifically, the New York Times, the Wall Street Journal, the Economist, Forbes, and Barron's Magazine. On this view, co-opting the technical trade press is necessary but not sufficient; it's important essentially as a pre-condition for storming Wall Street itself through the elite mainstream media. 5.5. 5. Educate hackers in guerrilla marketing tactics It was also clear that educating the hacker community itself would be just as important as mainstream outreach. It would be insufficient to have one or a handful of ambassadors speaking effective language if, at the grass roots, most hackers were making arguments that didn't work. 5.6. 6. Use the Open Source certification mark to keep things pure One of the threats we faced was the possibility that the term `open source' would be ``embraced and extended'' by Microsoft or other large vendors, corrupting it and losing our message. It is for this reason the Bruce Perens and I decided early on to register the term as a certification mark and tie it to the Open Source Definition (a copy of the Debian Free Software Guidelines). This would allow us to scare off potential abusers with the threat of legal action. It eventually developed that the U.S. Patent and Trademark office would not issue a trademark for such a descriptive phrase. Fortunately, by the time we had to write off the effort to formally trademark "Open Source" a year later, the term had acquired its own momentum in the press and elsewhere. The sorts of serious abuse we feared have not (at least, not yet as of July 2000) actually materialized. --- 6 . The Accidental Revolutionary Planning this kind of strategy was relatively easy. The hard part (for me, anyway) was accepting what my own role had to be. One thing I understood from the beginning is that the press almost completely tunes out abstractions. They won't write about ideas without larger-than-life personalities fronting them. Everything has to be story, drama, conflict, sound bites. Otherwise most reporters will simply go to sleep -- and even if they don't, their editors will. Accordingly, I knew somebody with very particular characteristics would be needed to front the community's response to the Netscape opportunity. We needed a firebrand, a spin doctor, a propagandist, an ambassador, an evangelist -- somebody who could dance and sing and shout from the housetops and seduce reporters and huggermug with CEOs and bang the media machine until its contrary gears ground out the message: the revolution is here!. Unlike most hackers, I have the brain chemistry of an extrovert and had already had extensive experience at dealing with the press. Looking around me, I couldn't see anyone better qualified to play evangelist. But I didn't want the job, because I knew it would cost me my life for many months, maybe for years. My privacy would be destroyed. I'd probably end up both caricatured as a geek by the mainstream press and (worse) despised as a sell-out or glory-hog by a significant fraction of my own tribe. Worse than all the other bad consequences put together, I probably wouldn't have time to hack any more! I had to ask myself: are you fed up enough with watching your tribe lose to do whatever it takes to win? I decided the answer was yes -- and having so decided, threw myself into the dirty but necessary job of becoming a public figure and media personality. I'd learned some basic media chops while editing The New Hacker's Dictionary. This time I took it much more seriously and developed an entire theory of media manipulation which I then proceeded to apply. The theory centers around the use of what I call ``attractive dissonance'' to fan an itchy curiosity about the evangelist, and then exploiting that itch for all it's worth in promoting the ideas. This is not the place for a detailed exposition of my theory. But intelligent readers can probably deduce much of it from the phrase ``optimal level of provocation'' and the fact that my interview technique involves cheerfully discussing my interests in guns, anarchism and witchcraft while looking as well-groomed, boyishly charming and all-American wholesome as I can possibly manage. The trick is to sound challengingly weird but convey a reassuring aura of honesty and simplicity. (Note that to make the trick work, I think you have to genuinely be like that; faking either quality has a high risk of exposure and I don't recommend it.) The combination of the ``open source'' label and deliberate promotion of myself as an evangelist turned out to have both the good and bad consequences that I expected. The ten months after the Netscape announcement featured a steady exponential increase in media coverage of Linux and the open-source world in general. Throughout this period, approximately a third of these articles quoted me directly; most of the other two thirds used me as a background source. At the same time, a vociferous minority of hackers declared me an evil egotist. I managed to preserve a sense of humor about both outcomes (though occasionally with some difficulty). My plan from the beginning was that, eventually, I would hand off the evangelist role to some successor, either an individual or organization. There would come a time where charisma became less effective than broad-based institutional respectability (and, from my own point of view, the sooner the better!). I am attempting to transfer my personal connections and carefully built-up reputation with the press to the Open Source Initiative, an incorporated nonprofit formed specifically to manage the Open Source trademark. At time of writing I am the president of this organization, but hope and expect not to remain so indefinitely. 7. Phases of the Campaign The open-source campaign began with the Mountain View meeting, and rapidly collected an informal network of allies over the Internet (including key people at Netscape and O'Reilly Associates). Where I write `we' below I'm referring to that network. From 3 February to around the time of the actual Netscape release on 31 March, our primary concern was convincing the hacker community that the `open source' label and the arguments that went with it represented our best shot at persuading the mainstream. As it turned out, the change was rather easier than we expected. We discovered a lot of pent-up demand for a message less doctrinaire than the Free Software Foundation's. Tim O'Reilly invited twenty-odd leaders of major free software projects to what came to be called the Free Software Summit on March 7. When these leaders voted to adopt the term `open source', they formally ratified a trend that was already clear at the grass roots among developers. By six weeks after the Mountain View meeting, a healthy majority of the community was speaking our language. The publicity following the Free Software Summit introduced the mainstream press to the term, and also gave notice that Netscape was not alone in adopting the open-source concept. We'd given a name to a phenomenon whose impact was already larger than anyone outside the Internet community had yet realized. Far from being fringe challengers, open source programs were already market leaders in providing key elements of the Internet infrastructure. Apache was the leading web server, with more than 50% market share (now grown to more than 60%.) Perl was the dominant programming language for the new breed of web-based applications. Sendmail routes more than 80% of all Internet email messages. And even the ubiquitous domain name system (which lets us use names like www.yahoo.com rather than obscure numeric IP addresses) depends almost entirely on an open-source program called BIND. As Tim O'Reilly said during the press conference following the summit, pointing to the assembled programmers and project leaders: ``These people have created products with dominant market share using only the power of their ideas and the networked community of their co-developers.'' What more might be possible if large companies also adopted the open source methodology? That was a good start to our `air war', our attempt to change perceptions through the press. But we still needed to maintain momentum on the ground. In April, after the Summit and the actual Netscape release, our main concern shifted to recruiting as many open-source early adopters as possible. The goal was to make Netscape's move look less singular -- and to buy us insurance in case Netscape executed poorly and failed its goals. This was the most worrying time. On the surface, everything seemed to be coming up roses; Linux was moving technically from strength to strength, the wider open-source phenomenon was enjoying a spectacular explosion in trade press coverage, and we were even beginning to get positive coverage in the mainstream press. Nevertheless, I was uneasily aware that our success was still fragile. After an initial flurry of contributions, community participation in Mozilla was badly slowed down by its requirement for the proprietary Motif toolkit. None of the big independent software vendors had yet committed to Linux ports. Netscape was still looking lonely, and its browser still losing market share to Internet Explorer. Any serious reverse could lead to a nasty backlash in the press and public opinion. Our first serious post-Netscape breakthrough came on 7 May when Corel Computer announced its Linux-based Netwinder network computer. But that wasn't enough in itself; to sustain the momentum, we needed commitments not from hungry second-stringers but from industry leaders. Thus, it was the mid-July announcements by Oracle and Informix that really closed out this vulnerable phase. The database outfits joined the Linux party three months earlier than I expected, but none too soon. We had been wondering how long the positive buzz could last without major ISV support and feeling increasingly nervous about where we'd actually find that. After Oracle and Informix announced Linux ports other ISVs began announcing Linux support almost as a matter of routine, and even a failure of Mozilla became survivable. Mid-July through the beginning of November was a consolidation phase. It was during this time that we started to see fairly steady coverage from the financial media I had originally targeted, led off by articles in The Economist and a cover story in Forbes. Various hardware and software vendors sent out feelers to the open-source community and began to work out strategies for getting advantage from the new model. And internally, the biggest closed-source vendor of them all was beginning to get seriously worried. Just how worried became apparent when the now-infamous Halloween Documents leaked out of Microsoft. These internal strategy documents recognized the power of the open source model, and outlined Microsoft's analysis of how to combat it by corrupting the open protocols on which open source depends and choking off customer choice. The Halloween Documents were dynamite. They were a ringing testimonial to the strengths of open-source development from the company with the most to lose from Linux's success. And they confirmed a lot of peoples' darkest suspicions about the tactics Microsoft would consider in order to stop it. The Halloween Documents attracted massive press coverage in the first few weeks of November. They created a new surge of interest in the open-source phenomenon, serendipitously confirming all the points we had been making for months. And they led directly to a request for me to confer with a select group of Merrill Lynch's major investors on the state of the software industry and the prospects for open source. Wall Street, finally, came to us. The following six months were a study in increasingly surreal contrasts. On the one hand, I was getting invited to give talks on open source to Fortune 100 corporate strategists and technology investors; for the first time in my life, I got to fly first class and saw the inside of a stretch limousine. On the other hand, I was doing guerrilla street theater with grass-roots hackers -- as in the riotously funny Windows Refund Day demonstration of March 15 1999, when a band of Bay-area Linux users actually marched on the Microsoft offices in the glare of full media coverage, demanding refunds under the terms of the Microsoft End-User License for the unused Windows software that had been bundled with their machines. I knew I was going to be in town that weekend to speak at a conference hosted by the Reason Foundation, so I volunteered to be a marshal for the event. Back in December I'd been featured in a Star Wars parody plot in the Internet comic strip "User Friendly". So I joked with the organizers about wearing an Obi-Wan Kenobi costume at the demonstration. To my surprise, when I arrived I found the organizers had actually made a passable Jedi costume -- and that's how I found myself leading a parade that featured cheeky placards and an American flag and a rather large plastic penguin, booming out "May the Source be with you!" to delighted reporters. To my further surprise, I was drafted to make our statement to the press. I suppose none of us should have really been astonished when the video made CNBC. The demonstration was a tremendous success. Microsoft's PR position, still trying to recover from the exposure of the Halloween Documents, took another body blow. And within weeks, major PC and laptop manufacturers began announcing that they would ship machines with no Windows installed and no `Microsoft tax' in the price. Our bit of guerilla theater, it appeared, had struck home. Prev Next 8. The Facts on the Ground While the Open Source campaign's air war in the media was going on, key technical and market facts on the ground were also changing. I'll review some of them briefly here because they combine interestingly with the trends in press and public perception. In the eighteen months after the Netscape release, Linux continued to grow rapidly more capable. The development of solid SMP support and the effective completion of the 64-bit cleanup laid important groundwork for the future. The roomful of Linux boxes used to render scenes for the Titanic threw a healthy scare into builders of expensive graphics engines. Then the Beowulf supercomputer-on-the-cheap project showed that Linux's Chinese-army sociology could be successfully applied even to bleeding-edge scientific computing. Nothing dramatic happened to vault Linux's open-source competitors into the limelight. And proprietary Unixes continued to lose market share; in fact, by mid-year only NT and Linux were actually gaining market share in the Fortune 500, and by late fall Linux was gaining faster (and more at the expense of NT than of other Unixes). Apache continued to increase its lead in the Web-server market. (By August 1999 Apache and its derivatives would be running fully 61% of the world's publicly-accessible Web servers.) In November 1998, Netscape's browser reversed its market-share slide and began to make gains against Internet Explorer. In April 1999 the respected computer-market researchers IDG predicted that Linux would grow twice as fast as all other server operating systems combined through 2003 -- and faster than Windows NT. In May, Kleiner-Perkins (Silicon Valley's leading venture-capital firm) took a lead position in financing a Linux startup. About the only negative development was the continuing problems of the Mozilla project. I have analyzed these elsewhere (in The Magic Cauldron). They came to a head when Jamie Zawinski, a Mozilla co-founder and the public face of the project, resigned a year and a day after the release of the source code, complaining of mismanagement and lost opportunities. But it was an indication of the tremendous momentum open source had acquired by this time that Mozilla's troubles did not noticeably slow down the pace of adoption. The trade press, remarkably, drew the right lesson: "Open source," in Jamie's now-famous words, "is [great, but it's] not magic pixie dust." In the early part of 1999 a trend began among big independent software vendors (ISVs) to port their business applications to Linux, following the lead set earlier by the major database vendors. In late July, the biggest of them all, Computer Associates, announced that it would be supporting Linux over much of its product line. And preliminary results from an August 1999 survey of 2000 IT managers revealed that 49% consider Linux an "important or essential" lement of their enterprise computing strategies. Another survey by IDC described what it called ``an amazing level of growth'' since 1998, when the market research couldn't find statistically significant use of Linux; 13% of the respondents now employ it in business operations. 1999 also saw a wave of wildly successful Linux IPOs by Red Hat Linux, VA Linux Systems, and other Linux companies. While the overblown dot.com-like initial valuations investors originally put on them didn't outlast the big market corrections in March 2000, these firms established an unmistakable for-profit industry around open source that continues to be a focus of investor interest. . Into the Future I have rehearsed recent history here only partly to get it into the record. More importantly, it sets a background against which we can understand near-term trends and project some things about the future. First, safe predictions for the next year: The open-source developer population will continue to explode, a growth fueled by ever-cheaper PC hardware and fast Internet connections. Linux will continue to lead the way, the sheer size of its developer community overpowering the higher average skill of the open-source BSD people and the tiny HURD crew. ISV commitments to support the Linux platform will increase dramatically; the database-vendor commitments were a turning point. Corel's commitment to ship their entire office suite on Linux points the way. The Open Source campaign will continue to build on its victories and successfully raise awareness at the CEO/CTO/CIO and investor level. MIS directors will feel increasing pressure to go with open-source products not from below but from above. Stealth deployments of Samba-over-Linux will replace increasing numbers of NT machines even at shops that have all-Microsoft policies. The market share of proprietary Unixes will continue to gradually erode. At least one of the weaker competitors (likely DG-UX or HP-UX) will actually fold. But by the time it happens, analysts will attribute it to Linux's gains rather than Microsoft's. Microsoft will not have an enterprise-ready operating system, because Windows 2000 will not ship in a usable form. (At 60 million lines of code and still bloating, its development is out of control.) I wrote the above predictions in mid-December of 1998. All are still holding good as of July 2000, eighteen months after they were written. Only the last one is arguable; Microsoft managed to ship Windows 2000 by drastically curtailing its feature list; adoption rates have not been what they hoped. Extrapolating these trends certainly suggests some slightly riskier predictions for the medium term (eighteen to thirty-two months out). Support operations for commercial customers of open-source operating systems will become big business, both feeding off of and fueling the boom in business use. (This has already come true in 1999 with the launch of LinuxCare, and Linux support-service announcements by IBM and HP and others.) Open-source operating systems (with Linux leading the way) will capture the ISP and business data-center markets. NT will be unable to resist this change effectively; the combination of low cost, open sources, and true 24/7 reliability will prove unstoppable. The proprietary-Unix sector will almost completely collapse. Solaris looks like a safe bet to survive on high-end Sun hardware, but most other players' proprietary Unixes will quickly become legacy systems. (Eight months later we saw the first casualty: SGI's IRIX, and in mid-2000 SCO agreed to be acqyuired by Caldera. It now looks probable that a number of Unix hardware vendors will switch horses to Linux without much fuss, as SGI is already well into the process of doing.) Windows 2000 will be either canceled or dead on arrival. Either way it will turn into a horrendous train wreck, the worst strategic disaster in Microsoft's history. However, their marketing spin on this failure will be so deft that it will barely affect their hold on the consumer desktop within the next two years. (Eight months later, a just-published IDG survey suggests that ``dead on arrival'' looks more likely all the time, with most large corporate respondents simply refusing to deploy the initial release.) At first glance, these trends look like a recipe for leaving Linux as the last one standing. But life is not that simple, and Microsoft derives such immense amounts of money and market clout from the desktop market that it can't safely be counted out even after the Windows 2000 train wreck. So at two years out the crystal ball gets a bit cloudy. Which of several futures we get depends on questions like: will the DOJ actually succeed in breaking up Microsoft? Might BeOS or OS/2 or Mac OS/X or some other niche closed-source OS, or some completely new design, find a way to go open and compete effectively with Linux's 30-year-old base design? At least Y2K fizzled... These are all fairly imponderable. But there is one such question that is worth pondering: Will the Linux community actually deliver a good end-user-friendly GUI interface for the whole system? I think the most likely scenario for late 2000/early 2001 has Linux in effective control of servers, data centers, ISPs, and the Internet, while Microsoft maintains its grip on the desktop. Where things go from there depend on whether GNOME, KDE, or some other Linux-based GUI (and the applications built or rebuilt to use it) ever get good enough to challenge Microsoft on its home ground. If this were primarily a technical problem, the outcome would hardly be in doubt. But it isn't; it's a problem in ergonomic design and interface psychology, and hackers have historically been poor at these things. That is, while hackers can be very good at designing interfaces for other hackers, they tend to be poor at modeling the thought processes of the other 95% of the population well enough to write interfaces that J. Random End-User and his Aunt Tillie will pay to buy. Applications were this year's problem; it's now clear we'll swing enough ISVs to get the ones we don't write ourselves. I believe the problem for the next two years is whether we can grow enough to meet (and exceed!) the interface-design quality standard set by the Macintosh, combining that with the virtues of the traditional Unix way. A year after I first wrote that last paragraph, help may be on the way from the inventors of the Macintosh! Andy Hertzfeld and other members of the original Macintosh design team have formed a open-source company called Eazel with the explicit goal of bringing the Macintosh magic to Linux. We half-joke about `world domination', but the only way we will get there is by serving the world. That means J. Random End-User and his Aunt Tillie; and that means learning how to think about what we do in a fundamentally new way, and ruthlessly reducing the user-visible complexity of the default environment to an absolute minimum. Computers are tools for human beings. Ultimately, therefore, the challenges of designing hardware and software must come back to designing for human beings -- all human beings. This path will be long, and it won't be easy. But I think the hacker community, in alliance with its new friends in the corporate world, will prove up to the task. And, as Obi-Wan Kenobi might say, ``the Source will be with us''. ---------------------------- 1. An Introductory Contradiction Anyone who watches the busy, tremendously productive world of Internet open-source software for a while is bound to notice an interesting contradiction between what open-source hackers say they believe and the way they actually behave -- between the official ideology of the open-source culture and its actual practice. Cultures are adaptive machines. The open-source culture is a response to an identifiable set of drives and pressures. As usual, the culture's adaptation to its circumstances manifests both as conscious ideology and as implicit, unconscious or semi-conscious knowledge. And, as is not uncommon, the unconscious adaptations are partly at odds with the conscious ideology. In this paper, I will dig around the roots of that contradiction, and use it to discover those drives and pressures. We will deduce some interesting things about the hacker culture and its customs. We will conclude by suggesting ways in which the culture's implicit knowledge can be leveraged better. ---- 2. The Varieties of Hacker Ideology The ideology of the Internet open-source culture (what hackers say they believe) is a fairly complex topic in itself. All members agree that open source (that is, software that is freely redistributable and can readily be evolved and modified to fit changing needs) is a good thing and worthy of significant and collective effort. This agreement effectively defines membership in the culture. However, the reasons individuals and various subcultures give for this belief vary considerably. One degree of variation is zealotry; whether open source development is regarded merely as a convenient means to an end (good tools and fun toys and an interesting game to play) or as an end in itself. A person of great zeal might say ``Free software is my life! I exist to create useful, beautiful programs and information resources, and then give them away.'' A person of moderate zeal might say ``Open source is a good thing, which I am willing to spend significant time helping happen''. A person of little zeal might say ``Yes, open source is OK sometimes. I play with it and respect people who build it''. Another degree of variation is in hostility to commercial software and/or the companies perceived to dominate the commercial software market. A very anticommercial person might say ``Commercial software is theft and hoarding. I write free software to end this evil.'' A moderately anticommercial person might say ``Commercial software in general is OK because programmers deserve to get paid, but companies that coast on shoddy products and throw their weight around are evil.'' An un-anticommercial person might say ``Commercial software is OK, I just use and/or write open-source software because I like it better''. (Nowadays, given the growth of the open-source part of the industry since the first public version of this paper, one might also hear ``Commercial software is fine, as long as I get the source or it does what I want it to do.'') All nine of the attitudes implied by the cross-product of the above categories are represented in the open-source culture. The reason it is worthwhile to point out the distinctions is because they imply different agendas, and different adaptive and cooperative behaviors. Historically, the most visible and best-organized part of the hacker culture has been both very zealous and very anticommercial. The Free Software Foundation founded by Richard M. Stallman (RMS) supported a great deal of open-source development from the early 1980s on, including tools like Emacs and GCC which are still basic to the Internet open-source world, and seem likely to remain so for the forseeable future. For many years the FSF was the single most important focus of open-source hacking, producing a huge number of tools still critical to the culture. The FSF was also long the only sponsor of open source with an institutional identity visible to outside observers of the hacker culture. They effectively defined the term `free software', deliberately giving it a confrontational weight (which the newer label `open source' just as deliberately avoids). Thus, perceptions of the hacker culture from both within and outside it tended to identify the culture with the FSF's zealous attitude and perceived anticommercial aims. RMS himself denies he is anticommercial, but his program has been so read by most people, including many of his most vocal partisans. The FSF's vigorous and explicit drive to ``Stamp Out Software Hoarding!'' became the closest thing to a hacker ideology, and RMS the closest thing to a leader of the hacker culture. The FSF's license terms, the ``General Public License'' (GPL), expresses the FSF's attitudes. It is very widely used in the open-source world. North Carolina's Metalab (formerly Sunsite) is the largest and most popular software archive in the Linux world. In July 1997 about half the Sunsite software packages with explicit license terms used GPL. But the FSF was never the only game in town. There was always a quieter, less confrontational and more market-friendly strain in the hacker culture. The pragmatists were loyal not so much to an ideology as to a group of engineering traditions founded on early open-source efforts which predated the FSF. These traditions included, most importantly, the intertwined technical cultures of Unix and the pre-commercial Internet. The typical pragmatist attitude is only moderately anticommercial, and its major grievance against the corporate world is not `hoarding' per se. Rather it is that world's perverse refusal to adopt superior approaches incorporating Unix and open standards and open-source software. If the pragmatist hates anything, it is less likely to be `hoarders' in general than the current King Log of the software establishment; formerly IBM, now Microsoft. To pragmatists, the GPL is important as a tool rather than an end in itself. Its main value is not as a weapon against `hoarding', but as a tool for encouraging software sharing and the growth of bazaar-mode development communities. The pragmatist values having good tools and toys more than he dislikes commercialism, and may use high-quality commercial software without ideological discomfort. At the same time, his open-source experience has taught him standards of technical quality that very little closed software can meet. For many years, the pragmatist point of view expressed itself within the hacker culture mainly as a stubborn current of refusal to completely buy into the GPL in particular or the FSF's agenda in general. Through the 1980s and early 1990s, this attitude tended to be associated with fans of Berkeley Unix, users of the BSD license, and the early efforts to build open-source Unixes from the BSD source base. These efforts, however, failed to build bazaar communities of significant size, and became seriously fragmented and ineffective. Not until the Linux explosion of early 1993-1994 did pragmatism find a real power base. Although Linus Torvalds never made a point of opposing RMS, he set an example by looking benignly on the growth of a commercial Linux industry, by publicly endorsing the use of high-quality commercial software for specific tasks, and by gently deriding the more purist and fanatical elements in the culture. A side effect of the rapid growth of Linux was the induction of a large number of new hackers for which Linux was their primary loyalty and the FSF's agenda primarily of historical interest. Though the newer wave of Linux hackers might describe the system as ``the choice of a GNU generation'', most tended to emulate Torvalds more than Stallman. Increasingly it was the anticommercial purists who found themselves in a minority. How much things had changed would not become apparent until the Netscape announcement in February 1998 that it would distribute Navigator 5.0 in source. This excited more interest in `free software' within the corporate world. The subsequent call to the hacker culture to exploit this unprecedented opportunity and to re-label its product from `free software' to `open source' was met with a level of instant approval that surprised everybody involved. In a reinforcing development, the pragmatist part of the culture was itself becoming polycentric by the mid-1990s. Other semi-independent communities with their own self-consciousness and charismatic leaders began to bud from the Unix/Internet root stock. Of these, the most important after Linux was the Perl culture under Larry Wall. Smaller, but still significant, were the traditions building up around John Osterhout's Tcl and Guido van Rossum's Python languages. All three of these communities expressed their ideological independence by devising their own, non-GPL licensing schemes. ------------- 3. Promiscuous Theory, Puritan Practice Through all these changes, nevertheless, there remained a broad consensus theory of what `free software' or `open source' is. The clearest expression of this common theory can be found in the various open-source licenses, all of which have crucial common elements. In 1997 these common elements were distilled into the Debian Free Software Guidelines, which became the Open Source Definition. Under the guidelines defined by the OSD, an open-source license must protect an unconditional right of any party to modify (and redistribute modified versions of) open-source software. Thus, the implicit theory of the OSD (and OSD-conformant licenses such as the GPL, the BSD license, and Perl's Artistic License) is that anyone can hack anything. Nothing prevents half a dozen different people from taking any given open-source product (such as, say the Free Software Foundations's gcc C compiler), duplicating the sources, running off with them in different evolutionary directions, but all claiming to be the product. In practice, however, such `forking' almost never happens. Splits in major projects have been rare, and always accompanied by re-labeling and a large volume of public self-justification. It is clear that, in such cases as the GNU Emacs/XEmacs split, or the gcc/egcs split, or the various fissionings of the BSD splinter groups, that the splitters felt they were going against a fairly powerful community norm [SP]. In fact (and in contradiction to the anyone-can-hack-anything consensus theory) the open-source culture has an elaborate but largely unadmitted set of ownership customs. These customs regulate who can modify software, the circumstances under which it can be modified, and (especially) who has the right to redistribute modified versions back to the community. The taboos of a culture throw its norms into sharp relief. Therefore, it will be useful later on if we summarize some important ones here. There is strong social pressure against forking projects. It does not happen except under plea of dire necessity, with much public self-justification, and with a renaming. Distributing changes to a project without the cooperation of the moderators is frowned upon, except in special cases like essentially trivial porting fixes. Removing a person's name from a project history, credits or maintainer list is absolutely not done without the person's explicit consent. In the remainder of this paper, we shall examine these taboos and ownership customs in detail. We shall inquire not only into how they function but what they reveal about the underlying social dynamics and incentive structures of the open-source community. -------------- 4. Ownership and Open Source What does `ownership' mean when property is infinitely reduplicable, highly malleable, and the surrounding culture has neither coercive power relationships nor material scarcity economics? Actually, in the case of the open-source culture this is an easy question to answer. The owner(s) of a software project are those who have the exclusive right, recognized by the community at large, to re-distribute modified versions. (In discussing `ownership' in this section I will use the singular, as though all projects are owned by some one person. It should be understood, however, that projects may be owned by groups. We shall examine the internal dynamics of such groups later in this paper.) According to the standard open-source licenses, all parties are equals in the evolutionary game. But in practice there is a very well-recognized distinction between `official' patches, approved and integrated into the evolving software by the publicly recognized maintainers, and `rogue' patches by third parties. Rogue patches are unusual, and generally not trusted [RP]. That public redistribution is the fundamental issue is easy to establish. Custom encourages people to patch software for personal use when necessary. Custom is indifferent to people who redistribute modified versions within a closed user or development group. It is only when modifications are posted to the open-source community in general, to compete with the original, that ownership becomes an issue. There are, in general, three ways to acquire ownership of an open-source project. One, the most obvious, is to found the project. When a project has had only one maintainer since its inception and the maintainer is still active, custom does not even permit a question as to who owns the project. The second way is to have ownership of the project handed to you by the previous owner (this is sometimes known as `passing the baton'). It is well understood in the community that project owners have a duty to pass projects to competent successors when they are no longer willing or able to invest needed time in development or maintenance work. It is significant that in the case of major projects, such transfers of control are generally announced with some fanfare. While it is unheard of for the open-source community at large to actually interfere in the owner's choice of succession, customary practice clearly incorporates a premise that public legitimacy is important. For minor projects, it is generally sufficient for a change history included with the project distribution to note the change of ownership. The clear presumption is that if the former owner has not in fact voluntarily transferred control, he or she may reassert control with community backing by objecting publicly within a reasonable period of time. The third way to acquire ownership of a project is to observe that it needs work and the owner has disappeared or lost interest. If you want to do this, it is your responsibility to make the effort to find the owner. If you don't succeed, then you may announce in a relevant place (such as a Usenet newsgroup dedicated to the application area) that the project appears to be orphaned, and that you are considering taking responsibility for it. Custom demands that you allow some time to pass before following up with an announcement that you have declared yourself the new owner. In this interval, if someone else announces that they have been actually working on the project, their claim trumps yours. It is considered good form to give public notice of your intentions more than once. More points for good form if you announce in many relevant forums (related newsgroups, mailing lists); and still more if you show patience in waiting for replies. In general, the more visible effort you make to allow the previous owner or other claimants to respond, the better your claim if no response is forthcoming. If you have gone through this process in sight of the project's user community, and there are no objections, then you may claim ownership of the orphaned project and so note in its history file. This, however, is less secure than being passed the baton, and you cannot expect to be considered fully legitimate until you have made substantial improvements in the sight of the user community. I have observed these customs in action for twenty years, going back to the pre-FSF ancient history of open-source software. They have several very interesting features. One of the most interesting is that most hackers have followed them without being fully aware of doing so. Indeed, the above may be the first conscious and reasonably complete summary ever to have been written down. Another is that, for unconscious customs, they have been followed with remarkable (even astonishing) consistency. I have observed the evolution of literally hundreds of open-source projects, and I can still count the number of significant violations I have observed or heard about on my fingers. Yet a third interesting feature is that as these customs have evolved over time, they have done so in a consistent direction. That direction has been to encourage more public accountability, more public notice, and more care about preserving the credits and change histories of projects in ways which (among other things) establish the legitimacy of the present owners. These features suggest that the customs are not accidental, but are products of some kind of implicit agenda or generative pattern in the open-source culture that is utterly fundamental to the way it operates. An early respondent pointed out that contrasting the Internet hacker culture with the cracker/pirate culture (the ``warez d00dz'' centered around game-cracking and pirate bulletin-board systems) illuminates the generative patterns of both rather well. We'll return to the d00dz for contrast later in the paper. ------------- 5. Locke and Land Title To understand this generative pattern, it helps to notice a historical analogy for these customs that is far outside the domain of hackers' usual concerns. As students of legal history and political philosophy may recognize, the theory of property they imply is virtually identical to the Anglo-American common-law theory of land tenure! In this theory, there are three ways to acquire ownership of land. On a frontier, where land exists that has never had an owner, one can acquire ownership by homesteading, mixing one's labor with the unowned land, fencing it, and defending one's title. The usual means of transfer in settled areas is transfer of title -- that is, receiving the deed from the previous owner. In this theory, the concept of `chain of title' is important. The ideal proof of ownership is a chain of deeds and transfers extending back to when the land was originally homesteaded. Finally, the common-law theory recognizes that land title may be lost or abandoned (for example, if the owner dies without heirs, or the records needed to establish chain of title to vacant land are gone). A piece of land that has become derelict in this way may be claimed by adverse possession -- one moves in, improves it, and defends title as if homesteading. This theory, like hacker customs, evolved organically in a context where central authority was weak or nonexistent.  It developed over a period of a thousand years from Norse and Germanic tribal law. Because it was systematized and rationalized in the early modern era by the English political philosopher John Locke, it is sometimes referred to as the `Lockean' theory of property. Logically similar theories have tended to evolve wherever property has high economic or survival value and no single authority is powerful enough to force central allocation of scarce goods. This is true even in the hunter-gatherer cultures that are sometimes romantically thought to have no concept of `property'. For example, in the traditions of the !Kung San bushmen of the Kgalagadi (formerly `Kalahari') Desert, there is no ownership of hunting grounds. But there is ownership of water-holes and springs under a theory recognizably akin to Locke's. The !Kung San example is instructive, because it shows that Lockean property customs arise only where the expected return from the resource exceeds the expected cost of defending it. Hunting grounds are not property because the return from hunting is highly unpredictable and variable, and (although highly prized) not a necessity for day-to-day survival. Waterholes, on the other hand, are vital to survival and small enough to defend. The `noosphere' of this essay's title is the territory of ideas, the space of all possible thoughts [N]. What we see implied in hacker ownership customs is a Lockean theory of property rights in one subset of the noosphere, the space of all programs. Hence `homesteading the noosphere', which is what every founder of a new open-source project does. Faré Rideau correctly points out that hackers do not exactly operate in the territory of pure ideas. He asserts that what hackers own is programming projects -- intensional focus points of material labor (development, service, etc), to which are associated things like reputation, trustworthiness, etc. He therefore asserts that the space spanned by hacker projects, is not the noosphere but a sort of dual of it, the space of noosphere-exploring program projects. (With an apologetic nod to the astrophysicists out there, it would be etymologically correct to call this dual space the `ergosphere' or `sphere of work'.) In practice, the distinction between noosphere and ergosphere is not important for the purposes of this paper. It is dubious whether the `noosphere' in the pure sense Faré insists on can be said to exist in any meaningful way; one would almost have to be a Platonist philosopher to believe in it. And the distinction between noosphere and ergosphere is only of practical importance if one wishes to assert that ideas (the elements of the noosphere) cannot be owned, but their instantiations as projects can. This question leads to issues in the theory of intellectual property which are beyond the scope of this paper (but see [DF]). To avoid confusion, however, it is important to note that neither the noosphere nor the ergosphere is the same as the totality of virtual locations in electronic media that is sometimes (to the disgust of most hackers) called `cyberspace'. Property there is regulated by completely different rules that are closer to those of the material substratum -- essentially, he who owns the media and machines on which a part of `cyberspace' is hosted owns that piece of cyberspace as a result. The Lockean logic of custom suggests strongly that open-source hackers observe the customs they do in order to defend some kind of expected return from their effort. The return must be more significant than the effort of homesteading projects, the cost of maintaining version histories that document `chain of title', and the time cost of doing public notifications and a waiting period before taking adverse possession of an orphaned project. Furthermore, the `yield' from open source must be something more than simply the use of the software, something else that would be compromised or diluted by forking. If use were the only issue, there would be no taboo against forking, and open-source ownership would not resemble land tenure at all. In fact, this alternate world (where use is the only yield, and forking is unproblematic) is the one implied by existing open-source licenses. We can eliminate some candidate kinds of yield right away. Because you can't coerce effectively over a network connection, seeking power is right out. Likewise, the open-source culture doesn't have anything much resembling money or an internal scarcity economy, so hackers cannot be pursuing anything very closely analogous to material wealth (e.g. the accumulation of scarcity tokens). There is one way that open-source activity can help people become wealthier, however -- a way that provides a valuable clue to what actually motivates it. Occasionally, the reputation one gains in the hacker culture can spill over into the real world in economically significant ways. It can get you a better job offer, or a consulting contract, or a book deal. This kind of side effect, however, is at best rare and marginal for most hackers; far too much so to make it convincing as a sole explanation, even if we ignore the repeated protestations by hackers that they're doing what they do not for money but out of idealism or love. However, the way such economic side-effects are mediated is worth examination. Below we'll see that an understanding of the dynamics of reputation within the open-source culture itself has considerable explanatory power. --------------- 7. The Joy of Hacking In making this `reputation game' analysis, by the way, I do not mean to devalue or ignore the pure artistic satisfaction of designing beautiful software and making it work. We all experience this kind of satisfaction and thrive on it. People for whom it is not a significant motivation never become hackers in the first place, just as people who don't love music never become composers. So perhaps we should consider another model of hacker behavior in which the pure joy of craftsmanship is the primary motivation. This `craftsmanship' model would have to explain hacker custom as a way of maximizing both the opportunities for craftsmanship and the quality of the results. Does this conflict with or suggest different results than the `reputation game' model? Not really. In examining the `craftsmanship' model, we come back to the same problems that constrain hackerdom to operate like a gift culture. How can one maximize quality if there is no metric for quality? If scarcity economics doesn't operate, what metrics are available besides peer evaluation? It appears that any craftsmanship culture ultimately must structure itself through a reputation game -- and, in fact, we can observe exactly this dynamic in many historical craftsmanship cultures from the medieval guilds onwards. In one important respect, the `craftsmanship' model is weaker than the `gift culture' model; by itself, it doesn't help explain the contradiction we began this paper with. Finally, the `craftsmanship' motivation itself may not be psychologically as far removed from the reputation game as we might like to assume. Imagine your beautiful program locked up in a drawer and never used again. Now imagine it being used effectively and with pleasure by many people. Which dream gives you satisfaction? Nevertheless, we'll keep an eye on the craftsmanship model. It is intuitively appealing to many hackers, and explains some aspects of individual behavior well enough [HT]. After I published the first version of this paper on the Internet, an anonymous respondent commented: ``You may not work to get reputation, but the reputation is a real payment with consequences if you do the job well.'' This is a subtle and important point. The reputation incentives continue to operate whether or not a craftsman is aware of them; thus, ultimately, whether or not a hacker understands his own behavior as part of the reputation game, his behavior will be shaped by that game. Other respondents related peer-esteem rewards and the joy of hacking to the levels above subsistence needs in Abraham Maslow's well-known `hierarchy of values' model of human motivation [MH]. On this view, the joy of hacking is a self-actualization or transcendence need which will not be consistently expressed until lower-level needs (including those for physical security and for `belongingness' or peer esteem) have been at least minimally satisfied. Thus, the reputation game may be critical in providing a social context within which the joy of hacking can in fact become the individual's primary motive. --------- 8. The Many Faces of Reputation There are reasons general to every gift culture why peer repute (prestige) is worth playing for: First and most obviously, good reputation among one's peers is a primary reward. We're wired to experience it that way for evolutionary reasons touched on earlier. (Many people learn to redirect their drive for prestige into various sublimations that have no obvious connection to a visible peer group, such as ``honor'', ``ethical integrity'', ``piety'' etc.; this does not change the underlying mechanism.) Secondly, prestige is a good way (and in a pure gift economy, the only way) to attract attention and cooperation from others. If one is well known for generosity, intelligence, fair dealing, leadership ability, or other good qualities, it becomes much easier to persuade other people that they will gain by association with you. Thirdly, if your gift economy is in contact with or intertwined with an exchange economy or a command hierarchy, your reputation may spill over and earn you higher status there. Beyond these general reasons, the peculiar conditions of the hacker culture make prestige even more valuable than it would be in a `real world' gift culture. The main `peculiar condition' is that the artifacts one gives away (or, interpreted another way, are the visible sign of one's gift of energy and time) are very complex. Their value is nowhere near as obvious as that of material gifts or exchange-economy money. It is much harder to objectively distinguish a fine gift from a poor one. Accordingly, the success
 of a giver's bid for status is delicately dependent on the critical judgement of peers. Another peculiarity is the relative purity of the open-source culture. Most gift cultures are compromised -- either by exchange-economy relationships such as trade in luxury goods, or by command-economy relationships such as family or clan groupings. No significant analogues of these exist in the open-source culture; thus, ways of gaining status other than by peer repute are virtually absent. ------------- 9. Ownership Rights and Reputation Incentives We are now in a position to pull together the previous analyses into a coherent account of hacker ownership customs.  We understand the yield from homesteading the noosphere now; it is peer repute in the gift culture of hackers, with all the secondary gains and side-effects that implies. From this understanding, we can analyze the Lockean property customs of hackerdom as a means of maximizing reputation incentives; of ensuring that peer credit goes where it is due and does not go where it is not due. The three taboos we observed above make perfect sense under this analysis. One's reputation can suffer unfairly if someone else misappropriates or mangles one's work; these taboos (and related customs) attempt to prevent this from happening. (Or, to put it more pragmatically, hackers generally refrain from forking or rogue-patching others projects in order to be able to deny legitimacy to the same behavior practiced against themselves.) Forking projects is bad because it exposes pre-fork contributors to a reputation risk they can only control by being active in both child projects simultaneously after the fork. (This would generally be too confusing or difficult to be practical.) Distributing rogue patches (or, much worse, rogue binaries) exposes the owners to an unfair reputation risk. Even if the official code is perfect, the owners will catch flak from bugs in the patches (but see [RP]). Surreptitiously filing someone's name off a project is, in cultural context, one of the ultimate crimes. It steals the victim's gift to be presented as the thief's own. Of course, forking a project or distributing rogue patches for it also directly attacks the reputation of the original developer's group. If I fork or rogue-patch your project, I am saying: "you made a wrong decision [by failing to take the project where I am taking it]"; and Anyone who uses my forked variation is endorsing this challenge. But this in itself would be a fair challenge, albeit extreme; it's the sharpest end of peer review. It's therefore not sufficient in itself to account for the taboos, though it doubtless contributes force to them. All three of these taboo behaviors inflict global harm on the open-source community as well as local harm on the victim(s). Implicitly they damage the entire community by decreasing each potential contributor's perceived likelihood that gift/productive behavior will be rewarded. It's important to note that there are alternate candidate explanations for two of these three taboos. First, hackers often explain their antipathy to forking projects by bemoaning the wasteful duplication of work it would imply as the child products evolved in more-or-less parallel into the future. They may also observe that forking tends to split the co-developer community, leaving both child projects with fewer brains to work with than the parent. A respondent has pointed out that it is unusual for more than one offspring of a fork to survive with significant `market share' into the long term. This strengthens the incentives for all parties to cooperate and avoid forking, because it's hard to know in advance who will be on the losing side and see a lot of their work either disappear entirely or languish in obscurity. Dislike of rogue patches is often explained by observing that they can complicate bug-tracking enormously, and inflict work on maintainers who have quite enough to do catching their own mistakes. There is considerable truth to these explanations, and they certainly do their bit to reinforce the Lockean logic of ownership. But while intellectually attractive, they fail to explain why so much emotion and territoriality gets displayed on the infrequent occasions that the taboos get bent or broken -- not just by the injured parties, but by bystanders and observers who often react quite harshly. Cold-blooded concerns about duplication of work and maintainance hassles simply do not sufficiently explain the observed behavior. Then, too, there is the third taboo. It's hard to see how anything but the reputation-game analysis can explain this. The fact that this taboo is seldom analyzed much more deeply than ``It wouldn't be fair'' is revealing in its own way, as we shall see in the next section. ----------- 10. The Problem of Ego At the beginning of the paper I mentioned that the unconscious adaptive knowledge of a culture is often at odds with its conscious ideology. We've seen one major example of this already in the fact that Lockean ownership customs have been widely followed despite the fact that they violate the stated intent of the standard licenses. I have observed another interesting example of this phenomenon when discussing the reputation-game analysis with hackers. This is that many hackers resisted the analysis and showed a strong reluctance to admit that their behavior was motivated by a desire for peer repute or, as I incautiously labeled it at the time, `ego satisfaction'. This illustrates an interesting point about the hacker culture. It consciously distrusts and despises egotism and ego-based motivations; Self-promotion tends to be mercilessly criticized, even when the community might appear to have something to gain from it. So much so, in fact, that the culture's `big men' and tribal elders are required to talk softly and humorously deprecate themselves at every turn in order to maintain their status. How this attitude meshes with an incentive structure that apparently runs almost entirely on ego cries out for explanation.  A large part of it, certainly, stems from the generally negative Europo-American attitude towards `ego'. The cultural matrix of most hackers teaches them that desiring ego satisfaction is a bad (or at least immature) motivation; that ego is at best an eccentricity tolerable only in prima-donnas and often an actual sign of mental pathology. Only sublimated and disguised forms like ``peer repute'', ``self-esteem'', ``professionalism'' or ``pride of accomplishment'' are generally acceptable. I could write an entire other essay on the unhealthy roots of this part of our cultural inheritance, and the astonishing amount of self-deceptive harm we do by believing (against all the evidence of psychology and behavior) that we ever have truly `selfless' motives. Perhaps I would, if Friedrich Wilhelm Nietzsche and Ayn Rand had not already done an entirely competent job (whatever their other failings) of deconstructing `altruism' into unacknowledged kinds of self-interest. But I am not doing moral philosophy or psychology here, so I will simply observe one minor kind of harm done by the belief that ego is evil, which is this: it has made it emotionally difficult for many hackers to consciously understand the social dynamics of their own culture! But we are not quite done with this line of investigation. The surrounding culture's taboo against visibly ego-driven behavior is so much intensified in the hacker (sub)culture that one must suspect it of having some sort of special adaptive function for hackers. Certainly the taboo is weaker (or nonexistent) among many other gift cultures, such as the peer cultures of theater people or the very wealthy! ------- 11. The Value of Humility Having established that prestige is central to the hacker culture's reward mechanisms, we now need to understand why it has seemed so important that this fact remain semi-covert and largely unadmitted. The contrast with the pirate culture is instructive. In that culture, status-seeking behavior is overt and even blatant. These crackers seek acclaim for releasing ``zero-day warez'' (cracked software redistributed on the day of the original uncracked version's release) but are closemouthed about how they do it. These magicians don't like to give away their tricks. And, as a result, the knowledge base of the cracker culture as a whole increases only slowly. In the hacker community, by contrast, one's work is one's statement. There's a very strict meritocracy (the best craftsmanship wins) and there's a strong ethos that quality should (indeed must) be left to speak for itself. The best brag is code that ``just works'', and that any competent programmer can see is good stuff. Thus, the hacker culture's knowledge base increases rapidly. The taboo against ego-driven posturing therefore increases productivity. But that's a second-order effect; what is being directly protected here is the quality of the information in the community's peer-evaluation system. That is, boasting or self-importance is suppressed because it behaves like noise tending to corrupt the vital signals from experiments in creative and cooperative behavior. For very similar reasons, attacking the author rather than the code is not done. There is an interesting subtlety here that reinforces the point; hackers feel very free to flame each other over ideological and personal differences, but it is unheard of for any hacker to publicly attack another's competence at technical work (even private criticism is unusual and tends to be muted in tone). Bug-hunting and criticism are always project-labeled, not person-labeled. Furthermore, past bugs are not automatically held against a developer; the fact that a bug has been fixed is generally considered more important than the fact that one used to be there. As one respondent observed, one can gain status by fixing `Emacs bugs', but not by fixing `Richard Stallman's bugs' -- and it would be considered extremely bad form to criticize Stallman for old Emacs bugs that have since been fixed. This makes an interesting contrast with many parts of academia, in which trashing putatively defective work by others is an important mode of gaining reputation. In the hacker culture, such behavior is rather heavily tabooed -- so heavily, in fact, that the absence of such behavior did not present itself to me as a datum until that one respondent with an unusual perspective pointed it out nearly a full year after this paper was first published! The taboo against attacks on competence (not shared with academia) is even more revealing than the (shared) taboo on posturing, because we can relate it to a difference between academia and hackerdom in their communications and support structures. The hacker culture's medium of gifting is intangible, its communications channels are poor at expressing emotional nuance, and face-to-face contact among its members is the exception rather than the rule. This gives it a lower tolerance of noise than most other gift cultures, and goes a long way to explain the taboo against attacks on competence. Any significant incidence of flames over hackers' competence would intolerably disrupt the culture's reputation scoreboard. The same vulnerability to noise goes for to explain the example in public humility required of the hacker community's tribal elders. They must be seen to be free of boast and posturing so the taboo against dangerous noise will hold. [DC] Talking softly is also functional if one aspires to be a maintainer of a successful project; one must convince the community that one has good judgement, because most of the maintainer's job is going to be judging other people's code. Who would be inclined to contribute work to someone who clearly can't judge the quality of their own code, or whose behavior suggests they will attempt to unfairly hog the reputation return from the project? Potential contributors want project leaders with enough humility and class be able to to say, when objectively appropriate, ``Yes, that does work better than my version, I'll use it'' -- and to give credit where credit is due. Yet another reason for humble behavior is that in the open source world, you seldom want to give the impression that a project is `done'. This might lead a potential contributor not to feel needed. The way to maximize your leverage is to be humble about the state of the program. If one does one's bragging through the code, and then says ``Well shucks, it doesn't do x, y, and z, so it can't be that good'', patches for x, y, and z will often swiftly follow. Finally, I have personally observed that the self-deprecating behavior of some leading hackers reflects a real (and not unjustified) fear of becoming the object of a personality cult. Linus Torvalds and Larry Wall both provide clear and numerous examples of such avoidance behavior. Once, on a dinner expedition with Larry Wall, I joked ``You're the alpha hacker here -- you get to pick the restaurant''. He flinched audibly. And rightly so; failing to distinguish their shared values from the personalities of their leaders has ruined a good many voluntary communities, a pattern of which Larry and Linus cannot fail to be fully aware. On the other hand, most hackers would love to have Larry's problem, if they could but bring themselves to admit it. ------------- 12. Global Implications of the Reputation-Game Model The reputation-game analysis has some more implications that may not be immediately obvious. Many of these derive from the fact that one gains more prestige from founding a successful project than from cooperating in an existing one. One also gains more from projects which are strikingly innovative, as opposed to being `me, too' incremental improvements on software that already exists. On the other hand, software that nobody but the author understands or has a need for is a non-starter in the reputation game, and it's often easier to attract good notice by contributing to an existing project than it is to get people to notice a new one. Finally, it's much harder to compete with an already successful project than it is to fill an empty niche. Thus, there's an optimum distance from one's neighbors (the most similar competing projects). Too close and one's product will be a ``me, too!'' of limited value, a poor gift (one would be better off contributing to an existing project). Too far away, and nobody will be able to use, understand, or perceive the relevance of one's effort (again, a poor gift). This creates a pattern of homesteading in the noosphere that rather resembles that of settlers spreading into a physical frontier -- not random, but like a diffusion-limited fractal. Projects tend to get started to fill functional gaps near the frontier (see [NO] for further discussion of the lure of novelty). Some very successful projects become `category killers'; nobody wants to homestead anywhere near them because competing against the established base for the attention of hackers would be too hard. People who might otherwise found their own distinct efforts end up, instead, adding extensions for these big, successful projects. The classic `category killer' example is GNU Emacs; its variants fill the ecological niche for a fully-programmable editor so completely that no competitor has gotten much beyond the one-man project stage since the early 1980s. Instead, people write Emacs modes. Globally, these two tendencies (gap-filling and category-killers) have driven a broadly predictable trend in project starts over time. In the 1970s most of the open source that existed was toys and demos. In the 1980s the push was in development and Internet tools. In the 1990s the action shifted to operating systems. In each case, a new and more difficult level of problems was attacked when the possibilities of the previous one had been nearly exhausted. This trend has interesting implications for the near future. In early 1998, Linux looks very much like a category-killer for the niche `open-source operating systems' -- people who might otherwise write competing operating systems are now writing Linux device drivers and extensions instead. And most of the lower-level tools the culture ever imagined having as open-source already exist. What's left? Applications. As the year 2000 approaches, it seems safe to predict that open-source development effort will increasingly shift towards the last virgin territory -- programs for non-techies. A clear early indicator is the development of GIMP, the Photoshop-like image workshop that is open source's first major application with the kind of end-user-friendly GUI interface considered de rigueur in commercial applications for the last decade. Another is the amount of buzz surrounding application-toolkit projects like KDE and GNOME. A respondent to this paper has pointed out that the homesteading analogy also explains why hackers react with such visceral anger to Microsoft's ``embrace and extend'' policy of complexifying and then closing up Internet protocols. The hacker culture can coexist with most closed software; the existence of Adobe Photoshop, for example, does not make the territory near GIMP (its open-source equivalent) significantly less attractive. But when Microsoft succeeds at de-commoditizing [HD] a protocol so that only Microsoft's own programmers can write software for it, they do not merely harm customers by extending their monopoly. They also reduce the amount and quality of noosphere available for hackers to homestead and cultivate. No wonder hackers often refer to Microsoft's strategy as ``protocol pollution''; they are reacting exactly like farmers watching someone poison the river they water their crops with! Finally, the reputation-game analysis explains the oft-cited dictum that you do not become a hacker by calling yourself a hacker -- you become a hacker when other hackers call you a hacker. A `hacker', considered in this light, is somebody who has shown (by contributing gifts) that he or she both has technical ability and understands how the reputation game works. This judgement is mostly one of awareness and acculturation, and can only be delivered by those already well inside the culture. ---------------- 13. How Fine a Gift? There are consistent patterns in the way the hacker culture values contributions and returns peer esteem for them. It's not hard to observe the following rules: 1. If it doesn't work as well as I have been led to expect it will, it's no good -- no matter how clever and original it is. Note the `led to expect'. This rule is not a demand for perfection; beta and experimental software is allowed to have bugs. It's a demand that the user be able to accurately estimate risks from the stage of the project and the developers' representations about it. This rule underlies the fact that open-source software tends to stay in beta for a long time, and not get even a 1.0 version number until the developers are very sure it will not hand out a lot of nasty surprises. In the closed-source world, Version 1.0 means ``Don't touch this if you're prudent.''; in the open-source world it reads something more like ``The developers are willing to bet their reputations on this.'' 2. Work that extends the noosphere is better than work that duplicates an existing piece of functional territory. The naive way to put this would have been: Original work is better than duplicating the functions of existing software. But it's not actually quite that simple. Duplicating the functions of existing closed software counts as highly as original work if by doing so you break open a closed protocol or format and make that territory newly available. Thus, for example, one of the highest-prestige projects in the present open-source world is Samba -- the code that allows Unix machines to act as clients or servers for Microsoft's proprietary SMB file-sharing protocol. There is very little creative work to be done here; it's mostly an issue of getting the reverse-engineered details right. Nevertheless, the members of the Samba group are perceived as heroes because they neutralize a Microsoft effort to lock in whole user populations and cordon off a big section of the noosphere. 3. Work that makes it into a major distribution is better than work that doesn't. Work carried in all major distributions is most prestigious. The major distributions include not just the big Linux distributions like Red Hat, Debian, Caldera, and S.u.S.E., but other collections that are understood to have reputations of their own to maintain and thus implicitly certify quality -- like BSD distributions or the Free Software Foundation source collection. 4. Utilization is the sincerest form of flattery -- and category killers are better than also-rans. Trusting the judgment of others is basic to the peer-review process. It's necessary because nobody has time to review all possible alternatives. So work used by lots of people is considered better than work used by a few, To have done work so good that nobody cares to use the alternatives any more is therefore to have earned huge prestige. The most possible peer esteem comes from having done widely popular, category-killing original work that is carried by all major distributions. People who have pulled this off more than once are half-seriously referred to as `demigods'. 5. Continued devotion to hard, boring work (like debugging, or writing documentation) is more praiseworthy than cherrypicking the fun and easy hacks. This norm is how the community rewards necessary tasks that hackers would not naturally incline towards. It is to some extent contradicted by: 6. Nontrivial extensions of function are better than low-level patches and debugging. The way this seems to work is that on a one-shot basis, adding a feature is likely to get more reward than fixing a bug -- unless the bug is exceptionally nasty or obscure, such that nailing it is itself a demonstration of unusual skill and cleverness. But when these behaviors are extended over time, a person with a long history of paying attention to and nailing even ordinary bugs may well rank someone who has spent a similar amount of effort adding easy features. A respondent has pointed out that these rules interact in interesting ways and do not necessarily reward highest possible utility all the time. Ask a hacker whether he's likely to become better known for a brand new tool of his own or for extensions to someone else's and the answer ``new tool'' will not be in doubt. But ask about (a) a brand new tool which is only used a few times a day invisibly by the OS but which rapidly becomes a category killer versus (b) several extensions to an existing tool which are neither especially novel nor category-killers, but are daily used and daily visible to a huge number of users and you are likely to get some hesitation before the hacker settles on (a). These alternatives are about evenly stacked. Said respondent gave this question point for me by adding ``Case (a) is fetchmail; case (b) is your many Emacs extensions, like vc.el and gud.el.'' And indeed he is correct; I am more likely to be tagged `the author of fetchmail' than `author of a boatload of Emacs modes', even though the latter probably have had higher total utility over time. What may be going on here is simply that work with a novel `brand identity' gets more notice than work aggregated to an existing `brand'. Elucidation of these rules, and what they tell us about the hacker culture's scoreboarding system, would make a good topic for further investigation. 14. Noospheric Property and the Ethology of Territory To understand the causes and consequences of Lockean property customs, it will help us to look at them from yet another angle; that of animal ethology, specifically the ethology of territory. Property is an abstraction of animal territoriality, which evolved as a way of reducing intra-species violence. By marking his bounds, and respecting the bounds of others, a wolf diminishes his chances of being in a fight that could weaken or kill him and make him less reproductively successful. Similarly, the function of property in human societies is to prevent inter-human conflict by setting bounds that clearly separate peaceful behavior from aggression. It is fashionable in some circles to describe human property as an arbitrary social convention, but this is dead wrong. Anybody who has ever owned a dog who barked when strangers came near its owner's property has experienced the essential continuity between animal territoriality and human property. Our domesticated cousins of the wolf know, instinctively, that property is no mere social convention or game, but a critically important evolved mechanism for the avoidance of violence. (This makes them smarter than a good many human political theorists.) Claiming property (like marking territory) is a performative act, a way of declaring what boundaries will be defended. Community support of property claims is a way to minimize friction and maximize cooperative behavior. These things remain true even when the ``property claim'' is much more abstract than a fence or a dog's bark, even when it's just the statement of the project maintainer's name in a README file. It's still an abstraction of territoriality, and (like other forms of property) based in territorial instincts evolved to assist conflict resolution. This ethological analysis may at first seem very abstract and difficult to relate to actual hacker behavior. But it has some important consequences. One is in explaining the popularity of World Wide Web sites, and especially why open-source projects with websites seem so much more `real' and substantial than those without them. Considered objectively, this seems hard to explain. Compared to the effort involved in originating and maintaining even a small program, a web page is easy, so it's hard to consider a web page evidence of substance or unusual effort. Nor are the functional characteristics of the Web itself sufficient explanation. The communication functions of a web page can be as well or better served by a combination of an FTP site, a mailing list, and Usenet postings. In fact it's quite unusual for a project's routine communications to be done over the Web rather than via a mailing list or newsgroup. Why, then, the popularity of Web sites as project homes? The metaphor implicit in the term `home page' provides an important clue. While founding an open-source project is a territorial claim in the noosphere (and customarily recognized as such) it is not a terribly compelling one on the psychological level. Software, after all, has no natural location and is instantly reduplicable. It's assimilable to our instinctive notions of `territory' and `property', but only after some effort. A project home page concretizes an abstract homesteading in the space of possible programs by expressing it as `home' territory in the more spatially-organized realm of the World Wide Web. Descending from the noosphere to `cyberspace' doesn't get us all the way to the real world of fences and barking dogs yet, but it does hook the abstract property claim more securely to our instinctive wiring about territory. And this is why projects with web pages seem more `real'. This point is much strengthened by hyperlinks and the existence of good search engines. A project with a web page is much more likely to be noticed by somebody exploring its neighborhood in the noosphere; others will link to it, searches will find it. A web page is therefore a better advertisement, a more effective performative act, a stronger claim on territory. This ethological analysis also encourages us to look more closely at mechanisms for handling conflict in the open-source culture. It leads us to expect that, in addition to maximizing reputation incentives, ownership customs should also have a role in preventing and resolving conflicts. 16. Project Structures and Ownership The trivial case is that in which the project has a single owner/maintainer. In that case there is no possible conflict. The owner makes all decisions and collects all credit and blame. The only possible conflicts are over succession issues -- who gets to be the new owner if the old one disappears or loses interest. The community also has an interest, under issue (C), in preventing forking. These interests are expressed by a cultural norm that an owner/maintainer should publicly hand title to someone if he or she can no longer maintain the project. The simplest non-trivial case is when a project has multiple co-maintainers working under a single `benevolent dictator' who owns the project. Custom favors this mode for group projects; it has been shown to work on projects as large as the Linux kernel or Emacs, and solves the ``who decides'' problem in a way that is not obviously worse than any of the alternatives. Typically, a benevolent-dictator organization evolves from an owner-maintainer organization as the founder attracts contributors. Even if the owner stays dictator, it introduces a new level of possible disputes over who gets credited for what parts of the project. In this situation, custom places an obligation on the owner/dictator to credit contributors fairly (through, for example, appropriate mentions in README or history files). In terms of the Lockean property model, this means that by contributing to a project you earn part of its reputation return (positive or negative). Pursuing this logic, we see that a `benevolent dictator' does not in fact own his entire project unqualifiedly. Though he has the right to make binding decisions, he in effect trades away shares of the total reputation return in exchange for others' work. The analogy with sharecropping on a farm is almost irresistible, except that a contributor's name stays in the credits and continues to `earn' to some degree even after that contributor is no longer active. As benevolent-dictator projects add more participants, they tend to develop two tiers of contributors; ordinary contributors and co-developers. A typical path to becoming a co-developer is taking responsibility for a major subsystem of the project. Another is to take the role of `lord high fixer', characterizing and fixing many bugs. In this way or others, co-developers are the contributors who make a substantial and continuing investment of time in the project. The subsystem-owner role is particularly important for our analysis and deserves further examination. Hackers like to say that `authority follows responsibility'. A co-developer who accepts maintainance responsibility for a given subsystem generally gets to control both the implementation of that subsystem and its interfaces with the rest of the project, subject only to correction by the project leader (acting as architect). We observe that this rule effectively creates enclosed properties on the Lockean model within a project, and has exactly the same conflict-prevention role as other property boundaries. By custom, the `dictator' or project leader in a project with co-developers is expected to consult with those co-developers on key decisions. This is especially so if the decision concerns a subsystem which a co-developer `owns' (that is, has invested time in and taken responsibility for). A wise leader, recognizing the function of the project's internal property boundaries, will not lightly interfere with or reverse decisions made by subsystem owners. Some very large projects discard the `benevolent dictator' model entirely. One way to do this is turn the co-developers into a voting committee (as with Apache). Another is rotating dictatorship, in which control is occasionally passed from one member to another within a circle of senior co-developers; the Perl developers organize themselves this way. Such complicated arrangements are widely considered unstable and difficult. Clearly this perceived difficulty is largely a function of the known hazards of design-by-committee, and of committees themselves; these are problems the hacker culture consciously understands. However, I think some of the visceral discomfort hackers feel about committee or rotating-chair organizations is because they're hard to fit into the unconscious Lockean model hackers use for reasoning about the simpler cases. It's problematic, in these complex organizations, to do an accounting of either ownership in the sense of control or ownership of reputation returns. It's hard to see where the internal boundaries are, and thus hard to avoid conflict unless the group enjoys an exceptionally high level of harmony and trust. 17. Conflict and Conflict Resolution We've seen that within projects, an increasing complexity of roles is expressed by a distribution of design authority and partial property rights. While this is an efficient way to distribute incentives, it also dilutes the authority of the project leader -- most importantly, it dilutes the leader's authority to squash potential conflicts. While technical arguments over design might seem the most obvious risk for internecine conflict, they are seldom a serious cause of strife. These are usually relatively easily resolved by the territorial rule that authority follows responsibility. Another way of resolving conflicts is by seniority -- if two contributors or groups of contributors have a dispute, and the dispute cannot be resolved objectively, and neither owns the territory of the dispute, the side that has put the most work into the project as a whole (that is, the side with the most property rights in the whole project) wins. (Equivalently, the side with the least invested loses. Interestingly this happens to be the same heuristic that many relational database engines resolve deadlocks. When two threads are deadlocked over resources, the side with the least invested in the current transaction is selected as the deadlock victim and is terminated. This usually selects the longest running transaction, or the more senior, as the victor.) These rules generally suffice to resolve most project disputes. When they do not, fiat of the project leader usually suffices. Disputes that survive both these filters are rare. Conflicts do not as a rule become serious unless these two criteria ("authority follows responsibility" and "seniority wins") point in different directions, and the authority of the project leader is weak or absent. The most obvious case in which this may occur is a succession dispute following the disappearance of the project lead. I have been in one fight of this kind. It was ugly, painful, protracted, only resolved when all parties became exhausted enough to hand control to an outside person, and I devoutly hope I am never anywhere near anything of the kind again. Ultimately, all of these conflict-resolution mechanisms rest on the wider hacker community's willingness to enforce them. The only available enforcement mechanisms are flaming and shunning -- public condemnation of those who break custom, and refusal to cooperate with them after they have done so. 18. Acculturation Mechanisms and the Link to Academia An early version of this paper posed the following research question: How does the community inform and instruct its members as to its customs? Are the customs self-evident or self-organizing at a semi-conscious level, are they taught by example, are they taught by explicit instruction? Teaching by explicit instruction is clearly rare, if only because few explicit descriptions of the culture's norms have existed to be used up to now. Many norms are taught by example. To cite one very simple case, there is a norm that every software distribution should have a file called README or READ.ME that contains first-look instructions for browsing the distribution. This convention has been well established since at least the early 1980s; it has even, occasionally, been written down. But one normally derives it from looking at many distributions. On the other hand, some hacker customs are self-organizing once one has acquired a basic (perhaps unconscious) understanding of the reputation game. Most hackers never have to be taught the three taboos I listed earlier in this paper, or at least would claim if asked that they are self-evident rather than transmitted. This phenomenon invites closer analysis -- and perhaps we can find its explanation in the process by which hackers acquire knowledge about the culture. Many cultures use hidden clues (more precisely `mysteries' in the religio/mystical sense) as an acculturation mechanism. These are secrets which are not revealed to outsiders, but are expected to be discovered or deduced by the aspiring newbie. To be accepted inside, one must demonstrate that one both understands the mystery and has learned it in a culturally approved way. The hacker culture makes unusually conscious and extensive use of such clues or tests. We can see this process operating at at least three levels: Password-like specific mysteries. As one example, there is a USENET newsgroup called alt.sysadmin.recovery that has a very explicit such secret; you cannot post without knowing it, and knowing it is considered evidence you are fit to post. The regulars have a strong taboo against revealing this secret. The requirement of initiation into certain technical mysteries. One must absorb a good deal of technical knowledge before one can give valued gifts (e.g. one must know at least one of the major computer languages). This requirement functions in the large in the way hidden clues do in the small, as a filter for qualities (such as capability for abstract thinking, persistence, and mental flexibility) which are necessary to function in the culture. Social-context mysteries. One becomes involved in the culture through attaching oneself to specific projects. Each project is a live social context of hackers which the would-be contributor has to investigate and understand socially as well as technically in order to function. (Concretely, a common way one does this is by reading the project's Web pages and/or email archives.) It is through these project groups that newbies experience the behavioral example of experienced hackers. In the process of acquiring these mysteries, the would-be hacker picks up contextual knowledge which (after a while) does make the three taboos and other customs seem `self-evident'. One might, incidentally, argue that the structure of the hacker gift culture itself is its own central mystery. One is not considered acculturated (concretely: no one will call you a hacker) until one demonstrates a gut-level understanding of the reputation game and its implied customs, taboos, and usages. But this is trivial; all cultures demand such understanding from would-be joiners. Furthermore the hacker culture evinces no desire to have its internal logic and folkways kept secret -- or, at least, nobody has ever flamed me for revealing them! Respondents to this paper too numerous to list have pointed out that hacker ownership customs seem intimately related to (and may derive directly from) the practices of the academic world, especially the scientific research commmunity. This research community has similar problems in mining a territory of potentially productive ideas, and exhibits very similar adaptive solutions to those problems in the ways it uses peer review and reputation. Since many hackers have had formative exposure to academia (it's common to learn how to hack while in college) the extent to which academia shares adaptive patterns with the hacker culture is of more than casual interest in understanding how these customs are applied. Obvious parallels with the hacker `gift culture' as I have characterized it abound in academia. Once a researcher achieves tenure, there is no need to worry about survival issues. (Indeed, the concept of tenure can probably be traced back to an earlier gift culture in which ``natural philosophers'' were primarily wealthy gentlemen with time on their hands to devote to research.) In the absence of survival issues, reputation enhancement becomes the driving goal, which encourages sharing of new ideas and research through journals and other media. This makes objective functional sense because scientific research, like the hacker culture, relies heavily on the idea of `standing upon the shoulders of giants', and not having to rediscover basic principles over and over again. Some have gone so far as to suggest that hacker customs are merely a reflection of the research community's folkways and have actually (in most cases) been acquired there by individual hackers. This probably overstates the case, if only because hacker custom seems to be readily acquired by intelligent high-schoolers! 19. Gift Outcompetes Exchange There is a more interesting possibility here. I suspect academia and the hacker culture share adaptive patterns not because they're genetically related, but because they've both evolved the one most optimal social organization for what they're trying to do, given the laws of nature and and the instinctive wiring of human beings. The verdict of history seems to be that free-market capitalism is the globally optimal way to cooperate for economic efficiency; perhaps, in a similar way, the reputation-game gift culture is the globally optimal way to cooperate for generating (and checking!) high-quality creative work. Support for this theory becomes from a large body of psychological studies on the interaction between art and reward [GNU]. These studies have received less attention than they should, in part perhaps because their popularizers have shown a tendency to overinterpret them into general attacks against the free market and intellectual property. Nevertheless, their results do suggest that some kinds of scarcity-economics rewards actually decrease the productivity of creative workers such as programmers. Psychologist Theresa Amabile of Brandeis University, cautiously summarizing the results of a 1984 study of motivation and reward, observed ``It may be that commissioned work will, in general, be less creative than work that is done out of pure interest.''. Amabile goes on to observe that ``The more complex the activity, the more it's hurt by extrinsic reward.'' Interestingly, the studies suggest that flat salaries don't demotivate, but piecework rates and bonuses do. Thus, it may be economically smart to give performance bonuses to people who flip burgers or dug ditches, but it's probably smarter to decouple salary from performance in a programming shop and let peeople choose their own projects (both trends that the open-source world takes to their logical conclusions). Indeed, these results suggest that the only time it is a good idea to reward performance in programming is when the programmer is so motivated that he or she would have worked without the reward! Other researchers in the field are willing to point a finger straight at the issues of autonomy and creative control that so preoccupy hackers. ``To the extent one's experience of being self-determined is limited,'' said Richard Ryan, associate psychology professor at the University of Rochester, ``one's creativity will be reduced as well.'' In general, presenting any task as a means rather than an end in itself seems to demotivate. Even winning a competition with others or gaining peer esteem can be demotivating in this way if it is experienced as work for reward (which may explain why hackers are culturally prohibited from explicitly seeking or claiming that esteem). To complicate the management problem further, controlling verbal feedback seems to be just as demotivating as piecework payment. Ryan found that corporate employees who were told, ``Good, you're doing as you should'' were ``significantly less intrinsically motivated than those who received feedback informationally.'' It may still be intelligent to offer incentives, but they have to come without attachments to avoid gumming up the works. There is a criticl difference (Ryan observes) between saying, ``I'm giving you this reward because I recognize the value of your work'' and ``You're getting this reward because you've lived up to my standards.'' The first does not demotivate; the second does. In these psychological observations we can ground a case that an open-source development group will be substantially more productive (especially over the long term, in which creativity becomes more critical as a productivity multiplier) than an equivalently sized and skilled group of closed-source programmers (de)motivated by scarcity rewards. This suggests from a slightly different angle one of the speculations in The Cathedral And The Bazaar; that, ultimately, the industrial/factory mode of software production was doomed to be outcompeted from the moment capitalism began to create enough of a wealth surplus that many programmers could live in a post-scarcity gift culture. Indeed, it seems the prescription for highest software productivity is almost a Zen paradox; if you want the most efficient production, you must give up trying to make programmers produce. Handle their subsistence, give them their heads, and forget about deadlines. To a conventional manager this sounds crazily indulgent and doomed -- but it is exactly the recipe with which the open-source culture is now clobbering its competition. 0. Conclusion: From Custom to Customary Law We have examined the customs which regulate the ownership and control of open-source software. We have seen how they imply an underlying theory of property rights homologous to the Lockean theory of land tenure. We have related that to an analysis of the hacker culture as a `gift culture' in which participants compete for prestige by giving time, energy, and creativity away. We have examined the implications of this analysis for conflict resolution in the culture. The next logical question to ask is "Why does this matter?" Hackers developed these customs without conscious analysis and (up to now) have followed them without conscious analysis. It's not immediately clear that conscious analysis has gained us anything practical -- unless, perhaps, we can move from description to prescription and deduce ways to improve the functioning of these customs. We have found a close logical analogy for hacker customs in the theory of land tenure under the Anglo-American common-law tradition. Historically [Miller], the European tribal cultures that invented this tradition improved their dispute-resolution systems by moving from a system of unarticulated, semi-conscious custom to a body of explicit customary law memorized by tribal wisemen -- and eventually, written down. Perhaps, as our population rises and acculturation of all new members becomes more difficult, it is time for the hacker culture to do something analogous -- to develop written codes of good practice for resolving the various sorts of disputes that can arise in connection with open-source projects, and a tradition of arbitration in which senior members of the community may be asked to mediate disputes. The analysis in this paper suggests the outlines of what such a code might look like, making explicit that which was previously implicit. No such codes could be imposed from above; they would have to be voluntarily adopted by the founders or owners of individual projects. Nor could they be completely rigid, as the pressures on the culture are likely to change over time. Finally, for enforcement of such codes to work, they would have to reflect a broad consensus of the hacker tribe. I have begun work on such a code, tentatively titled the "Malvern Protocol" after the little town where I live. If the general analysis in this paper becomes sufficiently widely accepted, I will make the Malvern Protocol publicly available as a model code for dispute resolution. Parties interested in critiquing and developing this code, or just offering feedback on whether they think it's a good idea or not, are invited to contact me by email. 21. Questions for Further Research The culture's (and my own) understanding of large projects that don't follow a benevolent-dictator model is weak. Most such projects fail. A few become spectacularly successful and important (Perl, Apache, KDE). Nobody really understands where the difference lies. There's a vague sense abroad that each such project is sui generis and stands or falls on the group dynamic of its particular members, but is this true or are there replicable strategies a group can follow? 22. Bibliography [Miller] Miller, William Ian; Bloodtaking and Peacemaking: Feud, Law, and Society in Saga Iceland; University of Chicago Press 1990, ISBN 0-226-52680-1. A fascinating study of Icelandic folkmoot law, which both illuminates the ancestry of the Lockean theory of property and describes the later stages of a historical process by which custom passed into customary law and thence to written law. [Mal] Malaclypse the Younger; Principia Discordia, or How I Found Goddess and What I Did To Her When I Found Her; Loompanics, ISBN 1-55950-040-9. There is much enlightening silliness to be found in Discordianism. Amidst it, the `SNAFU principle' provides a rather trenchant analysis of why command hierarchies don't scale well. There's a browseable HTML version. [BCT] J. Barkow, L. Cosmides, and J. Tooby (Eds.); The adapted mind: Evolutionary psychology and the generation of culture. New York: Oxford University Press 1992. An excellent introduction to evolutionary psychology. Some of the papers bear directly on the three cultural types I discuss (command/exchange/gift), suggesting that these patterns are wired into the human psyche fairly deep. [MHG] Goldhaber, Michael K.; The Attention Economy and the Net. I discovered this paper after my version 1.7. It has obvious flaws (Goldhaber's argument for the inapplicability of economic reasoning to attention does not bear close examination), but Goldhaber nevertheless has funny and perceptive things to say about the role of attention-seeking in organizing behavior. The prestige or peer repute I have discussed can fruitfully be viewed as a particular case of attention in his sense.  ---------------------  The Circus Midget and the Fossilized Dinosaur Turd -or- "What up with that software industry?" A Treatise on Free Software Development. With apologies to Eric S. Raymond. ---- By Martin Hock (oxymoron@bigsky.net) Copyright 1998. This is a parody. It is completely fictitious. I assume no liability. Please don't hurt me. Yes, there's an actual point to this. I went down to the Ethnic Quarter of the Montanan "city" I live in today, which normally consists of approximately three black people. Today, however, was different. Not only were there the normal three black people, but there were a couple of weird Europeans who had apparently gotten lost. On my way into the Cheap Legal Drugs Mart, I happened to overhear their conversation, which went approximately as follows: "You looka at the state ofa the software industry today, my frien, anda what do you see? You see a biga ball of the shit. That'sa what you see." The other guy didn't say anything, probably because he was too busy staring at a woman across the street. Still, it got me thinking. What up with that software industry, anyway? As I went home that night, I couldn't shake the image of the slobbering man from my mind. While I watched for the umpteenth time the Juiceman Juicer infomercial formed by a beam of electrons refreshing half the screen 60 times a second, I suddenly realized that I could make money off this concept if I went around the country making speeches about what up with that software industry. I looked at the room around me. Filled with empty beer bottles and crinkled pornography magazines dating back to the late 1970's, I realized that sinking all of my money into the simple pleasures in life brought me all the satisfaction that I ever needed. Oh, right, the software part. Yeah, anyway, I thought back to when I was a little kid and how I used to love the circus. I didn't like the lions, or the stupid gymnasts, or the evil foul-smelling clowns. What I liked were the freaks. They helped remind me that there were people in the world who were even more pathetic than myself. I especially liked the midget. His bulging little eyes used to follow me around my room, his stained leotard a constant reminder to the audience that bladder control is essential to functioning as a part of society. I wondered what that little man got paid. Probably sub-minimum wage. My parents used to feel guilty when they walked by him. He had a little tattered hat next to him with a small card taped in front that simply stated, "Donations." It was always empty, except for a couple of pennies. "The horrible way that circus treats that poor man," my mother always said. "If he didn't like it, he'd work somewhere else," my father would respond gruffly, his mono-brow dipped downward in the middle. They never put anything in the hat. Other days, we used to go to the museum. There were many things to look at when we went there, but the ones I most liked to observe were the dinosaurs. They were so huge and fierce. They reminded me that there were forces in life stronger even than parents. The big, bony structures didn't really tell me much, though. What I really liked to look at were the turds. They were these gigantic, ellipsoid masses. I could almost touch them except for a thin pane of Plexiglas. The small brass plate called it "excrement" or "feces" but I knew better; it was a turd, nothing less. I would dream about going in there at night, shattering the barrier, and taking the mass home with me. It wasn't scatological or anything. What I really wanted to do was drop it on a car from the overpass. Those cinder blocks did hardly any damage on the hardtops and hitting the windshield was nearly impossible from such an angle. The midget was a lot like free software. True, getting into the carnival wasn't free, so I guess that's like the hardware. But you could look at the midget all you liked. You could take pictures of the midget and bring them home. He modified himself sometimes; you'd see a new stain every time the carnival came in town. He'd get a little older, a little uglier. Back when I was a kid it was really cool, but if I went there today to see the midget, I wouldn't even care. There are better things to do with one's afternoon than to go look at a midget. The fossilized dinosaur turd was a lot like commercial software. It was big and robust. It was well supported by a velveteen cushion. It even had a nice layer of security instated by the Plexiglas. I could have stolen it, but there would be potential repercussions. I know that I could have taken the midget with me, but what would be the point? Also, the turd has a lot of potential uses. You could drop it on a car, a bus, or even a pedestrian. That's what I call adaptive. I could have modified the midget by feeding him lead shot over a course of several weeks, but this would have been time consuming. Why waste your time when the turd is already there, ready for use? So that's what I have to say about software development. You wanna give me my money now? Oh, I suppose you'd expect a little more than that for ten grand. All right, I'll continue. Look at the midget. It is feeble and weak compared to the dinosaur turd. It is the undiscovered, the lost. There was no banner trumpeting the arrival of the midget in town. However, it is alive. The dinosaur turd, though famous and strong, is dead. It has little hope for improvement, as the dinosaur that laid it is long extinct. Young dinosaurs may have frolicked in the field of turds, but a thick dust cloud ended all hopes of survival. A dust cloud, you might notice, made up of thousands of tiny particles, all working in unison. The midget stands alone, hoping for support, but the dust particles, all driven by the jurassic breeze, manage to topple even the largest dinosaur. Only the small, well-protected creatures remain. So what of the dust? Ah, it is the proletariat rebellion, waiting to happen, to conquer the bourgeois beast! It is inevitable, but we can bring it on ourselves if we work hard enough. We must employ thousands of workers at equal wages to create a giant fan fit for the ages. Then, we make a solar-powered generator, which allows for the falling away of the state since we won't have to turn the crank ourselves. Then, we just sit back and relax as the winds blow the dust and blissful anarchy sets in. But what of the tiny creatures? Ah, these are the seeds of a new generation! These will grow up one day to form factions, which can only be prevented from taking over the government if we implement plenty of checks and balances... Oh, I'm done now? I get the check already? But I have another nine and a half hours... ------------ Fame? Ego? Oversimplification! (I originally wrote this 14 July 1998 in response to a thread on Slashdot.) Many messages appearing on Slashdot in the last couple of days have made me wince pretty hard...and consider whether, in fact, I was really wise to try to haul the social dynamics of hackerdom out into the light. What's bothering me the most is some of the people who have gotten enthusiastic about the analysis I presented in The Cathedral and the Bazaar (CatB) and Homesteading The Noosphere (HtN), but, in their enthusiasm, are arguing something like a bad parody of it. I don't use the word `fame' at all in either paper, except once in reporting on Fare Rideau's critique of an early version of HtN. (The reference has since been removed; Fare reworded his critique after reading this essay.) This is not an accident. `Fame' is a vulgar, brassy, and shallow thing when compared to the earned and considered esteem of one's peers. Believe me on this, because I've had quite a bit of both (especially lately) and I know which one feels like a cheap high with a bad hangover and which one is food for the soul. And so, I think, do most hackers. It oversimplifies my work and (much more importantly) insults the people and culture my work describes to imply that most hackers have some inner fantasy of tickertape parades, talk-show appearances, and hordes of adoring groupies. But that is exactly what the word `fame' connotes -- and the way people have been flinging it around in disagreement and (worse) agreement with me suggests that a lot of them need to think carefully about the difference between `fame' and `peer repute'. That difference is crucial to understanding our culture. Because `fame' is a mob phenomenon, essentially an emotional response. It's irrational and self-reinforcing. There are people who are famous for being famous. The photographer who took the pictures for my People interview back in 1996 during my pre-CatB first fifteen minutes of fame called them `face people'. Often, there's nothing behind the face. Peer repute, on the other hand, is a much subtler and solider thing. The earned and considered approbation of one's peers has to come from accomplishment, from productivity. Often those peers are few, and this becomes more true as one becomes more accomplished. Higher levels of it, unlike fame, become progressively harder to earn because one's own standards for who is a fit peer keep rising. Linus said "I am your God" at Linux Expo on stage and brought down the house. The line was ironic and hilarious precisely because what he has is not `fame', not uncritical adoration, not the masses gazing up at him in awe, but rather a rational peer response to real achievement. He knows that; and he knows that we know it. I thought most of us did, anyway. The last day or two of Slashdot makes me wonder. So, in case it needs saying again, don't confuse `peer repute' with `fame'. And if you've interpreted CatB and HtN as assertions that `fame' is the only significant motive for hackers, think again. Reality, as usual, is more subtle and complex than that. -----------------------  Raymond on 9/11: Decentralism Against Terrorism -----   (I wrote this on September 11th, 2001, hours after learning that the World Trade Center had been destroyed, with thousands of lives lost, by terrorists who hijacked two jetliners using carpet knives.) Some friends have asked me to step outside my normal role as a technology evangelist today, to point out in public that a political panic reaction to the 9/11 terrorist attack could do a great deal more damage than the attack itself. Today will not have been a victory for terrorism unless we make it one. If we reward in any way the Palestinians who are now celebrating this hideous crime in the streets of the West Bank, that wil have been a victory for terrorism. If we accept "anti-terrorism" measures that do further damage to our Constitutional freedoms, that will have been a victory for terrorism. But if we learn the right lessons, if we make policies that preserve freedom and offer terrorists no result but a rapid and futile death, that will have been a victory for the rest of us. We have learned today that airport security is not the answer. At least four separate terror teams were able to sail right past all the elaborate obstacles -- the demand for IDs, the metal detectors, the video cameras, the X-ray machines, the gunpowder sniffers, the gate agents and security people trained to spot terrorists by profile. There have been no reports that any other terror units were successfully prevented from achieving their objectives by these measures. In fact, the early evidence is that all these police-state-like impositions on freedom were exactly useless -- and in the smoldering ruins of the World Trade Center lies the proof of their failure. We have learned today that increased surveillance is not the answer. The FBI's "Carnivore" tap on the U.S.'s Internet service providers didn't spot or prevent this disaster; nor did the NSA's illegal Echelon wiretaps on international telecommunications. Video monitoring of public areas could have accomplished exactly nothing against terrorists taking even elementary concealment measures. If we could somehow extend airport-level security to the entire U.S., it would be just as useless against any determined and even marginally competent enemy. We have learned today that trying to keep civilian weapons out of airplanes and other areas vulnerable to terrorist attack is not the answer either -- indeed, it is arguable that the lawmakers who disarmed all the non-terrorists on those four airplanes, leaving them no chance to stop the hijackers, bear part of the moral responsibility for this catastrophe. I expect that in the next few months, far too many politicians and pundits will press for draconian "anti-terrorist" laws and regulations. Those who do so will be, whether intentionally or not, cooperating with the terrorists in their attempt to destroy our way of life -- and we should all remember that fact come election time. As an Internet technologist, I have learned that distributed problems require distributed solutions -- that centralization of power, the first resort of politicians who feed on crisis, is actually worse than useless, because centralizers regard the more effective coping strategies as threats and act to thwart them. Perhaps it is too much to hope that we will respond to this shattering tragedy as well as the Israelis, who have a long history of preventing similar atrocities by encouraging their civilians to carry concealed weapons and to shoot back at criminals and terrorists. But it is in that policy of a distributed response to a distributed threat, with every single citizen taking personal responsibility for the defense of life and freedom, that our best hope for preventing recurrences of today's mass murders almost certainly lies. If we learn that lesson, perhaps today's deaths will not have been in vain. --------------------- The Biology of Promiscuity Why do human beings screw around when it complicates our lives so much? Why do we preach fidelity at each other and then, so often, practice adultery? The cheap and obvious answer, "because it feels too good to stop" isn't a good one, as it turns out. Evolutionary biology teaches us that humans being, like other animals, are adaptive machines; "feels good" is simply instinct's way to steer us towards behaviors that were on average successful for our ancestors. So that answer simply sets up another question: why has our species history favored behavior that is (as the agony columns, bitter ballads, tragic plays and veneral-disease statistics inform us) often destructive to all parties involved? This question has extra point for humans because human sex and childbirth are risky business compared to that of most of our near relatives. Human infants have huge heads, enough to make giving birth a chancy matter -- and even so, the period during which they remain dependent on nurturing is astonishingly long and requires a lot of parental investment. If we were redesigning humans to cope with the high investment requirement, one obvious way would be to rewire our instincts such that we pair-bond exclusively for life. It's certainly possible to imagine an evolved variant of humanity in which "infidelity" is never an issue because mated pairs imprint on each other so specifically that nobody else is sexually interesting. Some birds are like this. So why aren't we like this? Why haven't promiscuity and adultery been selected out? What adaptive function do they serve that balances out the risk to offspring from unstable matings? The route to an answer lies in remembering that evolutionary selection is not a benign planner that tries to maximize group survival but rather a blind competition between individual genetic lines. We need to look more closely at the conflicting strategies used by competing players in the reproduction game. Male promiscuity has always been relatively easy to understand. While total parental investment needs to be pretty intense, men have a dramatically lower minimum energy and risk investment in children than women do; one index of the difference is that women not infrequently died in childbirth under pre-modern conditions. This means genetic lines propagating through us hairy male types have an optimum strategy that tilts us a little more towards "have lots of offspring and don't nurture much", while women tilt towards "have few offspring, work hard at making sure they survive to breed". This also explains why cultures that have not developed an explicit ideology of sexual equality invariably take female adultery much more seriously than male adultery. A man who fails to take a grave view of his mate's "unfaithfulness" is risking a much larger fraction of his reproductive potential than a woman who ignores her husband's philandering. Indeed, there is a sense in which a man who is always "faithful" is under-serving his genes -- and the behavioral tendency to do that will be selected against. His optimal strategy is to be promiscuous enough to pick up opportunities to have his reproductive freight partly paid by other men, while not being so "faithless" that potential mates will consider him a bad risk (e.g. for running off with another woman and abandoning the kids). What nobody had a good theory for until the mid-1990s was why women cooperate in this behavior. Early sociobiological models of human sexual strategy predicted that women should grab the best provider they could attract and then bend heaven and earth to keep him faithful, because if he screwed around some of his effort would be likely to be directed towards providing for children by other women. In these theories, female abstinence before marriage and fidelity during it was modeled as a trade offered men to keep them faithful in turn; an easy trade, because nobody had noticed any evolutionary incentives for women to cheat on the contract. In retrospect, the resemblence of the female behavior predicted by these models to conventional moral prescriptions should have raised suspicions about the models themselves -- because they failed to predict the actual pervasiveness of female promiscuity and adultery even in observable behavior, let alone concealed. Start with a simple one: If the trade-your-fidelity-for-his strategy were really a selective optimum, singles bars wouldn't exist, because genotypes producing women with singles-bar behavior would have been selected out long ago. But there's an even bigger whammy... Actual paternity/maternity-marker studies in urban populations done under guarantees that one's spouse and others won't see the results have found that the percentage of adulterous children born to married women with ready access to other men can be startlingly high, often in the 25% to 45% range. In most cases, the father has no idea and the mother, in the nature of things, was unsure before the assay. These statistics cry out for explanation -- and it turns out women do have an evolutionary incentive to screw around. The light began to dawn during studies of chimpanzee populations. Female chimps who spurn low-status bachelor males from their own band are much more willing to have sex with low-status bachelor males from other bands. That turned out to be the critical clue. There may be other incentives we don't understand, but it turns out that women genetically "want" both to keep an alpha male faithful and to capture maximum genetic variation in their offspring. Maximum genetic variation increases the chance that some offspring will survive the vicissitudes of rapidly-changing environmental stresses, of which a notably important one is co-evolving parasites and pathogens. Assume Jane can keep Tarzan around and raise four children. Her best strategy isn't to have all four by Tarzan -- it's to have three by Tarzan and one by some romantic stranger, a bachelor male from another pack. As long as Tarzan doesn't catch them at it, the genes conditioning Jane's sexual strategy get 50% of the reproductive payoff regardless of who the biological father is. If the stranger is a fitter male than the best mate she could keep faithful, so much the better. Her kids will win. And this isn't just a human strategy either. Similar behavior has been observed in other species with high parental investment, notably among birds. So. The variation effect predicts that mated women should have a fairly strong genetic incentive to sneak off into the bushes with romantic strangers -- that is, other men who are (a) from outside their local breeding population, and (b) are physically attractive or talented or intelligent, or (c) show other, socially-mediated signs of high fitness (such as wealth or fame). It may also explain why polyamorism is only now emerging as a social movement, after women's liberation, and why its most energetic partisans tend to be women. Our instincts don't know about contraceptive intervention; from our genes' point of view sexual access is equivalent to reproductive use. As our instincts see it, polyamory (the ideology of open marriage) enables married women to have children with bachelor males without risking losing their husband's providership for any children. Men gain less from the change, because they trade away a claim on exclusive use of their wives' scarce reproductive capacity for what may be only a marginal increase in access to other women (relative to the traditional system combining closed marriage and high rates of covert adultery). This model may not please prudes and Victorians very much, but at least it explains her cheatin' heart as well as his. (Thanks to Gale Pedowitz for the email discussion that stimulated this essay.) In The evolution of human mating: Trade-offs and strategic pluralism, Steven W. Gangestad and Jeffry A. Simpson have explored some similar themes, focusing on within-sex variation in mating strategies and the idea that there may be tradeoffs between fitness-to-mate and willingness-to-nurture signals.