An experiment in collective intelligence. Stupidity. Whatever.
Context: post to linux-elitists. What makes free software tick.
ToDo: TWikify more fully.
On the question of whether or not free software is good because of Ethics v. Pragmatic debate, I tend to come down slightly on the pragmatists side debate: I use GNU/Linux because it does what I want it to do. And I expect it to continue doing so. But I see the two issues as inextricably bound.
I put this question to RMS at a dinner some years ago: do you believe in free software because it is good, or because it is free? Richard's answer: because it is free.
Richard is the idealist, the evangelist, the missionary. And to him freedom is an absolute goal.
I fall to a more pragmatic bent: the tools I use should work. They should be long-term credible. There should be a growth path. There should be a consistency over time. They should be extensible. They should be flexible.
When I began taking free software seriously, ~1997, I spent a couple of years looking hard at the economic, legal, and technical underpinnings of the movement. One of the more interesting items I turned up was a history of the computing industry in the US, from the 1940s onward.
- Table of Contents
- Reading List
- Richard Stallman, The GNU Manifesto
- Steve McConnell, Code Complete
- Carl Shapiro & Hal Varian, Information Rules
- Lawrence Lessig, Code and Other Laws of Cyberspace
- Weber, Stephen, The Success of Open Source
- Christensen, The Innovator's Dilemma
- Jarred Diamond, Guns, Germs, and Steel
- Eric Raymond, The Cathedral and the Bazaar
"Cognition and Capabilities: Opportunities Seized and Missed in the History of the Computer Industry" Richard N. Langlois
This begins with the emergence of IBM from the Eniac and UNIVAC computers in the 1940s and 50s, displacing Remington Rand (aka Sperry Rand), and beating out GE and RCA.. A number of transition points are noted:
Langlois's analysis leaves off at this point, but I'd throw in a few
additional examples or trends:
- Emergence of programmable (rather than special-purpose) computers.
- Modularization -- of central processing units and peripherals, by IBM in the 1950s to create new capabilities, combined with a leasing model of hardware revenues, which reduced the cannibalization effect of new products on existing base (cf: Innovators Dilemma, Steven Christensen).
- Unification of the computing platform from diverse systems with incompatible characteristics to a line with a common operating system: System 360. This also extended the modular philosophy of earlier offerings.
- At about the same time, a transition from vacuum tubes to solid-state circuitry, reducing costs, power consumption, and floor-space requirements; and increasing efficiency.
- Emergence of the minicomputer. DEC's first systems had a higher cost/performance ratio than IBM's offerings, but offered finer granularity, and a lower entry point to enterprise computing.
- Emergence of open systems. Unix appeared on the scene in the mid 1970s, and came to trounce DEC's VMS systems in the 1980s based on a minicomputer open architecture that extended not merely across systems from on vendor, but, at least at the source level, across vendors.
- Personal computing also emerged in the enterprise in the 1980s, largely as a pushback against decision making and control bottlenecks in the "Glass House" corporate IT bureaucracy. PCs allowed both software and hardware allocation decisions to be decentralized and put in the hands of end users, or at least departments.
- Apple's loss of marketshare to the PC. Based on cost, centralized control, modularity, and flexibility, a competitive, though arguably technically inferior, PC market overcame Apple's early lead, and continues to do so to this day.
Turning this into an analytic tool rather than merely a laundry list, I
see the following principles and themes emerging:
Many of the successful strategies pitted a slightly inferior, but good enough competitor against a more elegant solution. In all cases, "worse" also tended to: less expensive (on a purchase unit if not on a capabilities basis), more flexible, more modular, and less centrally controlled.
For the classic treatment of this, see Richard P. Gabriel's Worse is Better and [The Rise of ``Worse is Better''.
Reduced cost, or reduced entry cost dominates a more expensive product.
Selling pieces to be assembled (or assembling pieces and selling many different products) beats a highly tuned, but single-purpose, system.
Reducing centralized control, often written as "the right to fork" in free software discussions, means that more ideas can be tried, and that the proving ground for new development is larger. This is critical as the inventor of a new technology never foresees its possible applications. It also means no patent royalties or other licensing restrictions.
This dynamic is key to understanding both the rise and the likely fall of the Microsoft PC market. PCs emerged and succeeded as a decentralization tool -- they enabled users and broke the strangle-hold of the corporate IT fiefdom. Today, Microsoft represents to a greater extent the role of controlling authority, dictating terms under which other actors in the IT market can participate. GNU/Linux and free software offer decentralization and autonomy to hardware, software, and service vendors, as well as end users.
Providing a uniform base on which to roll out services tends to increase utility -- IBM's s360, DEC's minis, Unix, the PC, and GNU/Linux, as well as industry and technology standards such as ASCII, RFCs, etc.
So you ask, what the hell does freedom and ethics have to do with this?
Simply: the free software development model feeds each of these success factors.
I'd like to revisit briefly the foundations of free software -- the principles on which it is grounded and which lead to its existence.
- The dominance of free and open documentation formats such as HTML (and variants) and DocBook? over closed, proprietary, or single-purpose standards: dead word-processor of your choice, "OpenDoc" (a Sun initiative), the GNU "info" format, etc. This is a theme Tim O'Reilly has raised on numerous occasions, and fits hand-and-glove with the Langlois analysis.
- The success of the X11 windowing system over OpenLook? / NEWS, despite the arguable technical superiority of the latter, largely on licensing grounds.
- The success of GNU/Linux against both proprietary Unices and Microsoft in the small to mid-sized server space. Similarly the overwhelming success of GNU/Linux over all comers in the embedded/handheld space, largely based on costs, capabilities, and flexibility.
- The success of LAMP (GNU/Linux, Apache, MySQL?, Perl) in dominating the web applications space. This is turning, somewhat. More accurately, LAMP is extending: Apache and GNU/Linux still dominate the webserver and OS components, but we're seeing the addition of Postgres, PHP, and Java in the database and language slots. The competing and losing strategy is the legacy MS Windows, IIS, ASP, SQL Server mix.
And on top of these, factors indicating a probable ultimate success of
- A development model: the open source "Bazaar" described by Eric Raymond, which takes advantage of many eyes, tight development cycles, and continuous evolution
- A legal framework of free software licensing, including both copyleft (GNU GPL and similar) and less restrictive free licenses.
- An economic model which provides sufficient benefit to individuals or firms engaged in development of works not exclusively retained.
- A software architecture consisting of largely independent, modular design, allowing individual developers to "wrap their minds" around a given problem, and for code to be readily sharable among different projects.
- A widespread, very low cost distribution network. The Internet.
- Ready access to reasonably powerful computers with development tools.
- Open, accessible, standards. GNU/Linux itself was based on the convergence of the POSIX standard and x86 hardware, with additions of TCP/IP networking, X11, numerous RFCs, etc. Closing standards, or making them inaccessible (by licensing or royalty requirements) is a serious threat to GNU/Linux.
Free software depends on the intrinsic freedoms RMS espouses. This doesn't mean that a given adopter needs to embrace these, have them as a significant deciding factor, or even be aware of them. As a whole, however, free software's success is pinned on these principles. And due to the dynamic that they create -- technical, cost, and control characteristics -- the ultimate success of free software is inevitable. It's a inevitability that can be put off for a time, at a cost. But it's where we're headed.
And no, I don't foresee a world in which all software is free. But by and large the key components will be, and any significant software sector will have its free software alternatives. The proprietary tools which do remain should be the better for the competition, and may be successful if they provide a compelling advantage.
And since Google is my filing cabinet and I can never turn up the Langlois document when I'm looking for it....:
Keywords: computer industry history economics ibm dec vax unix microsoft apple richard n. langlois Cognition and Capabilities: Opportunities Seized and Missed in the History of the Computer Industry system 360 RISC ken olsen
Some texts which have influenced my thinking on free software, in addition to Langlois's article. No particular order.
Intent is to extend the following list to include a brief paragraph describing the work's significance.
- Free software tends to expediency: people build stuff that works, with an emphasis on the "get it working now" rather than "get it perfect later".
- Though much is made of free as in speech, I feel that free as in beer is also crucial. Of the factors contributing to Microsoft's dominance of the PC market is their dual use of price as a weapon: undercutting rivals products, when faced with competition, and usury profits when in a monopoly position, financing other projects. Despite its market dominance, Microsoft is utterly dependent on only two franchises -- operating systems and office software -- for all its profits. If either can be undercut, the company is stuck pitting revenues against marketshare. Losing on either count is not long term tenable. Free software allows both cost and marketshare to be attacked simultaneously.
- The free software development model is to produce largely independent, modular, software. The exceptions to this rule are largely "liberated" proprietary products, notably Mozilla (which required a complete redesign) and OpenOffice?.org (which will have to face the same music). The Unix philosophy of small, specialized tools to do one job, largely applies. Even the larger projects -- emacs, perk, apache, the kernel itself -- are themselves highly modularized.
- Free software is by definition decentralized. Sun will eventually learn this. The right to fork is not an option.
- GNU/Linux offers a uniform computing environment over virtually the entire electronic landscape. This includes serving as the binary compatibility standard on Intel architecture for POSIX environments, and source compatibility across a wide range of processors (a dozen architectures in the Debian project), scaling from wristwatches to mainframes.
Richard Stallman, The GNU Manifesto
Makes the moral argument, lays the roadmap, and starts the journey. A technological, political, and moral manifesto.
Much of the technical aspects of what make free software tick (largely: modularity) are layed out in this book, published, ironically, by Microsoft Press.
Lock in. What it is. How to avoid it (for users) or instill it (for vendors). Much on Microsoft from a DoJ? specialist.
Interplay of law, commerce, society, and technology.
An extremely good, cogent, and thorough review and analysis of free software's history, foundations, motivators, and potential. Extends Langlois's history through 2004, adding both breadth and depth. Makes extensive references to Lessig and Mancur Olsen (below). Busts a number of the more prevalent myths and sloppy thinking on the topic without introducing too many of his own.
Or, how to understand ground-up revolutions.
The best nutshell I can give is that I debated including this in the bibliography of a talk on free software licensing for the O'Reilly Open Source Summit in Monterey, 2000. Then found that Red Hat's CTO Michael Tiemann was giving an entire presentation on it. Briefly: what circumstances give rise to robust, sustainable, overwhelming diversity.
The social-organizational concept behind free software projects. Somewhat fanciful, but the stake in the ground to which most subsequent analyses refer.
Other books and works of note:
Expand on these as well.
Note that these are notable, but not essential works.
- Mancur Olsen, The Logic of Collective Action
- Jane Jacobs, Cities and the Wealth of Nations
- Grammatical Man
- Douglas Hofstadter, Goedel, Escher, Bach
- Pirsig, Zen and the Art of Motorcycle Maintenance
- Neal Stephenson, In the Beginning Was the Command Line
For an earlier take on a lot of this, as well as some interesting links, see my post Comments and Criticism to the Kuro5hin article An Economist/Programer writes on Open Source largely concerning U.S. Federal Reserve economist Gerald P. Dwyer, Jr. and his article "The Economics of Open-Source Software" (pdf), posted May 29, 2000.
-- KarstenSelf - 10 Oct 2003
-- KarstenSelf - 25 Mar 2003
"There is nothing new under the sun, but there are lots of old things
we don't know yet."
Copyright © 2001-2006 by the contributing authors.
All material on TWikIWeThey is the property of the contributing authors. This content may be freely distributed, copied, or modified, with attribution, and this notice.
Works are provided AS IS with NO WARRANTY and NO LIABILITY for consequences of use.
Ideas, requests, problems regarding TWikIWeThey? Send feedback.