Good, the Bad and the Ugly?
A Brief Look at Public Domain Software
|Originally published February, 1998|
|¿ 1998, 2005 Carlo Kopp|
The massive proliferation of public domain software tools and operating systems is a happening without historical precedent. The huge growth in the Internet which we have seen since the early nineties has clearly benefited the user of public domain packages and operating systems.
Many public domain tools rival or better their commercial equivalents, and this raises many important questions in relation to the commercial software industry. How well can the commercial software industry weather this phenomenon ? Who are the biggest beneficiaries of public domain software ? Who are the biggest losers ? What long term trends can we expect to see ?
The sheer volume of public domain software available at this time makes it exceedingly difficult to generalise on many issues, since what might be true for some packages may not be true for others. The FUD produced by many commercial vendors in relation to competing free software may or may not be valid, indeed often it is no more than FUD, lacking substance.
To place this in perspective it is helpful to briefly review the key areas in which public domain software has made its biggest inroads if for no other reason than to highlight areas in which commercial vendors have failed to address the needs of the customer base.
It is hardly an overstatement to observe the basic reality that public domain software has flourished precisely in those areas where the commercial vendors have failed to meet the expectations of the user community. Were the industry to have done the right thing by the consumers which feed it, there would be no public domain software worth mentioning. At its most basic level the public domain software boom reflects the fundamental rules of supply and demand - vendors unable to satisfy demand produce a reaction in the user base, which crafts its own tools to substitute for the overpriced or inadequate commercial product.
This may be a somewhat bitter pill for many of our vendors to swallow, but it is a reflection of modern times. Cheap and available hardware from the early nineties onward and widely available programming skills have become the "enabler" for this revolution in computing.
I am clearly a biased observer in this context, since I use almost exclusively public domain tools and operating systems. However, being immersed in the technology I am hopefully in the position to objectively judge its strengths and weaknesses. As a techno-literate in the discipline, many of what arguably qualify as shortcomings of public domain software are quite irrelevent to me as a user. This may not be true for other less well equipped users.
This leads us into the first important issue with public domain software, which is that of "stratification" of the user base into techno-literates and lay users. There can be no doubt that the biggest beneficiaries of the public domain software boom are the techno-literate users, who have the skills to locate, install, integrate, modify if necessary, and support and debug such code. Lay users then remain at the mercy of commercial vendors, since they are unable to independently support a public domain toolset. It is interesting to observe in this context that a number of consultancies (internationally) are now selling support services for public domain tools, such as the GCC compiler or Linux and FreeBSD operating systems. Is this a developing trend toward a new market segment ? This remains to be seen, but if the growth trend continues, then almost certainly we will be seeing a market segment of vendors specialising in the support of public domain toolsets.
Making arbitrary divisions across public domain software is fraught with difficulty, given the enormous diversity of what is available. However, there is some merit in dividing the domain into three core areas, operating systems, compilers and development tools, and finally general tools. Each is deserving of some exploration to place them into context, in relation to commercial software.
The impetus for public domain operating systems was the popularity of Unix in the programming community, coupled to the availability of high performance PCs, and the arguably high pricing structures associated with commercial Unix variants. Despite long standing complaints from the user community, the vendor community strongly resisted the idea of flooding the market with a Unix port priced to compete against Microsoft's DOS.
I can recall an interesting discussion over lunch in San Jose, in early 1992, with a senior salesperson from Sun Microsystems. I suggested that Sun should exploit the release of the Intel SVR4 Solaris port, then in Alpha test, and flood the market with the product priced below USD 150 per licence. At this time Sun were negotiating the buy of the SVR4 licence with the intent of full ownership. Clearly there were no royalty cost related impediments to such a scheme, but I was told that this idea was clearly considered to be unacceptable to Sun's marketing hierarchy. It would diminish profitability. Unlike Microsoft, who clearly understood the benefits of a huge installed user base, Sun's then marketeers could not see any value in this idea. They preferred the model of fighting for a slice of the established SCO market, expanding the base of the Interactive Systems SVR4 port.
In the end no commercial vendor was prepared to bite the bullet, and we as a result saw the Linux phenomenon, closely followed by FreeBSD and its cousin, NetBSD.
Linux is today one of the fastest growing products in the public domain "marketplace", widely used in academia and by a growing number of home users. The baseline Intel port provides a fully featured Unix-clone, with an impressive package of precompiled tool binaries on most CD-ROM releases. More importantly, an ever increasing number of commercial software vendors are porting their Unix hosted packages to Linux, a clear indicator of the size of the user base. Established staples such as Adobe Acrobat Reader are available, as well as tools such as StarOffice, an MS WFW clone, and even Wolfram's trusty Mathematica (please visit http://www.linux.org/apps/applications.html for more complete details).
Not content with the Intel architecture base, the Linux community have recently ported to the PowerPC architecture. I recently had the opportunity to play with a Linux installation on a 200 MHz Apple PowerMac and I was favourably impressed with the performance and stability of what is ostensibly a Beta port.
While Linux has the lead by a significant margin in terms of installed base, we have also seen significant growth in the FreeBSD base. The current 2.2.5 FreeBSD release is well and truly to a commercial standard and I am happy to argue that it is more robust than the production SunOS 4.1.2 used to be, many years ago. FreeBSD is the production port of the public domain 4.4BSD Unix and retains the superb networking performance of the BSD kernel. Since the BSD kernel was designed initially to run with resource limited 16 and 32 bit minicomputers, it is very lean and efficient in comparison with commercial SVR4 or NT.
It is interesting to note that FreeBSD has become very popular in the ISP community, and is competing strongly against low end commercial web servers. Recent releases of FreeBSD incorporate loadable kernel modules for Linux and SCO emulation, the former allowing FreeBSD hosts to run precompiled Linux binaries. To these you can add also commercial BSD/OS binaries, and a respectable number of commercial FreeBSD ports (more details at http://www.freebsd.org/commercial/software.html).
The next phase in the Linux/FreeBSD saga will be multiprocessing, with development ports for both operating systems in existence and likely to be available later this year in robust production versions.
Both Linux and FreeBSD use the XFree86 X11 server (http://www.xfree86.org/), which currently supports a truly massive range of boards, from the very basic up to the top-of-the-range Matrox cards. XFree86 supports not only basic 8-bit colour, but also, with cards, 16-bit and 24-bit colour. Recent releases of XFree86 deliver respectable performance, and combined with a suitable Pentium and a fast graphics adaptor will deliver performance comparable to many commercial Unix workstations. Should you seek better performance, you can move up to Xinside/XiG's commercial XAccel server, available for FreeBSD, Linux, and commercial BSD/OS, SCO, and Solaris for Intel. With performance in the range of hundreds of thousands of Xstones, this fusion of public domain operating system, commercial X11 server and cheap PC hardware allows a clever user to craft a respectable Unix graphics workstation at a fraction of the cost of a commercial workstation.
If we are looking at support for X11, then the public domain is the place to go for tools like window managers, of which there is now a huge number, nearly all public domain. Of course, the huge number of public domain X tools is mostly repackaged by commercial OS vendors, something also largely true of Unix shells.
It has been argued that the boom in public domain Unix is the death-knell of the Unix workstation, already under assault from MS NT at the bottom end of the range. Whether this proves to be the case remains open to some debate, since the bottom end of the workstation performance range has been under assault from NT for some time now. Every box hosting Linux or FreeBSD is a box hosting Unix instead of NT. Cold comfort perhaps for vendors who make their money from selling hardware, and using Unix as a tool to sell that hardware, but for the Unix community it means that Unix can be deployed very cheaply in comparison with NT.
It is interesting in this context to look at a market in which the lowest cost tier is public domain Unix, the next cost tier is NT, and the top cost tier is commercial Unix. Since public domain Unix is free, nobody can compete with it for cost.
Traditionally Unix has been mostly available as ports to proprietary hardware, its purpose in the eyes of most vendors as that of a selling tool, rather than an end within itself. The propensity of many vendors to tweak and twiddle their ports, loading them up with proprietary "enhancements", has been mostly yet another commercial tool, designed to attract other vendor's clients as much as to minimise application portability and thus tie existing customers down. The failure of the Unix vendor base to accommodate the Intel platform with very cheap ports of the same Unix variants used on larger platforms has produced a situation where public domain Unix has filled the resulting vacuum. Is it too late for for the vendor community to respond ? Probably yes at this stage, since freebie Unix cannot be beaten for price. However, a Unix for Intel port priced below $100, with a precompiled package of public domain binaries, could still be a successful seller.
Arguably the Unix vendor community has misunderstood the market in a fundamental way, since the real game has been operating systems rather than hardware platforms. Most users do not care what machine architecture they use, since their primary interest is in the operating system and its suitability for the applications they run, with hardware performance/price ratio being most important at the bottom (i.e. desktop) end of the market. Raw hardware performance was been a critical decider only at the upper end of the market, at least this has been my experience over many years of writing and evaluating tenders for Unix machinery.
We can expect to see further ongoing growth in the public domain Unix base, especially once multiprocessing ports of Linux and FreeBSD mature. This suggests that the low end commercial Unix workstation and workgroup server is now a historical artifact, to be supplanted by public domain Unix. How well NT will compete against public domain Unix in the technically literate segment of the market remains to be seen, most of its success to date has been in displacing Novell workgroup servers.
Without any doubt the jewel in the crown of the public domain development tools is the Free Software Foundation's GNU gcc C compiler and its associated package of supporting tools. Often criticised as big and cumbersome, whatever its warts it does have the distinction of being ported to virtually every flavour of Unix in existence, and is mostly robust and mature. I have supported the gcc compiler for a number of commercial clients, including software developers, some of whom prefer to use it over vendor supplied commercial compilers. Where significant pressure exists to support multiple vendors, using the gcc compiler bypasses the problem of maintaining compiler specific ports for different platforms. I could not help but chuckle at last year's Avalon airshow, where a major aerospace vendor's demo software crashed, spewing gcc error strings over the console of the demonstration machine. In the highly regimented aerospace development community, this was an interesting revelation.
The gcc compiler may be the most widely used public domain development tool in existence, but it is by no means the only one. A quick browse through the development packages list on FreeBSD 2.2.2 provides us with a good indication of the range of goodies available, i.e. autoconf-2.12, bcc-95.3.12, boehm-gc-4.10, cflow-2.0, cgiparse-0.8c, crossgo32-1.3, crosssco-1.3, cutils-1.3.2, cvsup-15.0, cxref-1.3, dlmalloc-2.6.4, dmake-4.0, fpp-1.0, gcc11-2.6.3, gdbtk-4.16, gmake-3.75, id-utils-3.2, jp-mimekit-1.6, lclint-2.2a, libident-0.20, libmalloc-1.18, libslang-0.99.38, libwww-4.0D, linux_devel-0.2, linuxgdb-4.16, m4-1.4, mkmf-4.11, mprof-3.0, noweb-2.7d, perl-5.003, prcs-1.1.1, scogdb-4.16, swig-1.1b5, tkcvs-6.0, tvision-0.4, xmake-1.01, xwpe-1.4.2, and xxgdb-1.12. Impressive ?
Tools like lclint are the equal of any commercial syntax checker I have seen to date, while xxgdb does much the same job as many vendors' commercial screen based debuggers, while libmalloc allows you to bypass nasty vendor specific malloc behaviour.
Let's now take a brief look at the languages (compilers and interpreters) available on the same release, i.e. STk-3.1, bcc-95.3.12, bwbasic-2.20, cim-1.92, eiffel-13a, elk-3.0.2, eperl-2.1.1, expect-5.22.0, gofer-2.30a, guavac-0.2.5, guile-iii, icon-9.3, jp-tcl-7.6, kaffe-0.8.4, lcc-3.6, mit-scheme-7.3, mixal-1.06, modula-3-3.6, modula-3-lib-3.6, modula-3-socks-1.0, moscow_ml-1.4, ocaml-1.03, p2c-1.21a, pbasic-2.0, perl-5.003, pfe-0.9.9, pgcc-126.96.36.199, pgcc-2.7.2c, python-1.4, rexx-imc-1.6d, safe-tcl-1.2, sather-1.0.5, scheme48-0.36, schemetoc-93.3.15, scm-4e1, swi-pl-2.8.6, tcl-7.6, tclX-7.5.2, xlispstat-3.44, and xpl486-4.1. While many of these fall into the domain of esoterica, many are solid mainstream products.
Of course, if you wish to focus on mathematical modelling, often considered esoteric within itself, then the same CD yields Fudgit-2.41, Wingz-142, blas-1.0, calc-2.9.3, calctool-2.4.13, eispack-1.0, fftpack-1.0, freefem-3.0, gnuplot-325b, hexcalc-1.11, lapack-2.0, linpack-1.0, numpy-1.0b3, octave-2.0.1, oleo-1.6, ranlib-1.0, ss-1.3.3, xgraph-11.3.2, xlispstat-3.44, xplot-0.89, xspread-2.1, xvgr-2.10.1. While the humble gnuplot is a mainstay of academia, libraries such as linpack, eispack, fftpack are arguably indispensable for scientific programming, while octave is a massive and capable package by any metric.
No browse through the wondrous world of development tools would be complete without a look at editors, this yielding nothing less than aXe-6.1.2, asedit-1.3.2, beav-1.40.6, bpatch-1.0, e93-1.2.6, emacs-19.34b, gvim-4.6, joe-2.8, jove-4.16, mule-2.3, nedit-4.0.3, nvi-m17n-1.79-970408, staroffice-3.1b4, uemacs-3.12, vile-6.3, vim-4.6, xcoral-2.5, xed-1.3, xemacs-19.15, and xvile-6.3. Emacs is of course a traditional favourite in the programming community, despite its gigantic command set, while vim, an improved vi, is steadily growing in popularity.
Given the often high quality of many of these tools, developed by postgraduate computer scientists and maintained by universities, it is entirely feasible to develop and maintain commercial quality software using public domain tools alone. Indeed, I can think of a number of organisations which do exactly that.
While many of these tools are the result of university development projects, many are also the result of the same market pressures discussed earlier. Where vendors cannot deliver a suitable product at the price the market can bear, then a public domain clone will appear sooner or later, in effect ruining the market for commercial developers. Most users, subjected to internal budgetary pressures, will think twice before spending good money on a commercial tool which might offer only a marginal gain in useful capability over a public domain freebie.
The propensity of many software vendors to overload their products with often meaningless features, thereby reducing performance and increasing demands upon the host platform, is nothing less than a direct incentive for users to jump ship and go public domain instead. The software "bloat" phenomenon is clearly counterproductive to the maintenance of an existing user base, and it is very sad to see that many software vendors have yet to come to grips with this. Granted, in an illiterate user base bells and whistles are good for selling. In a literate user base, which understands that bells and whistles mean more RAM on the host system, a faster CPU, longer learning curves, and more bugs to fight, the marketing paradigm is clearly broken.
While operating systems and software development tools are clearly key areas in which public domain software has chewed significantly into the commercial marketplace, they are hardly the only ones.
Areas where public domain tools hold significant shares of the user base include mailers, newsreaders, browsers, benchmarking tools, terminal emulators and graphics. The currently immature GIMP will eventually provide an equivalent in many respects to Adobe's Photoshop, an established mainstay in image manipulation. With established public domain image manipulation tools like xv, it is possible to do much of what Photoshop can do without expenses other than learning time.
The humble Xfig drawing package, a poor man's equivalent to Adobe Illustrator, may be limited in capability compared to its commercial equivalent, but is perfectly usable for a wide range of 2D drawing and illustration tasks. Anybody who doubts this should visit my Xfig page at http://www.csse.monash.edu.au/~carlo/xfig.html. Most of the artwork used in this column is produced with Xfig.
Text processing and document manipulation is another area where public domain tools play an important role, especially in providing format conversions between proprietary packages, and from proprietary formats to public domain formats. To attempt to list the huge number of format converters is arguably a futile task, since they pop up like mushrooms, whenever a frustrated programmer must grapple with vendor specific data formats. Again this is another example of the user base defeating the commercial tactic of format incompatibility, designed to tie users down to proprietary products.
There can be no doubt that public domain software has changed and will continue to change the character of the marketplace. There can also be no doubt that the biggest beneficiaries of the public domain software boom are the techno-literates in the industry, be they individual users or organisations which are competent to exploit these tools. While the market for software continues to grow, there will still be many users out there with limited literacy who will have little choice than to use commercial software, but as the market matures this fraction will become less and less significant.
The questions which developers of commercial software, and operating systems, must ask are interesting ones indeed. Should we accommodate Linux and FreeBSD users with ports of our Unix hosted software? Should we consider the possibility that our pricing structures are driving users to develop public domain clones of our products ? Are we allowing bloat of our products to create openings for public domain clones ? Should we use public domain tools for internal development tasks? Is the bugginess of our product and the quality of our support good enough to keep our customers from moving to public domain tools?
The user of software products may also ask some interesting questions. Should I spend my $$$ on a commercial tool when I can get the same job done with a public domain tool? What cost penalties in support time do I incur with a public domain tool, compared to a commercial tool? Are the penalties of vendor dependency worth the benefits in productivity?
There are no simple answers to these questions, but the reality which we as an industry must grapple with is that public domain software is now an important player in the marketplace, and those who ignore it will do so at their peril.
For disbelievers, I must note that this series of feature articles was produced on a public domain FreeBSD 2.2.2 system, using only public domain tools and utilities...
|$Revision: 1.1 $|
|Last Updated: Sun Apr 24 11:22:45 GMT 2005|
|Artwork and text ¿ 2005 Carlo Kopp|