Rob Lucas: THE CRITICAL NET CRITIC

Over at the NLR Rob Lucs provides an incisive analysis on IT and Capital:

Computers, software etc. are not always and not inherently communications technologies, and their history is distinct from that of the telecoms infrastructure. it goods only become infrastructural insofar as they play a role within communications technologies, and they have historically done this primarily through their integration with a pre-existing telecommunications infrastructure. We thus need to make a distinction here between means of communication and what we might call the ‘means of computation’. It is only the former which may be viewed as inherently infrastructural, ‘shared’ in the strong sense, and immediately prone to the types of ‘network effects’ exemplified in technologies such as the telephone. [17] Because of their socially general character, the development of new means of communication is typically a difficult matter for individual capitals; they tend to be hugely expensive to develop, and to involve complex issues of coordination. For these reasons they are often brought into being by state fiat—just as the Internet was. Goods produced for the purpose of information processing are not inherently prone to the same problems. They can, from the outset, be produced by individual capitals without concern for the general problems of infrastructure. But, like many significant technical innovations, their production tends to start out complex, expensive and difficult, before being finessed and cheapened over time. In the process they make a transition from what are effectively prototypes and short runs to mass-produced goods, at least insofar as a market exists or can be created for them. It is in essence this dynamic that Carr reads as a process of commoditization. At this level it goods are little different from microwaves or refrigerators.

Beyond this, however, it is possible to identify a strong affinity for standardization which is peculiar to the computer, for all true computers—in Alan Turing’s sense—are logically identical: all are ‘universal Turing machines’ capable in principle of running any program written for any other computer, from the first mainframe to the most advanced modern data centre. The possibility of copying software and functionality from one computer to another is thus basic to computing. [18] Furthermore, all encoded data is—like language—necessarily iterable and thus, in principle, capable of being copied. There are, of course, many situations in which it is not only possible, but also very useful to be able to copy software and data from one computer to another; a parallel argument can be made at the level of hardware components. Concrete incompatibilities between individual machines thus appear an impediment to transfers that ‘should’ by definition be possible. For users, there are penalties in purchasing non-standard it goods that inhibit such transfers, and concomitantly for producers a strong incentive to enable them—especially once standards have already begun to cohere. The creation of standards to interlink such machines and to facilitate the transfer of data and functionality thus inevitably presents itself as a problem to be solved, and it is this that ultimately leads, amongst other things, to the Internet and Web, two technologies which are most fundamentally realizations of technical standards—tcp/ip (Transmission Control Protocol/Internet Protocol) for the former, and http (Hypertext Transfer Protocol) for the latter. More than any particular technology, phenomena like the Internet and Web are products of communications protocols: sets of precise rules for transferring data from one computer to another.

If the computer, then, is not inherently a means of communication, the universality of its basic logical construction means that it is highly probable people will want to find standard ways to move data, software and hardware components between individual computers. And, once these are well established, it is only a small step to start using the transfer of data between computers for communication between people, especially once the mass production of hardware—which itself is promoted by, and in turn promotes, technical standardization—results in computers becoming widely distributed across society. The computer then, as a social artefact, has at least some strong elective affinities with means of communication, and we see these realized in the fusion of means of communication and computation which is the Internet.

Once a large network such as the Internet arises, the penalties of departing from its standards become so great as to rule it out in most cases, thus reinforcing the underlying proclivity for standardization. Yet standardization deprives individual it manufacturers of significant ways of qualitatively differentiating their products from those of competitors. If a given it commodity is fundamentally generic, and it can thus be readily exchanged for one produced by a competing capital, competition will tend to be focused more strongly around factors such as speed and capacity—which have so far proven technically quite open-ended—and price. [19] It may then be that the importance of standardization here means that competitive dynamics, which are obviously general to capitalism as a whole, are particularly acute when computing is involved, and even more so once the computer has begun to be used for communication.

There is also inherent in the computer, however, a tendency to undermine competitive dynamics at the level of software. Since what is produced for one is, at least in principle, capable of running on all others—and this possibility is enhanced with increasing standardization—and since software code is highly labour-intensive to produce initially but inherently iterable once written, there is a strong possibility that a single capital can serve the entire market for a particular software commodity. If the technical universality of the computer may promote competitive dynamics at some levels, then, it promotes monopolies in software. It may be tempting to refer to the ‘natural monopoly’ character of means of communication here, but it is notable that these monopolistic possibilities in software exist independently of any technological convergence of computation with the means of communication. The formation of monopolies at the level of software is, however, not the end of the story. As noted above, the very ease with which monopolies are potentially formed in software makes extraordinary counter-measures rational for competitors. It may even be rational to give away software entirely for free, or to fund the development of a Free or Open Source Software equivalent, if it helps prevent a competitor from developing a monopoly position from which it may thereby threaten to impinge upon other lines of production in which profits are still possible, or to grab a market that might be ‘monetized’ in some other way later. The tension between monopoly and competitive dynamics in software thus tends not only to lead to declining prices, as in the case of hardware, but to destroy software’s commodity status altogether, as the market gradually fills with free alternatives to a dominant product.

These general considerations correspond more or less to what has actually happened in the history of it in terms of standardization, rapid technical progress and deflationary tendencies at the level of hardware, and strong monopoly tendencies at the level of software coupled with tendencies for software to be decommodified, even before the full convergence of it and means of communication. Such dynamics have led to the centralized data centres of the present, which run on vast quantities of cheap, standard hardware and Free or Open Source software, and which supply potentially universal markets with proprietary—but often free (in terms of price)—software services. Of course, this cursory sketch isolates peculiarities of it from the larger macroeconomic context. In any real economic history of these developments, other factors such as the role of cheap Asian labour would have to be considered.

Finally, if both hardware price deflation and the destruction of the commodity status of software tend to limit the prospects for profitability of it firms even while monopoly positions develop, we might expect the dominant players to exploit their positions to derive revenue from non-it sources or seek to move into fresh lines of production. And this is precisely what we do see: Google giving away most of its software services for free, but deriving its revenue from marketing; Amazon remaining centred on its role as a retailer of non-it goods, such as books, but branching out into the production of new kinds of gadgets; Apple taking a significant part of its revenue from content sales via the iTunes store, while repeatedly moving into new lines of production. The imperative to exploit tech-monopoly positions for non-tech revenues might also help explain the increasing alignments between these tech giants and commercial content providers. If it is tempting to appeal to the evidence of the continuing buoyancy of leading it players, against the grim economic outlook described by Carr, it may be that their success is precisely predicated on their extracting revenues from areas that are not presently afflicted with the general limits of the it industry. These currently successful companies, defying a global context of crisis, may be the exceptions that prove the rule.

 

About HR

Deep in the adjunct crackhole.
This entry was posted in Sunday Reading and tagged . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s