Department
of Economic and Social Affairs 1993).
Nitzan Bichler - 2012 - Capital as Power
?
?
?
?
?
?
?
?
?
?
?
?
100 90 ?
?
?
?
80 80 ?
?
?
?
60 70 ?
?
?
?
?
?
?
?
?
?
?
?
?
?
40
60 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 20
50 ? ? ? ? 0 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015
Figure 8. 1 Energy User-Producer, Inc.
Note: The number of automotive factories and oil rigs is hypothetical. The 'quantity' of capital with a 1970 equilibrium assumes that the 'util-generating capacities' of an automotive factory and an oil rig have a ratio of 2:1, while the 'quantity' of capital with a 1974 equilibrium assumes that the ratio is 1:1.
For instance, if in 1970 an automotive factory cost $40 million and an oil rig $20 million, the company's 33 factories and 20 oil rigs would be worth $1. 72 billion:
4. $1. 72 bn = 33 * $40 mn + 20 * $20 mn
Now, assume that 1970 was a year of perfectly competitive equilibrium. According to the neoclassical scriptures, this happy situation means that the 2:1 ratio between the price of automotive factories and oil rigs 'reveals' to us their relative efficiencies. It tells us that automotive factories generate twice the universal utils as an oil rig. With this assumption, we can then use Equation (5) to compute the overall 'quantity' of capital (Q) owned by Energy User-Producer, Inc. for any year t:6
6 This computation contains an important caveat that must be noted. According to neo- classical theory, the equilibrium price ratio is proportionate to the relative efficiencies of the marginal automotive factory and oil rig, but that isn't enough for our purpose here. In order to compute the overall 'quantity' of capital, we need to know the relative efficiencies of the
? ? utils 'Quantity' of Capital number equilibrium in 1974
? ? (left)
117
? ? ? ? ? ? ? ? ? 'Quantity' of Capital
equilibrium in 1970 (left)
89
www. bnarchivs. net
? ? Automotive Factories (right)
Oil Rigs
(right)
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 132 The enigma of capital
5. Qt = N1t * $40 mn + N2t * $20 mn
The results, normalized to Q1970 = 100, are depicted by the thin line in the upper part of Figure 8. 1. 7 We can see that the increase in the number of the presumably less productive oil rigs was more than offset by the decline in the number of the presumably more productive automotive factories. As a result, the overall 'quantity' of capital fell by 11 per cent throughout the period.
Of course, there is no particular reason to assume that 1970 was a year of perfectly competitively equilibrium. Any other year could do equally well. So, for argument's sake, let us pick 1974 as our equilibrium year and see what it means for the computations.
Many things happened between 1970 and 1974. The most important for our purpose were probably the threefold rise in the price of crude oil and the accompanying increase in the price of energy-producing equipment, including oil rigs. So, for the sake of illustration, let us assume that, by 1974, the price of oil rigs doubled to $40 million, while the price of automotive factories remained unchanged at $40 million. Now, since these are equilibrium prices, the 'revealed' efficiency ratio between automotive factories and oil rigs must be 1:1 (as opposed to 2:1 in a 1970 equilibrium). To compute the 'quantity' of capital owned by Energy User-Producer Inc. , we would now use Equation (6):
6. Qt = N1t * $40 mn + N2t * $40 mn
The results of this computation, again normalized to Q1970 = 100, are plotted by the thick line in the upper portion of Figure 8. 1. The difference from the previous calculations is striking: whereas equilibrium in 1970 implies an 11 per cent fall in the 'quantity' of capital, equilibrium in 1974 shows a 17 per cent increase. The reason for this divergence is that, compared with the first assumption, the 'relative efficiency' of oil rigs is assumed to be twice as large, with the result that each additional oil rig adds to the aggregate util-gener- ating capacity double what it did before.
So now we have a dilemma. Since we are dealing with the same collection of capital goods, obviously only one of these 'quantity' series can be correct -
average factory and rig, and that latter information is not revealed by prices. Neoclassical manuals tell us that the average and marginal efficiencies are generally different, and that save for special assumptions there is no way to impute one from the other. This discrep- ancy is assumed away by the national account statisticians. The standard practice - which we follow here for the sake of simplicity - is to take equilibrium prices as revealing average efficiencies and utilities. This bypass enables the measurement of 'real' quantities, at a minor cost of making the result more or less meaningless.
7 A 'normalized' series is calibrated to make its value equal to a given number in a particular 'base year' (in this case, 100 in the base year 1970). To achieve this calibration, we simply divide every observation in the series by the value of the series in the base year and multiply the result by 100 (so the normalized series = Qt / Q1970 * 100). The absolute magnitudes of the observations change, but since all the observations are divided and multiplied by the same numbers, their magnitudes relative to each other remain unaltered.
? Accumulation of what? 133
but which one? To answer this simple question all we need to know is in which of the two years - 1970 or 1974 - the world was in perfectly competitive equi- librium. The embarrassing truth, though, is that no one can tell.
The consequence of this inability is dire. Depending on the year at which prices were in equilibrium, the divergence between the two measures in Figure 8. 1 could be much bigger or much smaller, only that we would never know it. Moreover, the predicament goes beyond capital goods. It applies to any heterogeneous basket of commodities, and it grows more intractable the more encompassing the aggregation.
Neoclassical theorists love to assume the problem away by stipulating a world that simply leaps from one equilibrium to the next. In our illustration, this assumption would make both 1970 and 1974 equilibrium years, elimi- nating the difficulty before it even arose.
Unfortunately, though, this assumption doesn't serve the statisticians. The reason is that, with ongoing equilibrium, all 'pure' price changes - i. e. changes that do not correspond to alterations of the commodity - must be attributed to variations in technology or tastes; yet, if tastes keep changing, we lose the basis for temporal comparisons. With consumer preferences having shifted, the same automotive factory may represent in 1974 a different 'util generating capacity' than it did in 1970; similarly, an increase in the number of rigs may denote a decrease as well as an increase in their 'util generating capacity', all depending on the jerky desires of consumers. So once the util dimensions of commodities are no longer fixed, we lose our benchmark and can no longer talk about their 'real' quantities.
Therefore, here the statisticians part ways with the theoreticians. They assume, usually implicitly, that equilibrium occurs only infrequently. And then they convince themselves that they can somehow identify these special points of equilibrium, even though there is nothing in neoclassical manuals to tell them how to do so.
Quantity without equilibrium
Elusive equilibrium is devastating for measurement. Consider again our example of oil rigs and automotive factories. The first produce energy and the second use it, so their relative prices depend crucially on the global political economy of the Middle East, home to two thirds of the world's known oil reserves and one third of its daily output. The difficulty for the statisticians is that this region, with its complex power conflicts between large corporate alli- ances, regional governments, religious movements and superpowers, never settles into a perfectly competitive equilibrium; but if so, how can we use the relative prices of oil rigs and automotive factories as a measure of their rela- tive 'quantities'?
And the problem is hardly unique to oil rigs and automotive factories. Indeed, given that the entire world is criss-crossed by huge corporate alliances, complex government formations, contending social groups, mass
134 The enigma of capital
persuasion, extensive coercion and the frequent use of violence, how can any market ever be in equilibrium? And if that is the case, what then is left of our attempt to quantify the 'capital stock' or any 'real' magnitude?
The statisticians try to assume the problem away by mentioning it as little as possible. The most recent UN guidelines on the system of national accounts, published in 1993, contain only ten instances of the word 'equilib- rium', none of which appear in Section XVI on Price and Volume Measures, while the word 'disequilibrium' is not mentioned even once in the entire manual (United Nations.
Department of Economic and Social Affairs 1993). The more recent OECD document on Measuring Capital (2001) no longer refers to either concept.
Fortunately, an earlier version of the UN national account guidelines, published in 1977, before the victory of neoliberalism, was not as tight-lipped. This older version provides occasional advice on how to deal with unfortu- nate 'imperfections', so we can at least get a glimpse of how the statisticians conceive the problem. One 'special case of difficulty' identified here is internal, or non-arm's length, transactions between related enterprises or branches of the same company. Since transfer prices set under these condi- tions may be 'quite arbitrary', to use the UN's wording, the advice is to 'abandon value [meaning price] as one of the primary measures' and replace it with a 'measure of physical quantity' combined with an estimate of 'what the equivalent market price would have been' (United Nations. Department of Economic and Social Affairs 1977: 12). Needless to say, the authors do not explain why we would need prices if we already knew the physical measures. They are also silent on where we could find 'equivalent' market prices in a world where all markets have already been contaminated by power.
Another challenge is the repeated need to 'splice' different price series when a new product (say an MP3 player) replaces an older one (a CD player). Since the splicing involves a price comparison of two different objects, the guidelines recommend that the replacement 'should take place at a time when the assumption that the price differences between the two products are proportional to the quality differences is most likely to be true' (p. 10). In short, splice only in equilibrium. But since nobody knows when that happens (if ever), the guidelines concede that, in practice, the decision 'must be essen- tially pragmatic' - that is, arbitrary.
The impossibility of equilibrium blurs the very meaning of 'well-being' - and therefore of the underlying substance that utils supposedly measure. The neoclassical conception of well-being is anchored in the 'sovereign consumer', but how can we fathom this individual sovereign in a world characterized by conflict, power, compulsion and brain-washing? How many consumers can remain 'autonomous' under such conditions? And if they are not 'autono- mous', who is in the driver's seat? Henry Ford is reputed to have said that, 'If I'd asked people what they wanted, they would have asked for a faster horse'. But even if we take a less condescending view and assume that consumers do
Accumulation of what? 135
have some autonomy, how do we separate their authentic needs and wants from falsely induced ones?
These are not nitpicking questions of high theory. The ambiguities here have very practical implications. For example, the national accounts routinely add the value of chemical weapons to that of medicine - this under the assumption that, in perfectly competitive equilibrium, one dollar's worth of the former improves our lives just as much as one dollar's worth of the latter. Consumers, however, do not live under such conditions, and rarely are they asked whether their country should produce chemical weapons, in what quantities or for what purpose. So how do we know that producing such weapons (Bentham's pushpins) indeed makes consumers better off in the first place, let alone better off to an extent equal to how well off they would have been had the same dollars been spent on medicine (Bentham's poetry)? Perhaps the production of chemical weapons undermines their well-being? And if so, should we not subtract chemical weapons from GDP rather than add them to it? 8
And it's not like medicine and weapons are in any way special. What should we do, for instance, with items such as private security services, cancerous cigarettes, space stations, poisonous drugs, polluting cars, imbecile television programmes, repetitive advertisements and cosmetic surgery? In the absence of perfectly competitive equilibrium, how do we decide which of these items respond to autonomous wants, which ones help create false needs, and which are simply imposed from above?
Neoclassicists cannot answer these questions. Yet, unless they do, all 'real' measurements of material aggregates - from the flows of the national accounts to the stocks of capital equipment and personal wealth - remain meaningless. 9
8 In practice, the 'real' quantity of weaponry is measured in exactly the same way as the 'real' quantity of food or clothing - i. e. by deflating its money value by its price (U. S. Department of Commerce. Bureau of Economic Analysis 2005: 33-35). Presumably, this 'real' quantity represents the nation's happiness from annihilating one enemy multiplied by the number of enemies the weapons can kill (plus the collateral damage). The statisticians are understand- ably secretive on how they actually determine this utilitarian quantum.
9 Alternative 'green' measures, such as the Index of Sustainable Economic Welfare (ISEW) and the Genuine Progress Indicator (GPI), do not really solve the quantification problem. These measures, pioneered by Herman Daly and John Cobb (1989), try to recalibrate the conventional national accounts for 'externalities' and other considerations. They do so by (1) adding the supposed welfare impact of non-market activities, (2) subtracting the assumed effect of harmful and unsustainable market activities and (3) correcting the resulting measure for income inequalities. Although the aim of these recalibrations is commendable, they remain hostage to the very principles they try to transcend. Not only do they begin with conventional national accounting as raw data, but their subsequent additions and subtrac- tions accept the equilibrium assumptions, logic and imputations of utilitarian accounting (for a recent overview of 'green accounting', see Talberth, Cobb, and Slattery 2007).
? 136 The enigma of capital
Hedonic regression
The final nail in the neoclassical measurement coffin is the inability to sepa- rate changes in price from changes in quality. The gist of the problem is simple. Even if we could do the impossible and somehow measure relative quantities at any given point in time, as time passes the 'things' we measure are themselves changing.
In our example of Energy User-Producer, Inc. , we assumed that the auto- motive factories and oil rigs remained the same over time and that only their number changed. But in practice this is rarely if ever the case, and the implica- tions for measurement are devastating.
To illustrate the consequences, suppose an oil rig made in 1984 cost 40 per cent more than one made in 1974. Assume further that the new rig is different than the old one, having been altered in order to make it more productive. From a neoclassical perspective the higher price is at least partly due to the new rig's greater 'quantity' of capital. The question is how big are the respec- tive effects? What part of the increase represents the augmented quantity and what part is a 'pure' price change?
As the reader must realize by now, the simple answer is that we do not know and, indeed, cannot know. The statisticians, though, again are forced to pretend. Since the measured items are constantly being transformed, they need to be, again and again, reduced back to their universal units, however fictitious.
Luckily (or maybe not so luckily), neoclassical theorists have come up with a standard procedure for this very purpose, fittingly nicknamed 'hedonic regression'. The idea was first floated in the late 1930s and had a long gesta- tion period. But since the 1990s, with the advance of cheap computing, it has spread rapidly and is now commonly used by most national statistical services. 10
How would a statistician use this procedure to estimate the changing quan- tity of the oil rig in the above example? Typically, she would begin by speci- fying the rig's underlying 'characteristics', such as size, capacity, weight, durability, safety, and so on. Next, she would regress variations in the price of the rig against observed changes in these different characteristics and against time (or a constant in cross-section studies); the estimated coefficient associated with each characteristic would then be taken as the weight of that characteristic in the 'true' quantity change of the rig, while the time coeffi- cient (or constant) would correspond to the 'pure' price change. Finally,
10 For early studies on the subject, see Court (1939), Ulmer (1949), Stone (1956) and Lancaster (1971). Later collections and reviews include Griliches (1971), Berndt and Triplett (1990), Boskin et al. (1996), Banzhaf (2001) and Moulton (2001). For a critical assessment, see Nitzan (1989).
? Accumulation of what? 137
she would decompose the observed price change into two parts, one reflecting the increased quantity of capital and the other reflecting pure price change. 11
Hedonic regressions are particularly convenient since they cannot be 'tested', at least not in the conventional econometric sense. Standard econo- metric models are customary judged based on their overall 'explanatory power' measured by such statistics as R2 or F-tests, on the 'significance' of their coefficients indicated by t-statistics and on other bureaucratic criteria. Hedonic regressions are different. These models do not try simply to explain variations in price by variations in the commodity characteristics. Rather, their purpose is to decompose the overall variations in price into two parts - one attributed to changes in characteristics and a residual interpreted as a 'pure' price change. However, since there is no a priori reason to expect one part to be bigger than the other, we have to accept whatever the empirical estimates tell us. If changes in characteristics happen to account for 95 per cent of the price variations, then the 'pure' price change must be 5 per cent. And if the results are the opposite, then the 'pure' price change must be 95 per cent.
This peculiar situation makes hedonic computations meaningful only under very stringent conditions. First, the characteristics that the statistician enumerates in her model must be the ones, and the only ones, that define the items' 'quality' (so that we can neither omit from nor add to the list of speci- fied characteristics). Second, the statistician must use the 'correct' regression model (for instance, she must know with certainty that the 'true' model is linear rather than log-linear or exponential). And last but not least, the world to which she applies this regression must be in a state of perfectly competitive equilibrium, and the 'economic agents' of this world must hold their tastes unchanged for the duration of the estimates.
11 For instance, suppose oil rigs have two characteristics, 'extracting capacity' and 'durability', and that x1 and x2 represent the temporal rates of change of these two characteristics, respectively. A simple-minded, cross-section hedonic regression could then look like Equation (1):
1. p=b0 +b1 x1 +b2 x2 +u,
where p is the overall rate of change of the price of oil rigs, b0 is the 'pure' price change, b1 and b2 are the respective contributions of changes in 'extracting capacity' and in 'durability' to changes in the 'quantity' of oil rigs, and u is a statistical error term. Now suppose that, based on our statistical estimates, b1 = 0. 4 and b2 = 0. 6, so that:
2. p = b0 + 0. 4 x1 + 0. 6 x2
Next, consider a situation in which 'new and improved' rigs have twice the 'extracting capacity' of older rigs (x1 = 2), the same durability (x2 = 0), and a price tag 40 per cent higher (p = 0. 4). Plugging these numbers back into Equation (2), we would then conclude that new rigs have 80 per cent more 'quantity of capital' (0. 4 * 2 + 0. 6 * 0 = 0. 8) and that the pure price change must have been a negative 40 per cent (b0 = -0. 4).
? 138 The enigma of capital
Now, since the model cannot be tested, these three conditions must be true. Otherwise we end up with meaningless results without ever suspecting as much. Sadly, though, the latter outcome is precisely what we end up with in practice: the first two conditions can be true only by miraculous fluke, while the third is a social oxymoron. And since none of the necessary conditions hold, we can safely state that, strictly speaking, all 'real' estimates of a quali- tatively changing capital stock are arbitrary and therefore meaningless. Indeed, for that matter every quantitative measure of a qualitatively changing aggregate - including consumption, investment, government expenditure and the GDP itself - is equally bogus. Statements such as 'real GDP has grown by 2. 3 per cent' or 'the capital stock has contracted by 0. 2 per cent' have no clear meaning. We may see that these stocks and flows are changing, but we cannot say by how much or even in which direction.
Quantifying labour values
Do the Marxists avoid these measurement impossibilities? 12 On the surface, it looks as though they should. In contrast to the neoclassicists whose capital is measured in subjective utility, Marxists claim to be counting it in objective labour units; and so, regardless of how and why capital accumulates, at least its magnitude should be problem free. But it isn't.
Concrete versus abstract labour
According to Marx, the act of labour has two distinct aspects, concrete and abstract. Concrete labour creates use value. Building makes a house, tailoring creates a coat, driving creates transportation, surgery improves health, etc. By contrast, abstract labour creates value. Unlike concrete labour which is unique in its characteristics, abstract labour is universal, 'a productive activity of human brains, nerves, and muscles . . . the expenditure of human labour in general . . . the labour-power which, on average, apart from any special development, exists in the organism of every individual' (Marx 1909, Vol. 1: 51).
The key question is how to quantify this abstract labour. In the above quotation Marx uses a 'biological' yardstick. Abstract labour, he says, is the 'activity of human brains, nerves and muscles'. But as it stands, this yardstick is highly problematic, for at least two reasons. First, there is no way to know how much of this 'biological activity' is embodied in the concrete labour of a cotton picker as compared to that of a carpenter or a CEO. Second, and perhaps more importantly, relying on physiology and biology here goes against Marx's own insistence that abstract labour is a social category.
12 As noted, most neo-Marxists have abandoned the labour theory of value. As we shall see at the end of this section, however, the question remains relevant to their inquiry as well.
? Accumulation of what? 139
Marx resolves this problem, almost in passing, by resorting to another distinction - one that he makes between skilled labour and unskilled, or simple, labour. The solution involves two steps. The first step establishes a quantitative equivalence between the two types of labour: 'Skilled labour', Marx says, 'counts only as simple labour intensified, or rather as multiplied simple labour, a given quantity of skilled being considered equal to a greater quantity of simple labour' (ibid. ). The second step, a few lines later, ties this equivalence to abstract labour: 'A commodity', he asserts, 'may be the product of the most skilled labour, but its value, by equating it to the product of simple unskilled labour, represents a definite quantity of the latter labour alone'. In other words, abstract labour is equated with unskilled labour, and since skilled labour is merely unskilled labour intensified, we can now express any type of labour in terms of its abstract labour equivalence.
This solution is difficult to accept. The very parity between abstract and unskilled labour seems to contradict Marx's most basic assumption. For Marx, skilled and unskilled labour are two types of concrete labour whose characteristics belong to the qualitative realm of use value. And yet, here Marx says that labour skills are also related to each other quantitatively and that this relationship is the very basis of value.
This claim can have two interpretations. One is that objective labour value depends on subjective use value - a possibility that Marx categorically rules out in the first pages of Das Kapital. The other interpretation is that there are in fact two types of use value: subjective use value in consumption and objec- tive use value in production. The first type concerns the relationship between people and commodities; as such, it cannot be quantified and is justly ignored. The second type involves the technical relationship of production and hence is fully measurable. Unfortunately, this latter interpretation isn't convincing either. As we have seen in Chapter 7, it is impossible to objectively delineate the productive process; but, then, how can we hope to objectively measure something that cannot even be delineated?
Value theorists, though, seem to insist that use value in production can be measured. So, for argument's sake, let's accept this claim and assume that unskilled labour indeed is a measurable quantum. What would it take to compute how much of this quantum is embedded in commodities?
For this calculation to be feasible, one or both of two conditions must be met. 13 The first condition is satisfied when a commodity is produced entirely by unskilled labour. In this case, all we need to do is count the number of socially necessary hours required to make the commodity. A second condi- tion becomes required if production involves different levels and types of skills. In this case, simple counting is no longer possible, and the theory works only if there exists an objective process by which skilled labour can be converted or reduced to unskilled (abstract) labour. Let us consider each condition in turn.
? 13 Note that in and of themselves, these conditions, although required, may be insufficient.
140 The enigma of capital
A world of unskilled automatons?
Begin with the first situation, whereby all labour is unskilled. According to Marx, this condition is constantly generated by capitalism, which relentlessly strives to de-humanize, de-skill and simplify labour, to turn live labour into a universal abstraction, a 'purely mechanical activity, hence indifferent to its particular form' (Marx 1857: 297). The abstraction process does not mean that all labour has to be the same. It is rather that capitalism tends to generate and enforce skills that are particularly flexible, easy to acquire and readily transferable. In this sense, abstract labour is not only an analytical category, but a real thing, created and recreated by the very process of capitalist development. 14
This claim, central to the classical Marxist framework and later bolstered by Harry Braverman's work on Labor and Monopoly Capital (1975), elicited considerable controversy, particularly on historical grounds. Workers, pointed out the critics, were forever resisting their subjugation by capital and in reality prevented labour from being simplified to the extremes depicted by Marx (cf. Thompson 1964; for a review, see Elger 1979). 15
According to Cornelius Castoriadis, Marx's conception of abstract labour was not only historically incorrect, but logically contradictory (1988, Vol. 1: General Introduction). First, if workers indeed were systematically deskilled, degraded and debilitated, asked Castoriadis, how could they possibly become, as Marx repeatedly insisted, historical bearers of socialist revolution and architects of a new society? According to this description, it seems more likely that they'd end up as raw material for a fascist revolution. 16
14 'The indifference to the particular kind of labor corresponds to a form of society in which individuals pass with ease from one kind of work to another, which makes it immaterial to them what particular kind of work may fall to their share. . . . This state of affairs has found its highest development in the most modern of bourgeois societies, the United States. It is only here that the abstraction of the category "labor", "labor in general", labor sans phrase, the starting point of modern political economy, becomes realized in practice' (Marx 1911: 299). The slaughter-house worker in Upton Sinclair's novel The Jungle (1906) needs no more than a minute to learn his job.
15 It is of course true that, over time, the distribution of skills undergoes major shifts, cyclical as well as secular. But a significant portion of these shifts occurs through the training of new workers rather than the quick re-training of existing ones. The rigidity of existing specialization seems obvious for highly skilled workers. Few engineers can readily do the work of accountants, few pilots can just start working as doctors, and few managers can easily replace their computer programmers. Moreover, the rigidity is also true for many so- called blue-collar workers, such as auto mechanics, carpenters, truck drivers and farmers, who could rarely do each other's work.
16 This was a potential that Mussolini, a self-declared student of Lenin and editor of a socialist tabloid, was quick to grasp. George Mosse (2000), whose father owned a tabloid chain in the Weimer Republic, comments in his memoirs that, unlike the socialists, the fascists didn't try to impose on workers their intellectual fantasies of freedom. They understood what workers wanted and gave them exactly what they deserved: entertainment, soccer and a strong hand. The dilemma for socialists was expressed, somewhat tongue in cheek, by George Orwell:
? Accumulation of what? 141
Second, and crucially for our purpose here, according to Castoriadis capi- talism itself cannot possibly survive the abstraction of labour as argued by Marx. The complexity, dynamism and incessant restructuring of capitalist production required not morons, but thinking agents. If workers were ever to be reduced to automatons, doing everything by the book like Hasek's The Good Soldier Schweik (1937), capitalist production would come to a halt in no time.
The foregoing argument does not mean that capitalists do not try to auto- mate their world, only that they cannot afford to completely succeed in doing so. In other words, over and above the conflict between capitalists who seek to objectify labour and workers who resist it, there is the inner contradiction of social mechanization itself: the tendency of mechanization to make power inflexible and therefore vulnerable. As we shall see later in the book, social mechanization in capitalism occurs mostly at the level of ownership, through the capitalization of power, a process which in turn leaves capitalist labour considerable autonomy and qualitative diversity.
60 ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 20
50 ? ? ? ? 0 1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015
Figure 8. 1 Energy User-Producer, Inc.
Note: The number of automotive factories and oil rigs is hypothetical. The 'quantity' of capital with a 1970 equilibrium assumes that the 'util-generating capacities' of an automotive factory and an oil rig have a ratio of 2:1, while the 'quantity' of capital with a 1974 equilibrium assumes that the ratio is 1:1.
For instance, if in 1970 an automotive factory cost $40 million and an oil rig $20 million, the company's 33 factories and 20 oil rigs would be worth $1. 72 billion:
4. $1. 72 bn = 33 * $40 mn + 20 * $20 mn
Now, assume that 1970 was a year of perfectly competitive equilibrium. According to the neoclassical scriptures, this happy situation means that the 2:1 ratio between the price of automotive factories and oil rigs 'reveals' to us their relative efficiencies. It tells us that automotive factories generate twice the universal utils as an oil rig. With this assumption, we can then use Equation (5) to compute the overall 'quantity' of capital (Q) owned by Energy User-Producer, Inc. for any year t:6
6 This computation contains an important caveat that must be noted. According to neo- classical theory, the equilibrium price ratio is proportionate to the relative efficiencies of the marginal automotive factory and oil rig, but that isn't enough for our purpose here. In order to compute the overall 'quantity' of capital, we need to know the relative efficiencies of the
? ? utils 'Quantity' of Capital number equilibrium in 1974
? ? (left)
117
? ? ? ? ? ? ? ? ? 'Quantity' of Capital
equilibrium in 1970 (left)
89
www. bnarchivs. net
? ? Automotive Factories (right)
Oil Rigs
(right)
? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? 132 The enigma of capital
5. Qt = N1t * $40 mn + N2t * $20 mn
The results, normalized to Q1970 = 100, are depicted by the thin line in the upper part of Figure 8. 1. 7 We can see that the increase in the number of the presumably less productive oil rigs was more than offset by the decline in the number of the presumably more productive automotive factories. As a result, the overall 'quantity' of capital fell by 11 per cent throughout the period.
Of course, there is no particular reason to assume that 1970 was a year of perfectly competitively equilibrium. Any other year could do equally well. So, for argument's sake, let us pick 1974 as our equilibrium year and see what it means for the computations.
Many things happened between 1970 and 1974. The most important for our purpose were probably the threefold rise in the price of crude oil and the accompanying increase in the price of energy-producing equipment, including oil rigs. So, for the sake of illustration, let us assume that, by 1974, the price of oil rigs doubled to $40 million, while the price of automotive factories remained unchanged at $40 million. Now, since these are equilibrium prices, the 'revealed' efficiency ratio between automotive factories and oil rigs must be 1:1 (as opposed to 2:1 in a 1970 equilibrium). To compute the 'quantity' of capital owned by Energy User-Producer Inc. , we would now use Equation (6):
6. Qt = N1t * $40 mn + N2t * $40 mn
The results of this computation, again normalized to Q1970 = 100, are plotted by the thick line in the upper portion of Figure 8. 1. The difference from the previous calculations is striking: whereas equilibrium in 1970 implies an 11 per cent fall in the 'quantity' of capital, equilibrium in 1974 shows a 17 per cent increase. The reason for this divergence is that, compared with the first assumption, the 'relative efficiency' of oil rigs is assumed to be twice as large, with the result that each additional oil rig adds to the aggregate util-gener- ating capacity double what it did before.
So now we have a dilemma. Since we are dealing with the same collection of capital goods, obviously only one of these 'quantity' series can be correct -
average factory and rig, and that latter information is not revealed by prices. Neoclassical manuals tell us that the average and marginal efficiencies are generally different, and that save for special assumptions there is no way to impute one from the other. This discrep- ancy is assumed away by the national account statisticians. The standard practice - which we follow here for the sake of simplicity - is to take equilibrium prices as revealing average efficiencies and utilities. This bypass enables the measurement of 'real' quantities, at a minor cost of making the result more or less meaningless.
7 A 'normalized' series is calibrated to make its value equal to a given number in a particular 'base year' (in this case, 100 in the base year 1970). To achieve this calibration, we simply divide every observation in the series by the value of the series in the base year and multiply the result by 100 (so the normalized series = Qt / Q1970 * 100). The absolute magnitudes of the observations change, but since all the observations are divided and multiplied by the same numbers, their magnitudes relative to each other remain unaltered.
? Accumulation of what? 133
but which one? To answer this simple question all we need to know is in which of the two years - 1970 or 1974 - the world was in perfectly competitive equi- librium. The embarrassing truth, though, is that no one can tell.
The consequence of this inability is dire. Depending on the year at which prices were in equilibrium, the divergence between the two measures in Figure 8. 1 could be much bigger or much smaller, only that we would never know it. Moreover, the predicament goes beyond capital goods. It applies to any heterogeneous basket of commodities, and it grows more intractable the more encompassing the aggregation.
Neoclassical theorists love to assume the problem away by stipulating a world that simply leaps from one equilibrium to the next. In our illustration, this assumption would make both 1970 and 1974 equilibrium years, elimi- nating the difficulty before it even arose.
Unfortunately, though, this assumption doesn't serve the statisticians. The reason is that, with ongoing equilibrium, all 'pure' price changes - i. e. changes that do not correspond to alterations of the commodity - must be attributed to variations in technology or tastes; yet, if tastes keep changing, we lose the basis for temporal comparisons. With consumer preferences having shifted, the same automotive factory may represent in 1974 a different 'util generating capacity' than it did in 1970; similarly, an increase in the number of rigs may denote a decrease as well as an increase in their 'util generating capacity', all depending on the jerky desires of consumers. So once the util dimensions of commodities are no longer fixed, we lose our benchmark and can no longer talk about their 'real' quantities.
Therefore, here the statisticians part ways with the theoreticians. They assume, usually implicitly, that equilibrium occurs only infrequently. And then they convince themselves that they can somehow identify these special points of equilibrium, even though there is nothing in neoclassical manuals to tell them how to do so.
Quantity without equilibrium
Elusive equilibrium is devastating for measurement. Consider again our example of oil rigs and automotive factories. The first produce energy and the second use it, so their relative prices depend crucially on the global political economy of the Middle East, home to two thirds of the world's known oil reserves and one third of its daily output. The difficulty for the statisticians is that this region, with its complex power conflicts between large corporate alli- ances, regional governments, religious movements and superpowers, never settles into a perfectly competitive equilibrium; but if so, how can we use the relative prices of oil rigs and automotive factories as a measure of their rela- tive 'quantities'?
And the problem is hardly unique to oil rigs and automotive factories. Indeed, given that the entire world is criss-crossed by huge corporate alliances, complex government formations, contending social groups, mass
134 The enigma of capital
persuasion, extensive coercion and the frequent use of violence, how can any market ever be in equilibrium? And if that is the case, what then is left of our attempt to quantify the 'capital stock' or any 'real' magnitude?
The statisticians try to assume the problem away by mentioning it as little as possible. The most recent UN guidelines on the system of national accounts, published in 1993, contain only ten instances of the word 'equilib- rium', none of which appear in Section XVI on Price and Volume Measures, while the word 'disequilibrium' is not mentioned even once in the entire manual (United Nations.
Department of Economic and Social Affairs 1993). The more recent OECD document on Measuring Capital (2001) no longer refers to either concept.
Fortunately, an earlier version of the UN national account guidelines, published in 1977, before the victory of neoliberalism, was not as tight-lipped. This older version provides occasional advice on how to deal with unfortu- nate 'imperfections', so we can at least get a glimpse of how the statisticians conceive the problem. One 'special case of difficulty' identified here is internal, or non-arm's length, transactions between related enterprises or branches of the same company. Since transfer prices set under these condi- tions may be 'quite arbitrary', to use the UN's wording, the advice is to 'abandon value [meaning price] as one of the primary measures' and replace it with a 'measure of physical quantity' combined with an estimate of 'what the equivalent market price would have been' (United Nations. Department of Economic and Social Affairs 1977: 12). Needless to say, the authors do not explain why we would need prices if we already knew the physical measures. They are also silent on where we could find 'equivalent' market prices in a world where all markets have already been contaminated by power.
Another challenge is the repeated need to 'splice' different price series when a new product (say an MP3 player) replaces an older one (a CD player). Since the splicing involves a price comparison of two different objects, the guidelines recommend that the replacement 'should take place at a time when the assumption that the price differences between the two products are proportional to the quality differences is most likely to be true' (p. 10). In short, splice only in equilibrium. But since nobody knows when that happens (if ever), the guidelines concede that, in practice, the decision 'must be essen- tially pragmatic' - that is, arbitrary.
The impossibility of equilibrium blurs the very meaning of 'well-being' - and therefore of the underlying substance that utils supposedly measure. The neoclassical conception of well-being is anchored in the 'sovereign consumer', but how can we fathom this individual sovereign in a world characterized by conflict, power, compulsion and brain-washing? How many consumers can remain 'autonomous' under such conditions? And if they are not 'autono- mous', who is in the driver's seat? Henry Ford is reputed to have said that, 'If I'd asked people what they wanted, they would have asked for a faster horse'. But even if we take a less condescending view and assume that consumers do
Accumulation of what? 135
have some autonomy, how do we separate their authentic needs and wants from falsely induced ones?
These are not nitpicking questions of high theory. The ambiguities here have very practical implications. For example, the national accounts routinely add the value of chemical weapons to that of medicine - this under the assumption that, in perfectly competitive equilibrium, one dollar's worth of the former improves our lives just as much as one dollar's worth of the latter. Consumers, however, do not live under such conditions, and rarely are they asked whether their country should produce chemical weapons, in what quantities or for what purpose. So how do we know that producing such weapons (Bentham's pushpins) indeed makes consumers better off in the first place, let alone better off to an extent equal to how well off they would have been had the same dollars been spent on medicine (Bentham's poetry)? Perhaps the production of chemical weapons undermines their well-being? And if so, should we not subtract chemical weapons from GDP rather than add them to it? 8
And it's not like medicine and weapons are in any way special. What should we do, for instance, with items such as private security services, cancerous cigarettes, space stations, poisonous drugs, polluting cars, imbecile television programmes, repetitive advertisements and cosmetic surgery? In the absence of perfectly competitive equilibrium, how do we decide which of these items respond to autonomous wants, which ones help create false needs, and which are simply imposed from above?
Neoclassicists cannot answer these questions. Yet, unless they do, all 'real' measurements of material aggregates - from the flows of the national accounts to the stocks of capital equipment and personal wealth - remain meaningless. 9
8 In practice, the 'real' quantity of weaponry is measured in exactly the same way as the 'real' quantity of food or clothing - i. e. by deflating its money value by its price (U. S. Department of Commerce. Bureau of Economic Analysis 2005: 33-35). Presumably, this 'real' quantity represents the nation's happiness from annihilating one enemy multiplied by the number of enemies the weapons can kill (plus the collateral damage). The statisticians are understand- ably secretive on how they actually determine this utilitarian quantum.
9 Alternative 'green' measures, such as the Index of Sustainable Economic Welfare (ISEW) and the Genuine Progress Indicator (GPI), do not really solve the quantification problem. These measures, pioneered by Herman Daly and John Cobb (1989), try to recalibrate the conventional national accounts for 'externalities' and other considerations. They do so by (1) adding the supposed welfare impact of non-market activities, (2) subtracting the assumed effect of harmful and unsustainable market activities and (3) correcting the resulting measure for income inequalities. Although the aim of these recalibrations is commendable, they remain hostage to the very principles they try to transcend. Not only do they begin with conventional national accounting as raw data, but their subsequent additions and subtrac- tions accept the equilibrium assumptions, logic and imputations of utilitarian accounting (for a recent overview of 'green accounting', see Talberth, Cobb, and Slattery 2007).
? 136 The enigma of capital
Hedonic regression
The final nail in the neoclassical measurement coffin is the inability to sepa- rate changes in price from changes in quality. The gist of the problem is simple. Even if we could do the impossible and somehow measure relative quantities at any given point in time, as time passes the 'things' we measure are themselves changing.
In our example of Energy User-Producer, Inc. , we assumed that the auto- motive factories and oil rigs remained the same over time and that only their number changed. But in practice this is rarely if ever the case, and the implica- tions for measurement are devastating.
To illustrate the consequences, suppose an oil rig made in 1984 cost 40 per cent more than one made in 1974. Assume further that the new rig is different than the old one, having been altered in order to make it more productive. From a neoclassical perspective the higher price is at least partly due to the new rig's greater 'quantity' of capital. The question is how big are the respec- tive effects? What part of the increase represents the augmented quantity and what part is a 'pure' price change?
As the reader must realize by now, the simple answer is that we do not know and, indeed, cannot know. The statisticians, though, again are forced to pretend. Since the measured items are constantly being transformed, they need to be, again and again, reduced back to their universal units, however fictitious.
Luckily (or maybe not so luckily), neoclassical theorists have come up with a standard procedure for this very purpose, fittingly nicknamed 'hedonic regression'. The idea was first floated in the late 1930s and had a long gesta- tion period. But since the 1990s, with the advance of cheap computing, it has spread rapidly and is now commonly used by most national statistical services. 10
How would a statistician use this procedure to estimate the changing quan- tity of the oil rig in the above example? Typically, she would begin by speci- fying the rig's underlying 'characteristics', such as size, capacity, weight, durability, safety, and so on. Next, she would regress variations in the price of the rig against observed changes in these different characteristics and against time (or a constant in cross-section studies); the estimated coefficient associated with each characteristic would then be taken as the weight of that characteristic in the 'true' quantity change of the rig, while the time coeffi- cient (or constant) would correspond to the 'pure' price change. Finally,
10 For early studies on the subject, see Court (1939), Ulmer (1949), Stone (1956) and Lancaster (1971). Later collections and reviews include Griliches (1971), Berndt and Triplett (1990), Boskin et al. (1996), Banzhaf (2001) and Moulton (2001). For a critical assessment, see Nitzan (1989).
? Accumulation of what? 137
she would decompose the observed price change into two parts, one reflecting the increased quantity of capital and the other reflecting pure price change. 11
Hedonic regressions are particularly convenient since they cannot be 'tested', at least not in the conventional econometric sense. Standard econo- metric models are customary judged based on their overall 'explanatory power' measured by such statistics as R2 or F-tests, on the 'significance' of their coefficients indicated by t-statistics and on other bureaucratic criteria. Hedonic regressions are different. These models do not try simply to explain variations in price by variations in the commodity characteristics. Rather, their purpose is to decompose the overall variations in price into two parts - one attributed to changes in characteristics and a residual interpreted as a 'pure' price change. However, since there is no a priori reason to expect one part to be bigger than the other, we have to accept whatever the empirical estimates tell us. If changes in characteristics happen to account for 95 per cent of the price variations, then the 'pure' price change must be 5 per cent. And if the results are the opposite, then the 'pure' price change must be 95 per cent.
This peculiar situation makes hedonic computations meaningful only under very stringent conditions. First, the characteristics that the statistician enumerates in her model must be the ones, and the only ones, that define the items' 'quality' (so that we can neither omit from nor add to the list of speci- fied characteristics). Second, the statistician must use the 'correct' regression model (for instance, she must know with certainty that the 'true' model is linear rather than log-linear or exponential). And last but not least, the world to which she applies this regression must be in a state of perfectly competitive equilibrium, and the 'economic agents' of this world must hold their tastes unchanged for the duration of the estimates.
11 For instance, suppose oil rigs have two characteristics, 'extracting capacity' and 'durability', and that x1 and x2 represent the temporal rates of change of these two characteristics, respectively. A simple-minded, cross-section hedonic regression could then look like Equation (1):
1. p=b0 +b1 x1 +b2 x2 +u,
where p is the overall rate of change of the price of oil rigs, b0 is the 'pure' price change, b1 and b2 are the respective contributions of changes in 'extracting capacity' and in 'durability' to changes in the 'quantity' of oil rigs, and u is a statistical error term. Now suppose that, based on our statistical estimates, b1 = 0. 4 and b2 = 0. 6, so that:
2. p = b0 + 0. 4 x1 + 0. 6 x2
Next, consider a situation in which 'new and improved' rigs have twice the 'extracting capacity' of older rigs (x1 = 2), the same durability (x2 = 0), and a price tag 40 per cent higher (p = 0. 4). Plugging these numbers back into Equation (2), we would then conclude that new rigs have 80 per cent more 'quantity of capital' (0. 4 * 2 + 0. 6 * 0 = 0. 8) and that the pure price change must have been a negative 40 per cent (b0 = -0. 4).
? 138 The enigma of capital
Now, since the model cannot be tested, these three conditions must be true. Otherwise we end up with meaningless results without ever suspecting as much. Sadly, though, the latter outcome is precisely what we end up with in practice: the first two conditions can be true only by miraculous fluke, while the third is a social oxymoron. And since none of the necessary conditions hold, we can safely state that, strictly speaking, all 'real' estimates of a quali- tatively changing capital stock are arbitrary and therefore meaningless. Indeed, for that matter every quantitative measure of a qualitatively changing aggregate - including consumption, investment, government expenditure and the GDP itself - is equally bogus. Statements such as 'real GDP has grown by 2. 3 per cent' or 'the capital stock has contracted by 0. 2 per cent' have no clear meaning. We may see that these stocks and flows are changing, but we cannot say by how much or even in which direction.
Quantifying labour values
Do the Marxists avoid these measurement impossibilities? 12 On the surface, it looks as though they should. In contrast to the neoclassicists whose capital is measured in subjective utility, Marxists claim to be counting it in objective labour units; and so, regardless of how and why capital accumulates, at least its magnitude should be problem free. But it isn't.
Concrete versus abstract labour
According to Marx, the act of labour has two distinct aspects, concrete and abstract. Concrete labour creates use value. Building makes a house, tailoring creates a coat, driving creates transportation, surgery improves health, etc. By contrast, abstract labour creates value. Unlike concrete labour which is unique in its characteristics, abstract labour is universal, 'a productive activity of human brains, nerves, and muscles . . . the expenditure of human labour in general . . . the labour-power which, on average, apart from any special development, exists in the organism of every individual' (Marx 1909, Vol. 1: 51).
The key question is how to quantify this abstract labour. In the above quotation Marx uses a 'biological' yardstick. Abstract labour, he says, is the 'activity of human brains, nerves and muscles'. But as it stands, this yardstick is highly problematic, for at least two reasons. First, there is no way to know how much of this 'biological activity' is embodied in the concrete labour of a cotton picker as compared to that of a carpenter or a CEO. Second, and perhaps more importantly, relying on physiology and biology here goes against Marx's own insistence that abstract labour is a social category.
12 As noted, most neo-Marxists have abandoned the labour theory of value. As we shall see at the end of this section, however, the question remains relevant to their inquiry as well.
? Accumulation of what? 139
Marx resolves this problem, almost in passing, by resorting to another distinction - one that he makes between skilled labour and unskilled, or simple, labour. The solution involves two steps. The first step establishes a quantitative equivalence between the two types of labour: 'Skilled labour', Marx says, 'counts only as simple labour intensified, or rather as multiplied simple labour, a given quantity of skilled being considered equal to a greater quantity of simple labour' (ibid. ). The second step, a few lines later, ties this equivalence to abstract labour: 'A commodity', he asserts, 'may be the product of the most skilled labour, but its value, by equating it to the product of simple unskilled labour, represents a definite quantity of the latter labour alone'. In other words, abstract labour is equated with unskilled labour, and since skilled labour is merely unskilled labour intensified, we can now express any type of labour in terms of its abstract labour equivalence.
This solution is difficult to accept. The very parity between abstract and unskilled labour seems to contradict Marx's most basic assumption. For Marx, skilled and unskilled labour are two types of concrete labour whose characteristics belong to the qualitative realm of use value. And yet, here Marx says that labour skills are also related to each other quantitatively and that this relationship is the very basis of value.
This claim can have two interpretations. One is that objective labour value depends on subjective use value - a possibility that Marx categorically rules out in the first pages of Das Kapital. The other interpretation is that there are in fact two types of use value: subjective use value in consumption and objec- tive use value in production. The first type concerns the relationship between people and commodities; as such, it cannot be quantified and is justly ignored. The second type involves the technical relationship of production and hence is fully measurable. Unfortunately, this latter interpretation isn't convincing either. As we have seen in Chapter 7, it is impossible to objectively delineate the productive process; but, then, how can we hope to objectively measure something that cannot even be delineated?
Value theorists, though, seem to insist that use value in production can be measured. So, for argument's sake, let's accept this claim and assume that unskilled labour indeed is a measurable quantum. What would it take to compute how much of this quantum is embedded in commodities?
For this calculation to be feasible, one or both of two conditions must be met. 13 The first condition is satisfied when a commodity is produced entirely by unskilled labour. In this case, all we need to do is count the number of socially necessary hours required to make the commodity. A second condi- tion becomes required if production involves different levels and types of skills. In this case, simple counting is no longer possible, and the theory works only if there exists an objective process by which skilled labour can be converted or reduced to unskilled (abstract) labour. Let us consider each condition in turn.
? 13 Note that in and of themselves, these conditions, although required, may be insufficient.
140 The enigma of capital
A world of unskilled automatons?
Begin with the first situation, whereby all labour is unskilled. According to Marx, this condition is constantly generated by capitalism, which relentlessly strives to de-humanize, de-skill and simplify labour, to turn live labour into a universal abstraction, a 'purely mechanical activity, hence indifferent to its particular form' (Marx 1857: 297). The abstraction process does not mean that all labour has to be the same. It is rather that capitalism tends to generate and enforce skills that are particularly flexible, easy to acquire and readily transferable. In this sense, abstract labour is not only an analytical category, but a real thing, created and recreated by the very process of capitalist development. 14
This claim, central to the classical Marxist framework and later bolstered by Harry Braverman's work on Labor and Monopoly Capital (1975), elicited considerable controversy, particularly on historical grounds. Workers, pointed out the critics, were forever resisting their subjugation by capital and in reality prevented labour from being simplified to the extremes depicted by Marx (cf. Thompson 1964; for a review, see Elger 1979). 15
According to Cornelius Castoriadis, Marx's conception of abstract labour was not only historically incorrect, but logically contradictory (1988, Vol. 1: General Introduction). First, if workers indeed were systematically deskilled, degraded and debilitated, asked Castoriadis, how could they possibly become, as Marx repeatedly insisted, historical bearers of socialist revolution and architects of a new society? According to this description, it seems more likely that they'd end up as raw material for a fascist revolution. 16
14 'The indifference to the particular kind of labor corresponds to a form of society in which individuals pass with ease from one kind of work to another, which makes it immaterial to them what particular kind of work may fall to their share. . . . This state of affairs has found its highest development in the most modern of bourgeois societies, the United States. It is only here that the abstraction of the category "labor", "labor in general", labor sans phrase, the starting point of modern political economy, becomes realized in practice' (Marx 1911: 299). The slaughter-house worker in Upton Sinclair's novel The Jungle (1906) needs no more than a minute to learn his job.
15 It is of course true that, over time, the distribution of skills undergoes major shifts, cyclical as well as secular. But a significant portion of these shifts occurs through the training of new workers rather than the quick re-training of existing ones. The rigidity of existing specialization seems obvious for highly skilled workers. Few engineers can readily do the work of accountants, few pilots can just start working as doctors, and few managers can easily replace their computer programmers. Moreover, the rigidity is also true for many so- called blue-collar workers, such as auto mechanics, carpenters, truck drivers and farmers, who could rarely do each other's work.
16 This was a potential that Mussolini, a self-declared student of Lenin and editor of a socialist tabloid, was quick to grasp. George Mosse (2000), whose father owned a tabloid chain in the Weimer Republic, comments in his memoirs that, unlike the socialists, the fascists didn't try to impose on workers their intellectual fantasies of freedom. They understood what workers wanted and gave them exactly what they deserved: entertainment, soccer and a strong hand. The dilemma for socialists was expressed, somewhat tongue in cheek, by George Orwell:
? Accumulation of what? 141
Second, and crucially for our purpose here, according to Castoriadis capi- talism itself cannot possibly survive the abstraction of labour as argued by Marx. The complexity, dynamism and incessant restructuring of capitalist production required not morons, but thinking agents. If workers were ever to be reduced to automatons, doing everything by the book like Hasek's The Good Soldier Schweik (1937), capitalist production would come to a halt in no time.
The foregoing argument does not mean that capitalists do not try to auto- mate their world, only that they cannot afford to completely succeed in doing so. In other words, over and above the conflict between capitalists who seek to objectify labour and workers who resist it, there is the inner contradiction of social mechanization itself: the tendency of mechanization to make power inflexible and therefore vulnerable. As we shall see later in the book, social mechanization in capitalism occurs mostly at the level of ownership, through the capitalization of power, a process which in turn leaves capitalist labour considerable autonomy and qualitative diversity.
