We are ruled, in part, by algorithms. They govern some of what we see and read, how we communicate and work; this is widely understood. What is less obvious is how we should situate this elementary truth in socio-historical terms, or even what an algorithm actually is. We might usefully begin by defining the electronic computer as a deterministic but flexible device for modelling rules by which sets of symbols are reliably transformed into other sets of symbols. The simplicity of this definition is pointedly deflationary, but the centrality of symbols in human social life meant that these conceptually simple machines could have dramatic implications when socially generalized. As they have come to mediate everyday activities and interactions, the social world has been subordinated to the typically opaque rules that they embody. The symbols by which we live and think are increasingly governed by machines operating according to someone else’s rules. When did this phenomenon begin? With Google’s famous PageRank algorithm and the rise of social media, or does it have an older provenance?

In her latest book, Rules, the historian of science Lorraine Daston has presented a novel thesis on the role of ‘rules’ in modernity. Born in 1951 in East Lansing, Michigan—a small university city about 100 miles west of Detroit—Daston studied history and science at Harvard, completing a PhD titled ‘The Reasonable Calculus: Classical Probability Theory, 1650–1840’ in 1979 under the supervision of Newton scholar I. Bernard Cohen—a key figure in the emergence of history of science as an academic field in the us. Published in book form with Princeton in 1988, Daston’s doctoral project traced the arc of mathematical probability theory in figures like Bernoulli, Condorcet and Laplace, who attempted to formalize probabilistic rules underlying the judgements of the rational subject. In Condorcet’s hands, this formed the basis for ‘social mathematics’—an early attempt at social science. Founded on elegant formalisms that often led to absurd conclusions, viewed from a certain angle, this kind of probabilistic theory can appear as an early ancestor of the post-war fetish for formalism and model-building in the Anglophone social sciences. Daston located its decline in the emergence of a statistical worldview that did not depend on discredited assumptions about the micro-level reasonableness of individual behaviour. ‘What does it mean to be rational?’ was her opening question; she is still grappling with the history of answers to this question.

Daston’s early work contributed to a wave of scholarship on probability and statistics that played a notable role in post-Kuhnian history of science, roughly from Ian Hacking’s pathbreaking Emergence of Probability in 1976, to its 1990 sequel, The Taming of Chance. Though technical, these topics proved anything but narrow, bearing upon—among other things—notions of inference and scientific method, insurance and the quantification of risk, the emergence of the modern state, and the eugenics movement. Foucault was an important—albeit distal—influence here, supplying a precedent for a kind of epistemic history, of the basic structures of knowledge. But while such thinking has often been philosophically informed, it has been little concerned with ‘French theory’: Hacking was an analytic philosopher turned historian; Daston set out as a historian of mathematics. In the 1980s she was a member of a research group at the University of Bielefeld assembled by historian and philosopher of science Lorenz Krüger. She has been based in Germany since: in 1994, Krüger became founding director of the Max Planck Institute for the History of Science in Berlin, but when he died soon after, Daston assumed the directorship, remaining in that post until 2019. There she convened other scholars for collaborative research, leading to some collectively authored texts, including How Reason Almost Lost Its Mind (2013): a study of prominent attempts to redefine rationality in formal and mechanical terms in Cold War American economics, political science, psychology and sociology, which anticipated some themes of her latest book. Under Daston, the Institute’s Working Group has been associated with a ‘historical epistemology’ focused on fundamental categories of science such as objectivity and observation—an approach reflected in her own writing.

Published in 1998, Wonders and the Order of Nature, 1150–1750, co-authored with Katharine Park, a historian of medieval and renaissance science, studied the role that wonders, marvels and prodigies—strange phenomena such as comets, monstrous births or a luminescent veal shank seen by the scientist Robert Boyle—played in bounding notions of natural order, until they became associated with civil and religious turbulence in the early modern period, and were ultimately suppressed by Enlightenment intellectuals in favour of the regular and lawlike. Her most conceptually striking book is Objectivity (2007)—again co-authored, with historian of physics and philosopher of science Peter Galison—which analysed changes in scientific image-making as exemplified in the atlases that have been central in defining individual specialisms over hundreds of years. In these images, Daston and Galison perceived shifts in the notion of scientific truth, from a truth-to-nature which aimed to capture essential characteristics—in the illustrations of Linnaean botanical classifications, for example—to an objectivity which attempted to obliterate the subjectivity of the scientist—often by mechanical means such as photography—to the trained judgement of credentialled specialists. If technology and changes in scientific labour were often prominent in these shifts, Daston and Galison were at pains to identify preceding changes in mentalité, rather than appealing to any simple transformation in what Marxists used to call the ‘base’. In 2019 she made a foray into philosophical anthropology with Against Nature, a short book on the relation of moral and natural order, which argued against transcendent notions of reason for one ‘embedded in the specifics of the human sensorium’.

If Rules draws from the same toolkit, it is also a departure: a philosophical history that ‘hopscotches’ over the centuries since antiquity to construct an argument about the shifting relationships between rules and exceptions, universals and particulars. Based on a lecture series delivered at Princeton in 2014, it is more conversational in tone than much of her earlier work, but does not fully cross over into the popular history of science and technology: conceptual and scholarly in temperament, various aspects of Daston’s argument require some degree of contextual knowledge to be properly understood. It has three declared aims: firstly, to shed light on ‘how mathematical algorithms intersected with political economy during the Industrial Revolution’; secondly, ‘to reconstruct the lost coherence of the category of rule that could for so long and apparently without any sense of contradiction embrace meanings that now seem antonymical to each other’—not just algorithm, law and regulation, but also model and paradigm; and thirdly, ‘to examine how rules were framed in order to anticipate and facilitate bridge-building between universal and particulars’—which is to say, Daston considers questions such as how general rules might historically have been related to specific cases. The argument is structured schematically by three oppositions: rules, according to Daston, can be thick and thin, flexible and rigid, general and specific. As history, the book is confined to the West, although other traditions are touched upon in places; as theory it seems to aspire to a broader scope. In structure, it is partly thematic, partly chronological, moving through the centuries before cutting back again.

The ancient Greek word, kanon, which referred to rods and straightedges typically used in construction, was also applied to Pythagorean music theory, the sculptor Polykleitos’s specifications for the ideal male body, Ptolemy’s tables for astronomical computation, and physical architectural models; by the Hellenistic period, it was applied to exemplary orators and poets; early Christians used it to refer to the gospels and other scripture, the decrees that ordered religious life and ultimately canon law. The Latin regula had much the same connotations, but also related to reasoning by precedent, in the context of Roman law. According to Daston, three principal semantic clusters can be perceived here: measurement and calculation; models or paradigms; laws and regulations. The puzzle that she sets out to solve in Rules is the relative eclipse of the second: if measurement and calculation, laws and regulations are still with us, the ancient notion of the model or paradigm persisted into early modernity, only—according to Daston—to fall into neglect around the turn of the nineteenth century. In addition to changing dictionary definitions, Daston finds signs of this decline in much-discussed perplexities about the role of rules in Thomas Kuhn’s notion of scientific paradigms—can a paradigm be rendered completely explicit, for example?—and in Wittgenstein’s famous question about how it is possible to follow even mathematical rules without an infinite regress of interpretation, where some meta-rule is required to specify how to follow every rule; significant too, is the answer he ultimately found in ‘customs’.

Since it has, according to Daston, become hard to understand this ancient concept of the model or paradigm, she sets about reconstructing it, starting with a case study of the Rule of Saint Benedict—a fifth- or sixth-century book of precepts for the communal living of Benedictine monks. Although fine-grained, these depend upon the discretion of the abbot, who is himself supposed to exemplify life according to the Rule—he is ‘the rule of the Rule’. This is a matter not of following some rigid procedure, but of freely making nuanced distinctions; of moving from particular to particular via analogy, with a model as the basis; of honouring principles rather than literalistically following prescriptions. For Daston, the ancient ‘home’ of rules was in technê or ars: ‘fields guided by precepts but responsive to the vicissitudes of practice’, engaging both head and hand, form and matter, as opposed to the universal and necessary truths of epistêmê—though Aristotle recognized a continuum between these poles, with technê also involving ‘reasoning from causes and achieving some degree of generality’.

If technê or ars can be translated as the ‘arts’, this was a much broader category—stretching from logic to cookery—than its contemporary cognate. The arts in this sense ‘became a bustling factory of rule-making from roughly the late fifteenth through the eighteenth centuries’ with an outpouring of how-to manuals, such as Albrecht Dürer’s on geometry, claiming to offer rules for the raising of craft into art. Though previously opposed to the more prestigious artes liberales that constituted the core of university curricula, artes mechanicae rose in status in early modern Europe. The innovations of skilled artisans had implications for science: figures like Galileo, Newton and Leibniz took an interest in engineering, shipbuilding, ballistics; Bacon contrasted the stagnation of natural philosophy with the progress of the mechanical arts; Descartes’s Rules for the Direction of the Mind (1628)—dealing with mathematical problems, among other things—resembled ‘the heuristics of the artisans’ handbooks’. Such handbooks expounded rules embedded in the particulars of practical contexts, addressed to practitioners with some experience, and assumed constant adjustment. In the late seventeenth and eighteenth centuries, ‘the discourse of improvement and self-improvement in the mechanical arts merged with that of public utility, as mercantilist governments across Europe sought to fill their coffers by raising the quality of exportable manufactures—once again by issuing rules.’

Like the Rule of Saint Benedict, such rules were, for the most part, ‘thick’: embedded in contexts, festooned with illustrations, qualifications, exceptions and advice on application. Even mathematical rules were expressed in practical examples, with generality emerging from an ‘accumulation of specifics’. According to Daston, these rules should be understood as inextricable from such paraphernalia: the examples were the rule. Cookbooks supply an example of something ‘thinner’: rules meant to be unambiguous and followed step-by-step. But they still assumed varying degrees of experience, with the maximally explicit reserved for the most untrained, since ‘thin rules for those without any background experience . . . require standardization, routinization, and a painstaking breakdown of the task at hand into simple steps.’ The generality of thin rules ‘presupposes that the class of cases to which they apply is unambiguous, that all cases in this class are identical, and that they will remain so for all eternity’. Although all rules in the mechanical arts aimed to minimize chance, formal probability theories and statistics remained irrelevant: a prerequisite for their emergence would be homogeneity in a given domain, and the world was not yet orderly enough for that. The ascent of thin rules required a new uniformity:

Algorithms designed to be executed by computers are the thinnest of rules. This is not because such rules are in any sense minimalist—on the contrary, programs can be both long and complex—but because they assume complete uniformity in execution and conditions of application.

If the term ‘algorithm’ now tends to refer to step-by-step operations performed specifically by computers, for most of history since ancient Mesopotamia, such sequences have been found primarily in classrooms and textbooks. Although the word came much later—an import from Arabic mathematics—examples can be found on cuneiform tablets. Like other forms of rule according to Daston, ancient algorithms were contextually embedded, defined in terms of concrete specifics: methods for calculating the area of a field or dividing up bread, expressed in terms of the manual ‘calculating technologies’—abacus, knots—used in reckoning (mathematics appears here as emphatically material, dependent upon hand work); they were ‘thick rules in disguise’, which can only be formalized in the language of modern mathematics at risk of anachronism and the occlusion of original meanings. Historically, such things were contrasted with axiomatic ideals of demonstration—long associated, although Daston does not mention this, with Euclid’s Elements. It was only in the twentieth century that algorithms would be claimed for mathematical proof, in the context of David Hilbert’s foundational programme of deriving all of mathematics from a provably consistent and finite set of axioms—notably, though again Daston does not discuss this, not just in Kurt Gödel’s famous negative proof and demonstration of the necessary incompleteness of mathematics, but also in Alan Turing’s own rebuttal to the Hilbertian programme, ‘On Computable Numbers, with an Application to the Entscheidungsproblem’ (1936), which has been retrospectively claimed—in an apparent effort to inflate Turing’s parental role—as a theoretical model for the electronic computer that would emerge in the 1940s and 50s.

If modern mathematics gained generality through abstraction in the work of people like Hilbert and Moritz Pasch—a nineteenth- and early twentieth-century German Jewish mathematician who argued for a purging of physical interpretation from Euclidean geometry—an alternative route towards the general was to proceed by a sort of induction, from particular case to particular case. According to Daston, this was how premodern beginners learned algorithms, following something like the taxonomic processes of natural history, distilling in memory a paradigm that ‘epitomizes the genus of problems in a single, still-specific problem’ (there are echoes here of the varying forms of scientific truth in Objectivity). For Jens Høyrup, a historian of ancient mathematics, the modern tendency to view such things with disdain lies in a ‘mathematical Taylorism’ premised on a separation of head from hand. And for Daston, it was in the ‘Taylorism avant la lettre’ of the late eighteenth century that algorithms ‘became modern—and began to thin down’.

Though in ancient Greek and Latin, mechanice/mechanica referred to force-multiplying devices like levers and pulleys, by the thirteenth century, according to Daston, this term came to be associated with lowly and unfree forms of manual labour; the status of the mechanical was raised in the seventeenth century—as exemplified by Newtonian mechanics—only to be enmired again in an association with the ‘lowest class of manual labour, conceived as all hands and no head . . . mindless, repetitive and banausic’, as work was subordinated to an increasing division of labour, before the application of actual machines. As volumes of calculating work escalated through early modernity, particularly in areas such as astronomy and celestial navigation, that work had come increasingly to be viewed as a kind of drudgery, prompting innovations such as John Napier’s seventeenth-century calculating rods and logarithms—printed tables of which enabled some simplification of the complex mathematics of the time—but the first steps towards mechanical computation were to come later. Famously, French engineer Gaspard Riche de Prony organized the post-revolutionary production of logarithms through an intensive division of labour inspired by Adam Smith’s pin factory (itself derived from the Encyclopédie article on pins), and it was Prony’s example that inspired Charles Babbage to make the leap into thinking about actual machines, with his early nineteenth-century attempts to automate the production of tables of polynomial functions, which—like logarithms—had important uses in science and navigation (Daston does not discuss the intended ends of Babbage’s project, or why the British state saw fit to fund it at great expense).

Parallelling the course taken by manufactures, the large-scale calculations exemplified in the tables of the mid-nineteenth-century British Nautical Almanac—published by the Royal Greenwich Observatory as a tool for determining longitude on the basis of the position of the moon—were produced through distributed piecework, before being brought under one roof, and then finally mechanized. The division of mathematical labour had taken it away from the reasoning from particular to particular characteristic of earlier instruction in algorithms, with people like Prony specifying in detail how to execute a single type of calculation—‘an achievement in both mathematics and political economy’. This was the emergence of thin rules, for which context had been fixed, unpredictability and variability eliminated: a world ‘made safe for thin rules to function’. This in turn laid the basis for computational machinery:

It was the division of labour, not the machines, that made the algorithms mechanical—and made the actual mechanization of algorithms on a grand scale thinkable.

Although Babbage’s efforts were largely unsuccessful at the time, they can seem prescient when viewed from a present in which mechanized computation is ubiquitous. But between him and us lies an era, roughly 1870 to 1970, of calculating machines working in tandem with humans. If this periodization may seem odd in relation to the standard origin story of the computer—which locates it in the Second World War—Daston is on solid ground in considering the history of actual social practices, rather than that of inventions, which is generally far more questionable. As is well known, computational labour through the first half of the twentieth century was largely feminized (the women who did this work were known as ‘computers’), and there is a faint echo here of Ruth Schwarz Cowan’s More Work for Mother (1983): rather than simply lightening workloads, machines brought new demands; this was a period in which even mechanized calculation demanded—Daston emphasizes—exhausting levels of attention. At the Nautical Almanac, the installation of a Hollerith tabulator in the 1930s to calculate positions of the moon was a major, disruptive transformation that initially involved an expansion of the workforce. Under Georges Bolle, the French railways were introducing similar machinery at that time to track shipments and rolling stock, an effort that required centralization of the workplace and feminized labour. In such developments, figures like Bolle—whose motto was ‘first organize, then mechanize’—and Leslie Comrie at the Nautical Almanac devoted themselves to an intensive analysis and reorganization of the computational work process that made them forerunners of the programmer:

the analytical intelligence applied to making human-machine cooperation in calculation work was a rehearsal for an activity that would become known first as operations research and later computer programming[.]

Daston demarcates these developments from the origins of the Artificial Intelligence research programme: it was calculation rather than intelligence in general that was made algorithmic, and early computing machines tended rather to discredit calculation as an intelligent activity (she notes the waning cultural status of the calculating savant in this period). But an early model of the pioneering 1956 ai program, Logic Theorist—in which humans performed the various functions—‘recalls the division of labour at centres of Big Calculation ever since Prony’s logarithm project’, and one of its programmers, Herbert Simon, later reflected that analytical methods applied in the division of labour might be useful in modelling scientific discovery with a computer program. Perhaps, Daston conjectures, the shift ‘from mindless machines to machine minds’ was facilitated by that from visible processes of computational labour to the black-boxing of algorithms in inaccessible computer code.

At this point the argument veers suddenly from computing and algorithms to other points in Daston’s semantic clusters—regulations and laws—and back to the early modern period. This registers even at the level of style, as we move from the conceptual construction of the first five chapters to an easier, chattier mode for the remaining three: the early modern is where Daston seems most at home as a historian, even if she ventures beyond science and technology here, but there is also—as we shall see—a possible structural reason for this change in quality. Regulations are closer to particulars than laws, attempting to bring fine-grained order to everyday life; they have proliferated in modern societies and especially cities. With much amusement, Daston tracks the fate of sumptuary regulations from the High Middle Ages to the early modern, as rulers attempted—typically to maintain signs of social hierarchy—to regulate mounting expenditure on fashion: an endless and largely futile cat-and-mouse game in which prohibition of the latest look would only provoke ‘new, still more extravagant frippery’. These regulations were detailed, and sought to ‘narrow the margin of discretion to a sliver’, yet they could not eradicate ambiguity or the need for interpretation, and they almost always failed.

Next we turn to Enlightenment Paris, where the Police—‘vanguard of the absolutist bureaucracy’—developed a ‘seductive fantasy’ of bringing order to the swelling population, issuing rigid ordinances on such things as traffic and hygiene which were also prone to failure—as evidenced by their repeated reissuance. Competition between European metropolises drove this search for order, leading to ‘the first version of modernity, a modernity that had as yet very little to do with science and technology (the nineteenth-century version of modernity) and everything to do with orderliness, predictability, and, yes, rules’. The Dutch led the way with Amsterdam, at the same time as they were pioneering ‘mathematically based annuities, lotteries, and insurance schemes . . . to rein in the role of chance in human affairs’. And despite the high rate of failure, the emergence of instances of urban order fuelled hopes of more general victory. Over time some regulations deepened into habits and norms, and republican governments tended to have more success—perhaps, Daston suggests, due to greater legitimacy—but urban order is always partial. Daston’s final examples of typically failed regulation are the attempts at spelling reform that began with the consolidation of national languages. Common to these upsurges in regulation is an attempt to deal with the increasing scale and complexity of early modernity:

Wherever human interactions expanded and intensified at an accelerated pace, regulations cropped up to order the perceived disorder of many people doing many different things in many different ways in the same space[.]

If regulations are tied to particulars, laws are maximally universalizing. Ideas of natural law and laws of nature coevolved in early modern Europe, reconfiguring concepts that had persisted since antiquity: the basis of natural law shifted from the divine to a human reason that attempted to be as self-evident and universally valid as the axioms of geometry, while God became the legislator of a law-governed nature. Natural order need not be universal: the ancient concepts of physis and natura referred to ‘specific nature’ (a concept already discussed in Against Nature)—‘that which makes something unmistakably what it is, and not something else’; ecological perspectives can grasp a local order; Bacon and Boyle referred to ‘customs of nature’, which hold most of the time. Yet the awkward metaphor of universal laws of nature spread in talk of natural regularities, displacing other terms, after Descartes used it for the fundamentals of his mechanical philosophy. This universalizing shift was also grounded in political and economic transformations:

Globe-spanning ambitions of trade and empire revived the rhetoric of universality, and the geographically more circumscribed ambitions of absolute monarchs to consolidate their territories elevated the value of uniformity.

Central in these shifts were competing notions of God’s wisdom versus his power: the far-sighted Leibnizian legislator who founded His creation on perfect laws which merely unfolded through time, or the Newtonian Universal Ruler who actively intervened here and there to correct the odd cosmic wobble. The former ultimately won out in the deterministic worldview of the mid-eighteenth century, though it lost its rationality: ‘Laws of nature like gravitation came to be seen as God’s positive law, universal and inexorable but arbitrary.’ At the same time, a growing gap between natural and human realms strained the analogy between natural law and laws of nature; Kant drove a wedge between them, supplanting natural law with human reason, and reconceiving laws of nature not as the ‘edicts of God’, but the precondition for understanding nature as any order at all.

Finally, Daston focuses on exceptions, rule-bending and -breaking. As rules became more rigid and ambitious from the seventeenth century, judgement and discretion became more controversial. Catholic casuistry, which reasoned from one particular case to another, not aiming to ascend to the universal or even make generalizations, fell afoul of Pascal’s famous polemic. The concept of equity shifted in meaning, from an exception to a law that would be unjust in the particular case to a conformity with a higher law. Meanwhile, with the enshrinement of Rule of Law, sovereign prerogative came to be viewed as arbitrary caprice. Daston scans early modern debates over sovereign exceptions, noting their resonances with and divergences from Schmitt’s later contempt for a natural law tradition that purged the legal order of exceptions and arbitrariness. There was an analogy here that Schmitt himself marked: ‘What a miracle was to nature, governed by laws of nature, prerogative became to the polity, governed by natural law: an intolerable exception to rules that held everywhere and always.’ Yet there was an irony in Schmitt: the modern sovereign exception, as exemplified by Hitler, is nothing without ‘a rational bureaucracy of rules’. Institutions and procedures had largely supplanted sovereign prerogative and exercises of executive discretion.

Rules are defined by their exceptions, and in an unpredictable world, rules were thickened by the exceptions they incorporated. With standardization and the development of ‘pockets of predictability and uniformity’, rules could be thinned down. More implicit forms of rules such as models or paradigms, and the cognitive skills they implied, could then become suspect. But the preconditions of thin rules can collapse, returning us to the thicker kind, and the bureaucratic and technical infrastructure on which thin rules depend is never perfect:

Computer algorithms, the thinnest rules of all, require an anonymous army of human monitors to correct their oversights and excesses on social media platforms. Behind every thin rule is a thick rule cleaning up after it.

How to evaluate Daston’s construction? There is no doubting her capacities as a scholar: an erudition both broad and deep has been distilled into this book. An impressive range of primary historical texts and secondary literature has informed it, in English, German, French, Italian and Latin, and atypically for a theoretical edifice of this sort, Daston has stepped into the archives at points: she brings new details to light in the discussion of mechanization at the Nautical Almanac. But many of these paths have already been well-trodden: the chapters on natural law, laws of nature, sovereign exceptions; the discussions of Babbage and Prony and the feminization of computational labour; these must be judged as parts of the overall argument.

As noted above, the historical epistemology associated with the Max Planck Institute has tended to concern itself with fundamental categories of scientific thinking—probability, objectivity, observation. There is a pleasure in uprooting concepts like this—particularizing something that has a claim to the universal, exposing the mess of contingencies that led to its growth. Daston has form in such operations, but is the argument of Rules the same kind of thing? How might we locate it in relation to historical epistemology? In a 2009 essay for Critical Enquiry, Daston drew a line in the sand between history of science and a Science Studies that she viewed as a relatively disreputable—and declining—field: that associated with the earlier Bruno Latour and the ‘strong programme’ originating at the University of Edinburgh, which famously attempted to bracket questions of the truth value of actual scientific claims when studying how scientists came to hold specific beliefs. If, according to Daston, the two had coincided in the post-Kuhnian moment, and shared some concerns—both adopted positions of estrangement in relation to contemporary science, and tended to have the political implication of strengthening the positions of the losers by taking them seriously—they parted ways in the 1990s as historians of science became ‘disciplined’ as professional historians. While Science Studies had to accept the category of ‘science’ as given in contemporary terms, and got somewhat lost in the theological disputes of the strong programme, history of science became more radically historicizing, specifying the mutations and limits of its central category, and refusing to read historical instances of ‘natural knowledge’ through the lens of current science. And while the strong programme necessarily relativized scientific claims, the latter’s historicism need not. Yet that historicism seemed to be reaching limits, dissolving into micro-history—and Daston indicated a path forward in a turn to philosophy.

If something is being historicized in Rules, it is not a matter of tracing the emergence of a major category of scientific knowledge; nor is this a contained study of a particular set of debates, as with her work on probability. One of the characteristic qualities of broadly Foucauldian approaches to history is the attempt to historicize categories—madness, sexuality—that might seem too general or abstract to have a history; on the face of it, a history of ‘rules’ might be viewed as a similar proposition. Yet Daston does not attempt to delimit her category in this way, and as she confesses, it is so expansive as to risk encompassing the entire history of humanity. There is thus a notable shift of register here, from a strong historicism that emphasizes radical, ruptural novelty to an implicitly anthropological perspective in which historical changes appear as reconfigurations of invariants, modernization as a transfiguration of human rationality itself; in this sense Rules might be read alongside Against Nature as a step towards philosophical anthropology in which ‘rules’ appear as something fundamental to the human world. But in Daston’s presentation, the unity of this category depends almost entirely on the ancient etymological origin-point where her three semantic clusters meet. One might reasonably ask whether this unity isn’t ultimately a mirage: does it make any sense to discuss mechanical computation, sumptuary regulations, theories of sovereignty and early modern science together as instances of a single underlying thing? After all, etymology has a way of drawing together the disparate in surprising and often arbitrary-seeming combinations—an effect of semantic differentiation over time—and the fact that terms share a common root need not imply that their referents are actually linked.

One could interpret this etymological argument as a rhetorical gambit to achieve a certain end—gaining a vantage point on a long-standing concern of Daston’s work: shifts in notions of rationality that have come with transformations in social order since the onset of modernity, and that have tended to involve a narrowing of reason in the pursuit of formalization, optimization, standardization, mechanization. The crux of her thinking here would seem to be the opposition between a narrow but sometimes seductive formalism and the vicissitudes of a reality that must be either spuriously excluded or have order imposed upon it. This is something that we find in her work on probability theory and on Cold War rationality; mutatis mutandis, a similar structure is at play in Objectivity, where it is the subjectivity of the scientist that is subordinated to an often machinic process. In the current iteration, the key opposition would seem to be between the subtleties of judging on the basis of models or paradigms, and the reduction of rationality to something so standardized it can be reproduced by a machine. In this sense, some aspects of her ‘semantic clusters’ are implicitly more important than others; as noted, one feels the tone change as she turns to law and regulations—as if the main points have been made and she is going through the motions to complete her rhetorical manoeuvre. If this interpretation is correct, the question over the unity of her category would become a pragmatic one: does this ploy serve its purpose? Might something more direct have worked better?

A blind spot that would have been obvious in a less circuitous approach to the history of algorithms, calculation and models is the failure to address the career of the model in modern mathematics, physics, economics, computing: the construction of models of previously unachievable complexity was a central application of early electronic computers, with pioneers like John von Neumann and Jay Forrester very much concerned with modelling dynamic systems such as weather and the world economy. Far from vanishing at the turn of the nineteenth century, the model is still, of course, alive and well in the form of the Large Language Model. One could reasonably characterize the ascendency of such connectionist ai—roughly, machine learning based on feeding data into models of neural systems—over symbolic ai, which aims to model the logic and representations involved in reasoning according to explicit rules, as a move, in Daston’s terms, from thin and rigid to thick and flexible. While the ‘model’ as exemplified in such things is a long way from the premodern concept, this is, at least, an elephant in the room.

There is another, perhaps more significant, shift in Daston’s thinking here. A materialist mode, while never absent in the explanatory admixture of Daston’s books and the work of the Max Planck Institute, has come to the fore: what accounts ultimately for the changing nature of ‘rules’ are factors like the division of labour, expansion of trade, the intellectual/manual labour distinction, and transformations in the nature of the state. With her move from radical historicism towards philosophy, has Daston also taken a materialist turn? She notably endorses an understanding of mathematics as grounded not in some abstract cognitive realm, but in the hand-work of reckoning. We are not so far here from the classical Marxist history of science, as exemplified in works such as Soviet physicist Boris Hessen’s 1931 attempt to explain Newton’s Principia in terms of the economic exigencies of early capitalism, Henryk Grossman’s ‘Social Foundations of the Mechanistic Philosophy and Manufacture’ (1935), Anton Pannekoek’s History of Astronomy (1951), or even J. D. Bernal’s monumental Stalinist tome Science in History (1954)—none of which was quite as vulgar as the Marxism of that time is typically reputed to be. After playing an important role in earlier historiography of science and technology, Marxism has largely been consigned to the bad old days of these disciplines since the seventies. Hacking’s Emergence, which might be read as a transitional work, asserted that while some ‘undogmatic version’ of a materialist approach ‘must be right’, it could not explain origins, and turned to Foucault for an alternative.

For many since, Foucault has played this role, historicizing changes in ‘discourse’ without granting a special causal role to anything in particular. One usually finds him endorsed a page or two after a glancing dismissal of the base/superstructure metaphor, as he is in Daston’s Objectivity and Paul Edwards’s The Closed World: Computers and the Politics of Discourse in Cold War America (1996). The latter was an exemplary study of early electronic computing shrouded in a hazy metaphorology that derived heuristics from a cultural-studies reading of the Terminator films. Other alternatives, more sociological but often refraining from anything that looks too much like an explanation (one glimpses here the shadow of Latour), have come from Science and Technology Studies: Jon Agar’s The Government Machine: A Revolutionary History of the Computer (2003) located the genesis of the computer in the pneumatic rising and falling motions—themselves unexplained—of certain professional groups within the British civil service, and flirted with the extraordinary idea that the computer is a materialized metaphor for the state. While works like these have made real contributions to scholarship, this has often seemed in spite of theoretical frameworks that can look merely decorative, offered in place of a disavowed materialism.

This is not something that Daston can be accused of. The major difference between her historical perspective here and that of Marxists like Bernal, is that hers is shorn of any sense of progressivism about the march of science and technology, but in this respect, she is in sympathy with many of his successors; arguably, even the late Marx was post-Kuhnian in this sense. So how does this book look if we try to read it as a work of historical materialism? One striking effect of this strange category of ‘rules’ is that it groups together the emergence of mechanical computation, bureaucratization and rationalization—historical phenomena that can intuitively seem related, but which are difficult to articulate together. In so far as Daston connects these shifts to the labour process, her argument seems to straddle ‘the economic’ and ‘the political’—a relationship that has been a neuralgic point in the history of Marxism. But there is a key enigma here that Daston does not address, and nor—to my knowledge—does anyone else: if the origins of modern computing lie in labour process transformations that closely paralleled developments in manufacturing, how to account for this parallel given that for much of this history, the relevant production processes were typically not directly capitalist, but subsumed within states? Prony, Babbage, the staff of the Nautical Almanac—all were working for states rather than businesses. In fact, early capitalist firms displayed very little interest in the division of computational labour or its mechanization, and states generally led the way in this area well into the post-war period. So why did they act so much like manufacturing firms when they ventured down this path? Is there a single, universal kind of rationalization—as seems to be implied by Weber’s classical formulations—and a single process of the division of labour, which state and capital have in common, in spite of distinct goals, or was something more arbitrary at work: did industrial capital merely supply the state with a model? The leading theorizations of this transition from Chandlerian business historians and systems theorists have obscured the problem by treating the differences between organizations—whether public or private—as a simple matter of scale. The answer will probably have a bearing on theories of bureaucracy and the modern state.

Finally: what might have motivated Daston’s materialist turn? The most serious—if less theoretically ambitious—works in the history of computing, such as Martin Campbell-Kelly’s, tend to take some interest in the labour process, political economy and economic history. These are hardly avoidable if one is to take a couple of steps beyond the Great Man tales and techno-fetishism that make up the discursive raw material of the history of machines, for computing devices have spent the vast majority of their history embedded in work processes, and their most visionary early pioneer—Babbage—was, of course, a political economist. Daston’s turn is thus implied by her object: perhaps, as people try increasingly to trace the roots of a capitalism now saturated in computing machinery, it is time for a renewed materialism—of the rules, exceptions and errors that structure an ailing world.