- Post 2: A question for a classmate about their first post that refers to Learning Materials and is at least 75-100 words long.
Jennifer Booker
Option 1
Work Cited
Miller, D. (2019). Anthropological Studies of Mobile Phones. Technology and Culture, 60(4), 1093-1097.
http://dx.doi.org/10.1353/tech.2019.0103
Summary
This article is a review of the mobile phone from an anthropological viewpoint. Several books from the anthropologist are mentioned in the article which discusses the effects of the mobile phone when introduce in different cultures. It describes the changes the mobile phones invoked on the people and the wide range of uses for the mobile phone. In addition, the article discusses the supplier of the mobile phones to small cities, towns, or villages. The article details the friendly approaches of the mobile supplier in order to gain the trust of the population of this new product while soley there to generate revenue.
Approach
This article takes the approach of technological determinism and social constructivism. Small islands like Mozambique, Samoa, and Papua New Guinea, just to name a few, were introduced to the mobile phone by the corporate company Digicel. The introduction of the mobile phone to these densely populated areas had a wide range of effects. The mobile phone began to reshape their culture. “One result of this was that we emphasized the way people expanded their social connections. In order just to get by, many Jamaicans without any income tried to find ways to “link-up” with Jamaicans who had some kind of income (Miller, 2019)”, the use of the mobile phone help them make these connections. Social constructivism come into to play with the mentioning that certain cultures viewed the phone as form of concealment and making things apparent. An example of this is the use of the phone to “arrange meetings with a lover, the phone seems to make it much easier to make all the requisite arrangements in secret (Miller, 2019). Another example of social constructivism, this new technology allowed people who were ostracized for health-related illness to feel some type of connection without having to provide anything in return. “They may find themselves shunned by their kin, but through using random call[1]ing, they can find supportive strangers who give them emotional and prac[1]tical care and don’t ask for money, sex, or pigs in return, unlike family and boyfriends (Miller, 2019).”When Digicel introduced the phone to these areas the used the approach of technological determinism, because they want the phone to reshape the culture. They wanted to replace the use of traveling miles to deliver news or information with the use of a cell phone. Their goal was to help a culture stay more connected to family and friends and reduce the use of long-distance travel.
In conclusion, this artifact -the mobile phone shaped cultures and allow cultures to shape it.
WC 426
Envisioning Latour 2.0
When the young anthropologist Bruno Latour visited the laboratory at
the Salk Institute for Biological Studies, he famously set aside all of his pre-
conceptions about the goals and behaviors of its inhabitants. Rather than
accept as reality the Salk researchers’ self-interpretation of their collective
enterprise, he carefully observed their day-to-day activities and material
practices and came to his own somewhat startling conclusion. What the
scientists and technicians at Salk spent the greatest part of their day doing,
noticed Latour, was “coding, marking, altering, correcting, reading, and
writing” various forms of documentary material. In this they resembled
nothing so much as a “strange tribe” of “compulsive and manic writers”
whose principal function seemed to be the manufacture of paper docu-
ments. Even their large and expensive experimental instruments acted pri-
marily as “inscription devices,” technologies designed specifically to “trans-
form material substance into a figure or diagram.”1 It was through these
various and repeated acts of inscription and transcription, argued Latour,
that ordinary data was transmuted into scientific fact.
Whatever you might think about Latour’s overall methods and analy-
sis, his close attention to the material practices of scientific knowledge pro-
duction inspired generations of historians and sociologists of science and
technology to take seriously the notion that technique and technology are
epistemologically significant. The tools we use to think with affect the char-
Nathan Ensmenger is an associate professor in the School of Informatics and Comput-
ing at Indiana University. He is currently working on a book that explores the develop-
ment and use of computerized decision technologies in medicine, finance, and public
policy. He is particularly grateful for the contributions to this essay of William Aspray,
Eden Medina, and Suzanne Moon.
©2012 by the Society for the History of Technology. All rights reserved.
0040-165X/12/5304-0001/753–76
1. Bruno Latour and Steve Woolgar, Laboratory Life, 48–51.
E S S A Y
The Digital Construction of Technology
Rethinking the History of Computers in Society
N A T H A N E N S M E N G E R
753
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 753
acter of our thoughts; to write something down is to transform it. The ma-
terial culture of the laboratory is important because experimental instru-
ments are agents in the production of scientific knowledge. It matters who
built these instruments, and how, and for what purposes; it matters how
these instruments are used, and by whom. These are no longer controver-
sial assertions, even outside the narrow confines of the academic literature
on science and technology studies.
Given that the close observation of material practices has proven such
a productive methodology, it is curious how little attention has been paid
within the historical community to the single most widespread and signif-
icant innovation that has occurred within the material culture of the labo-
ratory—and indeed, almost every site of scientific or technoscientific activ-
ity in contemporary society.
Let us imagine, for a moment, that Latour were to return to the Salk In-
stitute of the present to revisit his observations of four decades previous.
There are many things about the institute that would be familiar: the stark
modernism of the Louis Kahn–designed architecture; the academic creden-
tials and distinctions of the research staff and their ambitious young post-
doctoral fellows; the perpetual conversational obsession with publications,
priority, and position. Latour might even recognize some old friends, or at
least familiar faces. And yet there would be one striking and obvious differ-
ence that would be immediately evident: with the possible exception of the
janitorial staff, every single employee of the Salk Institute would spend the
majority of their time each day interacting with a computer screen. From
scientist to secretary, their work would revolve around computer technol-
ogy. Even those operating experimental instruments or other equipment
would do so via a computer-based interface. In fact, to a naive observer, it
might seem as if the designated role of most of the Salk researchers and
technicians was simply to shuffle from one computer screen to another, with
little perceptible differences among the activities engaged in at each loca-
tion. Every room in the institute would contain at least one computer, and
it would be difficult to distinguish between the computers in the rooms des-
ignated as “laboratories” and those labeled as “offices” (whether faculty or
administrative). There would be entire rooms devoted to computers, some
of which would only rarely be visited by human beings. All of the comput-
ers in the institute would be networked to every other, and computers would
serve as the primary means of communication both within the institute and
to the outer world. There would be nary a piece of actual paper in sight—
with the possible exception of the Ph.D. diplomas hanging on the office
walls, which would be as likely to reflect degrees in the fields of computer
science and bioinformatics as in molecular biology.
It might be that our hypothetical Latour version 2.0 would explain
away the pervasive presence of computers and other digital technologies in
the laboratory as simply being the modern incarnation of the inscription
754
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 754
device. After all, the majority of these technologies would be used, at least
in part, for the creation of digital documents. In fact, the very essence of all
of these machines could be described as being literary, their primary func-
tion being the reading and writing of codes—albeit codes intended to be
read primarily by machines rather than people. Seen from this perspective,
there would be no significant difference, at least in analytical terms, be-
tween the traditional version of a scientific “paper” and its electronic equiv-
alent, or between the tracings on the paper tape of a 1960s-era gas chro-
matograph and the digital representation of the same produced by a more
modern instrument. The larger interpretation of the laboratory would
remain essentially the same, with the computer being merely the most con-
venient contemporary tool available to perform the more timeless and ab-
stract tasks associated with scientific knowledge production.
To dismiss so easily this dramatic transformation of material practice
would, however, run counter to the entire theoretical and methodological
revolution in science studies that Latour himself played such a key role in en-
abling. It would also require him to ignore the visible evidence of another,
perhaps even more profound incorporation of computer technology into
the modern biological laboratory. Scattered across the Salk Institute are
buildings whose very names—the Crick-Jacobs Center for Computational
and Theoretical Biology, the Computational Neurobiology Laboratory, the
Razavi Newman Center for Bioinformatics—bear witness to the centrality of
the computer not just to the production, but to the content of scientific
knowledge. Lily Kay, among others, has documented the ways in which
concepts from computer science and information theory disseminated
throughout the biological sciences in the late twentieth century. It is now
commonplace, for example, to talk about the human genome as a code to be
decrypted, the brain as a neural network, and disease as a “subspecies of in-
formation malfunction or communications pathology.”2 These are not mere
metaphors, but statements about ontology. As the noted biologist Richard
Dawkins described it, “genetics has become a branch of information tech-
nology. The genetic code is truly digital, in exactly the same way as computer
codes. This is not some vague analogy, it is the literal truth.”3 For many work-
ing in the modern biological sciences, living cells are not like computers—
they are computers. While the long-term utility and durability of this com-
putational turn in biology might still be an open question, the existence of
the phenomenon is undeniable. Without presuming to know the mind of
Latour, it seems safe to assume that if he were to repeat his visit to the Salk
Institute, he would both notice and take seriously the transformative power
of the electronic digital computer and its kindred technologies.
2. Lily Kay, “Who Wrote the Book of Life?”; Hunter Crowther-Heyck, “George A.
Miller, Language, and the Computer Metaphor of Mind”; Donna Haraway, “Cyborg
Manifesto”; Cornelius Borck, “Toys Are Us.”
3. Richard Dawkins, “Genetics” (emphasis added).
755
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 755
The pervasive ubiquity of the computer and the computational mindset
are hardly confined to the Salk Institute or the biological sciences. In the past
several decades, computational models and techniques have transformed
the theory and practice of disciplines as diverse as physics, economics, psy-
chology, linguistics, anthropology, psychology, meteorology, cognitive sci-
ence, and ecology. The dominance of the computer in the practice of engi-
neering has been especially dramatic: until the final stages of production, it
is not unusual for a manufactured good to live an almost entirely virtual
existence. Engineers use computer-aided design tools to construct digital
models, evaluate those digital models using computational techniques like
finite element analysis, and test their performance in virtual environments
via virtual instruments before transmitting their designs in digital form over
electronic networks to computer-controlled machine tools. Many of the
products that these engineers design with computers contain their own
computers embedded within them: microprocessor-based control systems
are used as key components in everything from automobiles to elevators,
from refrigerators to pacemakers, from electronic books to children’s toys.
In fact, there are few technologies, industries, or social practices that have
not been significantly influenced, if not radically transformed, by the incor-
poration of computers and computer-based technologies.
Outside of the academic historical literature, the centrality of the com-
puter to contemporary social, political, and economic life is widely recog-
nized. No technological development of the past century is considered to
be as profoundly influential as the invention of the electronic digital com-
puter. Indeed, in most contemporary contexts, the word “technology” has
come to mean computer technology. When educators advocate for more
technology in the classroom, medical practitioners for more technology in
the hospital, and economists for the development of a more technology-
proficient workforce, they are not talking about filing cabinets, stetho-
scopes, or drill-press operators; what they are calling for is more comput-
ers, computer-based diagnostic systems, and computer-savvy technicians.
There is a vast and growing popular literature on the impact of computer-
ization on almost every aspect of modern society. And while historians of
technology are right to be skeptical of the hyperbole and simplistic deter-
minism that characterizes much of this literature, we also ignore it at our
peril, as David Edgerton has recently suggested.4 By not engaging more
substantially with the technological phenomena that most of our contem-
poraries regard as one of the most consequential of all in human history,
historians of technology run the risk of becoming increasingly irrelevant,
losing our voice in a conversation to which we, of all disciplines, are
uniquely prepared to contribute.
But what exactly does the history of technology have to say to the broad
4. David Edgerton, “Innovation, Technology, or History.”
756
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 756
range of questions raised by the hegemonic technological, intellectual, and
ideological dominance of computers, computing, and the computational
mindset? Thus far our contributions have largely been confined to the his-
tory of the computer, which is a worthy topic and one that capitalizes on
our traditional strengths of studying engineers, innovation, and industries.
But this focus on the machinery of computation also limits our ability to
speak to larger questions. Consider, for example, the many computers we
noticed earlier in our imagined tour of the Salk Institute: in terms of their
underlying physical architecture they would be essentially identical, com-
modity hardware such as could be purchased anywhere by anyone. But
each of these generic machines would be transformed, depending on the
software program it was running, into an almost infinite range of specific
devices, from word processor to communications tool to simulation model
to (no doubt surreptitiously) video game console. Historians of technology
are only just beginning to come to terms with the history of software, a sub-
ject of even larger scope and complexity than the history of the hardware
that runs it. And as for the larger history of computerization, as it trans-
formed the ways in which the Salk biologists conceptualize and practice
their discipline, or engineers and architects design and build things, or
artists make music, movies, or photographs, or average citizens communi-
cate, consume, and interact with their environment—these are obviously
not just one history but many, all linked in fundamental and significant
ways by their shared reliance on the vast sociotechnological network of
computers, microprocessors, and other digital devices.
It may be that the story of the computerization or, as I will argue, the
digitization of modern society is too massive, recent, or amorphous a topic
for any one discipline to claim in its entirety. Communications depart-
ments, information schools, interdisciplinary programs in the digital
humanities, and the emerging discipline of internet studies have all laid
claim to some of this territory, and for legitimate reasons. But many of
these approaches are frustratingly ahistorical, adopting unquestioningly
the claims of computer enthusiasts and internet utopians that we are living
through a technological revolution unprecedented in all of human history.
There is a desperate need for historians of technology, with their long tra-
dition of providing nuanced, theoretically sophisticated analyses of tech-
nological and cultural developments, to provide some historical context for
understanding these phenomena.
In this essay I will explore the ways in which the history of science and
technology has thus far engaged with the history of computers, computing,
computerization, and other closely related technologies and practices. I will
argue for a new approach toward integrating these histories and addressing
more directly the broader questions being raised by academics in other dis-
ciplines, by policy makers and business leaders, and by the larger general
public.
757
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 757
Whither the History of Computing?
The conventional classification used within the history of technology
discipline to designate works dealing with the topics outlined above is “his-
tory of computing.” For most of the past few decades this has been a serv-
iceable category, covering in theory both machines (computers) and proc-
esses (computing). In recent years, as our understanding of the relevant
histories of modern-day ICTs (information and communications tech-
nologies) and other digital devices has expanded to include a whole host of
developments and technologies for which no one term is a satisfactorily
comprehensive descriptor—including, for example, the data-processing
machines that predated the electronic digital computer, such as the me-
chanical tabulating machine, or the many communication devices whose
histories are essential to understanding the social and technological archi-
tecture of the contemporary smart-phone—specialists in the history of
computing have experimented with using other unifying concepts around
which to organize their respective disciplines. For example, it is no coinci-
dence that so many of these historians hold positions in schools of infor-
mation, given that the seemingly universal desire to manage and control in-
formation is a common theme in much of their work. This said, “history of
computing” remains the dominant, catchall term for describing all these
subdisciplines.
Within the history of computing literature, the primary concern has
been the development of the electronic digital computer. This represents
both the popular understanding of what is the most significant innovation
in the history of computing, as well as the background of many of the ear-
liest historians working in this area. These included many computer pro-
fessionals-turned-amateur historians who, like many non-academic histo-
rians of technology, were concerned primarily with the key moments of
invention and questions of priority.5 The academic historians who wrote
about computing tended to have backgrounds in the history of science,
mathematics, or technology, and although they produced much more
sophisticated histories, they also tended to address questions of interest to
their respective disciplines and focus on the contributions of the tradi-
tional academic, scientific, and engineering elites. As a result, these histo-
ries gravitated naturally toward the high-status activities associated with
the design and theorization of computers, rather than toward the more
mundane work of actual computation. To the degree that they dealt with
computing, as opposed to the computer, they focused almost exclusively on
scientific computing. In the popular literature, of course, the emphasis has
always been on great men and important “firsts,” on the massive early arti-
5. Herman Lukoff, From Dits to Bits; David E. Lundstrom, A Few Good Men from
Univac; Michael Williams, A History of Computing Technology; Alice Rowe Burks, Who
Invented the Computer?
758
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 758
facts that now look so impressive mounted in museums, and on the lineage
of technological descent from past accomplishments that best explains the
shape of things in the present.
It did not take long, however, for the academic historians at least to dis-
cover a history of computing that predated the invention of the electronic
digital computer, and that challenged the very centrality of the computer in
that history. Most obvious were the immediate precursors of the large-scale
electronic-computing experiments of the World War II period, including
mechanical calculating machines, human computing projects, and analog-
electric cybernetic control systems.6 It turned out that there were also entire
industries devoted to information and data processing, such as the business
machines industry, whose origins were distinct from those of scientific
computing and pursued an entirely different technological trajectory, but
which came to define during the immediate postwar period not only the
technical architecture of the electronic computer, but also its cultural
meaning and social significance.7 In fact, the “Cambrian explosion” of in-
novation that occurred in the business machines industry during the last
decades of the nineteenth century, which produced most of the firms, such
as IBM, Burroughs, Honeywell, and Remington Rand, that would later play
such formative roles in the early commercial computer industry, was at
least as significant in the history of modern computing as the later innova-
tions that would emerge from the wartime experiments with electronic cal-
culating machines.8 The fact that none of these companies viewed them-
selves as being primarily involved in “computing,” at least for the first
half-century or more of their existences, complicated our understanding of
what the history of computing was really about. In their excellent (and ex-
traordinarily durable) historical synthesis of this second generation of his-
tory-of-computing literature, Martin Campbell-Kelly and William Aspray
characterized the computer as “the information machine,” which aptly cap-
tured this new perspective on relevant history—or histories, as Michael
Mahoney repeatedly argued is the more appropriate description.9
The expansion of the history of computing to include more informa-
tion-processing technologies than just the electronic computer opened up
the field to a broader range of participants as well. Historians looking
beyond the manufacturing of computers began asking questions about how
computers were used, by whom, and for what purposes. They uncovered
the crucial contributions made by nonelite actors like technicians, opera-
6. Paul Ceruzzi, Reckoners; David Alan Grier, When Computers Were Human;
David A. Mindell, Between Human and Machine.
7. James Beniger, The Control Revolution; JoAnne Yates, Control through Communi-
cation; Alfred Chandler and James Cortada, A Nation Transformed by Information.
8. James Cortada, Before the Computer; Lars Heide, Punched-Card Systems and the
Early Information Explosion.
9. Martin Campbell-Kelly and William Aspray, Computer; Michael S. Mahoney,
“The Histories of Computing(s).”
759
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 759
tors, and programmers, and in doing so rediscovered the significant pres-
ence of women in computing.10 They also revealed the ideological dimen-
sions of the computer revolution: far from being an inevitable consequence
of economic rationality, the desire to computerize was often driven by the
need for centralized administrative control, or to advance individual or
professional agendas, or simply to appear cutting-edge and “shiny.”11 For a
wide variety of efficiency experts, systems men, management consultants,
and government officials, the novel and as yet inchoate technology of elec-
tronic computing represented the ideal tool with which to achieve goals that
already had been decided on. In this case, computerization was a means to
an end, not the end in itself. But although this new generation of historians
of computing engaged explicitly with other historical literatures like those
of business, labor, and social history, they continued to take seriously the
centrality of technology in the larger structures of power and processes of
social change. To borrow from a felicitous phrase from Jon Agar’s history of
computing initiatives in the British civil service, historians of computing
were “putting the ‘bureau’ back into ‘bureaucracy.’”12 In doing so, they not
only enriched the specialist history of computing literature, but reminded
historians in other subdisciplines that any serious study of mid- to late-
twentieth-century history would necessarily have to engage with innova-
tions in computing and information technology.
The Protean Machine
Perhaps the most promising development in the recent literature on the
history of computing has been the increasing focus on software. The his-
tory of software has long been recognized as a critical subject of historical
inquiry, but it is only in the past decade that historians have developed the
tools and methods to write about it effectively. While the significance of
software is widely acknowledged, coming to terms with it from a historical
perspective has proven extraordinarily difficult.13
First, a note on why software is so central to our modern understand-
ing of what computers are and what they can be used for. The first elec-
tronic digital computers were designed as special-purpose machines un-
derstood primarily in terms of existing traditions of mechanical (or at least
10. Jennifer Light, “When Computers Were Women”; Marie Hicks, “Only the Clothes
Changed”; Nathan Ensmenger, “Making Programming Masculine.”
11. Thomas Haigh, “The Chromium-Plated Tabulator”; Nathan Ensmenger, “Let-
ting the ‘Computer Boys’ Take Over”; Eden Medina, Cybernetic Revolutionaries; Joseph A.
November, Biomedical Computing; Christopher D. McKenna, The World’s Newest Pro-
fession.
12. Jon Agar, The Government Machine, 6.
13. Ulf Hashagen, Reinhard Keil-Slawik, and Arthur L. Norberg, eds., History of Com-
puting; Martin Campbell-Kelly, “The History of the History of Software”; Michael S.
Mahoney, “What Makes the History of Software Hard.”
760
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 760
14. Seymour Papert, Mindstorms, viii.
15. Sherry Turkle, The Second Self.
mechanically assisted) calculation. But it was soon realized that, by reengi-
neering these devices to eliminate the distinction between the operating
instructions of the device (its program) and the data on which it operated,
the electronic digital computer could be reinvented—and reconceptual-
ized—as a universal logic machine. It is this inherent flexibility, and its abil-
ity to be programmed via software to serve an almost infinite number of
purposes, that makes the electronic digital computer such a powerful and
compelling technology. Given the right software, an electronic digital com-
puter can simulate, control, or even replicate almost any other complex
technological, social, or even biological system. “What the gears cannot do
the computer might,” the pioneering computer scientist Seymour Papert
famously suggested, “The computer is the Proteus of machines. Its essence
is its universality, its power to simulate.”14 While the perceived universality
of the computer has certainly been overstated, it is clear that it is software,
as much as the computer itself, that makes such claims and predictions
plausible.
Software is also what defines our relationship to the computer. It is what
we experience when we interact with the machine. It turns the generic, com-
modity computer configuration—screen, keyboard, and the (quite literally)
black boxes that contain all of its essential circuity—into a multipurpose
collection of capabilities that reflects our particular requirements and
desires, such as an email client, word processor, media player, simulated
oscilloscope, or a collection of virtual Angry Birds, among many other
things. We might not know what kind of computer we are using or who
manufactured it, but we definitely know what software we are currently run-
ning. It is software that provides the computer with such an unusual degree
of sustained interpretive flexibility, and software that provides the computer
with much of its perceived economic, social, and cultural significance.15
The idea that it is the software that defines the computer is not some
mere flight of fancy sprung from the fevered imagination of a postmodern
theorist, but is rather the essence of all modern theories of computation.
For present-day computer scientists, the computer is by definition a ma-
chine that runs a certain kind of software program; whether the machine is
electronic, digital, biological, or even material is irrelevant. What matters is
that it can run software. It is this notion of the abstract computer, the Pla-
tonic ideal known as the universal Turing machine, that renders the com-
putational mindset so compelling—and indeed, so hegemonic. Any system
that can be described in terms of a Turing machine is a type of computer
and can be understood using computational terminology. This is what al-
lows Dawkins to describe the genome as computer code, the physicist Ste-
phen Wolfram to conclude that the universe is fundamentally digital, and
761
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 761
16. Dawkins, “Genetics”; Stephen Wolfram, A New Kind of Science; Steven Pinker,
How the Mind Works.
17. John W. Tukey, “The Teaching of Concrete Mathematics.”
18. Andrew Friedman and Dominic Cornford, Computer Systems Development.
the psychologist Steven Pinker to represent the human brain as the inter-
section of Darwin and a computer program.16
But we are running ahead of ourselves. From a historical perspective,
this understanding of software as the essence of computing took some time
to develop. The first electronic digital computers were simply programma-
ble calculators. The pioneering ENIAC machine, for example, was not so
much programmed as configured, with each new application requiring ex-
tensive preparation, because the machine needed to be rewired using plug
cables and mechanical dials. The work involved in “setting up” the com-
puter was considered to be low-skilled clerical work and was accordingly
assigned to low-status, female machine operators. The assumption of the
ENIAC project leaders and most other early hardware designers was that
once scientists or engineers had decided what work the computer needed
to do, “programming” it to actually complete the task would be a relatively
mechanical process of translating from one language (English, for exam-
ple) into another (machine language, or assembly code). As it turned out,
neither step in this process was straightforward. Not only was the work of
programming extraordinarily difficult, but it also became evident that even
deciding what to program presented serious challenges. Indeed, by the end
of the 1950s the critical “reverse salient” of the nascent computer industry
had shifted from hardware design and construction (building the comput-
ers themselves) to software development (“software” was defined at the
time as being everything about a computer installation that was not obvi-
ously “tubes, transistors, wires, tapes and the like”).17 It was this “everthing
else” about computers that turned out to be the real complication.
What made software so difficult to develop is exactly what makes it so
interesting to historians. Whereas the computer itself was a definite mate-
rial artifact that could readily be identified and isolated for testing, evalua-
tion, and improvement, software systems were inextricably intertwined
with a larger system of computing that included not just machines, but also
people and processes. The software that had to be developed to computer-
ize an accounting operation, for example, included not only computer
code, but also an analysis of existing operations, the reorganization of pro-
cedures and personnel, the training of users, the construction of peripheral
support tools and technologies, and the production of new manuals and
other documentary materials.18 Of all of the aspects of software develop-
ment, writing the actual application code generally involved no more than
a third of the overall time and effort. And even after the accounting appli-
cation had been designed, coded, tested, and debugged (and in the process
often redesigned and reprogrammed), the system would have to be oper-
762
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 762
19. Nathan Ensmenger, “Software as History Embodied.”
20. John Law, “Notes on the Theory of Actor-Network”; Bruno Latour, “Social
Theory and the Study of Computerized Work Sites.”
21. Thomas Haigh, “Inventing Information Systems.”
22. Nathan Ensmenger, The Computer Boys Take Over.
23. Michael S. Mahoney, “Software as Science”; Herbert Simon, The Sciences of the
Artificial.
24. Maurice Black, “The Art of Code”; Paul Graham, Hackers & Painters; Ensmen-
ger, The Computer Boys Take Over.
25. Frederick Brooks, The Mythical Man-Month.
26. Wendy Hui Kyong Chun, “On ‘Sourcery,’ or Code as Fetish.”
27. JoAnne Yates, Structuring the Information Age.
ated and, unexpectedly, continuously maintained—not because the soft-
ware application would “break,” but rather the context in which it was used
or the other systems it interacted with, including such nontechnical sys-
tems as corporate accounting policies and governmental regulations,
would change over time. As much as two-thirds of the costs of a software
system were incurred after the software was developed and operational.19
For computer users, the vague boundary between the social and the tech-
nical aspects embodied by the software was an expensive nightmare; for
historians of technology, it is a goldmine. A better example of the complex-
ity of a sociotechnical system or a heterogeneous network can scarcely be
imagined.20 Software is where the technology of computing intersects with
social relationships, organizational politics, and personal agendas.21
As I have argued extensively elsewhere, software is an extraordinarily
heterogenous technology; it straddles the boundaries between science and
technology, art and engineering, and the intellectual and the material.22
Software is clearly a built object, designed and implemented by humans, yet
it is also a mathematical formalism, an appropriate object of study for the
scientist or theorist.23 The people who develop software refer to themselves
alternatively as programmers, computer scientists, or software engineers—
as well as black artists, wizards, hackers, gurus, and cowboys.24 They do not
fit neatly into established academic or professional categories. The systems
they construct are as much literary as technological productions, and are
often referred to by practitioners as such.25 Like a poem, a program exists
in the mind of its creator, regardless of whether it is ever written or per-
formed. Software might even be considered a form of incantation: words
are spoken (or at least written) and the world changes.26 A computer pro-
gram is invisible, ethereal, and ephemeral. It exists simultaneously as an
idea, as language, as technology, and as practice. Certain forms of software,
such as a sorting algorithm, can be generalized and formalized as mathe-
matical abstractions, while others remain inescapably local and specific,
subject to the particular constraints imposed by corporate cultures, formal
and informal industry standards, and/or government regulations.27 In this
sense, software sits ambiguously at the intersection of science, engineering,
763
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 763
28. The ephemeral nature of software also poses serious challenges to the archivist,
museum curator, and historical researcher. Even when the actual source or machine code
of a particular software package has been saved, the social and technological systems that
allow it to function are generally not available. For good reason, archivists warn about
this period in history degrading into a digital dark age.
29. Eloina Paleaz, “A Gift from Pandora’s Box”; Ceruzzi, “Moore’s Law and Techno-
logical Determinism”; David C. Brock and Christophe Lécuyer, “Digital Foundations.”
30. Nathan Ensmenger, “The ‘Question of Professionalism’ in the Computer Fields”;
Rosalind Williams, “Historians of Technology in the Information Age.”
and business. As may be imagined, all this heterogeneity renders software
extraordinarily difficult to isolate, understand, and write about.28
Because the heterogeneity of software inevitably shifts the eye of the his-
torian from his traditional focus on the computer as artifact and toward the
larger context in which computers function in society, the emerging schol-
arship in this area has opened up new questions, sources, and sites of his-
torical analysis. Among other things, the study of software allows the histo-
rian of computing the opportunity to analyze failure. Whereas the story of
computer hardware is dominated by triumphal and deterministic progress
narratives driven by the seemingly inexorable march of Moore’s law toward
smaller, faster, and less expensive microelectronics, the history of software is
characterized by conflict, tension, and disillusionment. More than three-
quarters of all software development projects fail to be completed. The costs
associated with software development continue to rise, and complaints
about software projects being over-budget, behind-schedule, and bug-rid-
den remain a constant refrain within the industry literature.29 For more
than four decades key leaders in software development have been warning
about a looming “software crisis” threatening the health and future of their
industry. The Y2K and H1B crises and a whole host of other conflicts and
debates provide ample opportunity to explore questions about technical ex-
pertise, professional identity, community dynamics, and race, ethnicity, and
gender. In fact, most of the controversies attributed to various aspects of
computerization, including concerns about technologically driven unem-
ployment, government surveillance, breaches of privacy, cybercrime, and so
on, are really, at their hearts, debates about software and/or software devel-
opers. While clearly, in the end, software is largely a success story—after all,
without functioning software, there would have been no computer revolu-
tion—the prominent visibility of failure in the software story is an excellent
lens through which to view the messiness and permeability of the computer/
society continuum. As Rosalind Williams wrote about her experience as an
MIT adminstrator attempting to implement a university-wide software sys-
tem, “In a digital world, technological consciousness and cultural conscious-
ness are simultaneously heightened. . . . The relationship between the two is
one of constant and often painful tradeoffs.”30
Another productive avenue of research opened up by the history of
764
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 764
31. Martin Campbell-Kelly, From Airline Reservations to Sonic the Hedgehog.
32. Jeanette Hofmann, “Writers, Texts and Writing Acts”; Jessica Johnston, Techno-
logical Turf Wars.
software has to do with the study of computer use and users. Historians of
technology have long been interested in the ways in which artifacts and
their users are co-constructed through use-practices. Given the inherent
plasticity of the programmable digital computer, this process of co-con-
struction is particularly visible. Obviously some users, such as operators,
technicians, and programmers, have a great deal of control over the struc-
ture, function, and meaning of the technology. But all users have at least the
perception of control over the computer as it is represented and made tan-
gible by software. One of the defining features of software is its literary
nature: the way the software works is determined, to a greater or lesser
degree, by how the software is written. This implies, in theory at least, that
software can also be rewritten, which means that all software is contingent,
transitional, and subject to constant renegotiation and redesign. Whereas
conventional engineers and architects plan carefully before committing
their ideas to manufacturing, computer programmers face no such mate-
rial constraints on their creativity. While this allowed programmers an un-
precedented degree of freedom and creativity (“build first and draw up the
specification afterwards” was a frequent mantra of the software industry),
it also created unrealistic expectations on the part of the ultimate end-users
of the software. Software applications were perpetually works in progress,
with new features being requested and new bugs introduced.
As more of the programmability of the computer was made visible to
the end-users (what is a spreadsheet, after all, but a specialized interface to
a programming language?), the possibilities for users to reconfigure their
software to their own preferences and requirements became even more ap-
parent. This became even more true with the development of the personal
computer, which created both new users and a new category of computer
software: the mass-market, prepackaged consumer good that, as Campbell-
Kelly suggests, has as much in common with the products of the entertain-
ment industry as with the custom-made software systems of the earlier
mainframe era.31 The taxonomy of software types that Campbell-Kelly
develops is an excellent reminder that although all software shares some
essential characteristics, specific software systems, such as those developed
for particular machines, industry standards, regulatory environments, cor-
porate cultures, or technical ecosystems, are very different technologies in-
deed.32 In any case, the software developed for personal computers, which
is inexpensive, often amateurish, and, in the early years at least, extremely
limited, nevertheless provided a kind of power to computer users that was
previously nonexistent. Jon Lindsay, for example, in a recent issue of Tech-
nology and Culture describes the way in which individual air force pilots
765
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 765
33. Jon R. Lindsay, “War upon the Map.”
34. Steven Levy, “A Spreadsheet Way of Knowledge”; Robert X. Cringely, Accidental
Empires.
could circumvent the policies of the U.S. military simply by loading a piece
of software smuggled on a floppy disk onto a personal computer.33 And
Steven Levy, among others, suggests that the development of spreadsheet
software created the context for a fundamental change within U.S. financial
markets.34 There are few consumer-oriented software packages today that
are not built around the idea that end-users will also act as programmers.
The business models of most present-day video-game companies, for ex-
ample, rely upon the players themselves to generate their games’ core func-
tionality and content.
Everything Is Digital
The incorporation of the history of software into the history of com-
puting has greatly expanded its ability to address the questions outlined at
the beginning of this essay: namely, how historians of science and technol-
ogy can engage productively with the pervasive and powerful influence of
computers and computational thinking in almost every intellectual, eco-
nomic, and social activity of the previous half-century. Nevertheless, the
rich possibilities suggested by the inclusive term “computing” have been
limited by the narrow specificity implied by the word “computer.” A key
feature of the very phenomena that we are interested in studying has to do
with the hegemony of computational discourse, the way in which an in-
creasing number of complex physical and social systems are being rede-
fined in terms of an abstract and universalizing understanding of the com-
puter. Unless we, as historians, adopt this same broad conception of the
computer as a timeless ontological entity (which is, of course, fundamen-
tally ahistorical), it can be difficult both to acknowledge the computer as a
significant and unifying presence in contemporary history and to find a
useful language for talking about related though definitively distinct tech-
niques and technologies.
One approach to solving this historiographic conundrum is to situate
the electronic digital computer within a larger history of information tech-
nology. This has the distinct virtue of linking the computer to earlier or
parallel technological developments without suggesting, for example, that
a Hollerith tabulator is simply a primitive attempt at implementing a Tur-
ing machine. The term “information technology” also encompasses com-
munications technologies, which in the era of the iPhone are revealed to be
central to our overall understanding of what it means to be a “computer-
ized” society. And for some scholars, the concept of “information” provides
a more fundamental unit of analysis than even the abstract, timeless, uni-
766
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 766
35. Chandler and Cortada, A Nation Transformed by Information; Bruce Allen Bim-
ber, Information and American Democracy; Daniel Headrick, When Information Came of
Age; Reijo Savolainen, Everyday Information Practices; William Aspray and Barbara M.
Hayes, Everyday Information.
versal computer. According to information theory, for example, informa-
tion is just another property of matter, a measure of its degree of organiza-
tion and the negative of entropy. Almost everyone else uses the term in a
more colloquial sense, but with the shared assumption that information,
and the desire to organize and communicate it, is common across all peri-
ods and cultures. The study of how various societies and individuals engage
in information-seeking behaviors, develop systems of information organi-
zation, management, and communication, and conceptualize the role of
information in other processes of social and cultural change has proven to
be an important complement to the history-of-computing approach.35
The problem with the term “information” is that it contains too many
multitudes; outside of the technical literature, it is used almost indiscrimi-
nately. Once information is adopted as a fundamental unit of analysis, then
almost everything becomes an information technology: cuneiform scratch-
es, quipu knots, smoke signals, quill pens and parchment, church bells,
newspapers, optical telegraphs, and so on. While there are some interesting
commonalities among all of these technologies, it is not clear that lumping
them together into a single conceptual category adds much to our under-
standing of information. Although there are important historical questions
that, very broadly, can be asked about the history of information; the dan-
ger is that we lose sight of the specific character of the underlying techno-
logical changes that make some forms of information technology especially
meaningful in specific historical contexts. In popular literature on informa-
tion in particular, the role of technology is simultaneously taken too seri-
ously (making it fundamentally determinist) and not seriously enough (in
the sense that the particularities of any given technological system are
rarely analyzed in any actual detail). There does seem to be something espe-
cially powerful, for example, about the digital representations of informa-
tion made possible by the technology of the computer. These representa-
tions are not unprecedented and do not stand outside of history as some
enthusiasts and theorists would have us believe, but their specific technical
features are nonetheless highly significant. While the laboratories at the
Salk Institute have always been deeply connected to the history of infor-
mation, the remarkable changes in scientific practice and material culture
that have occurred there in recent decades seem more fundamental than a
mere change in scale or scope.
For these reasons and more, I am going to suggest that a productive
strategy for addressing the questions raised by computers and information
technology is to talk not in terms of “computerization,” but rather “digitiza-
tion.” There are obvious similarities between the two processes, but focusing
767
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 767
36. Ross Knox Bassett, To the Digital Age; Christophe Lécuyer, Making Silicon Valley.
on the constellation of technologies and practices linked together by the
common characteristic of being digital offers some significant advantages.
First, a few clarifications are necessary: not all computers are digital,
and not all digital devices are computers. Not every digital device encodes
information in exactly the same format, although most modern digital
devices store and communicate data in a binary format. Digital devices are
not necessarily electronic—consider, for example, the digital data stored in
the ivory tablet of a Jacquard loom or the paper tapes that control a me-
chanical player piano—but the invention of the vacuum tube and the tran-
sistor made it possible to communicate and manipulate digital data as a
series of electronic pulses. Subsequent innovations in chemistry, physics,
and semiconductor manufacturing made possible the mass production of
densely packed collections of transistors on a silicon wafer.36 As a result,
most modern digital devices contain their own tiny microprocessor com-
puter and therefore share a collective family resemblance. Many of the
same hardware and software technologies that can be found in the internals
of your laptop computer can also be found in your cell phone, digital cam-
era, and high-definition LED television. The skills required to design and
program digital devices are therefore the same as those to design and pro-
gram more conventional computers. Nevertheless, the concepts of “com-
puter” and “digital” are not always interchangeable, and it is important to
maintain this distinction.
So why digitization and not simply computerization? To begin with, of
the two concepts, the former is broader and more inclusive, encompassing
both conventional computing devices and novel hybrid technologies like
smartphones and video game consoles, while still being coherent enough to
remain analytically productive; the latter is limited by its close historical
association with one particular technology. This association has long been
problematic: many of the founders of the discipline of computer science
soon regretted the conflation of “computing” with “computer,” and within
a decade of its founding, the discipline’s principal professional society, the
Association of Computing Machinery, proposed dropping “machinery” al-
together. In any case, to computerize an organization or process still im-
plies the adoption of certain types of machines; but what it means for that
same organization or process to “go digital,” on the other hand, involves a
range of technologies and practices, some of which might require the in-
corporation of traditional computers although not necessarily so.
More significantly, the idea of digitization encompasses not only arti-
facts, but also data representations. Among the many commonalities found
in digital devices—including the fact that most are constructed around a
common core of components and formal and informal standards and
design conventions—they all, by definition, operate on data stored in a dig-
768
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 768
37. Paul Edwards, The Closed World; Kathryn Henderson, On Line and on Paper.
38. James Gleick, The Information.
ital format. The formats built around binary data are particularly amenable
to being stored and manipulated electronically. Digital data is not required
to be either binary or electronic, but the shared infrastructure built around
the data stored digitally in such a medium allows digital devices an extraor-
dinary range of interconnectivity. Once data is in digital format, it can be
replicated, transformed, and communicated by using an ever-increasing
range of readily available technologies. The notion of being digital implies
therefore both a specific type of technology and the structure of the under-
lying data. Unlike information, which lacks grounding in any particular
medium, digital implies an underlying technological architecture. This is
important because, as we well know, the construction of any technological
architecture is never a value-neutral proposition. In the same way that to
write something down is to transform it, to represent something in digital
format is to fundamentally alter its nature.37 This is an essential insight
drawn from the history of technology that is too often neglected in most
popular treatments of the computer and the information revolution.38
It is important to note that this process of digitization is not the same
as quantification. Although digital data is essentially numeric data (binary
data is typically represented as a series of “1”s and “0”s, for example), to
digitize a phenomenon is not simply to translate it into numbers. The
defining motivation of quantification is measurement; the principal goal of
digitization, however, is manipulation. The representation of an acoustic
wave as an MP3 file, for example, involves much more than another
method of measuring and quantifying sound; in fact, as a means of cap-
turing the information contained in the original sound wave, the MP3 for-
mat represents a regression from alternative, analog representations. Al-
though the digital data in an MP3 file is numeric data, these numbers are
not so much a measurement of sound as a model of sound. The value of that
model is not so much that it is accurate as it is manipulable; MP3 data is
valuable because it is easy to capture, store, communicate, analyze, and
transform. It is only within a digital ecosystem of networks and devices that
digital data becomes truly significant; but the rapidly increasing scale and
scope of this ecosystem makes the imperative to digitize almost irresistable.
There is, of course, a close relationship between the historical processes that
encouraged and enabled quantification and those that currently drive dig-
itization, but they are not identical.
It is the combination of data and the means of manipulating it that
makes the concept of digitization so much more compelling than comput-
erization. The common project of the scientists, technicians, and support
staff at the present-day Salk Institute is not, after all, the wholesale adoption
of computer technology, but rather the generation, manipulation, and pres-
769
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 769
39. Peter Galison, “Computer Simulations and the Trading Zone”; Paul Edwards, A
Vast Machine; Diane Bailey, Paul Leonardi, and Stephen Barley, “The Lure of the Virtual.”
40. Eric Winsberg, Science in the Age of Computer Simulation.
entation of digital data. To play once more on an observation of Latour, they
are a strange tribe of compulsive and manic digitizers, not compulsive and
manic computerizers. And the digital data generated by the instruments and
researchers at the institute are not simply digital representations of the
numeric data or paper documents previously recorded or inscribed in an
earlier era; what makes this new kind of digital data so powerful is that it can
be incorporated into a digital model represented in software on a digital
computer. The integration of digital data and digital model allows for the
digital simulation of the original physical system, which is what is so revo-
lutionary about the presence of the computer in the scientific laboratory.
New knowledge is produced not by observing and experimenting on the
natural world, but by simulating the natural world within a virtual environ-
ment. This is a fundamental shift in the epistemological foundations of the
scientific enterprise. Peter Galison writes about the origins of this shift in
microphysics; Paul Edwards describes this process as it occured in meteor-
ology; and Diane Bailey, Paul Leonardi, and Stephen Barley describe a sim-
ilar process under way in automative engineering.39 There is a small but
growing literature on the philosophical implications of digital modeling in
the sciences.40 The influence of digital technology on the processes of scien-
tific knowledge production and engineering design are nevertheless not yet
thoroughly documented by historians and are therefore an area that begs for
further work.
A vivid illustration of the difference between computerization and dig-
itization can be found in the motion-picture industry. Almost all contem-
porary filmmaking (the word “film” here is a quaint reference to an earlier
technological era) incorporates at least some degree of computer-generated
graphics, if only to draw in a background or to erase unwanted elements.
In fact, in most studios the production process, from start to finish, has be-
come almost entirely digital and therefore computers are omnipresent and
indispensable. But the computer is only one of the digital elements of a
larger digital toolchain that, when taken together, have entirely trans-
formed the modern film industry. Computer-generated images are more
than simply the representations, in digital form, of the same visual infor-
mation that in a previous generation would have been captured and stored
using photochemistry. To be sure, there are some that are mere digital
paintings or virtual backdrops, but for the most part, computer-generated
images are two-dimensional snapshots of what is really a three-dimen-
sional digital model or sculpture. Computer artists in the film industry do
not use computers so much to draw digital images as to construct digital
environments. Once a setting or character has been modeled in digital for-
mat it takes on an almost material reality. If a director wants to change the
770
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 770
41. Thompson, “Scale, Spectacle and Vertiginous Movement.”
angle of the shot or the perspective of the camera, he or she does not need
to have an artist or animator redraw the scene; instead, a virtual camera is
simply rotated within the digital environment. In an epic production like
the Lord of the Rings trilogy, entire armies of virtual actors were developed
to populate the massive battle scenes.41 These digital soldiers were not ani-
mations drawn on computers; they were simulated life forms whose broad
patterns of behavior were preprogrammed into their software, but whose
individual actions emerged only as the simulation played out in real time.
The essential characteristic of this new mode of film production is that it is
digital, not simply that it is computerized; the difference is that digital im-
plies both a kind of tool and a model of data representation.
There are many questions that the history of technology might ask
about the process of digitization as it has occurred in industries and activ-
ities as varied as science, filmmaking, musical performance, engineering
design, and social interaction. Many of these involve the traditional con-
cerns of the historian of computing. But there are also new insights to be
gained from shifting our emphasis from the computer to the digital. At the
moment, our treatment of these topics is too narrow, too focused on ma-
chines rather than data, representations, and processes. In the broader cul-
ture, talk of computerization and the computer age already sounds dated
and irrelevant. Without chasing too closely the whims of the contemporary
zeitgeist, historians of technology need to engage more directly with the
popular conversation about the digital revolution. It is the activity of digi-
tization, not computerization, that is occurring repeatedly throughout our
laboratories, factories, studios, schools, shopping malls, and living rooms.
Like the terms “computer,” “computing,” “information,” and “information
technology,” “digital” can be a vague and elusive descriptor. Nevertheless,
the idea of the digital captures better than its alternatives the distinctive
features of the technological and conceptual phenomenon that we are in-
terested in understanding. Stepping back and witnessing these acts of dig-
ital inscription with new eyes and a deliberately naive perspective, we will
uncover new questions, challenge stale historiographical certainties, and
produce better and more compelling histories.
Bibliography
Agar, Jon. The Government Machine: A Revolutionary History of the Com-
puter. Cambridge, MA: MIT Press, 2003.
Aspray, William, and Barbara M. Hayes. Everyday Information: The Evolu-
tion of Information Seeking in America. Cambridge, MA: MIT Press,
2011.
Bailey, Diane E., Paul M. Leonardi, and Stephen R. Barley. “The Lure of the
Virtual.” Organization Science 23, no. 5 (2012): 1485–1504.
771
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 771
Bassett, Ross Knox. To the Digital Age: Research Labs, Start-Up Companies,
and the Rise of MOS Technology. Baltimore: Johns Hopkins University
Press, 2002.
Beniger, James. The Control Revolution: Technological and Economic Origins
of the Information Society. Cambridge, MA: Harvard University Press,
1986.
Bimber, Bruce Allen. Information and American Democracy. New York:
Cambridge University Press, 2003.
Black, Maurice. “The Art of Code” (Ph.D. diss., University of Pennsylvania,
2002).
Borck, Cornelius. “Toys Are Us: Models and Metaphors in Brain Research.”
In Critical Neuroscience: A Handbook of the Social and Cultural Contexts
of Neuroscience, edited by S. Choudhury and J. Slaby, 113–34. Oxford:
Wiley-Blackwell, 2011.
Brock, David C., and Christophe Lécuyer. “Digital Foundations: The
Making of Silicon-Gate Manufacturing Technology.” Technology and
Culture 53 (2012): 561–97.
Brooks, Frederick. The Mythical Man-Month. Reading, MA: Addison-Wes-
ley, 1982.
Burks, Alice Rowe. Who Invented the Computer? The Legal Battle That
Changed Computing History. New York: Prometheus Books, 2002.
Campbell-Kelly, Martin. From Airline Reservations to Sonic the Hedgehog:
A History of the Software Industry. Cambridge, MA: MIT Press, 2003.
———. “The History of the History of Software.” IEEE Annals of the His-
tory of Computing 29, no. 4 (2007): 40–51.
———, and William Aspray. Computer: A History of the Information Ma-
chine. New York: Basic Books, 1996.
Ceruzzi, Paul.“Moore’s Law and Technological Determinism: Reflections on
the History of Technology.” Technology and Culture 46 (2005): 584–93.
———. Reckoners: The Prehistory of the Digital Computer, from Relays to
the Stored Program Concept, 1935–1945. Westport, CT: Greenwood
Press, 1983.
Chandler, Alfred, and James Cortada, eds. A Nation Transformed by Infor-
mation: How Information Has Shaped the United States from Colonial
Times to the Present. New York: Oxford University Press, 2000.
Chun, Wendy Hui Kyong. “On ‘Sourcery,’ or Code as Fetish.” Configurations
16 (2008): 299–324.
Cortada, James. Before the Computer: IBM, Burroughs, and Remington Rand
and the Industry They Created, 1865–1956. Princeton, NJ: Princeton
University Press, 1993.
Cringely, Robert X. Accidental Empires: How the Boys of Silicon Valley Make
Their Millions, Battle Foreign Competition, and Still Can’t Get a Date.
Reading, MA: Addison-Wesley, 1992.
772
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 772
Crowther-Heyck, Hunter. “George A. Miller, Language, and the Computer
Metaphor of Mind.” History of Psychology 2, no. 1 (1999): 37–64.
Dawkins, Richard. “Genetics: Why Prince Charles Is So Wrong.” The Times
(London), 28 January 2003.
Edgerton, David. “Innovation, Technology, or History: What Is the Histori-
ography of Technology About?” Technology and Culture 51 (2010): 680–
97, esp. 680.
Edwards, Paul. The Closed World: Computers and the Politics of Discourse in
Cold War America. Cambridge, MA: MIT Press, 1996.
———. A Vast Machine: Computer Models, Climate Data, and the Politics of
Global Warming. Cambridge, MA: MIT Press, 2010.
Ensmenger, Nathan. “The ‘Question of Professionalism’ in the Computer
Fields.” IEEE Annals of the History of Computing 23, no. 4 (2001): 56–
73.
———. “Letting the ‘Computer Boys’ Take Over: Technology and the Poli-
tics of Organizational Transformation.” International Review of Social
History 48, no. 11 (2003): 153–80.
———. “Software as History Embodied.” IEEE Annals of the History of
Computing 31, no. 1 (2009): 88–91.
———. The Computer Boys Take Over: Computers, Programmers, and the
Politics of Technical Expertise. Cambridge, MA: MIT Press, 2010.
———. “Making Programming Masculine.” In Gender Codes: Why Women
Are Leaving Computing, edited by Thomas J. Misa, 115–42. Hoboken,
NJ: John Wiley & Sons, 2010.
Friedman, Andrew, and Dominic Cornford. Computer Systems Develop-
ment: History, Organization, and Implementation. New York: John Wiley
& Sons, 1989.
Galison, Peter. “Computer Simulations and the Trading Zone.” In The Dis-
unity of Science: Boundaries, Contexts, and Power, edited by Peter Gali-
son and David J. Stump, 118–57. Palo Alto, CA: Stanford University
Press, 1996.
Gleick, James. The Information: A History, a Theory, a Flood. New York:
Pantheon Books, 2011.
Graham, Paul. Hackers & Painters: Big Ideas from the Computer Age. Sebas-
topol, CA: O’Reilly Media, Inc., 2004.
Grier, David Alan. When Computers Were Human. Princeton, NJ: Princeton
University Press, 2005.
Haigh, Thomas. “The Chromium-Plated Tabulator: Institutionalizing an
Electronic Revolution, 1954–1958.” IEEE Annals of the History of Com-
puting 4, no. 23 (2001): 75–104.
———. “Inventing Information Systems: The Systems Men and the Com-
puter, 1950–1968.” Business History Review 75, no. 1 (2001): 15–61.
Haraway, Donna. “Cyborg Manifesto: Science, Technology, and Socialist-
773
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 773
Feminism in the Late Twentieth Century.” In Simians, Cyborgs, and
Women: The Reinvention of Nature, 149–81. New York: Routledge, 1991.
Hashagen, Ulf, Reinhard Keil-Slawik, and Arthur L. Norberg, eds. History
of Computing: Software Issues. Berlin: Springer-Verlag, 2002.
Headrick, Daniel R. When Information Came of Age: Technologies of Knowl-
edge in the Age of Reason and Revolution, 1700–1850. New York: Oxford
University Press, 2000.
Heide, Lars. Punched-Card Systems and the Early Information Explosion,
1880–1945. Baltimore: Johns Hopkins University Press, 2009.
Henderson, Kathryn. On Line and on Paper: Visual Representations, Visual
Culture, and Computer Graphics in Design Engineering. Cambridge, MA:
MIT Press, 1999.
Hicks, Marie. “Only the Clothes Changed: Women Operators in British
Computing and Advertising, 1950–1970.” IEEE Annals of the History of
Computing 32, no. 4 (2010): 5–17.
Hofmann, Jeanette. “Writers, Texts and Writing Acts: Gendered User
Images in Word Processing Software.” In The Social Shaping of Technol-
ogy, 2nd ed., edited by Donald MacKenzie and Judy Wajcman, 222–43.
Philadelphia: Open University Press, 1999.
Johnston, Jessica R. Technological Turf Wars: A Case Study of the Computer
Antivirus Industry. Philadelphia: Temple University Press, 2008.
Kay, Lily. “Who Wrote the Book of Life? Information and the Transforma-
tion of Molecular Biology.” Science in Context 8 (1995): 609–34.
Latour, Bruno. “Social Theory and the Study of Computerized Work Sites.”
In Information Technology and Changes in Organizational Work: Images
and Reflections, edited by Wanda J. Orlikowski, Geoff Walsham, Matth-
ew R. Jones, and Janice I. DeGross, 295–307. London: Chapman & Hall,
1996.
———, and Steve Woolgar. Laboratory Life: The Social Construction of Sci-
entific Facts. Beverly Hills, CA: Sage Publications, 1979.
Law, John. “Notes on the Theory of Actor-Network: Ordering, Strategy, and
Heterogeneity.” Systems Practice 5, no. 4 (1992): 379–93.
Lécuyer, Christophe. Making Silicon Valley: Innovation and the Growth of
High Tech, 1930–1970. Cambridge, MA: MIT Press, 2006.
Levy, Steven. “A Spreadsheet Way of Knowledge.” In Computers in the Hu-
man Context: Information Theory, Productivity, and People, edited by
Tom Forester, 318–26. Cambridge, MA: MIT Press, 1989.
Light, Jennifer. “When Computers Were Women.” Technology and Culture
40 (1999): 455–83.
Lindsay, Jon R. “‘War upon the Map’: User Innovation in American Mili-
tary Software.” Technology and Culture 51 (2010): 619–51.
Lukoff, Herman. From Dits to Bits: A Personal History of the Electronic Com-
puter. Portland, OR: Robotics Press, 1979.
774
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 774
Lundstrom, David E. A Few Good Men from Univac. Cambridge, MA: MIT
Press, 1987.
Mahoney, Michael S. “Software as Science—Science as Software.” In His-
tory of Computing: Software Issues, edited by Ulf Hashagen, Reinhard
Keil-Slawik, and Arthur L. Norberg, 25–48. Berlin: Springer-Verlag,
2002.
———. “The Histories of Computing(s).” Interdisciplinary Science Reviews
20, no. 2 (2005): 119–35.
———. “What Makes the History of Software Hard.” IEEE Annals of the
History of Computing 30, no. 3 (2008): 8–18.
McKenna, Christopher D. TheWorld’s Newest Profession: Management Con-
sulting in the Twentieth Century. New York: Cambridge University Press,
2006.
Medina, Eden. Cybernetic Revolutionaries: Technology and Politics in Allen-
de’s Chile. Cambridge, MA: MIT Press, 2011.
Mindell, David A. Between Human and Machine: Feedback, Control, and
Computing before Cybernetics. Baltimore: Johns Hopkins University
Press, 2002.
November, Joseph A. Biomedical Computing: Digitizing Life in the United
States. Baltimore: Johns Hopkins University Press, 2012.
Paleaz, Eloina. “A Gift from Pandora’s Box: The Software Crisis” (Ph.D.
diss., University of Edinburgh, 1988).
Papert, Seymour. Mindstorms: Children, Computers, and Powerful Ideas.
New York: Basic Books, 1993.
Pinker, Steven. How the Mind Works. New York: W. W. Norton, 2009.
Savolainen, Reijo. Everyday Information Practices: A Social Phenomenologi-
cal Perspective. Lanham, MD: Scarecrow Press, 2008.
Simon, Herbert. The Sciences of the Artificial. Cambridge, MA: MIT Press,
1969.
Thompson, Kirsten Moana. “Scale, Spectacle and Vertiginous Movement:
Massive Software and Digital Special Effects in The Lord of the Rings.” In
From Hobbits to Hollywood: Essays on Peter Jackson’s Lord of the Rings,
edited by Ernest Mathijs and Murray Pomerance, 283–99. Amsterdam,
The Netherlands: Rodopi, 2006.
Tukey, John W. “The Teaching of Concrete Mathematics.” American Mathe-
matical Monthly 65, no. 1 (1958): 1–9.
Turkle, Sherry. The Second Self: Computers and the Human Spirit. Cam-
bridge, MA: MIT Press, 1984.
Williams, Michael. A History of Computing Technology. Washington, DC:
IEEE Computer Society Press, 1997.
Williams, Rosalind. “‘All That Is Solid Melts into Air’: Historians of Tech-
nology in the Information Revolution.” Technology and Culture 41
(2000): 641–68.
775
ENSMENGERK|KRethinking Computers in Society
ESSAY
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 775
Winsberg, Eric. Science in the Age of Computer Simulation. Chicago: Univer-
sity of Chicago Press, 2010.
Wolfram, Stephen. A New Kind of Science. Champaign, IL: Wolfram Media,
2002.
Yates, JoAnne. Control through Communication: The Rise of System in Amer-
ican Management. Baltimore: Johns Hopkins University Press, 1989.
———. Structuring the Information Age: Life Insurance and Technology in
the Twentieth Century. Baltimore: Johns Hopkins University Press, 2005.
776
OCTOBER
2012
VOL. 53
T E C H N O L O G Y A N D C U L T U R E
03_ensmenger:03_49.3dobraszczyk 568– 10/13/12 11:41 AM Page 776
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.