Skip to main content

Audrey Watters

Introducing Tressie

4 min read

I was honored to get to introduce Tressie McMillan Cottom this week when she delivered the opening keynote at the Digital Pedagogy Lab Institute. Here's what I said:

When I was first asked to introduce the keynote today, I thought about wearing my Denver Broncos t-shirt to troll Tressie, who I haven’t seen since my team – go Broncos – beat her team, the Carolina Panthers, in the Super Bowl. See, I would have tied it all together though, something about Cam Newton and how society demands certain bodies – Black folks in this case – perform a certain kind of emotional labor alongside physical and intellectual labor, what that looks like not just in post-game interviews, but what that looks like in academia, what that looks like in a public talk.

I was also sorely tempted to tell you an anecdote from that one time we were together on the 17-hour flight from Johannesburg, South Africa, to Atlanta, Georgia and Magic Mike XXL was one of the in-flight movies. But the set-up is kinda long and it perhaps require you have seen the movie and know a little about “Where Mike Got the Magic.” So I’ll save the story for the cocktail hour.

I actually want to be serious with this intro, because Tressie does some of the most seriously important work of anyone I know. In the last four years, her scholarship has become foundational to my own, as we work to analyze the systems and stories surrounding “skills,” “markets,” “certification,” and “schooling.”

I can tell you the first time I heard of Tressie McMillan Cottom. It was 2012. Tressie had written a response (or two) to an article published in The Chronicle of Higher Education by right-wing pundit whose name isn't worth mentioning and started a petition to have Schaefer Riley dismissed from the publication. I caught wind of this all on Twitter (because thankfully, I’m not in academia anymore and I needn’t subscribe to The Chronicle).

In the Chronicle article in question, this pundit argued for the elimination of Black Studies departments by viciously mocking and attacking the work of three doctoral students. The work of three female doctoral students. The work of three Black women.

Perhaps it’s a familiar story to us now: a publication hires someone it knows is going to say outrageous things. That person writes something outrageous. Outrage ensues. Outrage and virality. The publication then solicits articles, from the offender and the offended, in response – “We encourage you to weigh in!” – an attempt, let’s be honest, to extend not resolve the outrage. As the business model for online publishing increasingly depends on page-views, we get rage clicks. Hate reads.

And Tressie, then a doctoral student herself, named it. She named it for what it is – not just the baiting (link baiting, click baiting, race baiting), but “the institutional logic of racism.” The institutional logic of racism at work on the pages of the premier publication for higher education, one that echoes the institutional logic of racism in higher education.

The Chronicle of Higher Education is just one of the many, many gatekeepers in higher education. It’s the publication that faculty, staff, administrators, and yes graduate students are urged to turn to for the latest on the state of the institution, the disciplines, the politics, the future. It helps identify and shape the important issues, the important characters. The Chronicle, like all gatekeepers, carves out who belongs, whose scholarship – whose lives – matter. These gatekeepers distinguish, designate, and reinforce prestige.

Higher ed is, as Tressie’s work reminds us, a “prestige cartel.” (Her book Lower Ed: The Troubling Rise of For-Profit Colleges in the New Economy will be out in February.) This distinction, this stratification – “high” and “low” – coincides, overlaps with others – “real” and “fake,” “public” and “private,” “open” and “closed,” “Ivy” and the rest of us plebs, and perhaps central to our purposes here at this event, “offline” and “online,” “standardized” and “personalized.” The keywords of the new higher ed-tech economy – “innovative,” “disruptive,” “at scale” versus the old, the traditional, the outmoded, the irrelevant.

I’m honored today to introduce the Digital Pedagogy Institute’s opening keynote, assistant professor of sociology at Virginia Commonwealth University Dr. Tressie McMillan Cottom – a model public scholar, openly and ferociously engaged in issues of education and justice. My friend…

Audrey Watters

Notes from The Real World of Technology

26 min read

Ursula Franklin passed away several weeks ago. Although I'd been exposed to her work via several (Canadian, feminist) technologies and scientists, I hadn't ever read this book. I have to say: it's the best book I've read on technology in a very, very long time.

technology has built the house in which we all live. The house is continually being extended and remodelled.

Technology, like democracy, includes ideas and practices; it includes myths and various models of reality. And like democracy, technology changes the social and individual relationships between us. It has forced us to examine and redefine our notions of power and of accountability.

technology as practice

Technology is not the sum of the artifacts, of the wheels and gears, of the rails and electronic transmitters Technology is a system. It entails far more than its individual material components. Technology involves organization, procedures, symbols, new words, equations, and, most of all, a mindset.

Technology also needs to be examined as an agent of power and control, and I will try to show how much modern technology drew from the prepared soil of the structures of traditional institutions, such as the church and the military.

technology’s social impact. I myself am overawed by the way in which technology has acted to reorder and restructure social relations, not only affecting the relations between social groups, but also the relations between nations and individuals, and between all of us and our environment. To a new generation, many of these changed relationships appear so normal, so inevitable, that they are taken as given and are not questioned. Yet one can establish clear historical trends. In order to understand the real world of technology and cope with it, we need to have some knowledge of the past, as well as to give some thought to the future.

Central to any new order that can shape and direct technology and human destiny will be a renewed emphasis on the concept of justice. The viability of technology, like democracy, depends in the end on the practice of justice and on the enforcement of limits to power.

The historical process of defining a group by their agreed practice and by their tools is a powerful one. It not only reinforces geographic or ethnic distributions, it also affects the gendering of work.

The common practice that a particular technology represents, in addition to leading to an identification with culture and gender, can also lead to the “right” of the practitioners to an exclusive practice of the technology.

Another facet of the concept of technology as practice is the fact that the practice can define the content.

Work-related technologies make the actual practice easier.

control- and work-related technologies

holistic technologies and prescriptive technologies

Holistic technologies are normally associated with the notion of craft.

Using holistic technologies does not mean that people do not work together, but the way in which they work together leaves the individual worker in control of a particular process of creating or doing something.

It is the first kind of specialization, by product, that I call holistic technology, and it is important because it leaves the doer in total control of the process. The opposite is specialization by process; this I call prescriptive technology.

“division of labour”

a production method.

The amount of material found, and the knowledge that this constitutes only a small fraction of what was produced, assures us of the presence of a large, coordinated production enterprise. It was only when I considered in detail – as a metallurgist – what such a production enterprise would entail, that the extraordinary social meaning of prescriptive technologies dawned on me. I began to understand what they meant, not just in terms of casting bronze but in terms of discipline and planning, of organization and command.

When work is organized as a sequence of separately executable steps, the control over the work moves to the organizer, the boss or manager.

invention. In political terms, prescriptive technologies are designs for compliance.

Today’s real world of technology is characterized by the dominance of prescriptive technologies. Prescriptive technologies are not restricted to materials production. They are used in administrative and economic activities and in many aspects of governance, and on them rests the real world of technology in which we live. While we should not forget that these prescriptive technologies are often exceedingly effective and efficient, they come with an enormous social mortgage. The mortgage means that we live in a culture of compliance, that we are ever more conditioned to accept orthodoxy as normal, and to accept that there is only one way of doing “it.”

Any tasks that require caring, whether for people or nature, any tasks that require immediate feedback and adjustment, are best done holistically. Such tasks cannot be planned, coordinated, and controlled the way prescriptive tasks must be.

When successful, prescriptive technologies do yield predictable results. They yield products in numbers and qualities that can be set beforehand, and so technology itself becomes an agent of ordering and structuring.

The ordering that prescriptive technologies has caused has now moved from ordering at work and the ordering of work, to the prescriptive ordering of people in a wide variety of social situations.

“the digitalized footprints of social transactions,” since the technology can be set up not only to include and exclude participants, but also to show exactly where any individual has spent his or her time.

prescriptive technologies eliminate the occasions for decision-making and judgement in general and especially for the making of principled decisions. Any goal of the technology is incorporated a priori in the design and is not negotiable.

As methods of materials production, prescriptive technologies have brought into the real world of technology a wealth of important products that have raised living standards and increased well-being. At the same time they have created a culture of compliance.


Underlying the different uses of the concept of scale are two different models or metaphors: one is a growth model, the other a production model.

A production model is different in kind. Here things are not grown but made, and made under conditions that are, at least in principle, entirely controllable.

Production, then, is predictable, while growth is not.

choosing a particular university, following a particular regimen, will turn the student into a specifiable and identifiable product.

If there ever was a growth process, if there ever was a holistic process, a process that cannot be divided into rigid predetermined steps, it is education.

The real world of technology seems to involve an inherent trust in machines and devices (“production is under control”) and a basic apprehension of people (“growth is chancy, one can never be sure of the outcome”).

vernacular reality

extended reality that body of knowledge and emotions we acquire that is based on the experience of others.

constructed or reconstructed reality. Its manifestations range from what comes to us through works of fiction to the daily barrage of advertising and propaganda. It encompasses descriptions and interpretations of those situations that are considered archetypal rather than representative. These descriptions furnish us with patterns of behaviour. We consider these patterns real, even if we know the situations have been constructed in order to make a particular pattern very clear and evident.

projected reality – the vernacular reality of the future.

today there is no hierarchical relationship between science and technology. Science is not the mother of technology. Science and technology today have parallel or side-by-side relationships; they stimulate and utilize each other. It is more appropriate to regard science and technology as one enterprise with a spectrum of interconnected activity than to think of two fields of endeavour – science as one, and applied science and technology as the other.

Today scientific constructs have become the model of describing reality rather than one of the ways of describing life around us.

As a consequence there has been a very marked decrease in the reliance of people on their own experience and their own senses.

the downgrading of experience and the glorification of expertise is a very significant feature of the real world of technology.

the message-transmission technologies have created a host of pseudorealities based on images that are constructed, staged, selected, and instantaneously transmitted.

Media images seem to have a position of authority that is comparable to the authority that religious teaching used to have.

Media images seem to have a position of authority that is comparable to the authority that religious teaching used to have.

As a community we should look at what the new technologies of message-forming and -transmitting do to our own real world of technology and democracy. This is why I have a sense of urgency to map the real world of technology, so that we might see how in our social imagination the near is disadvantaged over the far. We should also understand that this does not have to be so.

Viewing or listening to television, radio, or videos is shared experience carried out in private. The printing technologies were the first ones that allowed people to take in separately the same information and then discuss it together. Prior to that, people who wanted to share an experience had to be together in the same place – to see a pageant, to listen to a speech.

there are new, high-impact technologies and these produce largely ephemeral images. The images create a pseudocommunity, the community of those who have seen and heard what they perceive to be the same event that others, who happened not to have watched or listened, missed for good.

Since normally only a fraction of the pseudocommunity become members of the real and active community, the possibility of forming such groups may be greater in the case of broadly based international concerns that are “the far” for most viewers than in the case of specific problems of “the near.”

There is a lot of talk about global crises and “our common future.”8 However, there is far too little discussion of the structuring of the future which global applications of modern technologies carry in their wake.

Whenever human activities incorporate machines or rigidly prescribed procedures, the modes of human interaction change.

technical arrangements reduce or eliminate reciprocity. Reciprocity is some manner of interactive give and take, a genuine communication among interacting parties.

For example, a face-to-face discussion or a transaction between people needs to be started, carried out, and terminated with a certain amount of reciprocity. Once technical devices are interposed, they allow a physical distance between the parties. The give and take – that is, the reciprocity – is distorted, reduced, or even eliminated.

reciprocity is not feedback. Feedback is a particular technique of systems adjustment. It is designed to improve a specific performance.

Reciprocity, on the other hand, is situationally based.

It is neither designed into the system nor is it predictable. Reciprocal responses may indeed alter initial assumptions. They can lead to negotiations, to give and take, to adjustment, and they may result in new and unforeseen developments.

these technologies have no room for reciprocity. There is no place for response. One may want to speculate for a moment whether this technological exclusion of response plays a part in the increasing public acceptance of the depiction of violence and cruelty.

technologically induced human isolation:

even in the universe of constructed images and pseudorealities there still exists a particular enclave of personal directness and immediacy: the world of the ham-radio operator. It is personal, reciprocal, direct, affordable – all that imaging technology is not – and it has become in many cases a very exceptional early warning system of disasters. It is a dependable and resilient source of genuine communication. I am citing this example so as not to leave the impression that the technological reduction of meaningful human contact and reciprocal response is inherently inevitable.

the growth of prescriptive technologies provided a seed-bed for a culture of compliance.

Technology has been the catalyst for dramatic changes, in the locus of power.

Any task tends to be structured by the available tools.

Tools often redefine a problem.

The real world of technology is a very complex system. And nothing in my survey or its highlights should be interpreted as technological determinism or as a belief in the autonomy of technology per se. What needs to be emphasized is that technologies are developed and used within a particular social, economic, and political context. They arise out of a social structure, they are grafted on to it, and they may reinforce it or destroy it, often in ways that are neither foreseen nor foreseeable. In this complex world neither the option that “everything is possible” nor the option that “everything is preordained” exists.

A change in one facet of technology, for instance the introduction of computers in one sector, changes the practice of technology in all sectors. Such is the nature of systems.

I much prefer to think in terms not of systems but of a web of interactions.

When women writers speak about reweaving the web of life,5 they mean exactly this kind of pattern change. Not only do they know that such changes can be achieved but, more importantly, they know there are other patterns. The web of technology can indeed be woven differently, but even to discuss such intentional changes of pattern requires an examination of the features of the current pattern and an understanding of the origins and the purpose of the present design.

1740s, a very influential book was published by La Mettrie called L’Homme-machine

the discovery of the body as object and instrument of power led to a host of regimes of control for the efficient operations of these bodies, whether they were the efficiencies of movement, the measured intervals of the organization of physical activities, or the careful analysis and timing of the tasks bodies could perform, usually in unison.

It was into this socially and politically well prepared soil that the seeds of the Industrial Revolution fell. The factory system, with its mechanical devices and machines, only augmented the patterns of control. The machinery did not create them.

To plan with and for technology became the Industrial Revolution’s strongest dream. The totally automated factory – that is, a factory completely without workers – was discussed by Babbage and his contemporaries in the early nineteenth century.

While the eighteenth century exercised control and domination by regarding human bodies as machines, the nineteenth century began to use machines alone as instruments of control.

For the British manufacturers, machines appeared more predictable and controllable than workers. The owners of factories dreamt of a totally controlled work environment, preferably without any workers. If and where workers were still needed, they were to be occupied with tasks that were paced and controlled by machines.

Industrial layout and design was often more a case of planning against undesirable or unpredictable interventions than it was of planning for greater and more predictable output and profit.

a clearly perceived loss of workers“ control and autonomy. It was not resistance to technology per se so much as an opposition to the division of labour and loss of autonomy that motivated the workers” resistance.

What the Luddites and other groups of the period clearly perceived was the difference between work-related and control-related technologies.

somehow I find no indication that they realized that while production could be carried out with few workers and still run to high outputs, buyers would be needed for these outputs. The realization that though the need for workers decreased, the need for purchasers could increase, did not seem to be part of the discourse on the machinery question. Since then, however, technology and its promoters have had to create a social institution – the consumer – in order to deal with the increasingly tricky problem that machines can produce, but it is usually people who consume.

Technology has changed this notion about the obligations of a government to its citizens. The public infrastructures that made the development and spread of technology possible have become more and more frequently roads to divisible benefits. Thus the public purse has provided the wherewithal from which the private sector derives the divisible benefits, while at the same time the realm from which the indivisible benefits are derived has deteriorated and often remains unprotected.

The global environ mental destruction that the world now has to face could not have happened without the evolution of infrastructures on behalf of technology and its divisible benefits, and corresponding eclipsing of governments" obligation to safeguard and protect the sources of indivisible benefits. Whether the task of reversing global environmental deterioration can be carried out successfully will depend, to a large extent, on understanding and enforcing the role and obligation of governments to safeguard the world’s indivisible benefits.

Prescriptive technologies are a seed-bed for a culture of compliance.

Many technological systems, when examined for context and overall design, are basically anti-people. People are seen as sources of problems while technology is seen as a source of solutions.

the “technological imperative.”

whatever can be done by technological means, will be done.

the need for a credible long-term enemy.

the changes that technology has brought to the part of citizens in war preparation and warfare. Just as fewer and fewer unskilled workers are needed in a modern technological production system, a country now has little practical need for raw recruits to operate its modern technological destruction system. Abandoning compulsory military service is not so much a sign of peaceful intentions as it is a sign of galloping automation.

Military service from citizens is no longer a prerequisite for war. What is a prerequisite is the compulsory financial service of all citizens, well before any military exchange begins.

Planning, in my sense of the word, originated with prescriptive technologies. As prescriptive technologies have taken over most of the activities in the real world of technology, planning has become society’s major tool for structuring and restructuring, for stating what is doable and what is not. The effects of lives being planned and controlled are very evident in people’s individual reactions to the impingement of planning on them. The real world of technology is full of ingenious individual attempts to sabotage externally imposed plans.

A common denominator of technological planning has always been the wish to adjust parameters to maximize efficiency and effectiveness.

holistic strategies are, more often than not, intended to minimize disaster rather than to maximize gain.

planning as part of the strategy of maximizing gain, and coping as central to schemes for minimizing disaster.

the real world of technology denies the existence and the reality of nature.

the prediction of a senior official at IBM, in an article called “The Banishment of Paperwork.” He confidently forecast the total absence of paperwork in 1984: Computers, within two decades, would have become the sole medium of communication, while all that burdensome paper would have vanished from our desks.

Ivan Illich pointed out in his 1981 essay, Shadow Work,1 that prescriptive technologies, particularly those in the administrative and social-service sectors, produce the desired results only when clients – for instance, parents, students, or patients – comply faithfully and to the letter with the prescriptions of the system. Thus, advanced applications of prescriptive technologies require compliance not only from workers, but also from those who use the technologies or are being processed by them. Illich stressed the role of individual and group compliance by citizens in this process of making prescriptive technologies work.

as more and more of daily life in the real world of technology is conducted via prescriptive technologies, the logic of technology begins to overpower and displace other types of social logic, such as the logic of compassion or the logic of obligation, the logic of ecological survival or the logic of linkages into nature. Herbert Marcuse, in One Dimensional Man, speaks of this overpowering.

a “mechanical bride,” the term used by Marshall McLuhan to describe the relationship between car and owner.

It is aimed at creating an atmosphere of harmless domesticity around the new technology to ease its acceptance.

If one doesn’t watch the introduction of new technologies and particularly watch the infrastructures that emerge, promises of liberation through technology can become a ticket to enslavement.

The authors of this prognostication evidently assumed that the introduction of the sewing machine would result in more sewing – and easier sewing – by those who had always sewn. They would do the work they had always done in an unchanged setting. Reality turned out to be quite different. With the help of the new machines, sewing came to be done in a factory setting, in sweatshops that exploited the labour of women and particularly the labour of women immigrants. Sewing machines became, in fact, synonymous not with liberation but with exploitation.

What turns the promised liberation into enslavement are not the products of technology per se – the car, the computer, or the sewing machine – but the structures and infrastructures that are put in place to facilitate the use of these products and to develop dependency on them.

To recap: many new technologies and their products have entered the public sphere in a cloud of hope, imagination, and anticipation. In many cases these hopes were to begin with fictional, rather than real; even in the best of circumstances they were vastly exaggerated. Discussion focused largely on individuals, whether users or workers, and promised an easier life with liberation from toil and drudgery. Discourse never seemed to focus on the effects of the use of the same device by a large number of people, nor was there any focus on the organizational and industrial implications of the new technologies, other than in the vaguest of terms.

once a given technology is widely accepted and standardized, the relationship between the products of the technology and the users changes. Users have less scope, they matter less, and their needs are no longer the main concern of the designers. There is, then, a discernable pattern in the social and political growth of a technology that does not depend on the particular technical features of the system in question.

how teaching, research, and practice in most areas of science and technology follow essentially male patterns by being basically hierarchical, authoritarian, competitive, and exclusive.

Major facets of technology are related to prescriptive practices and thus to the development of instruments of power and control.

The great contribution of women to technology lies precisely in their potential to change the technostructures by understanding, critiquing, and changing the very parameters that have kept women away from technology.

What does it say about our society, when human needs for fellowship and warmth are met by devices that provide illusions to the users and profits to the suppliers?

as a response to loneliness, it seems to me deceitful and fraudulent.

the disregard that technical designers can have for the needs of operators. Typists not only got awkward machines, but they – and the telephone operators – also encountered the usual division of work that has become part of mechanization and automation. As the technologies matured and took command, women were left with fragmented and increasingly meaningless work.

The way of doing something can be “holistic” when the doer is in control of the work process. The way of doing something can also be “prescriptive,” when the work – whatever it might be – is divided into specified

steps, each carried out by separate individuals. This form of division of labour, historically quite old and not dependent on the use of machines, is a crucial social invention at first practised in the workplace.

I hold that, in fact, we have lost the institution of government in terms of responsibility and accountability to the people. We now have nothing but a bunch of managers, who run the country to make it safe for technology.

I firmly believe that when we find certain aspects of the real world of technology objectionable we should explore our objections in terms of principle, in terms of justice, fairness, and equality. It may be well to express concerns as questions of principle rather than to try to emphasize merely pragmatic explanations – for instance, that objectionable practices may also be inefficient, inappropriate, or polluting.10 The emphasis on a pragmatic rationale for choice tends to hide the value judgements involved in particular technological stances.

When my colleagues in the field of cold-water engineering speak of “ice-infested waters,” I am tempted to think of “rig-infested oceans.” Language is a fine barometer of values and priorities. As such it deserves careful attention.

Let’s make a checklist to help in the discourse on public decision-making. Should one not ask of any public project or loan whether it: (1) promotes justice; (2) restores reciprocity; (3) confers divisible or indivisible benefits; (4) favours people over machines; (5) whether its strategy maximizes gain or minimizes disaster; (6) whether conservation is favoured over waste; and (7), whether the reversible is favoured over the irreversible?

redemptive technologies

the development and use of redemptive technologies ought to be part of the shaping of a new social contract appropriate for the real world of technology, one that overcomes the present disenfranchisement of people.

“protest and survive.”

“Let us understand, and on the basis of our common understanding, protest.” We must protest until there is change in the structures and practices of the real world of technology, for only then can we hope to survive as a global community.

If such basic changes cannot be accomplished, the house that technology built will be nothing more than an unlivable techno-dump.

many such communications have to be regarded as messages looking for receivers.

I have never liked the term cyberspace because it neither describes a space nor does its current use reflect the concepts of control and systems-design implied in the term cybernetics, after which the term cyberspace was patterned.

I got into real trouble once, when I suggested that the Internet could be looked at as one giant dump: people and organizations dump information in bits and pieces; they also retrieve whatever is of use and interest to them. What is found by the scavengers depends on where they dig, what was dumped, and what is considered useful or relevant enough to be retrieved. There is no pattern in the digging or reassembly, no one path through the dump, no compulsory reference to the scource of the bounty. And since the Internet contains information rather than stuff, the same treasures, or junk, can be retrieved again and again.

measured time and experienced time.

The role of asynchronicity in unravelling social and political patterns without apparent replacement with other patterns cannot be overestimated.

Many people have experienced the asynchronous forms of labour and have felt their consequences; the impact often includes the lack of work-related solidarity and selfidentification that can have profound social implications.

Women in particular have often treasured the opportunity to work asynchronously – getting a bit of writing done when the kids are asleep, sneaking in a slice of private life into their tightly structured existences. But I see a real difference between supplementing a rigidly patterned structure with asynchronous activities and substituting synchronous functions by asynchronous schemes.

The inhabitants of the City of Bits are still real live human beings, yet nature, of which humans are but a small part, appears to have no autonomous space in the bitsphere. There are no seasonal rhythms, no presence of the land nor the ebb and flow of individual lives, even though these are the synchronous patterns that have shaped culture and community throughout the time and, through their patterns, have provided a source of meaning to people for many generations.

the difference between a mechanism and an organism.

the biosphere and the bitsphere

Within the biosphere, human beings have attempted to codify and transmit their understanding of the world around them by ordering their experiences into general schemes and structures.1 Myths, religion, and science have endeavoured to transmit knowledge and experience so ordered as to convey sequence and consequence as ordering principles. Learning to recognize such ordering principles has been traditionally part of growing up in a given society. Ordering schemes help us to evaluate and interpret new knowledge and experience.

One of the most striking attributes of the bitsphere, on the other hand, is the absence of structure.

Unfortunately, the new technologies have entered the realm of education largely because they were regarded as production improvements, promising better products and faster or bigger production runs, and not because they were deemed to offer enrichment to the soil. Thus it is not surprising that the electronic classroom raises the same types of problems and exhibits the same social and political difficulties that one encounters in the realm of work or governance in the real world of the new technologies.

the displacement of people by devices

When external devices are used to diminish the need for the drill of explicit learning, the occasion for implicit learning may also diminish.

As considerations of efficiency and cost-cutting shift the balance of synchronous and asynchronous classroom activities, the balance of explicit and implicit learning is changing. While the pool of information available to the students may increase, the pool of available understanding may not. This has considerable consequences for social cohesion and peace and deserves careful attention.

how and where, we ask again, is discernment, trust, and collaboration learned, experience and caution passed on, when people no longer work, build, create, and learn together or share sequence and consequence in the course of a common task?

where, if not in school and workplace, is society built and changed?

the practice of democratic governance is in grave question and the advancement of social justice and equality appears stalled in a labyrinth of random transactions.14 This does not have to be so. The interface of the biosphere and the bitsphere not only poses problems and precipitates crises but it offers new opportunities to advance the common good. It will take the collective thought, moral clarity, and strong political will of many people to move towards this goal rather than away from it.

This is a collective endeavour that no group or conglomerate can do on its own. Most of our social and political institutions are both reluctant and ill-equipped to advance such tasks. Yet if sane and healthy communities are to grow and prevail, much more weight has to be placed on maintaining the non-negotiable ties of all people to the biosphere.

Audrey Watters

This Is Not Fine

Credits: KC Green

Audrey Watters

Notes from Questionnaire

12 min read

Here are my notes from Evan Kindley's new book Questionnaire. I think this is an incredibly important book for those interested in the histories of educational testing as well as the futures of learning analytics. Hopefully I'll carve out time to write a longer review:

The fact must be faced: for many of us, under the right circumstances, filling out forms is fun.

The word “questionnaire” appears first in French, in its modern sense, in the mid-nineteenth century. Some of the word’s early usages suggest persistent associations with the Catholic practices of catechism and confession, as well as governmental inquisition and interrogation. (In the eighteenth century, the term “questionnaire-juré” described a torturer.)


the importance of blank “job-printed” forms to the rise of bureaucracy and the consolidation of the new capitalist economy in the nineteenth and twentieth centuries.

Blank forms, Gitelman argues, are the ultimate bureaucratic objects: bland, impersonal, utilitarian documents designed to help officials process and sort large groups of people.

The history of the questionnaire is the history of attempts to make interacting with such dreary objects more and more fun for more and more people.

people can actually enjoy interrogation by questionnaire,

The history of the questionnaire is thus also a history of psychological manipulation, and of salesmanship: a series of attempts to find the magic words that will open the heart of the public.

“they will be used only as data for general statistical conclusions.”

On Galton: * a raft of dubious generalizations. * a definite methodological success.

More than any other single scientific work, English Men of Science established the self-report questionnaire in the United Kingdom as a legitimate instrument for the collection of empirical data.

In his work on heredity, he took the first steps toward solving a major practical problem for the social sciences: how to convince people to overcome their disinclination to provide personal information about themselves.

he exploited financial instincts, offering cash prizes

The emphasis was on the generation of family heirlooms rather than of experimental data.

With this stratagem, Galton invented the baby book, a popular genre that continues to flourish today.

A combination of rationalism, progressivism, and narcissism drove the early development of the questionnaire.

The Victorians loved questionnaires because they pandered to their faith in science, their earnest desire to improve the world around them, and – most important, perhaps – their intense interest in the quotidian details of their own lives.

the mania for anthropometric questionnaires bears a curious similarity to another contemporary trend among the British middle class of the late nineteenth century: the vogue for confession albums, which were a popular parlor game in the 1870s and later.

Like the personal details that circulate on today’s social media, these revelations were not true confessions but symbolic tokens meant to be shared.

“le questionnaire de Proust.”

“the advantage of questionnaires, from a financial point of view, was that not one of the celebrities who agree to submit [answers] expect to be paid.” [Note: this is so very similar to the extraction of data via quizzes today. Lots of pageviews; little to no payout for participants and/or writers]

In 1905, the French psychologists Alfred Binet and Théodore Simon developed a scale to measure the intelligence of children aged three to twelve. Lewis Terman, a psychology professor at Stanford, revised it in 1916 to create the Stanford-Binet Intelligence Scales, which in turn provided the model for the Scholastic Aptitude Test (SAT), the first national standardized intelligence test in the United States, introduced in 1926.

Tests like the Binet-Simon, the Stanford-Binet, and the SAT, by contrast, were used for evaluative purposes, and thus had an immediate impact on the life chances of those who took them.

The immediate aims of the Alpha and Beta examinations were pragmatic: they allowed the Army to identify exceptional individuals who might be suited for officer training, and consign the lowest-scoring recruits to labor battalions and other menial posts. But the project also enabled psychologists to amass an unprecedented amount of anthropometric data on the American population.

The Alpha tests were far from what we would now call “culture-blind”: that is, what they measured was not “intelligence” (whatever that means) so much as familiarity with a specific cultural context.

In scientific terms, as measurements of intelligence or ability, such tests are virtually useless. Nonetheless, the study’s findings were almost immediately weaponized by the antiimmigrant nativist movement.

Stress Disorder (PTSD). The severity of the epidemic led the Army to experiment with more rigorous screening of recruits for psychological instability, the governing assumption being that only the mentally weak would “crack up” under the strain of combat.

the Woodworth Psychoneurotic Inventory,

Industrial, or “applied,” psychology came into its own as a field, taking its place alongside Frederick Winslow Taylor’s Scientific Management as a major influence on the culture of capitalist production.

Humm-Wadsworth Temperament Scale

testing was a management science, and, like Taylorism, it was often put to antiunion purposes.

Bernreuter Personality Inventory, the Worthington Personal History Blank, the Thurstone Personality Schedule, the Adams-Lepley Personal Audit, the Allport Ascendance-Submission Reaction Study, the Guilford-Zimmerman Temperament Survey

The Organization Man

Whyte’s book was a national bestseller, and it inaugurated a vicious cultural backlash against mandatory personality testing. Psychological tests at work began to seem like the epitome of totalitarian thought policing, and were thus susceptible to attack from both the left and the right as the 1960s wore on.

The 1964 Civil Rights Act made companies reluctant to use tests that might be shown to have a systematic bias against minorities. In 1966, Senator Sam J. Ervin Jr. of North Carolina convened a hearing on Privacy and the Rights of Federal Employees that specifically targeted personality inventories as an unacceptable invasion of privacy.9

the steady drumbeat of scientific skepticism about its basic validity and value. Intelligence testing had been controversial from the beginning: it was opposed especially vociferously by anthropologists

confirmation bias.

In Mischel’s view, then, the fundamental premise of personality assessment – that individuals possess core psychological traits and attributes that remain consistent across different situations, contexts, and life stages – was simply wrong. All previous attempts to “test” for personality were based on a fundamental fallacy about human behavior, and should therefore be thrown out.

In the 1940s, Myers read an article in Reader’s Digest about the Humm-Wadsworth Temperament Scale entitled “Fitting the Worker to the Job.” The MBTI, modeled on the Humm and other industrial “people-sorters” but grounded in Jungian type theory as opposed to the categories of eugenic psychiatry, was conceived as a career-placement tool that would help employers identify the strengths of job candidates and individuals find their proper line of work.

Beginning in 1962, it was carried by the Educational Testing Service, the publishers of the SAT

Of all of the personality tests developed in the twentieth century – and there have been hundreds – the MBTI is the closest to the language of pop psychology and self-help.

“The Indicator’s unfailingly positive tone blends seamlessly … with our society’s emphasis on promoting self-esteem,” the journalist Annie Murphy Paul has noted.

Oxford Capacity Analysis (OCA)

Scientology founder L. Ron Hubbard

In the mid–1950s, publishers of personality tests began to require their customers to be accredited by the American Psychological Association, thus cutting Hubbard off from access to more legitimate scientific instruments. But it also allowed the church to shape the test to its own institutional requirements.

Ultimately, though, the responses given to these particular questions don’t matter very much, as it appears to be impossible to achieve a “good” score on the OCA.

the test was rigged to produce a negative result

Where the Myers-Briggs test flatters and protects those who take it, revealing to them their special psychological gifts, the Oxford Capacity Analysis is designed to tear your personality down, in order to rebuild.

“Your opinion of you,” then, is that you are a problem only Scientology can solve.

With the birth of the scientific opinion poll, Gallup attests, the long search for that Holy Grail of representative democracy – an accurate gage of popular opinion – was finally reaching an end.

the semantics of interrogation

The whole project of opinion research is predicated on the assumption that people can tell you what they really think.

The rise of the personal questionnaire broadly parallels the rise of women’s literacy, which soared across class divisions in the late nineteenth and early twentieth centuries.

Questionnaires, then, could be mechanisms of psychological control, but also portals to self-reflection, instruments of what the women’s movement of the 1970s would call “consciousness-raising.”

Popenoe founded the American Institute of Family Relations (AIFR), the nation’s first marriage clinic, in Pasadena, California, in 1930.

“the JTA’s statistical assumptions and assessment protocols allowed men much greater deviation [from the norm] than women”: even the math behind the JTA’s was sexist.

the fundamentally conservative enterprise of marriage counseling made some accidental contributions to women’s liberation nonetheless.

Quizzes played an important role in defining this hypothetical individual.

To take a psychological test is to put your trust in science (or pseudoscience, as the case may be). To take a quiz is to put your trust in an omniscient, benevolent magazine editor. Both of them involve a sort of quasi-religious faith. It’s a type of faith based on familiarity, which can often shade into contempt without undermining the basis of the faith, and it has been essential to the development of passionate online fan bases for quizzes, personality tests, purity tests, and other questionnaire-based forms. Even if you don’t share your answers with anyone, you’ve given up a part of yourself to a higher authority: you have confessed.

It allowed questionnaires, freed from any requirement to be accurate, to be fun.

the questionnaire as the basic building block of their information architecture.

In the mid–1950s, at the apex of his fame as a marriage counselor, Popenoe collaborated with computer scientists at Remington Rand on the world’s first computer dating program.

eHarmony, founded in the year 2000 by Neil Clark Warren, a Christian marriage counselor in Popenoe’s adopted hometown of Pasadena. In its early years, it was affiliated with Focus on the Family’s James Dobson, who got his start in the 1970s as one of Popenoe’s assistants at the AIFR.

All computer dating programs are built on a quasi-eugenic premise: that the fitness of a potential mate can be determined objectively, thus allowing “inappropriate” sexual partners to be screened out.

Without subscribing to their racial theories, they share with Popenoe and Galton a belief that human qualities can be quantified and that, once this data is collected and correlated, a better social order can be engineered.

it is becoming increasingly clear that they do far more with their users’ personal data than use it to set them up. A case in point is OkCupid. While it is far from the most successful dating site in raw numbers, OkCupid has had perhaps the greatest influence on the style of contemporary Internet culture at large.

Data is data, and when enough of it is compiled, patterns of some kind will inevitably emerge.

That raw data is used to match OkCupid’s customers, but it’s also sold (as Paumgarten reported in 2011) to academic social scientists, and probably to other outside parties as well.

none of the people represented in this data set agreed to be part of a study, nor did they sign the informed-consent agreements that are prerequisites for any legitimate research on human subjects in the social sciences.

“siren server”

a powerful computer network with exclusive access to a specific type of data (in this case, answers to personal questions relating to dating preferences) and proprietary sorting algorithms to help make sense of it all.

the quality of the typical answer matters less than the quantity of total answers.

quizzes are still a consistent traffic driver for BuzzFeed

BuzzFeed has denied that it’s selling the user data generated by quizzes, or even collecting it beyond basic metrics like how many people have taken the quiz, whether they share it, and their final results.

As soon as you land on any BuzzFeed page, Barker notes, custom variations to the site’s Google Analytics code allow it to see whether you’ve arrived via Facebook, your age, gender, the country you’re currently in, and how many times you’ve shared BuzzFeed content in the past. In the particular cases of quizzes, the site also records each “event” (i.e., each click on the page). “If you click ‘I have never had an eating disorder’” (an actual checklist item from the “How Privileged Are You?” quiz) “they record that click,” Barker writes. This means that, in theory at least, BuzzFeed is in possession of some extraordinarily sensitive information about their users.

they have both the technological capability and a strong economic incentive to do so.

The politics of Big Data are still up for grabs, though it’s difficult to believe that things won’t ultimately tilt in the direction of management rather than labor.

Audrey Watters


Credits: Jack Kirby

Audrey Watters

Notes from When We Are No More

28 min read

I'm working on a keynote with the phrase "Memory Machines" in the title, as this book has me thinking about the importance of personal memory, cultural memory, knowledge-making, and ed-tech.

Over forty thousand years ago, humans discovered how to cheat death. They transferred their thoughts, feelings, dreams, fears, and hopes to physical materials that did not die. They painted on the walls of caves, carved animal bones, and sculpted stones that carried their mental and spiritual lives into the future. Over generations we have created sophisticated technologies for outsourcing the contents of our minds to ever more durable, compact, and portable objects.

The carrying capacity of our memory systems is falling dramatically behind our capacity to generate information.

Every innovation in information technology, going back to ancient Mesopotamians’ invention of cuneiform tablets, precipitates a period of overproduction, an information inflation that overpowers our ability to manage what we produce.

Having more knowledge than we know what to do with while still eager to acquire more is simply part of the human condition, a product of our native curiosity.

massive machines that create, store, and read our memory for us.

What this mastery looks like and how we achieve it is today’s frontier of knowledge.

Digital memory is ubiquitous yet unimaginably fragile, limitless in scope yet inherently unstable.

the Declaration of Independence in full Track Changes mode

One data-storage company estimates that worldwide, web data are growing at a rate that jumped from 2.7 billion terabytes in 2012 to 8 billion terabytes in 2015. But nobody really knows—or even agrees how we should be counting bits.

The question had always been: “What can we afford to save?”

The question today is: “What can we afford to lose?”

We are replacing books, maps, and audiovisual recordings with computer code that is less stable than human memory itself.

Code is rapidly overwritten or rendered obsolete by new code.

Digital data are completely dependent on machines to render them accessible to human perception.

Two reigning misconceptions stand in the way of a happy ending to our experiment in reimagining memory for an economy of digital abundance. First is the notion that today’s abundance is a new phenomenon, unseen in human history, which began with computers and is driven by technology.

That was the radically transformative idea that the universe and all that

exists is no more and no less than the material effect of material causes.


Four inflection points in particular precede and enable the scientific advances of the nineteenth century that inaugurated today’s information inflation: (1) the development of writing in Mesopotamia for administrative and business purposes, together with professional management of the collections; (2) the ancient Greeks’ development of libraries as sites for the cultivation of knowledge for its own sake; (3) the Renaissance recovery of Greek and Roman writings and the invention of movable type, which together helped to propel the West into the modern age; and (4) the Enlightenment of the eighteenth century, which refashioned knowledge into an action verb—progress—and expanded the responsibilities of the state to ensure access to information.

when science moved from the Age of Reason to the present Age of Matter

The computer is not an accurate model for the brain.

Memory is the entire repertoire of knowledge an animal acquires in its lifetime for the purpose of survival in an ever-changing world—essentially everything it knows that does not come preprogrammed with its DNA. Given the complexity of the world, memory takes a less-is-more approach.

We keep our mental model of the world up to date by learning new things. Fortunately, our memory is seldom really fixed and unchangeable.

data is not knowledge, and data storage is not memory.

facts are only incidental to memory.

memory is not about the past. It is about the future.

Human memory is unique because from the information stored in our brains we can summon not only things that did or do exist, but also things that might exist. From the contents of our past we can generate visions of the future.

This deep temporal depth perception is unique in Nature.

As we consider memory in the digital age, we will see how our personal memory is enhanced, and at times compromised, by the prodigious capacities and instantaneous gratifications of electronic information.

Collective memory—the full scope of human learning, a shared body of knowledge and know-how to which each of us contributes and from which each of us draws sustenance—is the creation of multiple generations across vastly diverse cultures.

culture, a collective form of memory, we create a shared view of the past that unites us into communities and allows large-scale cooperation among perfect strangers.

We recognize ourselves in the irrational yet compelling desire to breach the limits of time and space, to bear witness to our existence, and to speak to beings distant in time and space.

self-awareness, symbolic thought, and language.

Creation myths usually feature people who are dangerously curious.

in Paradise, there is no curiosity.

Culture is the collective wit by which we live.

Through a curious interbreeding of biblical theology and Greek thought, the West gradually stopped seeing knowledge as a threat to reverence and instead began to cherish it as a form of reverence in itself.

writing remained intrinsic to the process of accounting for goods and services and making public—publishing—that information.

cuneiforms represent a more powerful innovation in the deep history of memory than a technical solution. This innovation made the invention of writing not only possible, but also almost inevitable. It led to the creation of objects as evidence, capable of transcending the frailty of human memory and thwarting the temptation to shade the truth by holding people accountable.

objective witnesses that cannot lie or forget.

to manage economic assets, secure control over a ruler’s possessions, and extol his power.

The proliferation of tablets with valuable information led to the vexing questions of storage, security, preservation, and cataloging—an early instance of a Big Data management problem.


As a mechanism of adaptation, culture is far more efficient than biology. Genetic material is more or less fixed at the time of conception. The genome does not acquire new information from an animal’s experiences of life. Learning modifies the nervous system of an animal, but not its DNA.

Before globalization, there were thousands of ways of being human, each with its own language, dress, kinship systems, counting methods, and food ways.

Because we are by nature culture-making creatures, distinctions we like to draw between what is natural and what is artificial or man-made are illusory at best.

Any feeling of there being a gap between humans and Nature is itself a by-product of culture.

Extending the reach and longevity of knowledge became a distinct competitive advantage not only over animals, but also over rival Homo sapiens.

The strategic alliance between knowledge and power, record keeping and administering power

Autobiographical memory gives us a sense of who we are and provides continuity as we age.

Memory begins to focus less on learning new things than on integrating all that we have experienced and known to provide a sense of continuity between past and present selves.

This is memory’s task of retrospection, to integrate the knowledge that we have, to impute a sense of cause and effect to the events in our lives, and to offer a sense of meaning.

Culture provides the large-scale framework for memory and meaning. It aids in the creation of new knowledge, but it also acts as a filter that over time determines what is of long-term value from the social perspective.

Natural memory is designed to be labile, flexible, easily modified or written over to suit new environments.

Artificial memory is designed to be stable, fixed and unchanging, slow, and resilient, freeing up mental space for individuals to learn new things.

We are all born into a culture, specific to a time and place, that provides a wealth of ready-made knowledge and know-how for us to use in making our way in the world without delay.

how much of our individuality emerges from our native culture,

shared memory is the midwife of innovation and, paradoxically, accelerates the change in our environment.

Collective memory and the sheer power of knowledge accumulated over millennia both push us ahead and pull us from behind.

In periods of great instability, the past becomes more useful as we increasingly tap into the strategic reserve of humanity’s knowledge. Yet it is at moments like this when the past is most easily lost.

The cultural amnesia induced by their complete loss was not Caesar’s doing, but the work of many generations, Christian and Muslim, who felt no responsibility to care for pagan learning.

the astounding efficiencies of ink-on-paper writing were bought at the price of durability.

in terms of sheer durability, the technology for writing reached a peak five thousand years ago and has been going downhill ever since.

By the fifth century B.C., the Greeks had embarked on a novel enterprise, the concerted cultivation of knowledge for its own sake. In doing so, they made three contributions to the expansion of human memory whose effects are still playing out today. The first is the creation of mnemonic or memory techniques that tap into a profound understanding of how memory relies on emotion and spatialization, thereby predating contemporary neuroscience’s findings by twenty-five hundred years.

The second is the creation of libraries as centers of learning and scholarship, not primarily storage depots for administrative records. And third is recognition of the moral hazards of outsourcing the memory of a living, breathing, thinking, and feeling person to any object whatsoever.

By cultivating knowledge for its own sake, they raised the pursuit of beauty and harmony to a level as high as, or higher, than the pursuit of know-how to solve pragmatic problems.

two fundamental discoveries about how the brain forms memories through emotion and spatial thinking. Prizing the art of rhetoric as a civic virtue—democratic citizenship in action, as it were—the Greeks had to perform feats of memorization and recitation.

memory palaces.

part-for-the-whole substitution is termed “synecdoche,”

For reasons we do not fully understand, memory can be reinforced and amplified by using physical objects, whether it is a memory stone, a series of knots tied on fingers or into elaborate quipu, or merely an extension of the body itself.

We do not know why, let alone how, moderate physical movement stirs up the archives of the mind along with the circulation of the blood.

We may refer to the Internet as cyberspace, but its lack of material substance has distinct disadvantages when it comes to finding our way in its dense forests of data. We understand so very little about how real physical space affects memory and vice versa.

Context is spatial.

Until roughly 2000 A.D., if someone wanted access to information, they had to go to where the books and journals, maps and manuscripts were—the library.

a collection that supported scholarship, embedded in a temple to learning.

The collections were copied and managed by experts, studied and edited by other experts. The scholars were paid for their intellectual labor,

It does not matter how comprehensive and well tended a collection may be. If an item cannot be located on demand because it is out of order, misplaced, or incorrectly cataloged, it effectively does not exist.

The librarians of Alexandria could afford to solve the scroll-management problem by throwing a lot of cheap labor at it. But a better solution was a technically advanced format—the codex.

Until the present age, managing physical objects was the only way we managed knowledge.

an imperial library

As depositories of human memory, libraries became the symbol of man’s attempts to master the world through the gathering of all knowledge. No library was quite as ambitious as the one in Alexandria. It is the essential model for the library in the digital age.

according to the written testimony of Plato, Socrates warned that the invention of writing would lead to ignorance and, ultimately, the death of memory.

Once knowledge is transferred to a piece of paper, then it essentially leaves us and with that, Socrates argues, we no longer feel responsible for remembering it.

For Socrates, remembering is a moral action, touching the very substance of our being.

The art of memory was taught as a species of performance,

The very foundation of memory itself was understood to be emergent and performative—not fixed and forever, but coming into being under specific circumstances.

Expanding the scope of knowledge above and beyond a certain scale makes it impossible to achieve the single thing he thought mattered in life: to know thyself.

Fundamental to today’s anxiety about the future of memory is the lurking awareness that our recording medium of choice, the silicon chip, is vulnerable to decay, accidental deletion, and overwriting.

Without preservation, there is no access.

With every innovation in information technology that produces greater efficiency by further compressing data, librarians and archivists begin a frantic race against time to save the new media, inevitably more ephemeral.

By 1500, a mere four decades after printing presses began operations, between 150 and 200 million books flooded the circulation system of European culture.

He called his prose pieces essais, meaning attempts, tests, or perhaps experiments.

he shows us that sometimes the best way to understand ourselves is to reveal ourselves to others,

No institution failed more spectacularly than the papacy,

the papacy’s religious authority and reputation remained fast among the faithful until they encountered detailed (often printed) reports of moral corruption in the papal court, frequently accompanied by salacious drawings

People became disillusioned with the clergy at all levels. But they did not become less religious. On the contrary, it was an inflammation of religious passion that led to the reformation, rather than the dying off, of Christian faith.

The presses had destroyed the possibility of monopolizing channels of communication.

It took a few generations before people and parties became adept at controlling the presses so they could control the message.

he created his audience by giving them something new, something they did not even know they wanted.

As authorities and institutions fail, we are forced to decide for ourselves which sources are trustworthy and which are not. The question of what to believe becomes, almost imperceptibly, a question of who to believe.

A new genre was invented—the newspaper—to meet the growing appetite for novelty, information, and gossip.

“Enlighten the people generally, and tyranny and oppressions of body and mind will vanish like evil spirits at the dawn of day.”

The first libraries were collections of religious books, packed in the luggage of religious pilgrims from the Old World.

they were read with one goal in mind—personal salvation.

For Jefferson, the goal of reading was not salvation but freedom.

Jefferson was more an acquisitor than an annotator.

America’s inane tension between nostalgia and utopian futurism,

private collections are the vanguard of our collective memory, but their value is realized only when they pass into the public sphere.

institutions are important for functions that must persist over long durations of time. Their job is to slow us down, to add friction to the flow of thought, foster inertia, and carve out from the fleeting moment a place for deliberation.

why collectors are so valuable a part of an information ecosystem. The faster the present moves, the more valuable they become. Collectors historically have acted as the first-line defense against the physical loss of our cultural legacy. They collect and curate the artifacts of knowledge on our behalf. While the motives of individual collectors can vary between the poles of intellectual curiosity and personal vanity, great collectors have some larger purpose they wish to accomplish and to which they dedicate enormous amounts of time and treasure. They are the ones who keep the strategic reserve of memory rich, saving various fossils of extinct cultures and ensuring that the collective memory of mankind does not become a monoculture.

Before the Enlightenment, the idea of a right to access, let alone to a universal collection, would have been nonsensical. Now it is the default expectation.

The technologically advanced, data-rich world we live in today all devolves from one central discovery made in the nineteenth century: The universe was created in deep time, extends infinitely across deep space, and leaves a record of its own history in matter. The material universe is itself an archive and the Earth writes its autobiography in matter.

the enthusiastic pedantry of the graduate student.

Jefferson saw the category of Memory or History encompassing all matters intrinsic to the natural world and the result of natural processes.


What is past has now become prologue.

the power of predictions inherent in materialism that captured the public and scientific minds alike.

In hindsight we see Darwin’s proposal that human beings evolved from primates as singularly traumatizing.

As the universe got bigger, we got smaller.

The disestablishment of the church meant that the pursuit of science and learning was protected behind a cordon sanitaire from sectarian battles.

In a nation where there were many different and often competing religious persuasions, liberating the pursuit of knowledge from religious oversight seemed eminently practical as well as self-evidently moral.

From the perspective of memory, the most consequential effect of embracing materialism is its unquenchable appetite for information in its smallest increments—single data points—and as many of them as possible.



Evidence, in other words, is information available to all without prejudice.

It is not esoteric, subjective, or privileged information. Even if not literally to be introduced at court, information has forensic value when it is reliable, publicly available, and authentic—that is, being what it purports to be, not a false representation.

What counts for evidence is always culturally determined, as is its interpretation.

If philosopher is a term “too wide and lofty” for the likes of the assembled, then “by analogy with artist, we may form scientist” as one who is “a student of the knowledge of the material world collectively.”


Beginning in the 1830s, new technologies appear in rapid succession: image capture (the first daguerreotype was taken in 1839); sound recording (the first recording of a human voice was made in 1860 by Édouard-Léon Scott de Martinville and the first playback machine by Thomas Edison in 1877); and X-rays (discovered by Wilhelm Röntgen in 1895).

The nineteenth century was marked by a series of crises around physical and

intellectual control of all the evidence streaming in. Emboldened by the dream of accelerating the rate of human progress and well-being, expanding our control over the natural world, and freeing ourselves us from drudge labor, we went to work on documenting the world. We built more infrastructure to manage the documents, supported the growth of highly specialized fields of knowledge to keep pace with the incoming data, and trained cadres of skilled professionals who in turn spontaneously differentiated themselves, like Darwin’s finches developing differently shaped beaks. Technologies and tools coevolve with the ideas and cultural practices of their users. And then the users outgrow them and want more. Science itself cannot advance without ever finer tools of observation, measurement, experimentation, and the communication of these results to other scholars. (The World Wide Web was devised to speed communication of research results among collaborating scientists at distant sites.)

By the 1840s the forensic imagination had already penetrated popular culture. In 1841, Edgar Allan Poe published the first story of detection, a “tale of ratiocination.”

Their zealous, even ascetic, dedication and single-mindedness of purpose became the hallmark of the professional man (and eventually woman). Immersed in the data-dense environment of a crime scene, Sherlock Holmes always brought laser-like focus and purpose.

When Karl Marx described the “alienation of labor,” he was not just talking economic theory, but also of the increasing perception that laborers were losing a sense of autonomy. As we outsource more of the most intimate part of ourselves—our personal memory and identity—to computer code, the fear of losing our autonomy—the alienation of our data, so to speak—increases because in the digital age, only machines can read our memory and know what we know at scale. As we gain mastery over our machines, this anxiety will lessen. But it will never go away, for the trade-offs we make between our appetite for more knowledge and our need for autonomy and control will continue to keep us on the alert for unintended consequences.

Emboldened by the Enlightenment cult of reason, we saw curiosity no longer as a vice, but as a civic virtue.

The first failure is the interruption of long-term memory formation.

The second failure is the loss or disintegration of memory.

what we perceive in any moment is a combination of real-time perception and stored information, our memory of the world. No creature is able to process enough information in real time to react appropriately to events as they transpire.

Living creatures come with preprogrammed memory, the genome, that encodes the history of the species and provides full instructions on how to become an ant if you are born with ant genes, a marmot if with marmot genes, a human if with human genes.

One of the breathtakingly simple advantages of the cuneiform, scroll, or printed page was that the memories inscribed on them were not easily changed, overwritten, or erased. On the contrary, these durable objects acted in exactly the opposite way our brains work. If kept in reasonably good physical shape, the words and images on a piece of paper would not change one whit for hundreds of years, no matter how many times they were read. Digital memory operates much more like biological memory. It is not really fixed and is easily overwritten or updated without leaving much trace of the changes made.

Deep learning and creativity, on the other hand, rely on the transformation of one day’s intake of perceptions to something sustained over time, embedded within a network of existing associations.

As man-made physical objects, all these artifacts of recorded knowledge—maps and photos, books and magazines—exist on the same scale as the humans who created them. The digital does not.

we know the effect of accelerated processing time and of binary thinking in our everyday lives: We have simultaneously more information and fewer means to sort its value.

“It is clear that the brain is much more like a social network than a digital computer.” Memory and learning are investigated now as products of “the graded, analog, distributed character of the brain.” It turns out that the computer is not an accurate metaphor for the brain.

Most of what we learn bypasses our awareness altogether and goes straight to the emotional and instinctual centers of the brain.

From the time of the Enlightenment onward, Western culture has deemed reason a stronger and more prestigious form of intelligence than emotion. Reason thinks slowly, not intuitively, and is effective as an instrument used to exert human will. But it is not the font of empathy and fellow feeling that social life requires. Computers, on the other hand, can reason with stunning speed, but they cannot simulate human decision-making processes with equal speed because they are not emotional. They can learn to simulate our behaviors by assessing the outcomes of past choices to make probabilistic predictions (“people who liked this also liked …”), and often that is good enough.

It is easy to read S.’s life story as a cautionary tale about the temptation to save all data because our capacity for digital storage keeps growing. The quantity of data we amass does not by itself add up to a grand narrative. Good memory is not data storage. Forgetting is necessary for true memory.

We have created a technologically advanced world that operates at a breathless pace, driven by a culture that demands more innovation (“disruption”). But science tells us that this disruptive and accelerated pace is self-defeating because our bodies and minds still operate at the same tempo as they did in ancient Sumer.

We are a culture obsessed with facts—the intrinsic value of a fact. But our brains do not share this reverence for facts.

The past is a plural noun.

memory is not about the past, it is about the future.

Our perception always tends toward prediction: It anticipates what it is seeing.

If the great feat of memory is to construct a model of the world that approximates reality closely enough—however we do it—the genius of imagination lies in using that model to create alternative orders and models of reality. Memory records the world as so.


transposes it into the key of as if, transforming experience into speculation. That is why to lose one’s memory means losing the future. Because imagination is memory in the future tense.

Imagination in adults is quite different from what we find in children. It is more akin to conjectural thinking, the ability to predict based on incomplete information.

a child’s model of the world is full of enchantment and driven by desires.

It is an intrinsic property of human memory,

As the scientist Richard Feynman said, “Science is imagination in a straitjacket.”

The loss of collective memory is as devastating to cultural identity as the loss of personal memory was to Murdoch. In the wars of the last century, both civil and international, the destruction of cultural memory became a central strategy in subduing civilian populations.

How will society respond flexibly, inventively, optimistically to the increasing pace of change if we lose our imagination?

Outsourcing more and more knowledge to computers will be no better or worse for us personally and collectively than putting ink on paper. What is important in the digital age, as it has been for the print era, is that we maintain an equilibrium between managing our personal memory and assuming responsibility for collective memory.

In the twenty-first century that means building libraries and archives online that are free and open to the public to complement those that are on the ground.

By extracting invaluable information from our use data, they create algorithms that predict our desires and streamline production facilities that offer to fulfill them even before we can articulate them—the “if you like this, you will like that” magic of user-driven algorithms. On the one hand, these shortcuts to gratification work for us because they save us so much time. On the other hand, we end up not with more freedom of choice but less, and the results can be easily gamed without our knowledge.

The trade-off between choice and convenience is always there.

“slow thinking,” as opposed to instinctual reactions, “fast thinking.”

most of our personal digital memory is not under our control.

We view our Facebook pages and LinkedIn profiles as intimate parts of ourselves and our identities, but they are also corporate assets.

The fundamental purpose of recording our memories—to ensure they live on beyond our brief decades on Earth—will be lost in the ephemeral digital landscape if we do not become our own data managers.

The skills to control our personal information over the course of our lives are essential to digital literacy and citizenship.

Sound recording is a recent technology, the first recording made in 1860. Despite its youth, in many ways audio is far more vulnerable to decay and loss than parchment manuscripts that have survived for two thousand years.

The forensic imagination means that there are now almost limitless possibilities for the extraction of information from any piece of matter, no matter how fragile.

The new paradigm of memory is more like growing a garden: Everything that we entrust to digital code needs regular tending, refreshing, and periodic migration to make sure that it is still alive, whether we intend to use it in a year, one hundred years, or maybe never. We simply cannot be sure now what will have value in the future. We need to keep as much as we can as cheaply as possible.

Beyond the problem of sheer scale, there are formidable social, political, and economic challenges to building systems that effectively manage an abundance of data, of machines, and of their human operators. These are not technical matters like storage and artificial intelligence that rest in the hands of computer scientists, engineers, and designers. They are social. Digital infrastructure is not simply hardware and software, the machines that transmit and store data and the code in which data are written. It comprises the entire regime of legal and economic conditions under which these machines run—such as the funding of digital archives as a public good, creating a robust and flexible digital copyright regime, crafting laws that protect privacy for digital data but also enable personal and national security, and an educational system that provides lifelong learning for digital citizens. We need to be competent at running our machines. But much more, we need to understand how to create, share, use, and ultimately preserve digital data responsibly, and how to protect ourselves and others from digital exploitation.

in the digital age, the fundamental mission of libraries and archives to preserve and make knowledge accessible is at risk because there is no effective exemption from copyright law that covers the specific technical preservation needs of digital data.

It is unrealistic to assume that in market capitalism, publishers, movie studios, recording companies, and other commercial enterprises will preserve their content after it loses its economic and commercial value or becomes part of the public domain.

The World Wide Web is not a library. It is a bulletin board.

It will be a challenge to re-create the traditional public library online, because a public library exists in large part to provide access to contemporary copyrighted materials.

Only decades of living with digital memory will reveal how reading on a screen differs from reading on a page, how digital audio recording affects our acoustical sensibilities, and how the inescapable ubiquity of information that chases us rather than the other way around alters our habits of thought.

Our twin aspirations—to be open yet to protect privacy, to embed democratic values in our digital code to support the public good while fostering competition and innovation in the private sector—will clash repeatedly.

And so it is with our artificial memory. The more fragile the medium, the more redundancy we need. Nothing we have invented so far is as fragile as digital data.

The web has the scope of a comprehensive library, but it lacks a library’s rules of access and privacy.

Much web content is inaccessible behind paywalls and passwords. Readers leave vivid trails of where they have been online in their browser history.

To reinvent something like the Library of Congress or Alexandria online, we would begin with an Internet that provides easy access to information, make it searchable by something more comprehensive than Google, and add the crucial back end of a network of Internet Archives to ensure persistence of data. Readers and researchers would have use of cheap, ubiquitous fiber connection, regulated as a utility to ensure equitable access. The reading room of the public library, open to all and allowing privacy of use, would now be complemented by similar spaces on the Internet.

(Today a web page lasts on average for one hundred days before changing or disappearing altogether.)

Search engines, whose business is built from the ground up on the reuse of other people’s data, also stake their future on managing and preserving the data they harvest.

The Greeks saw imagination as a divine dispensation from ignorance, a gift from the goddess of memory.

it encourages the pursuit of curiosity for its own sake and democratizes it.

we will take advantage of outsourcing logical tasks to our machines to free up time for more imaginative pursuits.

Our machines will not grow a moral imagination anytime soon. They must rely on ours.

Today, we see books as natural facts. We do not see them as memory machines with lives of their own, though that is exactly what they are.

Audrey Watters

Notes from The Global Pigeon

8 min read

I haven't finished reading The Global Pigeon yet, but I'm starting to pull together some thoughts on pigeons for a keynote this fall. I'll update this post when I finish the book. The following are passages I've highlighted:

“matter out of place.”

Rather than seeking communion with nature, Carmine absconded to his rooftop with a bit of nature’s raw material and relished his power to sculpt the pigeons, through selective breeding and training, according to his will.

the “social self.”

how cross-species encounters can in fact be a constitutive feature of social life in the city.

the pigeon coop was simultaneously an embodiment of the men’s nostalgia for a lost world and an organizing principle of their present-day social relations and identity.

this ritual led pedestrians to interpret the pigeon flocks as rightful—even celebrated—residents of the squares, and that the decision in both cities to evict the seed vendors, redefine pigeons as “rats with wings,” and ban pigeon feeding was part of a larger political project aimed at tackling a host of “quality of life” issues.

the ways that encounters with pigeons mediated people’s experience of urban spaces.

The pigeon is a felicitous animal for exploring the social significance of cross-species urban entanglements.

pigeons have in effect become naturalized urban citizens.

Their presence on city streets is utterly pedestrian, in both senses of the word.

Commonly referred to as “rats with wings,” a label meant to characterize them as filthy vectors of disease,

[… pigeons, though it was decommissioned after mauling a pedestrian’s Chihuahua. …]

People have not always deemed pigeons “nuisance animals” or sought to reduce their numbers. And it is only in the last century or so that pigeons have come to be considered distinct from doves.

“Rock pigeons,” also called “rock doves” (Columba livia), were first domesticated about 5,000 years ago, and for millennia humans selectively bred them to meet a variety of material needs.

The legacy of these efforts is omnipresent: the gray pigeons with black-barred wings and iridescent neck feathers that occupy city streets worldwide today are rock pigeons’ feral descendents.

Pigeons partly domesticated themselves.

“reproductive magic,”

Their gregariousness and docility also made them fitting symbols of peace (their predators, hawks, stand for aggression).

In feudal times, pigeon meat and guano were deemed so valuable that the right to own a dovecote was restricted to nobility—no wonder, then, that so many of these ornate structures were toppled in the wake of the French Revolution.

Over time, humans bred larger and fatter varieties for food, and stronger and leaner varieties with enhanced “homing” instincts (called homers) that could carry messages over hundreds of miles.

Pigeon fancying was so popular, and the variation in shapes, sizes, and colors that fanciers had produced through centuries of selective breeding was so extraordinary, that Charles Darwin opened On the Origin of Species with an exhaustive genealogy of so-called “toy” pigeon breeds.

“Everybody is interested in pigeons.”

The pigeons that occupy our sidewalks never existed in the wild. They are descendants of escaped domesticated pigeons that were imported to the United States, Europe, and elsewhere centuries ago.

French settlers who introduced the rock dove to North America in the early 1600s, primarily for consumption.

Society, then, has abetted an animal whose niche is one designed to be the exclusive habitat of humans: the sidewalk.

nonnative, feral birds—neither purely wild nor domestic—now confront humans as our own historical detritus

Synanthropes like pigeons challenge the conventional notion that urbanization has insulated people from contact with animals and nature.


the social experience of animals.

Animals were central to totemic belief systems not primarily because they embodied nature but because they were useful symbols for expressing the relationship between self and society.

Conceiving of nonhumans as separate from society—the corollary of Nature Lost—precludes a sociological understanding of how they are incorporated into contemporary social life.

Animals become pets or pests through social processes of interaction and classification.

While sociologists recognize that our sense of self is created through interaction, prevailing accounts of the “social self” neglect the role that nonhumans play in its construction.

we most enjoy having nature in our midst when we can exercise our “impulse to reduce—and thereby, order and control” it.

The edges of the city and nature continually rub against, and run over, each other like tectonic plates. The interaction may be smooth, or tremors may result; and the fault line can widen, narrow, and shift as an effect of the encounter. By straddling the fault line, we can begin to understand how the borders and contours of urban experience are shaped through cross-species encounters.

how animals can become part of what sociologist Erving Goffman called the “interaction order”36 of public space, and how social contexts structure whether or not people welcome the presence of the “wild” in city streets.


the capacity of pigeons to recognize regular feeders and coax unwitting park visitors into feeding them, behaviors that reflected the birds’ adaptation to the demands and opportunities of city living.

pigeons are “able to learn quickly from their interactions with human feeders” on the street and “use this knowledge to maximize the profitability of the urban environment,” discriminating between friendly feeders and hostile pedestrians and adopting begging strategies that elicit food from strangers.

the social significance of pigeon feeding.

pigeons, like chickens, are naturally inclined to scavenge for food by pecking the ground—not flowers, trees, and shrubs (which is why they walk rather than hop).

However different its intentions, the pigeon still confronts the feeder as an active agent, what philosopher Martin Buber would call a “thou,” “a truly subjective other whose immediate presence is compelling.”

a subtle tension at work, then, when one enters “open regions” such as parks. By putting oneself in a public place, the individual is open to the desirable possibility of engaging in sociable encounters with strangers and open to unwanted social entanglements or even social isolation. We are made vulnerable.

born out of a situational response to solitude in a moment of unstructured time.

While space refers to an area’s physical properties, place refers to its social meanings. Places embody history.

The mythical connotations of pigeon feeding in the square, and the ways that the “socialized” birds provided tourists with spontaneous and novel interactions, endowed the Piazza San Marco birds with a magical quality.

The pigeons’ alleged filth was tied to the filth and congestion that the city saw as choking the square.

By witnessing and participating in the feeding sessions over the next two weeks, I saw that these episodes were a time to eulogize, and briefly reconnect with, a place that no longer existed.

the Pigeon Action Group.

Collective memory haunted the space, fostered by diehard feeders, pigeons still tame enough to land on visitors, and the countless images of pigeons in the square still circulating in the media and literature.

A major part of how pigeons were framed as a problem in both squares was by being hitched to the mayors’ “quality of life” agenda, which sought to sanitize and bring greater order to public space.

1966 New York Times article,

‘a rat with wings.’”

our social and moral evaluations of animals are contingent on where they are found.

humans work to ensure that animals stay in their “proper place.”

Drawing on the anthropological insights of Mary Douglas, Philo claims that animals that “transgress” the “socio-spatial order” that humans have constructed around them become interpreted as “matter out of place.”

Like weeds in the cracks of pavement, pigeons represent chaotic, untamed nature in spaces designated for humans.

Their metaphorical “pollution” of city streets becomes crystallized through their link to humanity’s literal pollution—trash.

Part of our aversion to pigeons, then, stems from cultural insecurities about proximity to dirt and impurity. Mary Douglas argued, “In chasing dirt, in papering, decorating, tidying, we are not governed by an anxiety to escape disease, but are positively re-ordering our environment, making it conform to an idea.”

The pigeons of Trafalgar Square and Piazza San Marco are objectively different than most other street pigeons—they have been tamed.

In the process, they have become fully dependent on people for food and have stopped scavenging. Cultural products in a literal sense, their habits and their large flock sizes are a result of ritual human practices that engender a tradition. Most of the birds cannot survive without interacting with people, leading animal rights activists in both cities to ask the thorny question of whether people owe anything to these animals that they have made into urban scavengers. Once granted a space and a food supply, do these animals deserve a place in the city?

Audrey Watters

Wow. I'm really not into Known anymore, am I

Audrey Watters

Audrey Watters