Waste flow: A lunchtime seminar with Myra Hird

by Michele Acuto

This week’s on “Waste Flow” lunch time talk at Saïd Business School was presented by Myra Hird from Queen’s University, where Myra has been heading a research project on Canada’s waste flow. As she reminded the us, Canada is today the largest producer of waste per capita, and now faces serious challenges as this refuse is mostly landfilled when not exported to other countries.

Myra’s talk presented us with a series of considerations on the definitional complexities of waste, the technology of landfilling and leachate management as well as the twin forgetting/remembering process at work in the everyday social construction of waste. Drawing on sociological and anthropological literature, snapshots from around the world like the infamous “flying toilets” phenomenon in slums, field research at Canadian landfills in Kingston and, not least, Futurama and Fight Club clips, Myra begun by teasing out the challenges of defining what is “waste” and what is “wasting”. Associated with both tangible as well as intangible resources, wasteful consumption has become a daily obligation of society. Yet, as the the phenomenon of ‘garbology’ tells and as Myra reminded us, trash is also an epistemological resource charged with meanings, identities and multiple values, both economic and cultural.

The talk proceeded on to some closer observation on the construction of landfills and Myra’s work in collaboration with Queen’s Civil Engineering professor Kerry Rowe. Focusing on leachate, that heterogenous material product of the guaranteed failure of landfilling, which spoke to the uniqueness of the uniqueness of the mix of garbage contained in each landfill, as well as to the inherent impossibility of fully controlling and containing waste. This was the inspiration not only for considerations on engineering landfills and liner systems but also on a broader consideration that Myra presented us with: waste management in developed countries like Canada is today a practice of domestication and forgetting aimed at containing and determining something that, as waste, is inherently undetermined.

These considerations then led to the last section of the Talk, where Myra moved on to the ethical issues embedded in this visions of ‘mastery’ of waste as opposed to ‘remembering acts’ aimed at countering the removal of waste from everyday experience and at making waste management, as much as landfills, public. This theme sparked a lively discussion in the dozen or so participants to the talk, which begun by a variety of considerations on the aboriginal dimension of of the Waste Flow project and proceeded on to a the challenges of experiencing waste against forgetting. Along with the difficulties of bringing in an aboriginal sphere into research, and the often autobiographic bases of waste research, the Q&A raised issues about the capacity of ‘staying’ with garbage to remove the ‘waste’ quality of refuse (as with the plastic bottle that ‘comes back to life’) as well as with the use of waste to produce something new like in the case of upcycling.

The final challenges for research in waste emerged then in the question of dealing with the holistic nature of waste not only as the popularized household garbage, but also less tangible and yet crucial elements like carbon emissions, or even the unavoidable if not necessary presence of useful natural waste like oxygen. As the talk and the discussion ultimately demonstrated, waste is indeed a largely indeterminate and complex participant to our everyday experiences. And as expected, the group did produce a relevant amount of waste by consuming an healthy quantity of sandwiches, juice and packaged water during the seminar. Myra, at least, had brought her own recyclable water-bottle.

Human and Non-human Animals

by Natalie Porter

This week’s situated seminar on human and nonhuman animals took place at the Oxford Museum of Natural History, where we discussed John Berger’s “Why Look at Animals?” (1980), and Gail Davies’ “What is a Humanized Mouse?” Body & Society (forthcoming).

Museum of Natural History, OxfordAmy Hinterberger explained to us that the museum has been an important site for both entrenching and dismantling nature-culture, human-animal distinctions. Established in 1855, the museum sought to promote the study of natural history in a curriculum weighted toward the humanities. Thirty years later, the Pitt Rivers ethnological collection was erected in an adjacent building. Proximate yet distinct, these spaces reflected a nineteenth century emphasis on separating the domain of God (nature) from that of man (anthropology).

But this separation has proven difficult to uphold, and as early as 1860 the museum became a venue for reformulating the relationship between humans, animals and the divine. Here, in a debate on Darwinian evolution, Thomas Huxley famously told Bishop Samuel Wilberforce that he had:

No need to be ashamed of having an ape for his grandfather, but that he would be ashamed of having for an ancestor a man of restless and versatile interest who distracts the attention of his hearers from the real point at issue by eloquent digression and skilled appeals to religious prejudice.

One hundred and fifty years later our relationship to nonhuman animals remains open to critical reflection and reformulation. In Berger’s piece we find nostalgia for an “unspeaking companionship” between humans and animals, which he suggests has been lost in zoological displays. Berger promotes anthropomorphism as a means to recapture proximity between species, and to bring animals back from the margins. Davies considers anthropomorphism of a different kind. In her investigation of human-animal chimeras, humanization is less about recuperating lost relationships than establishing new ones. The humanized mouse is a species remade, an experimental object that reproduces the human immune system for biomedical research and drug testing.

Why only look at animals?Our discussion touched on themes of likeness and unlikeness, space and context, and species mixing at different scales. Davies shows how translational medicine maximizes overlaps between species to create opportunities on multiple scales. Humanized mice promise everything from personalized medical therapies to global research collaborations. This seems a departure from nonmedical sectors (sanitation, agriculture), where human-animal differences are maximized for productive ends.

Given this diversity, we questioned the utility of Berger’s anthropomorphism as a way to bring animals back from the margins. How do discourses of human rights and animal rights map onto each other? And how might placing animals in the role of ‘victim’ occlude their capacity to resist and respond? Indeed, anthropomorphism suggests an asymmetry in getting to know the other. Are there other ways of engaging with animals that do not involve recourse to humanity? Yes, but we should be wary of efforts to capture any pure or untrammeled animal aura. After all, are dolphins in petting pools less authentic than their ocean-dwelling relatives? Should meerkats be indifferent to human observation?

Questions surrounding the terms of engagement between species resonate with efforts to define relationships between observer and observed. Here, examples from artificial intelligence may prove instructive. In the same way that humans have been distinguished from animals by their capacity for language, the threshold of AI hinges on a machine’s ability to create discourse. But does an emphasis on communication again risk asymmetry, wherein estimations of proximity are conditioned by human perceptions of the other?

Finally, we asked how useful human-animal distinctions are with regard to chimeras, where species mixing occurs at the cellular scale. Humanization here manifests in biology rather than behavior. Likeness and unlikeness is mediated not by language or ancestry, but rather by cellular lineage and design. What does this mean, then, for the separation between natural history and anthropology? Will the humanized mouse end up next to the strokeable pony, or alongside the ethnological artifact?

This session was part of the ongoing reading group Encountering Science and Technology Studies: Situated Seminars. Rather than discussing readings in the confines and comfort of a seminar room, we immerse ourselves in locations that speak to the issues at hand. For upcoming sessions, please check the programme.

Upcoming talk: Myra Hird on “Waste Flow”

by Steve Woolgar

We are very fortunate that Myra Hird is visiting next week. She will give a lunch time talk on “Waste Flow” here at Saïd Business School. Please sign up with Bethsheba McGill if you would like to attend.

Professor Myra HirdWaste Flow

Myra Hird, Queen’s University, Canada
Wednesday, 20th June, 12.30-2pm
Seminar Room 2, Said Business School, OX1 1HP

Waste Flow is an interdisciplinary research project concerned with three central questions: what is waste; what do we do with our waste; and, what is our waste future? This talk focuses on the first question. From both local and global perspectives, waste is a relative term. Some claim there is no waste because humans never actually get rid of anything: things we discard are transformed into other things. Landfill, nuclear and other waste may be out of sight, but its materiality variously resists, flows, and transforms into other substances in deep time. So in asking the question, what is waste? we critically consider what it means to identify certain entities as discardable and discarded, to forget about waste, and to bring waste into view through an ethic of vulnerability.

About the speaker: Myra J. Hird is Professor and Queen’s National Scholar in the School of Environmental Studies at Queen’s University (Canada). She is the author of seven books and some fifty articles and book chapters on topics related to science and technology studies. Hird’s current project concerns waste management and the environment.

An Algorithmic Walk

by Morten Jensen Øllgaard

This week’s seminar challenged the participants to devise an algorithm that could act as a tour guide for the situated STS seminar. Our first instruction was to meet at the Martyrs Memorial, at the intersection of St Giles, Magdalen Street and Beaumont Street in Oxford. We managed to meet somewhere on the south side of the Memorial at the usual time at 4.30pm last Thursday.

Google’s obscure algorithm

First, Malte Ziewitz gave us a short briefing on his research on the practices of search engine optimisation and explained how an entire industry has developed thanks to Google’s obscure algorithm. The complexity of the algorithm running Google’s search engine makes it difficult for ordinary people and businesses to figure out what the algorithm actually does and how to achieve a good ranking on Google. Given the importance of a good ranking many businesses, hire consultants who are specialized in search engine optimisation to help them perform better in various searches.

I decided to see if I could find out just how big a share of the market for search engines Google’s has, trying to understand the importance of a good ranking on Google. I ‘googled’ “search engine market share” and got 55.400.000 results in 0.18 seconds! What I could discern from the different homepages I found via Google was that Google’s global market share is probably more than 80%, whereas Google’s market share in the U.S. is approximately 65% (some uncertainty arises here because the Americans tend to mix up what is global and American). This little exercise made me realise the difficulty in verifying these numbers, as the great many different sites that turned up in my search used different numbers and statistics. I must also admit that I did not care to go beyond the first page of my Google search, anticipating that I would not find a more authoritative answer among the remaining 55.399.990 results, just more numbers. (Actually, I cannot remember the last time I went beyond the first page on a Google search…) In this case Google’s algorithm ranked certain homepages over others, so the question is how we should evaluate this algorithmic intervention? Was it good, bad or neither of the two?

Can algorithms be wrong?

On the seminar’s reading list was an interesting blog post by Tarleton Gillespie, discussing if algorithms can be wrong (link). The blog post uses the dispute over why Occupy Wall Street did not make Twitter’s trends list as a starting point for discussing potential political and moral consequences of algorithmic interventions. The question is: was Occupy Wall Street censored and deliberately removed from the trends list or was it simply not trendy according to Twitter’s algorithmic definitions? We do not know as Twitter has declined to disclose its trend algorithm to protect its business and keep people from speculating in creating trends (that are not real trends, whatever that is). Would it have made a difference if Occupy Wall Street had made it on to Twitter’s trends list? The people behind Occupy Wall Street seem to think so.

I found the comments attached to Gillespie’s blog post to be both very interesting and entertaining as they represent different stands on the issue. In general, the commentators agree that the ‘algorithm reality’ is a particular perspective or view on what is trendy, but their agreement ends here. One comment stipulates that Twitter’s algorithm is a messy and complex piece of software developed over the years and consequently no one should be held accountable for the actions of the algorithm. Some discuss the technical possibilities of constructing so-called ‘open algorithms’ that promise less bias, abuse and misunderstandings. Others do not see what the fuss is all about, as they find Twitter’s trends to be mundane and unimportant. Finally, there is one person claiming it is all one big conspiracy as Twitter, Facebook, Google etc. are all controlled by the wealthy elite, who do whatever they can to reduce visibility of anything they do not like. This commentator offers us a red truth pill and points to parts of the Internet, which do not show up on Google’s algorithm reality.

The victory of the minimax algorithm

The second item on the reading list was an article by Nathan Ensmenger titled “Is chess the drosophila of artificial intelligence? A social history of an algorithm”. The paper explores the link between the game of chess and the development of artificial intelligence (AI). Starting in the 1970’s and up to the defining moment when Deep Blue defeated Garry Kasparov in 1997, Ensmenger demonstrates how chess became the experimental technology within the AI research practices. In that respect chess is similar to Drosophila, which was the experimental technology of the genetic sciences.

Both the game of chess and the record-keeping communities of chess players turned out to be a good match for AI researchers. Chess was perceived to be a game that requires some “thinking”. Simultaneously, it is a finite game with a finite number of positions and moves, ensuring the game will eventually end in a conclusive way (win, draw, or loss). In the early stages of the development of the chess computer, Ensmenger tells us, there were two competing algorithmic principles. The ‘Type-A’ algorithm, also called the minimax algorithm, which uses a brute-force method, and the competing ‘Type-B’, which was considered more “human” as it used heuristics to make decisions, trimming the decision tree by privileging certain branches over others. In the end, the minimax algorithm prevailed despite the fact that it was considered to be an inaccurate reflection of the ways in which human beings played chess. However, the minimax algorithm turned out to be fastest way to reach the goal set by the AI community: to beat the best human chess player. To cut it short, Ensmenger’s point is (a) it could have been otherwise and (b) that software algorithms like minimax or Twitter’s trends and Google search are parts of heterogeneous environments, therefore it is meaningless to isolate these algorithms from their social, economic, and political contexts.

The good walking algorithm

Both Gillespie and Ensmenger conclude that we need to develop a language and methodologies for studying and speaking about algorithmic interventions. It was against this background that we were asked to discuss and devise an algorithm that could act as a tour guide for our situated seminar.

Taking a walk around Oxford for about an hour seems a simple task, but having to put it on an algorithm form was a reminder that practical everyday activities like walking entail things that are difficult to articulate, especially in an algorithm. This ethnomethodological lesson was frustrating as it turned out to be quite hard to create an operational algorithm suitable for our purpose.

A complicated crossing

Our discussion was centred on two main topics: (a) the algorithm had to provide a decision rule, which could produce a definitive output and (b) it had to be something we could remember. We came up with a walking algorithm consisting of two main components:

  1. The intersection rule: We would toss a coin at every intersection we encountered. If the result were heads, we should walk to the right. Tails and we would walk to the left.
  2. The pub rule: We would enter the third pub encountered on our path.

The coin flipOther rules were suggested but they were either non-operational or conflicted with other interests. The application of the walking algorithm rules where subject to some discussion. For instance, there was some debate over what constitutes a proper intersection. The different interpretations were clearly influenced by the different interests of the participants. Some wanted to attend a lecture at 6pm at Green Templeton College, nobody was interested in walking all the way to Banbury, and all of us wanted to experience an algorithmic walk. Whether it was conspiracy, luck, or just really good design, the algorithm managed to devise a route that took us around the city and ended at a pub, leaving enough time to have a beer and be at Green-Templeton at 6pm.

The algorithmic route

Here is the route devised by the walking algorithm:

The route

I can warmly recommend taking an algorithmic walk if you get the chance, as it is good fun and interesting at the same time.

This session was part of the ongoing reading group Encountering Science and Technology Studies: Situated Seminars. Rather than discussing readings in the confines and comfort of a seminar room, we immerse ourselves in locations that speak to the issues at hand. For upcoming sessions, please check the programme.