Sunday 7 November 2021

When the anti-traditionalists love tradition

 


One of the several interesting moments in Joanna Hanink's piece 'A New Path for Classics' (which originally appeared under the title 'If Classics Doesn't Change, Let it Burn') comes when the Brown University Classics professor shifts from her critique of the field to 'a new generation of classicists' who, she says,  are turning away from the triumphalist “Western civ” model' and 'becoming better in tune with the world’s shifting realities.' 'The nation of Greece,' she goes on 

has recently been looking to allies beyond Europe. It is forging new economic and cultural links with China in a partnership based, at least rhetorically, on the idea that nations with ancient pedigrees understand each other. The same general premise also underpins the Ancient Civilizations Forum, a cultural initiative with nine member countries in regions that were “cradles of ancient civilizations.” The forum casts the antiquity as a potential source of soft power for modern nations.

There are a few points that might be raised here, from the wisdom of democratic nations cozying up to a communist dictatorship, to 'the idea that nations with ancient pedigrees understand each other.' What I want to highlight in this post, though, is the strangeness of Hanink appealing to 'nations with ancient pedigrees' at all. After all, in the rest of her article she several times casts doubt on 'the claim that ancient Greece and Rome were the “foundation of Western civilization," a 'fairytale Western origin story' that, she declares, she has 'no more patience for.' She also links approvingly to Kwame Anthony Appiah's essay 'There is no Such Thing as Western Civilization.' 

Now, admittedly, it could be the case that the narratives of continuity in Chinese history are correct (though the role of the Chinese state in constructing such narratives should give us pause), and every possible story about Western Civilization is simply false. But there is, I would submit, something quite strange, even less than perfectly consistent, about the position that some narratives of cultural continuity have to be insistently deconstructed while others can be routinely greenlighted. 

Appiah's idea that the Western tradition doesn't really exist is really only one step on from a general scepticism towards tradition that has had a place in the academy for quite some time now. The work that, as far as I can tell, planted the seed for this particular intellectual sub-tradition was Hobsbawm and Ranger's 1983 volume The Invention of Tradition, which discussed examples of traditions being more or less made up out of whole cloth, or at least reified to an extent that the actual historical facts didn't quite support. 

Hobsbawm and Ranger were no fools, and the phenomenon they pointed to was a real one. Traditions, from the Christmas Day address of the British monarch (inaugurated in in 1932) to the Superbowl (1966) are often more recent than we think. There's no doubt that some claims about tradition are more or less (sometimes entirely) mendacious, and that they often serve present-day interests and structures of power. 

But there's a risk of going too far the other way - a risk that, it seems to me, is especially perilous in contexts where the idea we're reacting to is associated with a political Other. Though some traditions - even many - may be invented, or at least patched up, we probably shouldn't conclude from this that all traditions are just made up, and there's no such thing as long-term cultural continuity at all.

One problem with arriving at this kind of conclusion is that talking about traditions and cultural lineages - even ones that reach back into ancient times - can often turn out to be very useful. Unsurprisingly, progressive historians and classicists sometimes find them useful too, as Hanink seems to in her talk of 'nations with ancient pedigrees.' And there are other examples.

Classicists who are in favour of transferring the Elgin Marbles to Greece, for example, usually do so partly on the basis that contemporary Greeks are, in some sense, the heirs (even the compatriots) of Pericles and of the classical city-states. Feminist classicists like Mary Beard are often partial to tracing a 'long tradition' of misogyny back to Greece and Rome. Somewhat further afield, Christianity is seen as having imposed repressive norms to do with gender and sexuality on cultures - in Polynesia, for example - that, it's claimed, had a tradition of more openness and flexibility in these areas of life.

My purpose here, again, isn't to contest any of these claims, or even to discuss them in any depth. At first glance, they all seem to have some plausibility. My purpose here is just to point out that strangeness. If it's true - or even plausible - that there's a tradition of misogyny that can be traced back to ancient Greece, isn't it equally plausible that we can trace a tradition of democracy back to ancient Greece as well? And yet my sense is that the idea of Athens as the cradle of democracy is less popular among American classicists than it used be. 

I noted above that it could be that narratives of Chinese (or Greek) cultural continuity are simply more accurate than notions of 'Western Civilization.' It could be argued, for a start, that 'Western Civilisation' is a more amorphous concept than Chinese Civilization, or Greece. In fact, something along these lines often is argued; although usually this is done by simply stressing the unhandlable multi-facetedness of Western Civilization, not by bolstering a sense of the continuities in national histories. (That, in itself, somewhat gives the game away.)

My own sense is that while we shouldn't uncritically accept stories about tradition (especially ones that stretch over huge spans of time - stories about apostolic succession and dharma transmission both come to mind), we also can't simply assume that all claims about tradition are false. Most of us recognize, I think, that some sort of long-term handing down of ideas and life ways is possible. (Indeed, I would go so far as to describe it as a fairly obvious feature of human society.) 

Whether particular claims about particular traditions are true is something that is best worked out on a case-by-case basis; and I would be the first to admit that there are, for example, simplistic narratives of Western Civilization and the transmission of democracy that don't stack up when set against the messy nuances of historical fact. 

My plea here, I suppose, is simply that we engage each other on a level playing field when it comes to teasing out which aspects of which narratives are sanctioned by the evidence, and which ones aren't. We can't, I think, make sweeping deconstructionist critiques of some narratives ( 'all narratives are constructed,' 'We must take care not to reify notions of "tradition,"' and so on) while exempting others from that style of skepticism. At least, we can't if we don't want those listening into our debates concluding that we're conducting ourselves in a less than perfectly consistent, and even-handed, manner. 







Saturday 4 September 2021

Classical Americana

 



Among the many claims about the field of Classics made in the New York Times' lengthy profile of Princeton Classics professor Dan-el Padilla Peralta was that “Classics as we know it today is a creation of the 18th and 19th centuries.” As has already been pointed out, it's a claim that only holds up if we allow the phrase 'Classics as we know it today' to do a lot of work. Classics as a modern academic field within a modern, research-intensive university system may well owe its origins to the emergence of the modern research university in 19th-century Germany. But Europeans have been turning back to classical cultures of the Greeks and the Romans pretty much since Rome fell.

So much is clear from more or less every stage of European history, from the Dark Ages, when monks kept the classical literary tradition alive by copying Greek and Latin manuscripts; through the Renaissance, when humanists like Politian translated and emulated classical writings; to the Enlightenment, when, as Rachel Poser, the author of the New York Times piece, puts it, “a sort of mania” for Greece and Rome took hold of the intellectual classes. The idea Europeans simply invented the classical tradition out of whole cloth in the Renaissance is, in other words, contradicted by virtually the entire history of European culture before then.

It might be objected at this point, as a last-gasp effort to save Poser from what seems like a fairly obvious clanger, that her focus was on the United States, not Europe. 'How these two old civilizations became central to American intellectual life,' she writes (with my added emphasis) 'is a story that begins not in antiquity, and not even in the Renaissance, but in the Enlightenment.' So perhaps her hypothesis isn't that a relationship with Greco-Roman culture became central to European culture only in the 18th and 19th centuries (after all, that's plainly wrong); but simply that Greece and Rome only became big in American culture during the Enlightenment. 

Now, that's obviously true, but it's true for what should have been a fairly obvious reason. Greco-Roman culture only becomes baked into American culture and institutions during the Enlightenment because that's when the United States developed into an independent nation. In other words, there was no United States of America before 1789, and there weren't even any permanent Anglo settlements in North America until 1607. Depending on where precisely you place the Enlightenment, that makes it inevitable that the Greek and Roman classics won't have a big presence in American life before then.

But still, wouldn't the kind of engagement with Greek and Roman history that is on show in the Federalist Papers have been impossible without the Enlightenment? When colonial Americans did wrestle with the Greco-Romans, didn't they always do so in an Enlightenment vein? Well, no. According to Eric Adler, Greek and Latin formed a large part of the curricula of the colonial colleges. And they were taught in a way that reflected two traditions, humanism and scholasticism, that stretch back well before the Enlightenment. 

They were, in fact, taught largely in a Christian vein. This brings us to a fact about Western cultural history that's so obvious that we are, it would seem, no longer capable of seeing it. The Europeans that settled North America in the Early Modern period were, almost to a man, Christians; and Christianity is a religion with roots in the ancient Mediterranean world. When the British settled North America, their religious and educational elites brought the study of Greek and Latin with them in large part because many of their holy texts (the Gospels; influential translations like the Septuagint and Jerome's Vulgate; the works of the Church Fathers) were written in those languages.

We can, as it happens, actually test the thesis that, had Europeans come to the Americas before the Enlightenment, they would have brought an interest in the Greco-Roman classics with them. We can do this because, as you might be aware, Europeans (mainly Spanish and Portuguese) did actually arrive in the Americas well before anyone's starting-point for the Enlightenment. And, sure enough, if we look at Spanish accounts of their encounters with local people, we find references to Greek and Roman historiography. Many of these accounts were, of course, written by priests of the Christian religion, a religion which, it bears repeating, was a central part of the Europeans' cultural inheritance and which has its origins in the Greco-Roman world. 

The real reason that Greece and Rome play such a significant role in the culture of the nascent United States, then, should be fairly obvious. Early Americans (at least the educated ones) talked about Greece and Rome because they came from Europe, and European culture was rooted in the Greco-Roman past. European settlers in the Americas, from Quebec to Buenos Aires, brought European culture with them, and European culture had a significant classical component. The specifically Enlightenment style of classicising engaged in by men like Georgia founder James Oglethorpe, whose utopian schemes were influenced by Plato, was simply the latest wave of classical influence. That it involved engagement with the classics was, in itself, more of a continuation of pre-existing European cultural habits than anything fundamentally novel.

All of this should, to repeat, be fairly obvious to anyone who knows anything about global cultural history. But the power of the ideology that currently has a stranglehold on US colleges is considerable; so considerable, in fact, that it leads college professors to write statements that any layman can see are plainly false. For Rebecca Futo Kennedy, for example, of Denison College, Ohio, modern Americans ‘are no more or no less the heirs of the ancient Greeks than they are the heirs of ancient China.' 

In fact, Americans are clearly more indebted to ancient Greece than they are to ancient China. Their monumental architecture is Hellenizing in style; banks and state capitols look vaguely like the Parthenon, not the Foguang Temple. Modern Americans' political system was created by men who drew on and discussed Polybius and Plutarch, not Confucius and Shang Yang. The higher registers of their main language, English, is full of borrowings from Greek, not Mandarin. And the holiest text of the dominant religion (the New Testament, Christianity) is written in koine Greek, not Old Chinese.  

Once again, virtually everyone who hasn't spent too much time around a modern American Humanities department knows this, and knows why this is the case. Americans are more indebted to ancient Greece than to ancient China because the United States was created largely by Europeans; and European culture, for stunningly obvious reasons, has always been more indebted to ancient Greece than to ancient China. 

But this simple fact destroys one of the central contentions of the extreme 'social justice' approach to Classics profiled in Poser's New York Times piece. This contention, as Poser describes it, is that ideas about the classical tradition 'cannot be separated from the discourses of nationalism, colorism and progress that were taking shape during the modern colonial period, as Europeans came into contact with other peoples and their traditions.' 'Enlightenment thinkers created a hierarchy with Greece and Rome, coded as white, on top, and everything else below,' she goes on. And she quotes Harvard professor Paul Kosmin: 'That exclusion,' he tells her 'was at the heart of Classics as a project.'

This is wrong-headed in any number of ways. For a start, as the cognitive scientist Steven Pinker has had to remind historians, the idea that exclusion, inter-group violence, and slavery was a product of the Enlightenment gets things exactly the wrong way round. One of the things that most clearly distinguishes Enlightenment and post-Enlightenment cultures is precisely their relative inclusivity and pacifism, not to mention their distaste for slavery, an eccentricity that in itself sets them apart from virtually all previous civilisations. 

But the simplest way we can see that the radical revisionist view is wrong is by going back to the obvious facts about Western cultural history that I've referred to in this post. The new would-be orthodoxy is that the classical tradition 'cannot be separated' from ideas that took shape in the modern colonial period. In fact it can, as a simple glance at the classical tradition before the modern colonial period makes clear. 




Sunday 27 June 2021

Locked in?


Whence Woke? There are many theories. Though some, like Lindsay and Pluckrose, stress the movement's origins in predominantly French political theory, many point to its striking predominance in English-speaking countries, a.k.a. the Anglosphere - and even Pluckrose and Lindsay acknowledge the origins of certain key concepts, such as intersectionality, in US academia. This is especially interesting - if that is the word - considering the Anglosphere's prominence in the history of liberalism. How did such an illiberal way of thinking grow out what looked like such liberal soil?

One possibility is that Wokeism developed for its own reasons, but was then spread around the world (at least in the first, and hopefully last, phase of the pandemic) by the English language and the networks and lifestyles that go with it - rather as world-beating rates of obesity have spread from Austin to Auckland with the diffusion of car-focused suburbs and fast-food joints. Another possibility is that it's something else in 'Anglo-Saxon' culture that has done most of the work, the most common culprit being not the flaxen moustaches but the residual habits of Protestant Christianity, with its predilection for puritanism and witch-hunts. 

The point of this short post is just to add another idea to the mix - probably a bad one, but, well, that's what blogs are for. Intellectual historians - and intellectuals tout court - have a tendency to over-state the importance of high-faultin' philosophical ideas on world history, and I'm well aware that's a danger here. So I offer this as just one more factor that may have played a role in the deep history of this new form of extremism. 

The hypothesis is just that the philosophical tradition of empiricism, long strong in English-speaking cultures, may have had a hand here. Philosophers like Locke and especially Hume argued that what we know comes overwhelmingly (even, perhaps, entirely) from our senses. This was a tendency in English-speaking philosophy even until the time of A.J.Ayer (a disciple of Hume) and Bertrand Russell. 

Locke and Hume and Berkeley argued, against continental 'rationalists' like Leibniz, that innate faculties (e.g. reason) played a relatively small part in how we came to understand the world. The debate involved famous puzzles like what would happen to a blind man who was suddenly given the ability to see. Would he simply take in knowledge of his surroundings like the rest of us, or would he be somehow cognitively unprepared for all the new information coming his way? (The answer, it turned out, was the latter.)

Part of Kant's contribution, of course, was to try to reconcile these two traditions: we understand the world, he suggested, by taking in evidence according to certain in-built schemas. Though empiricism retained a role in Kant's brand of idealism, the radical empiricism of the likes of Hume had clearly been left behind.

The problem for radical empiricists of various stripes since Kant has been our growing knowledge of human development and psychology. Aristotle and Spinoza had both intuited that different beings have different in-born tendencies, though neither of them quite understood why. Now we have a much better idea: we act in typically human ways (and even in typically male and female ways) to a large extent because of our genetics. (And the same can be said of cats and bears and flies and jellyfish).

Some writers still like to warn about the dangers and wrong-headedness of 'essentialism,' but, of course, essentialism isn't always wrong. We expect humans to act in certain ways (not like rocks, say, or gold- or star-fish) because we attribute (consciously or not) a humanness to them. We think they - we - have some mysterious human essence. And we're right. Except that it's steadily becoming less mysterious.

The later Wittgenstein, who can be read as a kind of born-again fundamentalist empiricist, tended to want to dissolve human tendencies and actions into 'forms of life,' even to the extent of seeming to say that internal mental states could be read off outward actions. What more aggressive empiricist invasion of the private sources of innatism could there be?

Psychological behavioralists followed this lead. Chomsky cut his teeth criticizing them, in particular by pointing out that languages seemed to have an innate aspect to them. Children across the world seemed to have been born with a 'language instinct.'

Since Darwin, Mendel, and the neo-Darwinian synthesis of evolutionary theory and genetics, we've had a pretty good idea of how this works (even if the details have turned out to be far more complicated than we expected). Our genes encode certain inherited information, and this includes tendencies towards certain behaviours. We can even estimate the proportion of certain traits that are genetic as opposed to environmental (though lay people tend to underestimate the extent of the genetic influence that scientific studies support). 

One of the most obvious features of the Woke culture on university campuses is the hostility (among many other sorts of hostility) towards ideas about human nature. Most of the current elite are outspoken 'blank-slatists,' preferring to believe that we are born as blank slates for our environments to write on, rather than the largely pre-designed, if highly responsive, robots we more closely resemble. 

The sources of this hostility are multiple and have been written about extensively elsewhere. It's my suggestion here, though, that the Anglo-Saxon tradition of philosophical empiricism may be among the roots of this reactivity. Even if we have plenty of good evidence - overwhelming evidence, at this point - that our behaviours are strongly influenced by our genetic essences, there's a strong tendency among English-speakers to want to treat humans as random streams of sense-perceptions. If this is at all right, it's another way (along Puritanism) in which Wokeism emerges, not as a cosmopolitan revolutionary movement, but as an peculiarly reactionary brand of Anglo-Saxon traditionalism.



Sunday 10 January 2021

New republics


There was a time when libertarians fantasized about starting their own countries on floating platforms. Now anyone can do it from home, on a digital platform. Some of these new countries are quite big. Facebook now has 1.7 billion users, more citizens (if you want to look at it that way) than China, and more adherents than Catholicism. 

If you're thinking that websites like this are just games ('Twitter isn't real life' etc.), you'd be right. But games are a serious business. Game designer Reed Berkowitz recently explored how QAnon (whose shaman is pictured) might have grown out of the same sorts of incentives that are generated by live-action role-playing and alternate reality games. People are given a quest (Who is Q?) and the motivation to want to complete it (this will show you the way thing really are). They build a world around them without a graphics card in sight.

This may sound like a new idea, or at least an eccentric one, but it's actually a highly familiar one in political science and economics, at least in the branch of economics known as the New Institutional Economics (NIE). The Nobellist Douglass North defined institutions as 'the rules of the game' in a given society. He may have meant it as a metaphor, but there's no reason why the rules of the game can't be the rules of a literal game.

Or, say, the ways a website is set up - its terms of service, its modalities, its incentive structures. The differences between social media sites might seem like just a matter of choosing different products - Twitter allows you less space per message; this bus company lets you drink coffee onboard. But there's more going on. The different sites have a vibe, a style, even (by now) a history - and we act differently when we're on (or in) them.

That's partly because of the multiplication of incentives within these worlds. I saw a link to Berkowitz's piece on Justin Murphy's Twitter feed. Murphy left academia, as he says, 'to spend more time on research and teaching,' and he seems increasingly interested in online learning not just a site for educational content, but for educational incentives. That last thing was what MOOCs were lacking. Murphy will send you an email a day (if you want him to) to help you learn R (What is R?). Khan Academy and other sites give you points, badges, etc. so that you keep levelling up in the game of Knowledge.

Meanwhile, in the world of real games (whatever that means), video games are apparently getting longer and increasingly nesting 'micro-transactions' within them to get you to pay up for a snazzier helmet, a deadlier weapon, or a more interesting adventure. (From what I can tell, this is one of the complainers' main complaints about the beautiful Assassin's Creed Odyssey.) We're long past the point at which companies have started selling real-world products in digital universes like Second Life (in fact, that's apparently something Reed Berkowitz used to do for a living). We're surely not far from Borgesian 'games' that electro-shock your brain into suffering or ecstasy as you proceed bravely through their new world.

But we're off on a side-quest now, so let's go back to something I wrote before. The different social media sites that already exist, I said, 'have a vibe, a style, even (by now) a history - and we act differently when we're on (or in) them.' If this reminds you of countries, well, that was sort of the point. Diving into an online community is kind of like visiting a foreign country and immersing itself into its exotic ways, its alien norms.

The reason countries used to be so different was because they had different pasts that led to different institutions, norms, customs - a different cultural infrastructure that in turn helped shape the way people were. I'm old enough to remember a time when people would talk openly about the way different countries (and its inhabitants) were, and even if some of this was ignorant or over-confident, some of it seemed about right. It had to be, in a sense - if changing laws or customs has any power to change behaviours, countries with different laws and customs should have a different vibe.

Of course, they still do, to a great extent, and this should remind us of something. The kinds of incentive structures nations have built up over the centuries are pretty formidable, and some of the axes they wield (like laws) ultimately trump the structures set up by social media companies. Twitter can oust Trump from its platform, true, but Twitter still ultimately exists at the pleasure of the US and other governments. Nation-states aren't going away any time soon, and their age-old institutions and norms continue to shape us in profound ways.

Still, sites like the United States of America are starting to seem a bit last century. If it's incentives and information that shape people, especially as they move like questers through whatever spaces open before them, trying to make meaning out of their lives - if that's what is forming individuals and communities, Arizona and Invercargill are really no match for the world wide web. What we'll have - what we to some extent already have - will be new republics shaped online, with new compatriots that are as different, and sometimes as hostile, to each other as Spaniards and Swedes were in 1634. 

The only difference is that this time, the citizens of these new republics - dressed differently, speaking different and mutually unintelligible languages, worshiping different gods - won't be separated by channels or ranges, but will be living side-by-side. And when fighting breaks out, as it has already started to, old-timers speaking of 'internal' or 'civil' war won't seem alarmist, but just quaint. 








Saturday 14 November 2020

How the attention economy is sucking our will to platform

 

People (like the economist Ashley Hodgson) have long been talking about 'the attention economy.' When I first heard about it, it sounded liberatory and utopian: with the advent of the internet, the theory went, people would be paid as a kind of tribute for work they'd chosen to do, kind of like how you leave coins for a busker. 

Nowadays, the attention economy has more dystopian overtones. A friend of mine from college worked for years for a company that built super-computers to calculate the value of bits of space on the internet and bid against other super-computers in instantaneous, online options. Apparently that kind of thing is going on all the time, humming along in vast rows of air-conditioned calculation.

And more than that: basically everyone in the developed world now, from big tech companies to online sex workers, is trying to get your attention. (Me too, sort of.) OK, they may not be aiming at your attention in particular, but, in general, the more attention the better.

Of course, that was always sort of the case. Vendors in markets from time immemorial have shouted at passers-by to try to get them to buy their goods. Then there was advertising, which obviously didn't start with the internet. Getting someone's attention has always been the first step to getting some of their money.

As with other aspects of life, in some ways what the internet has done is simply speed everything up and expand it to a global scale. But social media especially has also introduced a new form of currency in the form of likes, followers, and so on. In the world of the bird app, or of the codex of visages, it's the man with many followers who's king.

And online publications like Quillette and Vox have, obviously, gained influence (and income) by gaining re-tweets rather than by selling collections of the articles in the form of glossy magazines. This has led to one of those little features of online life that runs up against the norms of anybody with a liberal education from more than 10 years go: people refusing to share links to a piece they then condemn.

The reason that seems so weird is that there used to be a strong norm in intellectual life that you didn't hide books or articles or try to stop their circulation. Imagine if I'd said to my fellow students 15 or 20 years ago 'I've read this book I strongly disagreed with but which has been influential; I therefore won't name it, and in fact I've withdrawn it from the library and hid it under a bush so that it won't circulate further.' They would have thought I was bonkers, not to mention that I was curtailing open debate and infringing on their own freedom to read what they saw fit.

I do think all these points still hold today for those who refuse to link to pieces they hate; but I can also see where they're coming from. They're right that clicks help websites, in a way that lending someone a book didn't help Penguin or Anthony Kenny. They're anxiously aware of the importance of attention in the new ecosystem we now live in - and they're keen to deny its oxygen to their enemies. 

All of this might, to some extent, help explain the recent vogue for de-platformings - that is, preventing people from hearing someone speak on campus because you don't like them or what they have to say. This kind of thing is yet another phenomenon that even people as young as this blogger tend to find pretty peculiar - it's another thing that, I think, most of my fellow undergraduates in the first few years of the millennium would have seen as obviously not the way to behave.

One way of looking at what's going on with de-platformings is to think about what the de-platformers think they're doing. One of the things they think they're doing, I would submit, is akin to not linking to a Quillette article. They see their campuses like their chirrup or mug-scroll feeds, and they don't want them to contribute to funnelling more attention towards Christina Sommers (or whoever). 

I remember hearing one of the bullies who tried to shut down Sommers' talk at Lewis & Clark Law School  saying something like 'You already know what she's going to say from YouTube.' Again, she's still wrong to act in the repressive way she did - Sommers still had a right to speak, and the students to hear her - but I think I now understand a bit more about why that student was acting as she was. 

Back in the day, Bjorn Lomborg (or whoever) coming to speak was interesting partly because you got something you didn't get from reading his articles or books. Nowadays, you can easily access recordings of public intellectuals online. But the mention of YouTube, I think, also suggests that the student was thinking of Sommers' appearance very much in internet terms. She didn't see the talk as a source of ideas or as an experience - she saw it as a kind of bid in a game whose point is to amass the most attention-chips. She saw it as she might have viewed a fellow student sharing a Sommers YouTube talk on social media. 

Christina Sommers giving a talk, Bruce Gilley publishing an article - back in the last millennium the obviously correct reaction to such things would have seemed, to most sane individuals, even those who disagreed with them, not much. Maybe they would have gone along and asked a critical question; maybe they would have written a letter to the student paper. Other people paying attention to such things didn't seem much of a threat.

On the online world, though, especially on social media, life is a high-stakes (OK, low-stakes, but it feels high-stakes) battle for attention. Attention accruing to your ideological rival empowers them and thus seems to threaten your own views and values. This economy of attention has become a kind of vortex, not only sucking previously rather somnolent groups like classicists into it, but also exerting its sucking effect on what's left of the offline world. The online economy of attention is sucking at our universities like a horrific hair-cutting 'solution'; and it's sucking at our will to let people explore ideas. 


Saturday 26 September 2020

Be with us now

 


In the depths of the lockdown, in the middle of my fortnight's stay at a quarantine hotel, I saw my friend. He was standing there at the end of my bed. He was smiling, and exuding the same bonhomie as ever. But the feeling I had upon seeing him wasn't joy. Why not? Because he had died a few months earlier. 

If I was living in a less rationalistic culture - any other culture than the one I do live in - I have no doubt I would be talking about that episode as the visitation of a ghost, spirit, or angel. As it is, I'm more inclined to believe it was a dream. Though maybe a dream of a particular sort, born of particular circumstances. 

I'm talking about lockdown dreams, the particularly vivid dreams that people have been reporting after weeks of being cooped up at home or in a hotel, sometimes without seeing another living person for weeks on end. These dreams come in different shapes and sizes, and not all of them involve people, but the ones that do suggest an obvious explanation. Are these dreams the result of our brains' effort to make up for the lack of human contact by providing us with the images of our friends?

It's interesting to me that, in the same period that I had the dream I mentioned, I was also praying to Mary with the rosary, something I'd never done before in my life (I've never been a Catholic, and I'm not one now). There were other reasons for that (I'd just been in a city with some beautiful Catholic churches, where I'd been exposed to and drawn to the practice), but it has struck me that it is a style of meditation that involves, first and foremost, calling upon a figure, a personality, a person.

Prayer, of course, often works in this way. Christians call upon God, Jesus, Mary, and various other saints. Muslims call upon Allah. Buddhists call upon Buddha and numerous bodhisattvas and spirits (and sometimes even visualise them as a form of meditation). Ancient Greeks who were ailing would call upon the healing God Asclepius and then go to sleep in one of his sanctuaries, where he would then appear to them in dreams. 

There are many reasons why people pray, but one may just be loneliness. We want another presence in the room, in our lives, for the night. In a sense, religious activity is a way of inviting people over, for dinner, say, and is often figured as such - the Greeks imagined the gods enjoying the smoke from their sacrificial feasts, and the Christian Eucharist re-enacts the Last Supper, seeing Christ as really (or symbolically) present once again. 

The many different forms of religious ritual obviously imagine different sorts of togetherness with different supernatural guests. And, as with ordinary guests, we may want to invite them over for different reasons. We may want to invite over someone powerful and reassuring, someone who will allow us to sleep with some sense of safety. We may want a mother-figure to smile down on us and tell us everything will be alright. We may want a raucous fellow-reveller like Dionysos.

None of this is to suggest that sending out invitations of this sort is necessarily a silly thing to do, even if we don't happen to believe that any of the guests are really going to be there. Whether or not we find it silly may, in any case, in some sense be neither here nor there. It may simply be something we humans do during lockdowns, in the desert, in the hour of our death. We find other ways of having our friends over, other ways of seeing them. 



Friday 14 August 2020

Harsh but unfair

 


I remember reading once, in a book on Athenian law (perhaps this one) that anthropologists had observed that in societies where criminals were less likely to be apprehended, penalties were harsher. It made sense; after all, modern developed countries, with their highly developed surveillance technology, have (by historical standards) strikingly lenient punishment regimes; pre-modern ones, by contrast, which had zero or only rudimentary policing, had more of a tendency to turn to the gallows, the guillotine - or the gulp of hemlock.

The observation came back to me recently in connection with the current vogue for 'cancelling.' The frequency of this phenomenon has been questioned, but what seems to concern many people isn't necessarily how widespread it is, but how harsh the punishments can be. A disabled grandfather is sacked for sharing a comedy sketch. A researcher loses his job for re-tweeting a study about the effectiveness of peaceful compared with violent protest. And all the while, not-especially-controversial views and tame jokes elicit the kind of fury that used to be reserved for blood feuds. 

Given the many instances of such 'cancellations' that have occurred, it might seems strange that a good few people continue to insist that the whole phenomenon is made up. But there might be a way of explaining both why they think that and why some of these same people engage in such disproportionately harsh punishings of individuals who violates their norms. 

The reason they think the free speech crisis isn't really a crisis is partly because they see people saying things they dislike all the time. That's been one of the effects of the explosion of social media: whereas twenty years ago you wouldn't often be exposed to views from outside your thought-world, and you'd have to put in some work to have your views broadcast, now it's easy to post things and even easier to see things others have posted. 

If you have narrow parameters for what ideas are acceptable, it follows that you're likely to see quite a lot of what are to you unacceptable ideas. Twitter must be terrifying - all those people saying things you think are terrible! What's more, most of them are getting away scot free.

The temptation, then, is to make an example of anyone you are in a positions to punish, pour décourager les autres. This is what ancient societies were up to as well. It makes sense, especially if you consider the point of view of the potential criminal. 

You can look at risk as the combination of how likely a bad thing is to happen, and how bad it will be if it does. You may not be that likely to fall of the cliff if you go right up to the edge, but if you slip you'll die, so why risk it? If you're in a society without a functioning police force, the chance you'll be apprehended for doing something bad is pretty low. One way for the state to increase the risk you face (and hence deter you from wrongdoing), is to increase the penalty you risk facing. You think you probably won't get caught, but if you do...

The temptation to make an example of someone might be especially great when there are artificial barriers in the way of punishing other people who are up to the activity you dislike. For example, if a lot of the people saying things you find unacceptable are represented by anonymous Twitter accounts. Or if there's been a state amnesty saying you can't punish any of the members of a tyrannical junta.

That last thing, of course, is what happened in Athens after the murderous regime of the so-called Thirty Tyrants. Once the democracy had been restored, there seems to have been an agreement not to prosecute anyone involved with the Thirty, except for the Thirty themselves (some of whom had already been killed in the process of restoring the democracy). (What exactly the amnesty required is, like most things in ancient history, a little bit controversial).

In 399, only four years after the Thirty had been toppled, the philosopher Socrates, who had links to some of the Thirty (including Critias, one of the more extreme members), was executed on a vote of a popular jury. Why? It's complicated; there were lots of factors that led to that outcome, including the way he went about defending himself (if that was even what he was up to) in court. 

But one possibility is that his prosecutors indicted him on a charge of inventing new gods and corrupting the youth precisely because they couldn't prosecute him for what they were really angry at him for - the actions of the Thirty. And they also couldn't prosecute many of the people who they knew had collaborated with the Thirty. Nor could they prosecute Critias and others who were already dead. But Socrates was there, still going about his business asking irritating questions in public...

Note that the theory, if it's right, explains not only the excessiveness of the punishings but also the way they have of mistaking their object. At least, it looks an awful lot like all of the guilt for something is being loaded onto the back of one, unfortunate person who happens to be in the wrong place at the wrong time. That, of course, is another phenomenon that's familiar to anthropologists: scapegoating. 

One way to stop this sort of thing, as you might guess, is to get better at apprehending wrong-doers. But it's very questionable in cases like the ones mentioned above (sharing comedy sketches and so on) whether anyone's done anything wrong at all. Another way is to reduce narrow-minded people's exposure to views they find distasteful. 

Doing that by force would be wrong (people should be free to go on social media, of course), but it might be advisable, considering the kinds of moral risks involved, for some people to think twice about the amount of time they spend online. In other words, if you can't deal with different ideas, it might be best just to stay off Twitter. Otherwise you might find yourself with a cup of hemlock in your hand - handing it to an innocent person.