24/07/2020

Society Is A Broadband Network







In​ the late 1920s, the political philosopher and jurist Carl Schmitt, subsequently to join the Nazi Party, developed a theory of democracy that aimed to improve on the liberal version. In place of elections, representatives and parliaments, all talk and gutless indecision, Schmitt appealed to the one kind of expression that people can make for themselves: acclamation. The public should not be expected to deliberate or exercise power in the manner that liberals hoped. But they can nevertheless be consulted, as long as the options are limited to ‘yea’ or ‘nay’. The public can ‘express their consent or disapproval simply by calling out’, Schmitt wrote in Constitutional Theory (1928), ‘calling higher or lower, celebrating a leader or a suggestion, honouring the king or some other person, or denying the acclamation by silence or complaining’. ‘Public opinion,’ he continued, ‘is the modern type of acclamation.’

A host of new instruments were developed to capture this ‘modern type of acclamation’, though few of them held much interest for Schmitt. Representative sampling was designed by statisticians in the 1920s, making it possible for social scientists to discover the attitudes of millions of people by surveying just a small – but mathematically representative – fraction of them. A new industry of opinion polling, audience research and market research grew over the course of the 1930s, led by companies such as Gallup. The question of whether ‘the people’ favoured or disfavoured a particular policy or institution became a matter of intense political and public interest. Other new methods included focus groups and clunky mechanical interfaces by means of which participants would register their opinion of a song, advertisement or film as they were witnessing it. This new research industry operated largely within the parameters proposed by Schmitt. The topics and questions would be determined by whichever authority – commercial or political – was looking for answers. The respondents had the status of an audience, cheering or booing, agreeing or disagreeing, depending on what was dangled in front of them. In a plebiscitary democracy, power lies with the person who designs the questions.

About seventy years later, a new set of innovations arrived. The news aggregator website Reddit was launched in 2005, allowing users to share links with one another by means of a feature that echoed Schmitt’s vision of a people ‘calling higher or lower’: contributions could be ‘up-voted’ or ‘down-voted’ by other users, determining their prominence on the site. In 2008, an analogous technology was introduced to the political arena. That year’s televised debates between the US presidential candidates were accompanied by an onscreen ‘worm’ reflecting the sentiments of a sample of undecided voters, fluctuating in real time over the course of the broadcast. The fortunes of the Republican candidate, John McCain, took a dive during the second debate in Nashville, when an off the cuff reference to his opponent, Barack Obama, as ‘that one’ caused a sudden surge of negative opinion, visible to TV audiences across America.

The rapid expansion and consolidation of social media platforms led by Facebook has driven the logic of the ‘worm’ into everyday life. In the shadow of the ubiquitous ‘like’ button, however, the alternative to enthusiasm is often – as Schmitt anticipated – ‘silence or complaining’. Photographs, restaurants, research papers, songs, products or opinions are compared on the basis of their relative numbers of ‘likes’. On Facebook, Twitter and Instagram, there is no equivalent of a ‘down-voting’ button (though there is on YouTube). Negative opinion is expressed either through a sheer absence of acclaim, or through outbursts of denunciation, which other users may in turn wish to ‘like’ or share.

The radical difference between the infrastructure overseen by Mark Zuckerberg today and the one rolled out by George Gallup in the 1930s is that we can all now potentially act as the pollster. Here’s my dog: like or dislike? Donald Trump is a fascist: agree or disagree? This is not the idealised classical or liberal public sphere of argument and deliberation, but a society of perpetual referendums. The perennial question, when it comes to so much up-voting and down-voting, is who can be bothered to ‘vote’ at all. The passionately positive and the passionately negative can usually be relied on to take part.




Some of this can be attributed to consumerism. A society that bestows sovereignty of choice on consumers faces two immediate problems. First, there is the business challenge of anticipating and influencing the exercise of that sovereignty. What do consumers want? Surveys and focus groups were among the tools developed in order to help mass producers tailor their products – and advertisements – to the desires of their target market. Opinion polling simply extended this method to the ‘sale’ of politicians and policies. The emergence of huge platforms, such as Facebook and Google, in the 21st century vastly expanded and fine-tuned this science of taste, but didn’t substantially alter its strategic objectives.

Second, how do we, the consumers, cope with the burden of this sovereignty? How do we know what’s ‘good’ and what’s ‘bad’? What if, confronted with a flood of ads, campaigns, trailers, logos and billboards, I still don’t know what I like? This is where star ratings, endorsements and marks out of ten come in handy. In a society of excessive choice, we become reliant on what the French sociologist Lucien Karpik has described as ‘judgment devices’, prosthetic aids which support us in the exhausting labour of choosing and preferring. Karpik studied such comfortingly analogue examples as the Michelin restaurant guide. Today we are inundated with quickfire judgment devices: Tripadvisor, Amazon reviews, Trustpilot, PageRank and all the other means of consulting the ‘hive mind’. The scoring systems they deploy are crude, no doubt, but more subtle than the plebiscitary ‘yes’ or ‘no’ imagined by Schmitt and now hardwired into many social media platforms.

The tyranny of binary opinion isn’t just a symptom of consumerism, but also an effect of the constant flow of information generated by the internet. It is not for nothing that, in the age of the digital platform, we use liquid metaphors of ‘feeds’, ‘torrents’ and ‘streams’ to describe the way images, sounds and words surround us. In the midst of an online experience of one sort or another, clicking a button marked ‘like’ or ‘dislike’ is about as much critical activity as we are permitted. For services such as Netflix or Amazon, the design challenge is how to satisfy customers’ desires with the minimum of effort or choice, largely on the basis of what they have liked – or not – in the past.

The unceasing pursuit of audience ‘acclaim’, in the form of rapid, real-time feedback, bleeds into the sphere of cultural production. Talent shows are evidence of what happens when the plebiscitary form is extended to entertainment: singing and dancing become contests, tests of vocal and bodily agility, that eventually result in everyone straining for the same sound, look and appearance. Platforms such as Twitter and Instagram have a similar effect on the presentation of the self, where the goal is to win plaudits for instantly impressive slogans and iconography. Chunks of ‘content’ – images, screengrabs of text, short snatches of video – circulate according to the number of thumbs up or thumbs down they receive.

It is easy to lose sight of how peculiar and infantilising this state of affairs is. A one-year-old child has nothing to say about the food they are offered, but simply opens their mouth or shakes their head. No descriptions, criticisms or observations are necessary, just pure decision. This was precisely what Schmitt found purifying in the idea of the plebiscite, that it cut out all the slog of talking. But a polity that privileges decision first and understanding second will have some terrible mess to sort out along the way. Look at what ensued after 46 million people were asked: ‘Should the United Kingdom remain a member of the European Union or leave the European Union?’

Acclaim and complaint can eventually become deafening, drowning out other voices. It’s not only that cultural and political polarisation makes it harder for different ‘sides’ to understand one another, although that is no doubt true. It makes it harder to understand your own behaviour and culture as well. When your main relationship to an artefact is that you liked it, clicked it or viewed it, and your main relationship to a political position is that you voted for it, what is left to say? And what is there to say of the alternative view, other than that it’s not yours?

In June, the right-wing think tank Policy Exchange responded to the fresh wave of public interest in Britain’s violent colonial past by announcing a new research project, History Matters. It was launched alongside a call for people to ‘share their experiences and concerns about the ways in which history is being politicised, and sometimes distorted’. The launch also featured the results of a poll revealing ‘public concern over the rewriting of British history’, which included the revelations that 71 per cent of people oppose the Cenotaph being ‘used as a focal point for demonstration, vandalised or desecrated’, while 67 per cent of people oppose Churchill’s statue ‘being spray-painted with graffiti’.

At a moment when institutions at the core of British public life, from the Bank of England to the British Museum, from Oxford University to Lloyd’s of London, are opening themselves up to dialogue about their history and current arrangements, a project like History Matters flattens discourse into questions of ‘for or against’. The poll is a litany of idiotic questions, with binary choices between equally idiotic answers. Do you think British history is ‘something to be proud of’ or ‘something to be ashamed of’? Do you think ‘even if the historical figure used wealth gained from the slave trade for public benefit, their statues should no longer be allowed to stand’ or ‘it is unfair to make judgments about people in the past based on today’s values’? Sorry, those are the options. Hurry up and choose. Statues are themselves a way of ossifying acclaim and it’s not surprising that they become the focus of these divisions. Context is important. The statue of the former Manchester United manager Alex Ferguson outside Old Trafford wouldn’t go down so well – with Manchester City fans in particular – if it were placed anywhere else in Manchester. But by and large, the attempt to constrain how future generations allocate acclaim deserves to fail.

Policy Exchange protests that ‘history has become the focus of a new culture war [which] started on the political fringes’. They show no sign of wanting to end it. Once history itself becomes a matter of plebiscitary decision, we are assigned to cultural camps that we had no hand in designing, and whose main virtue is that the other camp is even worse. One stupid position (‘You can’t judge the past by the standards of the present!’) presumes its only marginally less stupid opponent (‘We must judge the past by the standards of the present!’). This turns an opportunity to address the myopia of the history curriculum and present the public with the complexities of their history into a matter of taking sides. The past becomes one more product to acclaim or decry.

The right understands how to play this ‘culture war’: identify the most absurd or unreasonable example of your opponents’ worldview; exploit your own media platform to amplify it; articulate an alternative in terms that appear calm and reasonable; and then invite people to choose. It isn’t all one-way traffic, of course. There is no shortage of progressive and left-wing opinion on social media aimed primarily at harming conservatives by misrepresenting them. One difference is that the left isn’t in control of the majority of the newspapers, though its opponents accuse it of controlling much else, from the BBC to universities.

The dilemma facing campaigns for justice is when to engage in such ‘wars’, or whether to do so at all. It’s hard to deny that focused efforts such as Rhodes Must Fall have had a rallying effect, while the evolution of Black Lives Matter would be unthinkable without the forms of ‘acclaim’ and ‘complaint’ that social media is so effective at propagating. The reason racism is being discussed by broadcasters, politicians and historic institutions as never before is largely thanks to publicity tactics that start with a smartphone video of an act of police violence and scale up from there. The challenge is to avoid conflating tactics with goals, as if movements for justice were solely concerned with imagery, reputations and statues. Conservatives and media outlets share a common interest in restricting politics to the level of sporting spectacle, occupying the space where other forms of inquiry and understanding might occur.

The outcome​ of all this is a politics with which Schmitt’s name is commonly associated, one that reduces to a base distinction between ‘friend and enemy’. The distinction itself is what counts, not whatever fuels or justifies it. From Schmitt’s grim perspective, the friend-enemy distinction is ultimately realised in the question: who am I prepared to kill and who am I prepared to die for? We are very far from this with regard to statues and national icons. Instead, the friend-enemy distinction has become a new type of ‘judgment device’, in which my preferences and tastes are most easily decided by the fact that they’re not yours. Things which you hate must ipso facto be good. It becomes embarrassing or even shameful to appreciate something, if the ‘wrong people’ are also praising it.


‘Tribalism’ and ‘populism’ have come in for plenty of stick over the past five years, especially from those at the liberal centre, who feel they are being squeezed out of discussions (and representation in elections). Some liberals still hope that the Covid-19 pandemic will re-establish a common political ground, within which debate will be had and evidence respected. The risk in framing things this way is that it places too much faith in a supposed political spectrum, its centre an Archimedean point of objectivity. But the centre can get dragged around. As Donald Trump demonstrated when praising ‘very fine people on both sides’ of the clash in Charlottesville in 2017, the notion of ideological equilibrium can be manipulated to the benefit of extremists.


The problem right now – exacerbated by the circumstances of the pandemic – is that when the past is the object of political conflict, the result is a tribunal convened to determine, once again, a binary question: guilt or innocence. Where the study of history might seek to discover, explain and understand, possibly to facilitate judgment, our current moment demands decision first, study later (if at all). Many conservatives imagine that when the British Empire and colonialism are taught in schools, it’s in order to spread shame or seek revenge, not because they are central features of the political, economic and social history of the past five hundred years. No doubt there are some activists, maybe even some scholars, whose primary relationship to this material is a passion for condemnation, but to abandon such an opportunity for public education on those grounds represents a terrible loss of nerve. What Britain sorely needs is not self-love, or self-hatred, but self-knowledge.



Melanie Klein identified as ‘splitting’ the psychic process whereby the self, unable to accommodate its own ‘bad’ aspects, projects them onto others. Terrified that one might be entirely and exclusively guilty, one adopts a position of exaggerated innocence and virtue, while attributing total and irredeemable badness elsewhere. Examples of this in the current ‘debate’ (if that’s what it is) are ubiquitous. Fearful of having to face up to an unbearable national guilt, the right projects its anxiety onto a culture of violent ‘wokeness’ which it claims is pulling society apart. Boris Johnson, a guilt-shedding maestro, derives his over-elevated status in public life precisely from his ability to accept no responsibility for anything he (or anyone else) has said or done.

The left, especially its more ‘online’ sections, suffers from its own version of this syndrome. Alongside sophisticated critiques of structural racism, renewed attention to the racialised and colonial foundations of global capitalism, and the increasingly detailed policy agenda of Black Lives Matter, eyeballs are invariably dragged towards the public shaming of unapologetic nationalists. Given that many of these targets thrive on outrage and provocation (otherwise known as trolling), this is hardly a good use of anyone’s time, but it provides further opportunities to ‘split’ off guilt from innocence. The online public sphere remains intoxicated by the prospect of the unambiguous baddie, whose condemnation will absolve others of all sin.

What is obstructed by such patterns of behaviour is a realisation that is integral to psychological maturity, as well as to many of the most important works of 20th-century social theory, from Max Weber to Hannah Arendt to Michel Foucault: guilt and innocence are rarely as easily distinguishable as we might like them to be. This is what it means for a problem to be systemic. Bad things don’t happen simply because bad people intend them; and good people often play an integral part in terrible political acts and institutions. To recognise as much is not to agree with Trump when he says there are ‘fine people on both sides’, but to make space for a politics that doesn’t start out with sides to be ‘up-voted’ or ‘down-voted’, and for a relationship to the past that refuses to be narrowed to manufactured media battles over Churchill’s statue.


Who am I prepared to kill? By William Davies. London Review of Books, July 2020.








What is society? The most notorious answer we’ve been given in the last forty years was a triumphant negation, uttered by Margaret Thatcher in an interview with Woman’s Own magazine in 1987: ‘There is no such thing!’ The left has ensured that Thatcher’s words have not been forgotten; the right has occasionally sought to remind people of her next sentence: ‘There are individual men and women and there are families.’ But does anything connect those individual men and women with those families?


The term ‘social’ is ‘the weasel-word par excellence’, Thatcher’s intellectual inspiration Friedrich Hayek wrote in 1979. ‘Nobody knows what it actually means.’ Hayek was in no doubt that patterns emerged in the behaviour of populations, and might eventually lead to a form of large-scale self-organisation. The best way of ensuring this happened was to build a communications infrastructure that would make it possible for millions of people to share information in real-time. In Hayek’s view, that infrastructure was the price system of a free market. As the Covid-19 crisis was beginning to take hold in the US, the New York Times carried a story about an online retailer in Tennessee called Matt Colvin, who had stockpiled 17,700 bottles of hand sanitiser in order to exploit surging demand, only to have his account suspended by Amazon. Colvin’s actions prompted widespread disgust. But wasn’t he just responding to price signals? Or does the ‘weasel-word’ mean something after all?




Perhaps the boldest conceptual vision of society belongs to Émile Durkheim. In Suicide: A Study in Sociology (1897), he argued that society was a ‘social fact’, which couldn’t be reduced to matters of psychology or economics. Individuals were constrained and shaped by trends and norms that were manifest only at a macro-level. By tracking variations in the suicide rate over time and across nations, Durkheim demonstrated that its level didn’t only have to do with economic welfare or individual choice. ‘At each moment of its history ... each society has a definite aptitude for suicide.’

The task of sociology, Durkheim believed, would be to study this kind of ‘social fact’, and its principal evidence would be statistics. Sociology, as he conceived it, was joining a political project that was already well underway (nowhere more so than in France) to analyse the nation in terms of measurable quantities: births, deaths and causes of death. Suicide was a useful case study for the sociologist, partly because it appeared on the face of it to be such a solitary phenomenon, but also because there was already plenty of international data on the topic. Not only could statistics reveal the various aggregates that make up a society, but, more important, they could identify the invisible norms binding us all together: the averages.

We are all Durkheimians now. Every day the headlines are dominated by announcements of the latest aggregates and averages from nations around the world. Statistics swirl about on social media, as people form speculations on the basis of their own mental arithmetic: what’s 1 per cent of this or 15 per cent of that, and who’s to say the 1 per cent isn’t actually 3 per cent, and why isn’t the South Korean average the same as the Italian one? But at their root, statistics are a combination of state-led data collection and probabilistic modelling. Demographic averages offer little security to the individual, unless they are accompanied by widespread solidarity and the sharing of risk.

None of this could be further from the idea of society that propelled Boris Johnson to power, and which is all but useless to him now. Brexit was fuelled by a desire for the society of a collectively imagined past, for a ‘nation’. While Remainers spoke of GDP and other macroeconomic indicators, Leavers offered the cultural symbols of a community that had supposedly been dissolved by multiculturalism and globalisation, and by their over-educated spokespeople. Yes, it was about stronger borders, but just as important was the right to be proud of flags, Britain and England. Nations promise plenty of solidarity, just not for everyone. Brexit didn’t need a committed Brexiteer at the helm; it just needed someone who was unapologetic about the collateral damage it would cause. A columnist with a dim regard for facts was the perfect person to execute a project whose chief aspect was imaginary.

Watching Johnson at recent press conferences, flanked by the government’s chief medical officer, Chris Whitty, and its chief scientific adviser, Patrick Vallance, you saw a man struggling with his every instinct. For decades a mischievous smirk, a joke here, a hair-ruffle there, have been enough to make newspaper editors, interviewers, Have I Got News for You audiences and cabinet colleagues putty in his hands. Now the man who hoped to be remembered for having Got Brexit Done is suddenly forced to take charge of managing a lethal epidemic. (Even so, he can’t quite shake the habit of a lifetime. On a conference call with the CEOs of sixty manufacturing businesses, urging a collective effort to produce more ventilators for the NHS, Johnson reportedly referred to the plan as ‘Operation Last Gasp’.)

The imagined community of Britain or England is on hold for the foreseeable future. While Britain shifts hesitantly onto a ‘war-footing’, the cultural and economic divides that split the nation in two in the summer of 2016 have been suspended, save for the self-separation of a privileged few who are able to escape to a remote island or hunker down in the country pile for a few months. The generational divide is the one that still counts above all, but it appears in a very different light now compared with just a few weeks ago.

What is society, then, to the likes of Whitty and Vallance, the men whom Johnson is said to be obeying so loyally? Ultimately it is a network, made up of billions of interconnected nodes. You can try to impose a nation on this network, as Donald Trump has done with his travel bans and his maniacal effort to buy a vaccine for exclusive use within the United States, but networks are governed by mathematical, not sovereign, laws. Society conceived as a network isn’t about aggregates or averages, but is a complex system through which trends, behaviours, memes, information and infections travel. There is nothing distinctively human or political about the laws of networks: as dots on a vast network map, we are no different from slime mould or animals – we become a herd.

This worldview also has a long intellectual tradition, dating back to Durkheim’s contemporary and critic Gabriel Tarde. From this perspective, society is a pattern formed from billions of interpersonal connections. Understand the micro-social networks – families, schools, pubs, workplaces and so on – and, with sufficient computational firepower, you can build up an image of the macro-system that emerges. In the 1930s, the social psychologist Jacob Moreno pioneered the mathematical study of social networks (initially known as ‘sociometry’), developing primitive maps and models of how interpersonal connections produce larger systems. With advances in software during the 1970s, the field of ‘social network analysis’ was born at the intersection of sociology, psychology and computer science.




Viewing the world in terms of networks was all the rage in the 1990s. Manuel Castells’s The Rise of the Network Society (1996) provided the most ambitious sociological account of why networks had become the dominant organisational form of the age. The concept of the network seemed suited to the densely integrated, multipolar world of globalisation. Airport bookshops filled up with titles promising to tell you all you needed to know about ‘emergence’, ‘power laws’ and ‘virality’ in the network age, with Malcolm Gladwell’s The Tipping Point (2000) at the top of the pile. The sudden enthusiasm for these ideas was partly a side effect of the internet’s arrival in everyday life, but also a reflection of their individualist foundations. Society, to a network theorist, is what emerges after everyone is left to go about their own private business. Thinkers such as Bruno Latour and Gladwell agreed with Thatcher up to a point: there is no such thing as society – there are nodes and there are links.

The injunction emerging from this worldview is that we should recognise the disproportionate potential of the small and marginal changes that often go unnoticed. The micro and the macro are brought together in a new and unpredictable intimacy. There is some ground for paranoia here. Networks can be completely overhauled by minor events that begin on their fringes. As the economist Branko Milanović recently tweeted, ‘the most influential person of the 21st century (so far): A Hubei farmer’. Gladwell’s book became required reading for market researchers, who unleashed new techniques of ‘viral marketing’ and ‘coolhunting’. This deference to the macro-potential of micro-changes also lay at the heart of ‘nudging’, the term coined by Richard Thaler and Cass Sunstein for scarcely noticeable government interventions that alter individual behaviour with minimal effort, cost or constraint, but significant social benefit. One of the best-known examples in Thaler and Sunstein’s Nudge (2008) is the flies painted on urinals in Amsterdam’s Schiphol airport: men unthinkingly aim at them, reducing the amount of urine that ends up on the floor.

David Cameron’s government, hungry for a new political idea but reluctant to rock any ideological boats, was quick to seize on ‘nudges’, along with its ‘Big Society’ vision of volunteering and social enterprise. What these things had in common was a dedication to moral and civic responsibility that didn’t require much of the state in either a fiscal or a regulatory sense. With a few tweaks here and there, society would be magicked into being, without costing any money or requiring any central planning. ‘There is such a thing as society,’ Cameron said. ‘It’s just not the same thing as the state.’ The Behavioural Insights Team (often known as the ‘nudge unit’) was born in 2010, before being spun off as an independent company in 2014.

It is unclear precisely how much influence the nudge unit has exerted over Britain’s response to the pandemic, but there has been widespread unease with the government’s comparatively relaxed approach. At first, its policy seemed heavily dependent on behavioural insights that warned against making drastic interventions too early, for fear that ‘fatigue’ would set in and people would drift back into social contact. In an interview with BBC News on 11 March, David Halpern, the psychologist who runs the nudge unit, made the alarming claim that the government was deliberately allowing infections to spread, so as to create ‘herd immunity’ among the young and healthy majority of the population, who could maintain regular working and family lives. Uniquely among governments around the world, the UK government appeared to be taking the view that you can’t defend society by obliterating social connectivity.


The government and its advisers came in for plenty of criticism for this laissez-faire approach, implicitly from the World Health Organisation, and explicitly from the Lancet and numerous prominent epidemiologists. Many believed that it was complacent in the extreme to rely on social network modelling and behavioural science, when epidemiological science suggested that the correct response was to enforce a complete shutdown of all social gatherings. With schools and public venues of all kinds still open, #BorisTheButcher began trending on Twitter and the allegation circulated that Britain was putting its economy above the lives of its citizens. While government experts were weighing up the various emergent side effects of government action (who looks after a doctor’s children once the schools are closed?), much of the public just wanted the government to act.

Its stance appeared to shift radically within a few days of Halpern’s interview, with the publication of a report by the Imperial College Covid-19 Response Team, which has been advising the government. The report confirmed what critics had been saying, that the only way of avoiding catastrophic loss of life and the swamping of the health system (its modelling had the demand for intensive care beds peaking at eight times’ capacity) would be to minimise social contact wherever possible. But it also warned that the only guaranteed way out of a broad social lockdown was a vaccine, which is probably at least 18 months away. You can blame a decade of Conservative administrations for the low level of intensive care beds and ventilators available in this country (compared to similar economies), and you can blame capitalism for the fact that most people depend on wages to live, but there was a certain sociological frankness in the nudgers’ judgment that a society such as Britain’s (whose state has sought to outsource, marketise and incentivise at nearly every opportunity over the past forty years) couldn’t be suddenly switched off without dire consequences for human welfare.

With Britain heading towards a shutdown, lasting who knows how long, it will quickly become evident how difficult it is to sustain society without everyday sociality. The triumph of the Thatcherite and Hayekian vision meant that we ended up with a ‘flexible’ economy in which a large number of people are entirely reliant on the near-term vagaries of the labour market for their day-to-day survival, with neither savings nor state guarantees to provide any back-up when that market crashes. Wages, rent, credit card repayments and everyday consumption are locked into their own ‘just-in-time’ supply chain, which is stressful enough even when it’s up and running. Having spent decades overhauling the welfare state to promote a more entrepreneurial, job-seeking, active populace, driven by an often punitive conditionality, Britain has little to fall back on when the most urgent need is for everybody to stay at home. The class divide between rentiers (those who accrue income without having to do very much) and the rest has immediately grown starker.



As everyone looks around anxiously in search of their ultimate backstop, we are witnessing a collision between rival ideologies of society. Communities look desperately to the state, while the state looks hopefully to communities. Who’s to say how many desperate young men, seeing the impotence of both, will instead turn furiously to the ‘nation’ as their last resort? If there is one institution that has stood as the symbol of society throughout most British people’s lives, it is the NHS. Nobody expects the safety net that it provides to hold adequately over the next three months. At some point something new will be born, for better or worse. Until that moment, society is a broadband network.

Society as a Broadband Network. By William Davies. London Review of Books, April 2020.








Why do so many of us no longer trust experts, facts and statistics? Why has politics become so fractious and warlike? And how can the history of ideas help us understand our present? In this episode Professor Will Davies, author of Nervous States: How Feeling Took Over The World speaks to Carl Miller about the long history of how societies based on facts and reason were built and why they are now unravelling before our eyes.


Intelligence Squared+, May 26, 2020







William Davies, author of The Happiness Industry, which assesses the relationship between consumer capitalism, big data and positive psychology, talks with Chris Hoff and they discuss the effects of the science of well-being becoming tangled up with economics and technology, and what this might mean for psychologists and therapists. 

The Radical Therapist , June 29, 2020. 





More from William Davies, see  William Davies

















No comments:

Post a Comment