If its recent record is any indication, Winston Churchill might have been wrong about democracy.
There is no more satisfying description of democracy than Winston Churchill's declaration that it "is the worst form of government except all those other forms that have been tried from time to time." Among compliments, backhanded ones are the loveliest, first making a show of retreating and then, like a boomerang, returning to hand. One cannot deny that democratic electorates occasionally lurch into unfortunate decisions, but Churchill consoles us that other systems are prone to worse. (He also thereby consoles himself two years after being turfed out of office by an electorate asking, "Yes, but what have you done for us lately?") The theoretical case for democracy is not undermined by such wayward episodes because the system's excellence is comparative, not absolute. Democracy embodies the virtue of moderation in its self-equilibrating disposition to retreat from perils that hurl other political systems off the rails.
I have long thought of myself as a Churchillian democrat. I do not believe in the innate wisdom of the masses or in a Rousseauian "general will" that unerringly directs the affairs of a politically engaged citizenry. Rather, democracy's chief achievement is that it makes it possible to peaceably throw the old rascals out. Mistakes are not eliminated but more or less rectified. There is nothing heroic about regular alteration of offices among contending parties, but it is better than the various despotisms that would otherwise arise. And perhaps some heroism is to be seen, after all, in the pattern of democracies in times of crisis bringing to the fore noble exemplars, such as Washington in the 18th century, Lincoln in the 19th, Franklin Roosevelt in the 20th—yes, and Winston Churchill up from the wilderness just in time to save Britain.
Sometimes democracies don't merely sputter; sometimes they fail disastrously. As Thucydides informs us in his History of the Peloponnesian War, Athenian democracy under the stress of a long war with Sparta bungled things with the imperial overreach of the Sicilian campaign (and then, as Plato reports, compounded the error with the trial and execution of Socrates). German democracy a decade after the Great War produced one unsatisfactory government after another until in 1933 it elevated Adolf Hitler to the chancellorship.
As distressing as these events are, they do not seriously undermine the Churchillian dictum. The former episode was very, very long ago, and the latter occurred in a country that had only superficially put on the vestments of democracy over its traditional autocratic garb. These and similar cases constitute evidence that democratic theorists need to analyze carefully. But they do not disconfirm the conclusion that, among imperfect constitutions, democracy is the least bad.
My Churchillian inclinations have, however, recently taken a buffeting. In 2016, the electorate of Churchill's own country defied pollsters and enlightened opinion to endorse withdrawal from the European Union. In the same year, the Philippines brought to the presidency Rodrigo Duterte, a strongman who, to the general approval of the citizenry, has superintended the slaughter of thousands of individuals alleged to have some connection to traffic in drugs. Brazil recently elected a president with similar views about the merits of extrajudicial executions and who looks back longingly at his country's days of military dictatorship. Hungary, liberated within living memory from jackbooted totalitarianisms, elected and then re-elected its own strongman, Viktor Orbán, who explicitly praises "illiberal democracy." Poland has done similarly. In the postwar era, Turkey has been championed as the exemplary Muslim state, a staunch member of the North Atlantic Treaty Organization and, in between the occasional military coup, a tolerant, open democracy. However, popular elections have repeatedly endorsed Recep Tayyip Erdogan, first as reforming prime minister and now as a caliph-in-waiting who smashes all other centers of power. Italy, always whimsical in its political ways, has recently returned a government of left-populists combined with right-populists. And then there was a certain U.S. 2016 election.
This is by no means a complete list of recent democratic deviations, and admittedly it omits some cheering upsets: a French government that at a stroke sidelined all the tired, established parties; Malaysia ejecting the party that had ruled uninterruptedly for six decades since independence. Nonetheless, and with all due qualification, something more may be disturbing democratic waters than the usual tides and eddies. Is there any call for Churchillians to be worried?
In a word, yes.
I fear that features endogenous to contemporary democracy create a propensity toward decline and, unless checked, decadence. I shall be very pleased to be shown that this judgment is based on hasty observation and faulty reasoning. But first a primer in the anatomy of democracy.
Since antiquity, it has been well-understood that democracies, more than any other form of rule, are susceptible to the disease of demagoguery. A would-be leader with fire on his tongue can capture, at least for a while, the rapt allegiance of the citizenry. When democracy was reborn in modern times, its architects, knowing this, tried to immunize it from the demagogic disease by imposing republican structures, two in particular: First, rule of the people is exercised through elected representatives rather than via a vote of the whole. Second, governance is not unitary but rather exercised through a division of powers such as a separate legislature, judiciary, and executive.
These two attributes admit of endless variation, and beyond them other features may provide a regime its distinctive form, such as a written constitution, an authoritative listing of rights, a tradition of customary law, and so on. To qualify as a democracy, however, rule must in some way be founded on the expressed and regularly re-expressed will of the people via the ballot box.
Theorists have identified two potentially undermining flaws in this model: rational ignorance and rational abstention. The former means it is almost never in voters' direct material interest to become better informed about the candidates and issues competing for their support. Time invested in political investigation is costly; it expends energy that could have been used in alternative pursuits. Moreover, study may simply confirm one's original untested hunches, in which case the effect on the direction of one's vote is nil.
Finally, even in those cases in which study leads one to vote more shrewdly, in an electorate with hundreds, let alone hundreds of thousands, of other voters, it is exceedingly unlikely that one's own ballot will swing the election. (Even in the exceedingly rare case in which it does, the likelihood that one may be wrong about which candidate best serves one's interests is non-negligible.) For these reasons it is almost always more profitable to study which car to buy, which stock to invest in, which video to view, or which person to marry than to bother unduly about which candidate to back.
To the extent electors are rational, they will accumulate shockingly little political knowledge. By similar reasoning, individuals find themselves with much less reason to take time and effort to vote than to use that time for more productive activity. The average voter has only a hazy understanding of what is at stake and has almost no chance of being decisive on the outcome anyhow. To the invitation to exercise the franchise the rational individual will respond: Thanks, but no thanks.
Rational ignorance and rational abstention, though, can seem most conspicuous by their absence. In some countries—e.g., Australia—voting is mandatory. In the United States it is not, yet at every election tens of millions of citizens regularly leave the comfort of their homes to cast a ballot. Leaving aside the degree of their wisdom in matters political, it is incontestable that they voluntarily consume enormous quantities of political (mis)information via newspapers, 24-hour cable news networks, Facebook posts, and disputation with the man on the next barstool. Pundits, being pundits, will declaim that this is not nearly enough. Nonetheless, even average voter consumption of political information is orders of magnitudes greater than would be predicted by an economically grounded account of voter rationality, the branch of political economy that goes by the name public choice theory. That theory is embarrassingly unable to answer satisfactorily the questions: Why do individuals bother to vote? Why do they direct so much of their consciousness to political matters?
Lin Zhizhao/UnsplashNo doubt there are many correct partial answers. Some people misjudge the likelihood of their own votes tipping the balance. Some have been propagandized by their high school civics teachers into believing that they have a moral duty to cast informed ballots. Others fear being chastised by friends or family for displaying civic laziness. Not inconsistent with any of these explanations but transcending them is that most voters want to "have a say." That is, voting is not so much a calculated effort to bring about political outcomes as it is an expressive act valued in its own right. Always and everywhere we are a species that relishes admiring and deploring, supporting and opposing, sometimes instrumentally but also for its own sake.
From our Paleolithic ancestors' painting the caves of Lascaux through graffiti on city walls and public restroom stalls and now via endlessly accumulating Twitter storms, we project ourselves onto the world. All political activity does so, but what is distinctive about voting as opposed to running for office or lobbying in support of a bill is that it is almost entirely expressive. It is more like cheering on a sports team than it is like buying or selling in the market. The fan devoutly desires a particular outcome, but she does not scream at the television set because she believes it will enhance that outcome's likelihood.
Nor are voters indifferent as to who wins or loses, else they would have other uses for their time, but their expressions of support are not primarily instrumental. Voting booths are by no means the only venue in which support is expressed, but unlike taverns or social media networks, they are a solemnly institutionalized forum in which matters of great public moment are determined. As a voter you partake of the emotional benefits that come from civic engagement even if you know you aren't deciding the outcome.
It is a commonplace of political science that individuals tend to vote their interests. Nothing said above denies that truism, but it does indicate the need to distinguish between material interest and expressive interest.
To a considerable extent these coincide. I am a teacher, so policies that benefit teachers benefit me. I therefore am apt to support them as a matter of prudent self-regard. But it is also likely, indeed true, that I regard education as a practice that merits social esteem. Regardless of the extent to which this attitude has been nurtured by my own career path, it holds out the opportunity of expressive returns not linked to my bottom line.
I will sincerely and unreflectively do things like cite Socrates on the worthlessness of an unexamined life, thereby reaping the emotive reward of standing up for my values. But although material and expressive interests largely coincide, it is crucial for understanding the workings of democracy to recognize that sometimes they do not. That is true for the generic form of republican democracy, but it is especially salient, I suspect, for understanding democratic distemper in the second decade of the 21st century.
Some expressive acts are very costly. Telling your boss to her face what you think of her might be enormously gratifying but might also carry consequences you prefer not to bear. So instead you keep your mouth shut or inaudibly mumble to yourself. One generic feature of casting a ballot in a large-number electorate is that it is very inexpensive. Once you actually bestir yourself to visit the polling station, all you give up by your ballot is the opportunity to have voted for the other side; influence over outcomes is virtually a nonfactor. This means that expressive acts that in other environments come at a high price can be available almost cost-free in the voting booth.
For example, to express support for the cause of the poor by putting $100 in the Salvation Army kettle may be in accord with one's charitable ideals, but unfortunately it costs $100 of alternative consumption forgone. On the other hand, to vote for a measure that would tax one $100, the proceeds to the poor cost not $100 but only the opposed vote forgone, which, when calculated in expected value terms, is apt to be only a small fraction of a cent.
Lofty moral ideals are a natural for ballot booth expression. But so too are animosities. As noted previously, face-to-face confrontations can prove costly. To vote against the despised Other may provide considerable expressive satisfaction at almost zero expense, especially under conditions of ballot anonymity. (This is one advantage that voting still possesses over many social media ejaculations.) Accordingly, we can expect under standard conditions of democratic voting greater extremes of both benevolence and malevolence than tend to come to the fore when individuals bear substantial costs for their own expressive activities.
A further corollary is that individuals will be more likely to visit the polls when expressive returns in one or another direction promise to be high than when only undramatic business as usual is in the offing. Actors further up on the political food chain, such as candidates and party operatives, will be aware of this calculus and therefore will maneuver to secure votes by appealing not only to material interests but also to emotively rich factors. Because political competition is essentially a zero-sum game, when one side raises the expressive ante it is incumbent on the other to follow along. Doing so simultaneously addresses the rational abstention problem—staying home becomes less attractive if a visit to the polls allows people to satisfyingly vent their spleens—and also provides reasons to direct the vote this way rather than that.
Indirectly, that heightening of emotion also responds to rational ignorance. Human beings tend to spend more time and effort learning and thinking about that which is expressively meaningful to them (for example, the averages of the players on their beloved team) than to that which may be significant for their material interests but which engages only lightly their emotions (for example, the preferred policies of local sewage board members).
My hunch is that in recent democratic politics across much of the world, expressive stakes have tended to tip increasingly away from the positive and toward the negative. That is to say, the joys of providing a comeuppance to deplorable Others supplants beneficence or disinterested moral yearnings as the dominant propulsion behind ballots. Moreover, there is reason to fear that this is not a temporary aberration that will soon be righted but rather a long-wave alteration in democratic propensities that will create an increasingly hostile environment for national and, especially, international comity. It would be a matter of enormous relief to me if this hunch is mistaken, but wishful thinking is not evidence.
First and most obvious evidence for this fear are the outcomes themselves. It is not terribly uncommon for campaigns to be succeeded by upsets. In American history, the 1948 defeat of Dewey by Truman has become a classic, in no small measure because of the Chicago Tribune's embarrassingly premature headline choice ("Dewey Defeats Truman"). Churchill's own defeat three years earlier was of a similar order. One established party more or less surprisingly supplants another, and business goes on. Not so for recent events. Britain's half-century-long integration with Europe is jarringly thrown into reverse. For decades Italy had seemed to occupy the frontier of political whimsy and yet now leaps miles further along into the surreal. Turkey, Poland, and, yes, the United States confuse and confound. The observer is not merely surprised but aghast. Tectonic political plates have shifted under one's feet.
All across the world of rich democracies, even unto the pure lands of Scandinavia, resentment against would-be entrants gains political potency. And the beggar-thy-neighbor impulse takes precedence even over one's own bottom line.
Second, insularity is in the ascendancy. One way to lose an election is to be less vigorous than one's opponent in decrying the baleful influence of outsiders or, even worse, their attempts to become geographical insiders. Xenophobia is, of course, nothing new in political practice. The dynamics of Us vs. Them is repeated endlessly, both in war and in peace. Often this phenomenon proves tragically negative-sum. After two world wars in the 20th century (and one Cold War that flamed dangerously hot at its fringes), electorates and their governments increasingly replaced their animosity toward the Other with embrace of practices that promised mutual gain. The Europe that had twice torn itself asunder formed first a trading bloc and then a community. Its free countries joined the United States in a defensive alliance that continues to endure. Tariff barriers were systematically lowered via the General Agreement on Tariffs and Trade and then massaged further by the World Trade Organization. Polities freely voted to tax themselves to provide aid to impoverished developing countries.
Recently, though, centrifugal forces have reasserted themselves. All across the world of rich democracies, even unto the pure lands of Scandinavia, resentment against would-be entrants gains political potency. It is important to note that while some of these moves seem to promise material advantage, many do not. Rather, the beggar-thy-neighbor impulse takes precedence even over one's own bottom line.
Third, rejection of foreigners is accompanied by a similar rejection of domestic experts, hereafter known as "so-called experts." For against whatever statistics and analysis they present, equal and opposed items can be put forth to counter. The technical term for these is "alternative facts." So, for example, when business leaders and economists laden with many advanced degrees declare that exit from the European Union will lower Britain's national income, other "experts" retort that hundreds of millions of additional pounds will be made available to the National Health Service. Voters are free to pick whichever set of facts fits their expressively favored choice.
At the turn of the century, postmodernist relativism enjoyed currency only within the rarefied air of the humanities departments of elite universities. Altogether unexpected was that it would come to capture the masses. Not long ago, democracies were disproportionately influenced by the considered opinion of the nation's elites. For better or worse, deference has turned to disdain. It becomes hard to utter the phrase "best and the brightest" without irony. If one aspect of democracy is acknowledgment of equality among citizens, then assigning equal epistemic credit to the opinions of all may be the ultimate progression of the democratic ethos. When high school ne'er-do-wells constitute a majority, why should they not have a turn at defining reality? I freely admit that I find this prospect appalling, but as the holder of advanced degrees I would, wouldn't I?
A fourth reason to fear these changes in democratic functioning are likely to linger is that technological developments—I refrain from calling them advances—have thrust credentialed opinion makers from the commanding heights of the communication infrastructure. With the coming of the internet and the social media that surf upon it, the balance of contending forces has changed, sometimes decisively. Those with an interest in getting out a message that contends against the respectable consensus have far more opportunity to do so than back in the era of furtive street corner exchanges and samizdat. Epistemological priorities have been reversed; instead of interested parties having to form their platforms in light of the facts that have been disseminated, now "facts" are manufactured to order and communicated instantaneously. Democracy has never been viewed by theorists as best suited for engaging in careful analysis prior to decisive action, but under contemporary conditions, democratic populaces operate less as a deliberative community than as a conditioned reflex.
Individuals across the social spectrum are more able than ever to avail themselves of data and considered opinion that were once the purview of an educated elite. Critics might argue that this is not a negative in assessing the success of rule by the people—rule by the whole people. I admit my own inability to make social media my friend may have biased me in a discreditably reactionary direction. I do admire and approve the way in which modern communications have lubricated the efforts of people endeavoring to remove oppressive boots from their throats, such as during the all-too-brief days of the Arab Spring. Nonetheless, it is not at all obvious that the overall tendency of Facebook, Twitter, and the rest has been for the good. It ought to be recalled that the greatest previous eruption of voices celebrated in our tradition is the Tower of Babel.
Very possibly these fears are overstated. Generalizing from a limited number of cases is hazardous. Then again, the number has not really been so limited. To be sure, it is statistically normal to see occasional bunchings of anomalous cases, whether it is hurricanes, 50–1 shots galloping home first, suicides, or political upsets. It is not, however, merely the numbers but rather what can be discerned as underlying these episodes that ought to give one pause. Economic, political, and epistemic disturbants have grown in force and do not show signs of abating. This is not a call to raise the white flags, but any end-of-history triumphalism based on very recent experience is out of place. Although hindsight flattens out terrors of previous days, the rise of dictators in the 1930s was not so very long ago, and not until 1989 did totalitarianism offer clear indications of being on its way out.
Moreover, unlike during these earlier periods, a competitor presents itself to many as an appealing alternative to democratic liberalism. Compared to Nazi Germany, China seems positively cuddly, and its ratio of millionaires to gulag prisoners is many times greater than the Soviet Union ever achieved. As democracies fray further around the edges, attraction to the Chinese model and other illiberalisms can be expected to wax. It gives me no pleasure to conclude with an admission that even we Churchillians have grounds for concern that the 21st century may not be kind to our cause.