Thursday, September 29, 2016

Courage and free speech

Courage and free speech

Throughout human history there have been individuals who have been ready to risk everything for their beliefs

August Landmesser refuses to perform the Nazi salute during a visit to the Hamburg shipyards by Adolf Hitler in 1936. Photo courtesy Wikipedia
is a historian and writer. He is professor of European studies at the University of Oxford, Isaiah Berlin Professorial Fellow at St Antony’s College, and a senior fellow at the Hoover Institution at Stanford University. He is also the director of the Free Speech Debate. His latest book is Free Speech: Ten Principles for a Connected World (2016). 
Published in association with
Free Speech Debate
an Aeon Partner
1,600 words
Edited by Nigel Warburton
Republish - Licence Only
Why is free expression so important to us?
0Responses
‘Nothing is more difficult,’ wrote the German political essayist Kurt Tucholsky in 1921, ‘and nothing requires more character, than to find yourself in open contradiction to your time and loudly to say: No.’ First of all, it is intellectually and psychologically difficult to step outside the received wisdom of your time and place. What has been called ‘the normative power of the given’ persuades us that what we see all around us, what everyone else seems to regard as normal, is in some sense also an ethical norm.
Numerous studies in behavioural psychology show how our individual conviction of what is true or right quails before the massed pressure of our peers. We are, as Mark Twain observed, ‘discreet sheep’. This is what John Stuart Mill picked up when he wrote in On Liberty (1859); that the same causes that make someone a churchman in London would have made him a Buddhist or a Confucian in Beijing. The same truth is gloriously captured in the humorous song ‘The Reluctant Cannibal’ (1960) by Michael Flanders and Donald Swann, in which a young cannibal revolts against the settled wisdom of his elders and declares that ‘eating people is wrong’. At the end of the song, one of the elders exclaims, to huge belly laughs all round: ‘Why, you might just as well go around saying: “Don’t fight people!”’ Then he and his colleagues cry in unison: ‘Ridiculous!’
Yet norms change even within a single lifetime, especially as we live longer. So as elderly disc jockeys are arrested for sexual harassment or abuse back in the 1960s, we should be uncomfortably aware that some other activity that people regard as fairly normal now might be viewed as aberrant and abhorrent 50 years hence.
To step outside the established wisdom of your time and place is difficult enough; openly to stand against it is more demanding still. In Freedom for the Thought that We Hate (2007), his fine book on the First Amendment tradition in the United States, Anthony Lewis quotes a 1927 opinion by the Supreme Court Justice Louis Brandeis, which Lewis says ‘many regard as the greatest judicial statement of the case for freedom of speech’.
The passage Lewis quotes begins: ‘Those who won our independence… believed liberty to be the secret of happiness and courage to be the secret of liberty.’ This is magnificent, although it also illustrates the somewhat self-referential, even self-reverential, character of the modern First Amendment tradition.
Lewis cites Brandeis, who credits this thought to the 18th-century founders of the US. But those founders would have been well aware that they got it straight from Pericles’ funeral oration during the Peloponnesian War of the fifth century BCE, as reported – if not invented, or at least much improved upon – by Thucydides. ‘For you now,’ Thucydides’ Pericles admonishes his ancient Athenian audience, after praising the heroic dead, ‘it remains to rival what they have done and, knowing the secret of happiness to be freedom and the secret of freedom a brave heart, not idly to stand aside from the enemy’s onset.’
More directly, the US tradition of courage in the defence of free speech draws on the heritage of the 17th-century English. People such as John Lilburne, for example. In 1638, while still in his early 20s, Lilburne was found guilty by the Star Chamber court of helping to smuggle into England a tract against bishops that had been printed in the Low Countries. He was tied to the back of a cart on a hot summer’s day and unremittingly whipped as he walked with a bare back all the way from the eastern end of Fleet Street to Westminster Palace Yard. One bystander reckoned that he received some 500 blows that, since the executioner wielded a three-thronged whip, made 1,500 stripes.
Lilburne’s untreated shoulders ‘swelled almost as big as a penny loafe with the bruses of the knotted Cords’, and he was then made to stand for two hours in the pillory in Palace Yard. Here, in spite of his wounds and the burning sunshine, he began loudly to tell his story and to rail against bishops. The crowd was reportedly delighted. After half an hour, there came ‘a fat lawyer’ – ah, plus ça change – who bid him stop. The man whom the people of London had already dubbed ‘Free-Born John’ refused to shut up. He was then gagged so roughly that blood spurted from his mouth. Undeterred, he thrust his hands into his pockets and scattered dissident pamphlets to the crowd. No other means of expression being left to him, Free-Born John then stamped his feet until the two hours were up.
Get Aeon straight to your inbox
As an Englishman, I find particular inspiration in the example of Free-Born John, and those of all our other free-born Johns: John Milton, John Wilkes, John Stuart Mill (and George Orwell, a free-born John in all but name). More broadly, there is no reason to understate, let alone to deny, a specifically Western tradition of courage in the advancement of free speech, one that can be traced from ancient Athens, through England, France and a host of other European countries, to the US, Canada and all the liberal democracies of today’s wider West. But it would be quite wrong to suggest that this habit of the heart is confined to the West. In fact, there have been rather few examples of such sturdy defiance in England in recent times, while we find them in other countries and cultures.
Consider, for instance, the Chinese dissident Liu Xiaobo. Liu was sentenced to 11 years’ imprisonment in 2009 for ‘subverting state power’. Both his written response to the charges against him and his final speech in court are, like many of his earlier writings, lucid and courageous affirmations of the central importance of free speech. He definitely does not draw only on Western traditions. For example, in his book No Enemies, No Hatred (2012), he quotes a traditional Chinese 24-character injunction: ‘Say all you know, in every detail; a speaker is blameless, because listeners can think; if the words are true, make your corrections; if they are not, just take note.’
After paying a moving tribute to his wife (‘Armed with your love, dear one, I can face the sentence that I’m about to receive with peace in my heart’), Liu looks forward to the day ‘when our country will be a land of free expression: a country where the words of each citizen will get equal respect, a country where different values, ideas, beliefs and political views can compete with one another even as they peacefully coexist’. The judge cut him short in court before he had finished speaking, but free-born Xiaobo, like free-born John, still got his message out. In his planned peroration, Liu wrote: ‘I hope that I will be the last victim in China’s long record of treating words as crimes. Free expression is the base of human rights, the root of human nature and the mother of truth. To kill free speech is to insult human rights, to stifle human nature and to suppress truth.’
‘They will send me to prison,’ al-Johani says, ‘and I will be happy’
Liu was by this time famous, and that great speech made him more so. He was awarded the Nobel Peace Prize in 2010. But perhaps the most inspiring examples of all come from people who are not famous at all: so-called ordinary people doing extraordinary things. People such as the Hamburg shipyard worker who, at the launch of a naval training vessel in 1936, refused to join all those around him in making the Hitler salute. The photograph only achieved wide circulation on the internet more than 60 years later. There he stands amid a forest of outstretched arms, with both his own firmly folded across his chest, a portrait of stubborn worker’s pride. His name was August Landmesser. He had been a Nazi party member but was later expelled from the party for marrying a Jewish woman, and then imprisoned for ‘dishonouring the race’. After his release, he was drafted to fight in the Second World War and never returned.
Again, such moments are emphatically not confined to the West. During the Arab Spring of 2011, a ‘day of rage’ was proclaimed by dissidents in Saudi Arabia. Faced with a massive police presence at the appointed location in the country’s capital Riyadh, almost nobody showed up. But one man, a strongly built, black-haired teacher called Khaled al-Johani, suddenly approached a group of foreign reporters. ‘We need to speak freely,’ he cried, with an explosion of pent-up passion. ‘No one must curb our freedom of expression.’ A BBC Arabic service film clip, which you can watch on YouTube, shows a tall secret policeman, in white robes, headdress and dark glasses, looming in the background as he snoops on al-Johani’s speech. A little further away, armed police mutter into their walkie-talkies. ‘What will happen to you now?’ asks one of the reporters, as they escort the teacher back to his car. ‘They will send me to prison,’ al-Johani says, adding ironically: ‘and I will be happy.’ He was subsequently condemned to 18 months’ imprisonment.
In many places, we can find monuments to the Unknown Soldier, but we should also erect them to the Unknown Speaker.
This is an edited extract from ‘Free Speech: Ten Principles for a Connected World’ by Timothy Garton Ash © 2016, published by permission of Atlantic Books.

Ideas

The Plot to Put Conceptual Art on ‘Melrose Place.’




Photo

The conceptual artist Mel Chin, who helped form a group to supply art and props with coded cultural messages on “Melrose Place.” On Friday, at the Red Bull Studios New York, 100 objects from the committee’s work go on display in an exhibition. Credit George Etheredge for The New York Times

Twenty years ago, the conceptual artist Mel Chin cold-called the offices of “Melrose Place,” Aaron Spelling’s wildly popular prime-time soap opera, with a proposition. What if a task force of artists supplied free artworks and props for the show’s apartment-complex set, with coded cultural messages on pressing topics like reproductive rights, American foreign policy, alcoholism and sexual politics?
Deborah Siegel, the show’s set decorator, listened to this absurd offer and had an instant reaction. “I thought it sounded really interesting,” she said in a recent interview. “So I met with him.”
This was the beginning of a conceptual artist’s dream, an ongoing intervention into the very heart of American mass culture. In late 1995, Mr. Chin and a team of 100 mostly unknown artists, called the Gala Committee, began a two-year experiment, placing objects on the set of “Melrose Place.” They took their cues from scripts provided in advance and in some instances worked with the writers to modify plot lines and develop characters.
On Friday, at the Red Bull Studios New York in Chelsea, 100 objects from the committee’s work go on display in the exhibition “Total Proof: The Gala Committee 1995-1997.

Photo

“RU 486 Quilt,” which appeared on the show. Credit GALA Committee

The exhibition, through Nov. 20, will be, appropriately enough, a rerun. Viewers of “Melrose Place” saw a version of it in April 1997, in a television episode featuring an actual exhibition at the Museum of Contemporary Art in Los Angeles, “Uncommon Sense,” which included many of the works produced for the set.

Photo

Heather Locklear and Rob Estes in “Melrose Place.” An episode in 1997 featured an exhibition at the Museum of Contemporary Art in Los Angeles, which included works made for the set. Credit CBS Television Studios

In it, Heather Locklear, as the hard-charging advertising executive Amanda Woodward, has just taken on the museum as a client and brings her love interest, Kyle McBride (Rob Estes), to the opening for a stimulating evening of art talk.
Much of it takes place in front of a Ross Bleckner-like painting that alludes to the American bombing of Baghdad. That work was ordered by Carol Mendelsohn, the show’s head writer. This fictional opening, filmed two weeks before the museum’s opening, was one of the great meta moments in television history.
Mr. Chin is by now a well-known figure, a skilled organizer of socially provocative works that can last for years. In a recent project typical of his approach, “The Tie That Binds,” he used native plants to create eight drought-resistant gardens along the Los Angeles River. Visitors were invited to take away a blueprint for one of the gardens and replicate it at home, furthering the cause of water conservation.
Continue reading the main story

Continue reading the main story
The “Melrose Place” idea began when Mr. Chin was shuttling back and forth between the University of Georgia, where he held a temporary professorship, and the California Institute of the Arts, where he was conducting a workshop. “We discussed pop culture and Hollywood,” said Valerie Tevere, one of his Cal Arts students and now an associate professor of art at the College of Staten Island. “How might artists work with TV. What sort of things could happen?”
Continue reading the main story
Mr. Chin had never heard of “Melrose Place.” “I was not watching much television at the time,” he said in a recent interview at Red Bull Studios.
But if he was not watching, he was thinking, prompted by Julie Lazar, the director of experimental programs at the Museum of Contemporary Art, and Tom Finkelpearl, a guest curator and now New York’s commissioner of cultural affairs, who approached him to take part in “Uncommon Sense.”
Mr. Chin recalled that while on a flight from Atlanta to Los Angeles, he looked out the window and thought “Los Angeles is in the air.” The city existed in the trillions of electronic impulses its residents sent through the atmosphere and around the world, transmitting social content and cultural symbols. “Our world is transformed by covert information, political messages,” Mr. Chin said. “How would that work if it was art?”
Back home, Mr. Chin watched as his wife, Helen Nagge, flipped the remote and stopped on an arresting image. “I saw this large blond face filling the screen, with blue eyes,” he said. It was Ms. Locklear. “When she moved, there was a painting behind her, and I said, ‘That’s the gallery.’”

Photo

Another work, “Think of the Re-runs.” Credit GALA Committee

Mr. Chin began assembling his troops. The name GALA fused the abbreviations for Georgia and Los Angeles, but eventually the committee absorbed dozens of artists around the country.
The team included students; professional artists; a media scholar (Constance Penley of the University of California, Santa Barbara); and an actual fan of the show, Mark Flood, an old friend of Mr. Chin’s from his native Houston.
Mr. Flood wondered aloud whether the project amounted to a sellout. Mr. Chin told him, “We’re not selling anything, we’re getting in.”
Frank South, an executive producer for the show, and Ms. Mendelsohn decided not to mention the project to Mr. Spelling or the network brass. Eventually, word leaked out. In 1997, The New Yorker ran a Talk of the Town article, “Agitprop,” timed to the opening of “Uncommon Sense.” Mr. South said, “I was busted.”
Mr. Spelling, tickled at the idea of seeing “Melrose Place” in the museum world, took the news well. “Just don’t do anything to hurt the show,” he told his charges.

Photo

Another work that was featured on “Melrose Place.” Credit GALA Committee

In early 1996, with the series in its fourth season, the artwork began to arrive, first in a trickle, then in a flood. As a safe-sex message, committee members designed “Safety Sheets” for the manipulative, womanizing Dr. Peter Burns: bedsheets in an all-over pattern of cylindrical shapes that, on close inspection, turned out to be unrolled condoms.
When Alison Parker (Courtney Thorne-Smith) became pregnant, the GALA Committee made her a quilt appliquéd with the chemical symbol for the morning-after pill RU-486. “One of the things we wanted to do was to respond to the fact that in network TV, no matter how strong you are, you cannot have an abortion,” Ms. Penley said. “You either have the baby, or you fall down the stairs. We wanted to put reproductive choice back on network TV.”
One of the sneakier placements — the committee referred to them as “product insertion manifestations” — came from the Cal Arts workshop. When Michael Mancini, a character played by Thomas Calabro, visits a hot-sheet motel, he sees the clerk reading “Libidinal Economy,” a work by the French poststructuralist Jean-François Lyotard.
“Total Proof,” organized by Max Wolf with Candice Strongwater, takes its title from an altered photograph of the bombing of the Alfred P. Murrah Federal Building in Oklahoma City in April 1995, with the damage reworked by the artists to mimic the shape of an Absolut vodka bottle. The work was initially deemed too disturbing to appear on the show, but somehow it ended up, in plain sight, on a wall at D&D Advertising, Amanda’s company.

Photo

The work “Cause and Effect Rain Coat.” Credit GALA Committee

As the television project gathered steam, the producers turned to the committee to help invent the character of Samantha Reilly, an artist who, after graduating from the Rhode Island School of Design, heads out to Los Angeles and moves into the Melrose Place complex. Ms. Mendelsohn was flown out to Kansas City to brainstorm with 10 women on the committee who became known as the Sisters of Sam.
“We thought, she could be a Cindy Sherman, or a Kiki Smith, or a Barbara Kruger,” said Ms. Penley, who envisioned a feminist conceptualist. But the producers demanded paintings in the David Hockney mode, with bright pastels.
“They said, “‘Because the camera loves those colors,’” Mr. Chin recalled.
Hijacking the concept, the Gala Committee turned out a series of cheery-toned paintings on the theme of violence and death in Los Angeles.
The Gala Committee called it a day after the museum episode, but the series continued until May 1999. In a half-serious statement for a sale of many of the artworks at Sotheby’s, Mr. Chin summed up the great intervention as the catalyst for “a profoundly radical transformation of worldwide art, entertainment, communication and government.”
The reality was somewhat less dramatic. “We were exhausted, basically,” Mr. Chin said. “It was very stressful, producing on deadline. The potentiality and the pictorial reality had been enlarged, so we decided to stop there. It was time to release it to the world. And think of the reruns.”

Continue reading the main story

Wednesday, September 28, 2016

How organisations enshrine collective stupidity









Stupefied

How organisations enshrine collective stupidity and employees are rewarded for checking their brains at the office door

Yes! Photo by Westend61/Stefan Kranefeld/Gallery Stock
is professor of organisational behaviour at the Cass Business School at City, University of London, where he specialises in political dynamics, organisational culture and employee identity. His latest book, together with Mats Alvesson, is The Stupidity Paradox: The Power and Pitfalls of Functional Stupidity at Work (2016).


Brought to you by curio.io, an Aeon partner
3,300 words
Edited by Sam Haselby


What lessons have you learned from seeing stupid practices persist in an organisation?
 25Responses
Each summer, thousands of the best and brightest graduates join the workforce. Their well-above-average raw intelligence will have been carefully crafted through years at the world’s best universities. After emerging from their selective undergraduate programmes and competitive graduate schools, these new recruits hope that their jobs will give them ample opportunity to put their intellectual gifts to work. But they are in for an unpleasant surprise.
Smart young things joining the workforce soon discover that, although they have been selected for their intelligence, they are not expected to use it. They will be assigned routine tasks that they will consider stupid. If they happen to make the mistake of actually using their intelligence, they will be met with pained groans from colleagues and polite warnings from their bosses. After a few years of experience, they will find that the people who get ahead are the stellar practitioners of corporate mindlessness.
One well-known firm that Mats Alvesson and I studied for our book The Stupidity Paradox (2016) said it employed only the best and the brightest. When these smart new recruits arrived in the office, they expected great intellectual challenges. However, they quickly found themselves working long hours on ‘boring’ and ‘pointless’ routine work. After a few years of dull tasks, they hoped that they’d move on to more interesting things. But this did not happen. As they rose through the ranks, these ambitious young consultants realised that what was most important was not coming up with a well-thought-through solution. It was keeping clients happy with impressive PowerPoint shows. Those who did insist on carefully thinking through their client’s problems often found their ideas unwelcome. If they persisted in using their brains, they were often politely told that the office might not be the place for them.
One new recruit who faced this problem was Jack. After years at graduate school, he was a specialist in corporate governance. Hoping to use his expertise to make a difference in the real world, he joined a large consulting firm. He quickly found that he was working on a range of projects that had absolutely nothing to do with his expertise. Even though he presented to clients as a global expert, he knew little more than what he found in a few minutes searching the company intranet. He learned that his main job was to make a good impression with the client, not to solve their problems. He knew that, if he actually tried to use his expertise in a meaningful way, his superiors would not be happy.
For more than a decade, we’ve been studying dozens of organisations such as this management consultancy, employing people with high IQs and impressive educations. We have spoken with hundreds of people working for engineering firms, government departments, universities, banks, the media and pharmaceutical companies. We started out thinking it is likely to be the smartest who got ahead. But we discovered this wasn’t the case.
Organisations hire smart people, but then positively encourage them not to use their intelligence. Asking difficult questions or thinking in greater depth is seen as a dangerous waste. Talented employees quickly learn to use their significant intellectual gifts only in the most narrow and myopic ways.
Those who learn how to switch off their brains are rewarded. By avoiding thinking too much, they are able to focus on getting things done. Escaping the kind of uncomfortable questions that thinking brings to light also allows employees to side-step conflict with co-workers. By toeing the corporate line, thoughtless employees get seen as ‘leadership material’ and promoted. Smart people quickly learn that getting ahead means switching off their brains as soon as they step into the office.
Get Aeon straight to your inbox
We found many ways that all kinds of organisations positively encouraged intelligent people not to fully use their intelligence. There were rules and routines that prompted them to focus energies on complying with bureaucracy instead of doing their jobs. There were doctors who spent more time ‘playing the tick-box game’ than actually caring for patients; teachers who spent more time negotiating new bureaucratic procedures than teaching children. We met Hans, a manager in a local government agency: after a visit from a regulator, his office received a list of 25 issues in need of improvement. So Hans’s agency developed 25 new policies and procedures. The result: the regulator was happy, but there was no change in actual practice. Such stories showed us how mindless compliance with rules and regulations can detract people from actually doing their jobs. The doctors, teachers and government officials all knew that the rules and regulations they spent their days complying with were pointless diversions. However, they chose not to think about this too much. Instead, they just got on with ticking the boxes.
Another significant source of stupidity in firms we came across was a deep faith in leadership. In most organisations today, senior executives are not content with just being managers. They want to be leaders. They see their role as not just running their business but also transforming their followers. They talk about ‘vision’, ‘belief’ and ‘authenticity’ with great verve. All this sounds like our office buildings are brimming with would-be Nelson Mandelas. However, when you take a closer look at what these self-declared leaders spend their days doing, the story is quite different.
George saw himself as a very ‘open’ manager. Staff told us he provided breakfast in the morning and an annual beer-tasting
No matter how hard you search there is little – if any – leadership to be found. What most executives actually spend their days doing is sitting in meetings, filling in forms and communicating information. In other words, they are bureaucrats. But being a bureaucrat is not particularly exciting. It also doesn’t look very good on your business card. To make their roles seem more important and exciting than they actually are, corporate executives become leadership addicts. They read leadership books. They give lengthy talks to yawning subordinates about leadership. But most importantly they attend many courses, seminars and meetings with ‘leadership’ somewhere in the title. The content of many of these leadership-development courses would not be out of place in a kindergarten or a New Age commune. There are leadership-development courses where participants are asked to lead a horse around a yard, use colouring-in books, or build Lego – all in the name of developing them as leaders.
At least $14 billion gets spent every year on leadership development in the US alone yet, according to researchers such as Jeffrey Pfeffer at Stanford, it has virtually no impact on improving the quality of leaders. In our own research, we found that most employees in knowledge-intensive firms didn’t need much leadership. People working at the coalface were self-motivated and often knew their jobs much better than their bosses did. Their superiors’ cack-handed attempts to be leaders were often seen as a pointless distraction from the real work. George, a manager in a high-tech engineering firm, told us he saw himself as a very ‘open’. When we asked his subordinates what he actually did, they told us that he provides breakfast in the morning and runs an annual beer-tasting.
Another particularly rich source of stupidity in organisations is the deep belief in the power of brands. Many organisations seem to assume that, just by changing the signage, it’s possible to transform the entire company. Sadly, this is almost always wishful thinking on the part of senior executives. We saw costly rebranding initiatives that involved changing the logo of an organisation, but little else. The University of Western Sydney spent millions to transform itself by becoming ‘Western Sydney University’. The Australian Opera also underwent a costly rebranding process to become ‘Opera Australia’. National Bank of Australia hoped to overhaul itself by becoming ‘National Australia Bank’.
Often, this fascination with branding can be little more than a distraction. In one company we studied, we met a group of marketing executives whose job it was to sell a range of products including toothpaste. Naturally, they were very enthusiastic about the magical power of branding. One executive told us ‘you live and die’ by your brand. But when we asked them more about what actually mattered in selling toothpaste, we were told that consumers will ‘just pick anything on a shelf that’s on promotion’ and that ‘people aren’t really interested in toothpaste’. All that counted, they admitted, was the price.
In many organisations, a fascination with branding can become a dangerous distraction. A few years ago, senior figures in the Swedish armed forces decided to run a large rebranding exercise. Unfortunately, this meant they had to cancel some military exercises. When the rebranding initiative was introduced, a commander said: ‘You have to break eggs to make an omelette. It is clear that some will think it is tough along the way, but it will be a damn fine omelette.’ After millions were spent replacing everything from signs to tableware, the top official in the military admitted that the rebranding initiative had been a mistake. It was quietly dropped, but not after creating a significant amount of resentment.
We found another particularly tragic case of rebranding at British Airways in the late 1990s. Following a strategic change, senior executives decided to make the company more globally oriented. To do this, they rebranded British Airways as ‘The world’s favourite airline’ and replaced the Union flag on its planes’ tailfins with ‘world art’ designs. The change sparked widespread public outcry: even the then prime minister Margaret Thatcher entered the fray, covering a model plane featuring the new designs with her handkerchief. In only a matter of weeks, the airline reverted to its old livery. Although little had changed, millions had been spent in the process.
Another big driver of stupidity in many firms is the desire to imitate other organisations. As Jan Wallander, the ex-chairman of Sweden’s Handelsbanken, said: ‘Business leaders are just as fashion-conscious as teenage girls choosing jeans.’ Many companies adopt the latest management fads, no matter how unsuitable they are. If Google is doing it, then it’s good enough reason to introduce nearly any practice, from mindfulness to big-data analytics.
But often there are very weak reasons for following ‘industry best practice’. For instance, when the Swedish armed forces decided to start using Total Quality Management techniques, some officers naturally asked: ‘Why?’ The response: ‘This is presumably something we benefit from, since this is what they do in the private sector.’ In other words, we should do it because others are doing it.
But adopting ‘best practice’ often has little or no impact. One study of oil and gas companies found that they would introduce diversity programmes that had little impact on making people more tolerant. One employee commented: ‘It really is a feel-good exercise. You know we can all feel good that we are this happy multi-coloured family – that’s going to bring in all this money for the firm. The truth is quite another matter.’
Sometimes, following industry best practice can result in worse outcomes. An example of this is companies giving ever-higher pay to their chief executive officers. One analysis found that US companies would pay above-average salaries for top new appointments in the hope of attracting above-average candidates. But, ultimately, the high pay had no impact on a firm’s performance. All it did was ratchet up the amount that companies across the economy were willing to spend on senior executives.
Managers would spend their days trying to claim responsibility for projects that had been seen to succeed and dodge responsibility for failures
One last source of corporate stupidity we came across was company culture. Often, these cultures imprison employees in narrow ways of viewing the world, such as the common obsession with constant change. One hi-tech company we studied was very enthusiastic about change, and launched new change initiatives every few years, often with little or no real results. The programme would be launched with great fanfare, but not much happened next. Everyone seemed to think that someone else was responsible for creating change. And when it became clear that nothing substantive was changing, senior executives dropped the initiative and moved on to the next fashionable change programme without learning anything.
Many corporations create an unwavering focus on the present. In Moral Mazes (2009), a study of the culture in one large US corporation, Robert Jackall found that managers would frequently say things such as: ‘Our horizon is today’s lunch’ or ‘I know what you did for me yesterday, but what have you done for me recently?’ This extremely short time-horizon meant that managers would spend their days trying to claim responsibility for projects that had been seen to succeed and dodge responsibility for failures.
A culture of unflappable positivity is also popular with many companies. In an IT consultancy we studied, employees were constantly told: ‘Don’t bring us problems, only bring us solutions.’ This upbeat message aimed to create a happy workplace. But one consultant who knew the firm well was not so sure. When we asked him to describe the company, he told us: ‘It’s not a firm, it’s a religion.’ Employees’ sincere belief in being positive all the time meant that when genuine problems without an obvious solution appeared, they were overlooked. When the economy went through a large downturn, the company’s upbeat attitude left employees unable to make necessary changes until it was too late.
At the outset of our research, we suspected that organisational life would be full of stupidities. But we were genuinely surprised that otherwise smart people would go along with collective stupidity, and be rewarded for doing so. Mindlessly following rules and regulations – even if they were completely counterproductive – meant that professionals would be left alone. Using empty leadership talk would get ambitious people promoted into positions of responsibility. Copying other well-known organisations meant a firm could be seen as ‘world-class’. Launching branding initiatives meant that executives could focus on the easier work of manipulating surface images and avoid the much messier realities of organisational life. Following deep-seated corporate cultures often meant employees could be seen as committed organisational citizens while overlooking festering problems.
Although corporate mindlessness comes with some big pay-offs, we also noticed it could be very costly. When smart people stopped fully using their intelligence, they would often overlook mistakes. Usually, this wouldn’t matter: companies can be large organisations that offer plenty of places to hide mistakes. What’s more, people in corporations have short attention spans. Perpetrators of blunders will likely have moved onwards (often upwards) before their mistakes becomes obvious. ‘Always try to outrun your mistakes’ was one middle-manager’s key career advice.
However, there are times when it’s impossible to hide the rotten fruits of the collective stupidity. This is what happened at Nokia. Between 2007 and 2013, managers at the telecommunications firm were encouraged to be relentlessly positive. One middle-manager described how ‘if you were too negative, it would be your head on the block’. As a result, employees wanted to give senior managers only ‘good news’ but ‘not a reality check’. Naysayers found their divisions starved of resources, while upbeat corporate yes-men were given ever more responsibility. When there was a genuine problem with Nokia’s new smartphones, developed to compete with Apple’s iPhone, few dared to speak up. This meant that senior management took more than a year to realise they were on a losing streak. By that time, Apple and Samsung were well on their way to dominating the smartphone market.
This cautionary tale reminds us that, although acting stupid can come with some significant short-term rewards such as popularity and promotion, it also comes with longer-term risks. This suggests a dose of stupidity at work is like most things: good in moderation.
Acting stupid at work is a subtle art. If you underdo it, people will suspect you are putting on an act. If you overdo it, they will start to think you are a liability. However, there are some tactics that skilled practitioners of corporate stupidity use to get it just right.
One of the most common tactics is doing what everyone else is doing, even if it is wrong. If your competitor introduces a new strategy, do the same – no matter how wrong-headed it might be. If another competitor starts a Total Quality Management initiative, follow suit. It’s often advisable to copy iconic companies such as Google – even if you are in an entirely different industry. If you call it ‘best practice’, you might be hailed as a genius. When it goes wrong, you can say: ‘Well, everyone got it wrong.’
In a world where stupidity dominates, looking good is more important than being right. Advanced practitioners of corporate stupidity often spend less time on the content of their work and more on its presentation. They know that a decision-maker sees only the PowerPoint show and reads just the executive summary (if they’re lucky). They also realise that most stupid ideas are routinely accepted when they’re presented well. Decision-makers will likely forget much of the content by the time they walk out the door. And when things go wrong, they can say: ‘They didn’t read the fine-print.’
Negotiating corporate stupidity also requires assuming that the boss knows best. This means doing what your boss wants, no matter how idiotic. What is even more important is that you should do what your boss’s boss wants. You will look like you are loyal and it will save time arguing for your position. When things go wrong, you can blame your boss.
Working in a stupefied firm often means blinding others with bullshit. A very effective way to get out of doing anything real is to rely on a flurry of management jargon. Develop strategies, generate business models, engage in thought leadership. This will get you off the hook of doing any actual work. It will also make you seem like you are at the cutting edge. When things go wrong, you can blame the fashionable management idea.
Take the glory that comes from success and move on before you’re saddled with any costs. That way, someone else is left to clean up the mess
Being overly opportunistic is also advisable. Most individuals can easily fool themselves into believing anything if it benefits them. When people are paid enough, they will believe almost anything. So if you justify your commitment to a stupid course of action, just ensure that everyone knows you’re only doing it for the money. That way, when things go wrong, you can blame the incentive structure.
The final piece of advice for any practitioner of corporate stupidity is to keep moving. It is vital to avoid being landed with your own mistakes. Take the glory that comes from short-term success and move on before you’re saddled with any longer-term costs. That way, when things go wrong, someone else is left to clean up the mess.
For the past two decades, management theorists have been convinced that organisations succeed or fail on the basis of their specialised knowledge. However, our close look at the corporate world showed quite a different picture: many large corporations seemed over-run by stupidity. What’s more, this stupidity is not just the accidental result of a few corporate buffoons. It is often intentionally created. This is much more than taking advantage of the various inbuilt cognitive biases with which behavioural economists are so obsessed. Rather, it involved organisations purposefully creating a kind of collective mindlessness.
We saw firms going out of their way to block employees from reflecting on their assumptions, to discourage them for thinking about their substantive goals, and to impede them from giving or asking for justifications for their decisions and actions. By doing this, organisations often create functional outcomes both for individuals (such as career progression) and the whole organisation (such as the ability to avoid conflict and focus on common goals). While these favourable outcomes dominate in the short term, collective stupidity can create disfunction in the longer term, including a lack of learning and an imperviousness to mistakes. Perhaps management thinkers need to stop clinging to knowledge-based theories of organisations and start developing a stupidity-based theory of how organisations are run.