Say what you like about Benefits Street; it pulls in the viewers — over five million of us. Monday’s final episode will probably gather even more to the tv screens. So, while people pontificate about the morality of the show, remind yourself of one thing: this is entertainment, not sociology. Just like Shameless, the comedy, and soaps, this is made to amuse rather than educate us. Not that tv shows intended to entertain can’t also enlighten us: quality dramas and comedies do both. But first and foremost, Benefits Street is designed to distract us in an agreeable way. This is not a slice of life: more of a sliver — a very narrow and entirely unrepresentative sample of a handful of people who live in a street of about a hundred dwellings, the residents of which are no doubt the least amused viewers in the country. This is good television, but terrible social science: if you think it offers a scope on the country’s lumpen proletariat, you’re hopelessly misled. The stars of this show are a minority of a minority of a minority, a highly skewed sample (notice how few British Asians appear in the show). The people are uninhibited, unafraid and take a certain pride in their deprivation. And we love watching them. They afford us the opportunity to tut-tut, express our disapproval and apportion blame for the decline of modern Britain. And there is pleasure in this. How we love to judge others. Call it “poverty porn” if you like. We’re all guilt-free voyeurs.
Why do we know so much yet so little about Nigella Lawson (above), the domestic goddess who brought sensual pleasure to the mundane domestic chore of cooking? It seems only yesterday, she was a regular in our homes. Now, we’re not so sure know her at all. We thought we knew her. But it seems we didn’t know the first thing about her. While neither she nor her ex-husband Charles Saatchi is in the dock, for the past few days, they have been the centre of attention: she and her marriage have been opened up for public inspection. Nigella’s court case, which started last month, is actually the trial of the celebrity couple’s two personal assistants, Italian sisters Elisabetta and Francesca Grillo, who worked as Nigella’s and her husband’s personal assistants. The Grillos are accused of fraudulently using the credit cards of Saatchi’s private company. But they have barely been considered: all attention is on Nigella. Every day, it seems we have learned something new about her private life that has systematically disabused us of any fanciful notion we knew the first thing about her. It started last June with the release of a picture of her being, it seemed, throttled by Saatchi. Since then, her relationship with Saatchi has been methodically cut up and its internal parts displayed.
Celebrity relationships have been dissected publicly before, of course; usually as they happen. Brangelina and Bennifer were narratives the media asked us to share. But the analysis of Nigella’s multi-dysfunctional relationship — Nigella characterized it as “intimate terrorism” — is particularly enlightening: it’s taught how little we really know about people with whom we feel close. This is a condition of today: we assume we know people from the two-dimensional images we see of them. Let me explain how this happened.
During the second half of the twentieth century, television transformed the way we thought and behaved. It affected the way we relaxed, the way we learned, the way we communicated. The complete cultural landscape was transfigured by television, to the point where we can’t recognize its presence. So much of what we know about the world is gleaned from tv that we find it tough to think where else we find out about some event or other. The internet has, of course, emerged as an alternative. It barely needs stating that Nigella or the celebrity culture she inhabits wouldn’t have been possible without television. Prior to its acceptance as a domestic appliance in the 1950s, we knew about prominent figures mainly by their names or artist’s impressions, still photographs or newsreels shown at the movies. “Television, bringing famous faces and sounds into our homes, has created different kinds of celebrity,” writes the social psychologist David Giles in his book Illusions of Immortality.
Television brought with it intimacy: we were able to see moving images and hear voices — in our own homes. It also brought replication: those images and sounds were not just one-offs: they could be repeated time and again, exposing us to the famous in a way that stirred us to new interest. We saw people that were previously remote and perhaps unknowable as ordinary humans with the same kinds of mannerisms, faults and maybe foibles as the rest of us. Giles argues that the proliferation of media, specifically television, in the late twentieth century expanded the opportunities for people to become famous. In material terms, there were more tv screens on which they could appear and become known. Viewers could not only see and hear a new array of people: they could almost reach out and touch them. In a way, they could almost swear they knew them. The more they felt they knew them, the more they became entranced.
Giles invokes a term from to a 1956 article in the journal Psychiatry to capture the emerging relationship between tv figures and viewers: “parasocial interaction.” The 1950s was the decade of growth for television: at the start, few households had a tv; by the end over 90 percent of household in the USA and 70 percent in the UK had at least one set. Viewers were forming unusual attachments. They were developing “friendships” with television characters, some fictional and others real (like announcers, or weather forecasters). They also “hated” some of them. Familiarity led to a sense of intimacy. Viewers actually thought they knew the figures they saw on their screens. They interacted with them parasocially. The relationships were and still are strictly one-way. It’s called parasocial because para means beyond, as in paranormal. The attachment might only have been as strong as a beam of light from a cathode ray tube. Yet it was experienced as strong and meaningful. Viewers actually felt they knew people they had never met, probably never seen in the flesh and who knew nothing of their existence. So there is no actual interaction (inter means between): it’s oneway. This doesn’t stop viewers feeling like there’s a genuine interaction. In this sense, it’s an interesting term that captures the way we think and feel about people we don’t know and who don’t know us but who sometimes unwittingly and unknowingly move us to act, occasionally in erratic and irrational ways.
Since 1999 when she appeared on our screens in Nigella Bites, the domestic goddess has been in our homes, our kitchens even. This is, I think, the reason why we assumed we knew her so well: seeing celebs on stage, in movie roles or even in the contrived circumstances of reality tv shows is one thing; but seeing them in the kitchen doing nothing more sensational than preparing food, contributes to an especially close parasocial relationship. @elliscashmore
Coronation Street’s reputation as Britain’s premier soap is based on its preparedness to take on divisive social issues and avoid crass simplifications. It is currently featuring euthanasia. In the itv drama, Hayley Cropper, the transsexual character played by Julie Hesmondalgh, has just been diagnosed with inoperable pancreatic cancer, with six months’ life expectancy. Last week she told her husband Roy, the singular, thoughtful café owner played by David Neilson, that she intends to take control of what’s left of her life and die when and in circumstances of her own choosing. She sees the alternative as a nightmarish descent into morphine-induced purgatory where visions of her past life as a man will return. Her usually sympathetic husband surprises her with his response. At first, he assures her that she will change her mind, then resigns himself to the prospect of being complicit in her termination. He accuses her of being selfish by depriving him of every last available second he cherishes with her. Reminding her of the palliative care she will receive, he tries to convince her that she won’t necessarily suffer from the confusion than sometimes results from pain relief. “I’d rather forego the goodbye if it meant you weren’t suffering,” he tells her. With no resolution, Hayley collapses and is entered into hospital with an infection. As usual, Corrie’s writers have handled a sensitive moral dilemma with care, never reducing the issue to pat answers. The viewer completely understands Hayley’s concerns about losing control as the cancer debilitates her, while feeling the impact of her decision on her loving husband, who himself attempted suicide in the past.
The drama makes the debate complicated, tangled and wracked with competing emotions. Euthanasia elicits all manner of emotion. At one level, it seems an individual has every right to end his or her life however they wish, particularly if the alternative is prolonged suffering of some kind. Loved ones are forcibly reminded of the preciousness of life and the possible catastrophic impact on their own lives. Palliative care, we are often assured, has come on in leaps and bounds in recent years and the kind of torment Hayley anticipates is unlikely to happen. All the same, should a person be forced to accept care against his or her will? Should they be compelled to live with medication until the disease reaches its inevitable conclusion? Or should they be allowed to decide for themselves? Under English law, all adults have the right to refuse medical treatment, as long as they have sufficient capacity (the ability to use and understand information to make a decision).
Other countries have different rules. Active euthanasia is currently only legal in Belgium, Holland and Luxembourg, for example. New Zealand’s recently withdrawn End-of-Life Choice Bill would have allowed adult residents to access medical assistance in hastening their death if they met certain criteria. Patients and their families must have a clear understanding of all of their options through carefully guided conversations with their medical practitioners. By making euthanasia legal, the law would have allowed family members to discuss options and support each other, rather than have clandestine conversations and face legal prosecution if they follow a loved one’s requests. So the pro-euthanasia or “right-to-choose” argument is not for totally unrestricted choice in one’s own suicide and its guiding morality is that, while dying is not a good thing, it’s often preferable to suffering and the loss of dignity diseases sometimes entail. Those who oppose assisted dying are usually guided by a moral compass that comes from their faith. No one, they argue, has the right to “play God” and there is something profoundly irreligious about an arrangement that permits exactly this. Most religions regards the preservation of human life as one of the supreme moral values, though there is no complete unanimity in, for example, Buddhism, Sikhism and some areas of Christianity. Corrie will, again, set an agenda for a debate about a subject that will continue to arouse great emotions. It is to the show’s credit that it can integrate issues of great complexity into its narrative.
The X Factor has turned into torture porn – that’s the film genre that specializes in exposing audiences to the wilfully cruel and sadistic infliction of pain, suffering and humiliation to others.I’m exaggerating a bit: torture porn, as exemplified in movies such as Hostel, Vileand the Saw series, is intentionally about hurting people. The films attract audiences who share a taste for watching others in pain. The X Factor, by contrast, is supposed to be a talent contest. But it now seems to appeal to the torture porn sensibility: its viewers might once have been drawn to the singers and allowed themselves the indulgence of laughing at the manner in which the judges expressed their disapproval. But the current series seems vicious: the entertainment value of others’ pain seems to have been foregrounded to the point where the singing is almost supplementary.
A few weeks ago, contestant Hannah Sheares and two friends auditioned as Daisy Chain, a band, only to be told that, Hannah herself was passable, but her friends were useless and would have to be dumped. Presumably forgetting that bands like the Supremes, the Three Degrees and Destiny’s Child all did pretty well with a strong lead and two backing singers, the judges offered Hannah the chance to progress as a solo performer. Amid much crying, she did so and lost her friends. “We don’t talk any more,” Hannah stated the obvious. She was eliminated from the show a couple of few weeks later. When the panel gave the same choice to another band, the trio refused, though a week later, the lead singer mysteriously re-appeared minus her two friends, meaning that she had been persuaded. It’s not the first time the show has made enemies out of friends and it could always be argued that the choice always remains with the contestants. Yet it seems a peculiarly vicious and unnecessary way of filtering out “talent” and, if we are honest, the way in which the camera dwells on the breakups suggests the producers think we enjoy becoming voyeurs. Maybe they are right.
The X Factor is not just a television show, it’s a cultural phenomenon. There has never been anything quite like it in the history of television. Starting in 2004, it has launched the careers of Leona Lewis, Alexandra Burke and, of course, One Direction (about whom I blogged a few weeks ago). It has also given career boosts to panellists, particularly Cheryl Cole, Nicole Scherzinger and Tulisa Contostavlos. Its viewing audience is barely believable. Over the years it has regularly snagged 40% of the total audience share and, even in slumps, draws in 10 million viewers. At its historic high point in 2010, 17.2 million tuned in to watch Matt Cardle triumph – that’s over 27% of the total population of the UK. It’s perfectly in sync with today’s culture, inviting audiences to vote using their phones and to tweet, text and engage fully with social media. In a sense it offers a perfect cultural democracy. But, as the show morphs from a talent contest to an all-purpose entertainment platform, its benign character has changed. It is now a heartless, insensitive and callous psychodrama in which astringent is poured on open wounds.
Like the torture porn filmmakers, the X Factor producers would probably shrug and say, “That’s what the audience wants.” They have a point: no one points a gun to the heads of 10 million telly watchers and demands they stay glued to their screens every Saturday and Sunday anymore than filmgoers are scooped up from the streets, strapped into place and forced to watch people having limbs cut off without anaesthetic.Viewers not only want to watch, they feel entitled to watch the slaughter and the human response that accompanies it. As the torture porn fan delights in witnessing the pleading, the whimpering and, best of all, the sobbing, the X Factor fan enjoys the privilege of observing human emotion at its most painful. We can identify with the rejected wannabes to whom winning would mean “everything” and this confers its own empathic rewards. Living in celebrity culture makes us realize how fragile hopes of instant fame are popular currency. But the real bonus is that we can also identify with the torturers … I mean, the judges: the power to grant someone’s wildest dreams or consign them to oblivion is something viewers have never had, and probably never will have. But by aligning themselves with Sharon or Louis as they traumatize young hopefuls and reduce them to incoherent losers, they get to identify with the powerful too. And the best bit is this: no one feels bad about this. There may be a brief moment of sorrow as the losing contestant blubs inconsolably and either promises to come back stronger or just go back to stacking shelves at the supermarket, but it passes as soon as the next TalkTalk commercial arrives. The perverse empowerment offered by the show is too good to risk undermining with sympathy.
Now the filmed sequences are over, we are into “live” shows and audiences will bear witness to exhibitions of inconsolable distress as their judges deliver their agonizingly prolonged verdicts (“I’m gonna say … ” followed by a 10-second wait). Years ago, we might have felt uncomfortable and switched channels. Who takes pleasure not just in other people’s distress, but in their shameless, often excruciating public display of that distress? I know the answer to this question. So do you.
Britain in turmoil: high unemployment, nationally owned companies being privatized, strikes threatened in all the major industries, a government that is never quite sure it enjoys the support of the nation. Sound familiar? It should be – at least if you watch Downton Abbey and Peaky Blinders. The current Downton Abbey starts in 1922 and Peaky Blinders is set just after the end of the 1914-18 war. The interwar period provides prime material for dramatists and viewers love the gas-lit cobbled streets, the horse drawn carriages, the born-to-rule masters and their obsequious servants. Don’t get me wrong: I’m not embracing nostalgia. Part of the appeal of these programmes lies in how they spirit us back to a time when things were so much worse than they are today. British society might have been a blissful idyll unspoilt by progressive notions like sex equality or workers’ rights, but mostly, it was a joylessly harsh and merciless place where the spectre of the workhouse was never far away.
Before the First World War, the class structure seemed like a permanent feature of British society. Permanent because the industrial working class expected so little. The rulers who lord over Downtown enjoy their privilege by consent: the working class approved of their masters’ rights and by implication their own lowly position in the natural order of things. By the end of the war, the workers were not so easily placated. Four years of conflict with the loss of 956,703 British lives changed things. Industrial disputes became commonplace and radical politics centred on the emergent Labour Party, which was to form its first government in 1923. When unemployment crept towards the two million level, trades unions called for militancy, building eventually to a nationwide general strike in 1926. In Peaky Blinders, we see communist Freddie Thorne preparing for a revolution that never materialized. His friend Tommy Shelby calls him a fantasist and opts for a practical if deviant way of life in which the rewards are more tangible and immediate. We see him and his gang extorting money from Birmingham publicans, shopkeepers, and bookies, or using their razor-lined caps to slice open enemies’ faces.
Downton is supposed to be only a hundred or so miles north of Brum in Yorkshire, but it could be on a different planet. True, we see the working class again, but this time they are domesticated, waiting hand and foot on the landed gentry, taking orders, no matter how unreasonable, with a smile, bowing and scraping, their role in life to cater for every need of their social and moral superiors. The enchantment of Downton Abbey lies partly in its plausible depiction of Britishness, replete with class distinctions, meticulously observed prejudice and downright snobbery. Looking backwards from the 21st century, these practices seem both elegantly civilized and cruelly archaic.
This was a time when women were renegotiating their social status. For long paralyzed politically, they were awarded voting rights by the legal reforms of 1918, after an often-painful campaign by suffragettes. The extension of franchise reflected changing though not altogether enlightened attitudes towards women. For decades before the War, manliness was synonymous with moral goodness as well as physical health and vestiges of this are apparent in both shows: women are always peripheral to the main narrative and either support, encourage or express their appreciation.
The dramas’ embodiments of gender irruptions take the form of Lady Mary, who unexpectedly and unconventionally decides to play an active role in the running of the Downton estate, and Polly Gray, a Brummie Valkyrie, who masterminds criminal operations for all-men gangs. Birmingham, by the way, never looked so gorgeous on screen, shot through a smoky industrial haze with sprays of furnace sparks decorating its streets like firework displays (compare with Martin Scorsese’s New York in his Gangs of New York).
The 1920s were known in America as the “Jazz Age,” a period characterized by carefree hedonism, wealth, freedom and the kind of exuberance we find in the novels of F. Scott Fitzgerald and, of course, the movies based on his work, The Great Gatsby being the most famous. There’s no evidence of this at Downton yet, but it promises to arrive over the coming weeks with two new characters: a breezy relative from Scotland and the show’s first black character, who is a jazz musician.
The narratives of both dramas are congruent with both the laissez-faire doctrine of individual action unrestricted by government interference and a conception of masculinity in which the vigorous, physical and pursuit of goals is the ideal. Both chime with the free market ideology. The gangsters of Peaky Blinders are almost character studies in cutthroat capitalism, individualistic, operating privately and driven solely by profit; they are friends of anyone with whom they can do business, regardless of political or any other kind of affiliation. So we see them dealing with the IRA, the British government, other gangsters – anyone who can be used to turn a penny.
If Peaky Blinders is inspired, if not based on reality, it is also ludicrously fantasticated: whenever it threatens to become believable, there is a comically false Brummie accent, a contemporary phrase or an outfit that is so glaringly out-of-place that the drama’s plausibility, weak at best, crumbles. This is a pity because the subject matter and the social context is so promising and the issues seems so fresh. The worst offence of Downton Abbey, apart from the clichéd characters and its social Manichaeism, dividing upper from working class as if Britain still had a strictly two-class structure with no emergent bourgeoisie, is its uninspired inclusion of a comic character in the form of the Dowager Countess of Grantham’s Father, who might, in another script, be a vile, imperious, heartless matriarch who looks down on … well, everyone apart from the upper class. In Downton, she comes across as a basically well meaning, if misguided but deeply implausible figure with a nice way of putting people down.
You have a Twitter account, right? Silly question: of course you have. You and about 200 million others, at very least. The microblogging phenomenon has only been around since 2006, but, in a sense, it seems to compare with television and the internet as media innovations that changed the way we spend our waking hours. A couple of weeks ago, I wrote about how conceptions of privacy had changed so drastically in recent years and how, in many ways, we no longer understand our private lives in the same way as previous generations. Twitter has played no small part in this: with Facebook, it has turned private lives inside out – encouraging us to reveal details of our lives that other people find either fascinating or slightly less than fascinating, but never, it seems, dull. We devour information about what other people are doing or thinking or intending to do. What people are doing at any given moment may not seem very important, but we value this kind of information. Twitter has enriched millions of lives. Can it enrich them even more – this time with hard cash?
Twitter has filed paperwork for what the world of finance calls an initial public offering (IPO), which means it will invite anybody with enough money to buy a chunk of the company. Appropriately, it announced the news in a tweet. This means that you or I or anyone else can become a part owner of a company that is already part of our lives and engages all of us for at least a portion of our day. Last year, Facebook floated on the stock exchange. Its founder Mark Zuckerberg had put this off, probably fearing that he would have to surrender his hoodie credentials and become a corporate head, answerable to his shareholders. Zuckerberg seemed concerned that people will stop thinking about Facebook as a cultural service provider and more as a profit-driven business. The initial price per Facebook share was $38 (£24); it is now about $44. A 15.7% increase in 18 months is not bad, despite a rocky start. Encouraged by this, Twitter is following its networking cousin into the market.
Twitter seems to be everywhere, all of the time: people are always tweeting, or reading tweets, or retweeting. But there are actually three times as many Facebook users as tweeters. Twitter’s revenue is also less than Facebook’s in 2011, its final year as a private company. So there are bound to be suspicions when it arrives on the market. All the same, the sheer prominence of Twitter in contemporary culture will persuade investors that this is a company with a future. Won’t it?
Twitter has been revolutionary. But the question investors will ask themselves is: will it be revolutionary like television, or revolutionary like ITV, the first commercial company in Britain? Since the 1950s, tv has grown into arguably the most influential innovation of the twentieth century (I’ll accept a counterargument from advocates of the internal combustion engine). It’s adapted to changing environments and, in the process, changed us in myriad ways. ITV launched in 1955. BBC was the national public broadcaster and, as such, was funded by licence payers, not advertising. ITV’s remit was more populist and it operated as a commercial organization, charging for advertising spots between its programmes. It shared the market with BBC until 1964 when BBC2 came into being. ITV’s market share was shaved a little in 1982 when Channel 4 arrived, but in the late 1980s and 1990s, it was thrust into an open market with any number of digital and satellite channels all competing for advertisers. And it’s struggled ever since. So what is Twitter? A unique medium like television, or a service that has got the market to itself, but only for the moment?
Will we all be tweeting in five years? Probably. But in ten? And beyond? Twitter may be, like television, a medium that morphs with cultural changes, or it may be just one service provider that has caught the zeitgeist – and zeitgeists change. Twitter has no doubt already started planning for this possibility. For example, it has the video-sharing app Vine, and Twitter music, the music discovery service. It will probably launch new services. But the product Twitter will be putting up for sale is actually you. OK, you see yourselves as users, but, as far as advertisers are concerned you are potential customers. 200 million customers is an attractive market for advertisers. Unlike tv stations, Twitter has no portfolio of programmes, such as Corrie, The X Factor or Downton Abbey(all ITV, of course): it just has people who like tweeting and are liable to be influenced by ads.
At the moment, none of us minds the occasional pop-up; after all, it’s a free service. Twitter currently reckons it pulls in about 380 million quid a year from advertising. But, once on the stock market, Twitter will be under pressure from shareholders to pull in as much revenue from advertisers as it can and this could affect the experience that tweeters currently find so engaging. Would Twitter dare risk alienating users with too many ads? This is essentially the question potential investors will be asking themselves.