Ellis Cashmore - Author at 51Թ /author/ellis-cashmore/ Fact-based, well-reasoned perspectives from around the world Tue, 21 Apr 2026 15:30:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 Is Trump Just Pretending to Be Mad? /politics/is-trump-just-pretending-to-be-mad/ /politics/is-trump-just-pretending-to-be-mad/#respond Mon, 20 Apr 2026 13:21:18 +0000 /?p=162016 Russian President Vladimir Putin launched Russia’s invasion of Ukraine on February 24, 2022. Donald Trump assumed the US presidency on January 20, 2025. Would Putin have risked a years-long war if Trump had been in the White House at the time of his attack? As journalist Janan Ganesh recently wrote in the Financial Times, “Trump… Continue reading Is Trump Just Pretending to Be Mad?

The post Is Trump Just Pretending to Be Mad? appeared first on 51Թ.

]]>
Russian President Vladimir Putin launched Russia’s invasion of Ukraine on February 24, 2022. Donald Trump assumed the US presidency on January 20, 2025. Would Putin have risked a years-long war if Trump had been in the White House at the time of his attack?

As journalist Janan Ganesh recently in the Financial Times, “Trump is the one US president elected this century under whose watch Russia has not launched a foreign invasion. Putin attacked Georgia under George W Bush, Crimea under Barack Obama and Ukraine entire under Joe Biden.” The pattern is suggestive, if not conclusive.

Calculated unpredictability

Many observers have portrayed Putin as a deranged autocrat bent on restoring a lost empire, surrounded by subservient aides too intimidated to challenge him. Yet, over time, his behavior has come to seem grimly legible. His aims are extreme but comprehensible; his methods brutal but procedural. He is consistent — consistently malign, perhaps, but consistent.

Trump, by contrast, presents a different figure. He is erratic, self-contradictory and prone to sudden shifts in manner and position. Where Putin’s menace is intelligible, Trump’s is properly inscrutable. And that difference raises the possibility that not knowing someone may itself function as a form of power.

This is the essence of what Ganesh calls Madman Theory. More a strategy than a theory, it involves a political leader deliberately cultivating the appearance of irrationality so that neither adversaries nor allies can reliably anticipate responses. All they can do is act with caution. The leader need not be mad, but others must not be certain. The seed of doubt is crucial.

Ganesh illustrates this with former US President , who presided over the nation from 1969 to 1974, a period overshadowed by the Vietnam War. Nixon’s Secretary of State, , was tasked with negotiating American withdrawal. Nixon, with Kissinger, incubated a stratagem: to convey to North Vietnam that the president was unstable, beyond even Kissinger’s control. Thus, Kissinger could imply in negotiations that Nixon might take extreme measures — even nuclear ones — regardless of advice.

Picture it: After hours of high-level with Vietnamese negotiators Lê Đức Thọ and Xuân Thủy, Kissinger concludes, “Excellent. I’ll take this back to the president. But honestly—he might throw it in the trash.” The goal was simple: force concessions by making the consequences of resistance unknowable and potentially catastrophic.

This was Madman Theory in its purest form: calculated unpredictability. Nixon himself did not appear overtly unhinged to the American public, at least not before the Watergate scandal. But he wanted adversaries — and even allies — to believe that he might be. The performance of instability was designed to create leverage.

Yet even here, the results were ambiguous. The war dragged on; the costs were immense; the strategy failed to produce decisive gains. As Ganesh observes, the problem is built into the logic: If the threat is too extreme, it lacks credibility; if carried out, it becomes catastrophic.

Genghis Khan, Stalin, Hitler and Marcos

In the 1532 political treatise , philosopher Niccolò Machiavelli argued that rulers must sometimes act immorally, inconsistently and against expectation. It’s safer to be feared than loved, if one can’t be both. Appearances matter. Machiavelli understood the uses of terror, ambiguity and deliberate inconsistency. He admired deacon Cesare Borgia, whose ruthlessness and unpredictability helped secure his power.

While Machiavelli didn’t cite Genghis Khan, the latter embodies many of these principles. Khan’s backdrop was the tribal, wind-scoured steppe of Inner Asia in the late 1100s. His reputation for sudden, overwhelming violence was not incidental to his success — it was central. Cities that resisted could expect annihilation; those that submitted might be spared.

The effect was psychological as much as military. Opponents did not merely calculate their chances; they confronted the possibility of total destruction at the hands of a leader who seemed to operate beyond restraint. Whether Genghis Khan was irrational is beside the point; others behaved as if he might be.

operated in a very different setting: the bureaucratic, industrializing Soviet state of the 20th century, where power was exercised through institutions, purges and fear. As General Secretary of the Communist Party, he inherited the machinery of Vladimir Lenin but took it in a direction few anticipated. His Five-Year Plans transformed the economy at immense human cost, while purges eliminated enemies, real and imagined. Even fellow revolutionaries were not safe: Leon Trotsky was exiled and eventually assassinated.

Crucially, Stalin’s rule was characterized not just by brutality but by unpredictability. Decisions appeared arbitrary; loyalties could reverse overnight. No one could be certain of the limits, because there were none.

Adolf Hitler followed a different path but produced a similar effect. His rise depended on fusing charismatic authority with national identity. Once in power, he defied conventional constraints. His impulsive, ideological and often strategically baffling leadership confounded allies and enemies alike. There were assassination attempts, even from within his own ranks, yet he retained intense loyalty from figures such as Luftwaffe Commander-in-Chief Hermann Göring.

In all of these cases, the pattern is evident: power reinforced by the perception that the leader might act beyond reason, beyond norms, beyond comprehension. The onus is on the leader to sustain that image — psychopath, megalomaniac, obsessive — but never a rational, calculating figure. Madman Theory doesn’t depend on whether the image is true or make-believe. What matters is that others believe it. And the others include followers, friends and, most importantly, enemies.

A more modern and very different case is Imelda Marcos, the former First Lady of the Philippines. She doesn’t belong in quite the same category as the great tyrants of the 20th century, nor did she wield power in the same way. Yet her public persona introduced a distinct, perhaps unique, form of unpredictability. Her rule, alongside President of the Philippines Ferdinand Marcos, was marked not only by abrupt interventions but by extraordinary extravagance. Where others projected menace through violence, Marcos added theater: excess itself became a signal, unsettling in its disregard for restraint. She famously owned pairs of shoes, bought perfume by the gallon and once splurged $7 million on jewelry.

This was not madness in any clinical sense: It was performative extravagance, an idiosyncratic form of power that kept her followers in awe. Madman Theory need not be fully realized; even its partial expression can shape how others respond.

Trump and the value of uncertainty

And so to today — and a plan that seems crazed, until it starts working. Trump has certainly unnerved the world. His expletive-loaded posts alone betray the lack of dignity and respect typically associated with world leaders. Then there is his penchant for fabrication, and his unfulfilled threats. Often, his hyperbole is excused as “,” but is that all it is? Trump’s impulsivity seems almost too outlandish to be genuine. Surely no human being, never mind an elected president could think and behave so preposterously. Surely.

To be fair, he is not the only political leader to unsettle observers. Putin’s prosecution of the Ukraine war has raised persistent concerns about escalation. North Korean Supreme Leader Kim Jong Un, has long alternated between provocation and restraint, keeping adversaries uncertain whether they face calculated brinkmanship or something less controllable. Trump belongs, at least in part, in this company.

So, is Trump’s apparent madness real or strategic? His record allows for both interpretations. Allies are praised, then rebuked, sometimes in language that veers from jocular to incendiary. Even high-stakes diplomacy is reduced to the language of “deals,” as if geopolitical conflict were a used car sale.

Surely, other politicians suspect Trump’s departures from established norms are sometimes so aberrant that it’s hard to imagine he actually meant them. But even if they suspect design, they remain uncertain regardless.

It’s not difficult to imagine how this might shape decision-making in Moscow. Let’s return to the question raised in our opening paragraph. Picture Putin at his long table in the Kremlin, advisors gathered at a careful distance.

“We should anticipate what Trump will do,” he begins.

“Nothing,” one replies. Ukraine is not part of NATO.

“You think a technicality constrains him?” another asks. “He may arm them heavily. He may even place American missiles within reach of Moscow.”

“That would be crazy.”

“AԻ…?”

Silence.

That silence is the point. The problem with a “madman” is not that he will act, but that he might — and that no one can be sure how far he would go. Faced with such uncertainty, even a nerveless leader may hesitate.

This doesn’t mean Trump has consciously revived Nixon’s strategy. The simpler explanation that he is erratic rather than strategic remains plausible. Nor does it mean Madman Theory always works; history offers as many warnings as endorsements.

But it does suggest an answer to the opening question. Would Putin have invaded Ukraine if Trump had been in the White House? Probably not. Not because Trump would certainly have acted, but because Putin couldn’t have been certain that he ɴdzܱ’t.

[Ellis Cashmore is the author of .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Is Trump Just Pretending to Be Mad? appeared first on 51Թ.

]]>
/politics/is-trump-just-pretending-to-be-mad/feed/ 0
Would Michael Jackson Have Survived in the #MeToo Era? /culture/would-michael-jackson-have-survived-in-the-metoo-era/ /culture/would-michael-jackson-have-survived-in-the-metoo-era/#respond Fri, 10 Apr 2026 13:21:27 +0000 /?p=161806 Non omnia quae mortua sunt, mortua manent — not all that is dead remains dead. Michael Jackson died in 2009, steeped in debt. But he certainly didn’t remain dead; a reinvigorated Jackson was restored to life. His record sales spiked, a movie deal was done and, within a year, Jackson made $275 million — more… Continue reading Would Michael Jackson Have Survived in the #MeToo Era?

The post Would Michael Jackson Have Survived in the #MeToo Era? appeared first on 51Թ.

]]>
Non omnia quae mortua sunt, mortua manent — not all that is dead remains dead.

Michael Jackson died in 2009, steeped in debt. But he certainly didn’t remain dead; a reinvigorated Jackson was restored to life. His record sales spiked, a movie deal was done and, within a year, Jackson made — more money than any other musician or actor, dead or alive, over the previous 12 months. Other lucrative events included a Cirque du Soleil production and a hit Broadway show, all of which brought in over $3 billion in earnings.

But the spectral Jackson also had detractors who refused to let the allegations fade, even after Santa Barbara County Superior Court cleared him of sexual molestation charges in 2005. Suspicions of an unwholesome side to Jackson surfaced as early as 1993 when screenwriter Evan Chandler him of abusing his son, Jordan Chandler. A legal settlement the following year prevented this from damaging Jackson’s then-flourishing career. (The limited what could be depicted about the issue artistically and, for a while, imperilled the biopic film, — trailer above.)

Head in a lion’s mouth

Less than a year after the allegations and settlement, Jackson Lisa Marie Presley, daughter of the world-famous musician, Elvis Presley. For years before the marriage, Jackson’s androgynous presentation, high voice, lack of tabloid-documented romantic history and unusually childlike persona had prompted speculation about his sexuality. Gossip columns periodically asked whether he might be gay or asexual. These unsubstantiated rumors circulated widely and gained impetus from the settlement, making the Presley marriage appear as a validation of his heterosexuality.

Jackson and Presley separated in 1996. That same year, only months after finalizing the divorce, Jackson nurse Debbie Rowe, with whom he had two children. A third child by an unknown mother followed in 2002.

Exactly what was on Jackson’s mind when he agreed to appear in a documentary fronted by journalist Martin Bashir is unclear. If he was trying to improve his public image, it was a catastrophic mistake. Bashir had earlier interviewed Princess Diana and, while it wasn’t clear at the time, used to persuade her. By the time he agreed to Bashir’s request to film him, Jackson had spent over 20 years in the unforgiving glare of showbusiness. Any claim to ingenuousness about media exposure was difficult to sustain. Jackson’s decision was rather like starving a lion for a few days and then putting his head in its mouth.

Jackson talked about regularly having sleepovers with children, including a young cancer patient named Gavin Arvizo. Bashir ended the HBO program with his on Jackson’s home, known as Neverland Ranch: “A place where his enormous wealth allowed him to do what he wanted, when he wanted, how he wanted.” The New York Times Jackson as “creepy, but almost touching in his delusional naïveté.”

The program screened in February 2003. That December, Santa Barbara County District Attorney Tom Sneddon Jackson with committing lewd and lascivious acts with a child under the age of 14.

In 2005, Jackson stood ; the jury heard allegations that he had abused a 13-year-old boy and exposed him to “strange sexual behavior” during visits to Neverland Ranch. But the jurors ultimately concluded the prosecution had not proved its case beyond reasonable doubt. Jackson was exonerated of all charges, walked from court an innocent man and remained legally so for the rest of his life. Innocent, that is, in a legal sense: Rumors persisted up to and beyond his in 2009.

Where there’s smoke…

On October 5, 2017, The New York Times published a detailing allegations of sexual harassment against Hollywood film producer Harvey Weinstein. Among those who spoke publicly were actors Rose McGowan and Ashley Judd. The revelations triggered a cascade of accusations against Weinstein that culminated in his arrest and, in February 2020, his for felony sexual assault and a sentence of 23 years’ imprisonment. Weinstein maintained his innocence.

The significance of the case extended far beyond one powerful producer. For decades, stories circulated in Hollywood about men who traded professional opportunities for sexual favors, the notorious “casting couch” becoming shorthand for a system of exploitation long acknowledged but rarely challenged.

More than a decade earlier, in 2006, activist Tarana Burke had begun using the phrase, “,” to support survivors of sexual abuse. After the Weinstein revelations, the phrase was repurposed as a global hashtag and rallying cry. What followed was one of the most consequential cultural shifts since the rise of the women’s liberation movement in the 1970s.

Within a year, hundreds of prominent men across politics, entertainment and media faced allegations of sexual misconduct. Some were prosecuted, many were not. Yet formal verdicts often mattered less than public judgment. Careers ended, projects met cancellation and reputations collapsed even in the absence of criminal convictions.

A new principle seemed to have taken hold: Accusations alone could be enough to remove powerful men from positions of influence. The informal tribunal of public opinion proved faster, harsher and often more decisive than the courts. Guilty or innocent no longer seemed to matter. The adage, “where there’s smoke, there’s fire,” became a serviceable rule of thumb.

Not guilty. So?

Now, reimagine the Jackson episodes I described earlier. In the post-Weinstein world, a settlement may still resolve a dispute legally, but it does not always relieve the defendant from blame even when the out-of-court agreement involves no admission of liability.

The most dramatic illustration of this occurred in 2022: then-Prince Andrew’s settlement with trafficking survivor Virginia Giuffre, who had accused him of sexual assault. Andrew paid an undisclosed amount and donated a sum to a charity. He avoided a trial, but invited a blizzard of innuendo. Further investigations into his relationship with the disgraced financier Jeffrey Epstein pushed Andrew into an inescapable corner. King Charles III stripped him of his titles, relieved him of his royal duties and made him an unwilling symbol of privileged depravity.

In 1994, Jackson’s global popularity was comparable with Taylor Swift’s today. His albums Off the Wall, Thriller and Bad had established him in the same class as Elvis and the Beatles. His video, Michael Jackson’s Thriller, remains a classic of its genre. None of the disorienting strangeness of later years had yet appeared and Jackson, like his peer, Madonna, enraptured audiences everywhere.

His prodigious popularity would have been a defense against cynics who suspected the settlement disguised indecent tendencies. Of course, Jackson never had to contend with social media, as he would in the #MeToo world; that in itself could have wrecked his reputation. But, it’s conceivable, even likely, that his immense adoration would have been powerful enough to sustain him. The 2003 charges, however, were unexpected and uncontainable.

Remember: Jackson was eventually acquitted on all seven counts of child sexual abuse and two counts of administering an intoxicating agent. But, as we know, the legal precept “innocent until proven guilty” lost purchase in the wake of the Weinstein case. In 2022, actor Johnny Depp won in damages from his former wife, actress Amber Heard, who had accused him of domestic abuse. But he lost his role in Disney’s Pirates of the Caribbean franchise (at that ) and, as an on-screen actor, has only appeared in 2023’s Jeanne du Barry since. Actor Kevin Spacey was first accused of sexual assault in 2017 and found of sexual offences at a criminal trial in 2023, and has recently a separate case. In these cases, the actors were accused wrongly, but offers for dramatic roles dried up.

The probability is that Jackson too would have been canceled, his legal innocence overridden by a verdict reached in the less formal but far more potent tribunal of culture. In 2005, when he was cleared, the shadow of the allegations were troubling but not fatal. Of more immediate concern was his extravagant lifestyle, which left him with colossal debts — at his death to be more than “more than half a billion dollars.”

Reissues of earlier albums kept public interest alive, but Jackson himself became a recluse. So, when in 2009, he announced his first live concerts in 15 years, it seemed to confirm he needed money. A two-month residency at London’s O2 Arena was thought to be worth . When the concerts sold out and tickets were sold online for $10,000, more dates were added. At 50, Jackson seemed to be on the verge of making an improbable but spectacular comeback. In preparation, he threw himself into an exhausting rehearsal schedule. But as we now know, the concerts never took place.

Within three months of the announcement, Jackson was found dead at his Los Angeles home. The death was ruled a homicide and his personal physician, Conrad Murray, was of involuntary manslaughter in 2011. He had Jackson a lethal dose of propofol, a powerful anaesthetic. Jackson, the public soon found, was also a habitual user of painkillers such as OxyContin and Demerol.

In 2019, an HBO documentary, Leaving Neverland, featured graphic by two men, Wade Robson and James Safechuck, who alleged Jackson abused them as children. A less publicized claim followed when five members of the Cascio family, longtime friends of Jackson, that Jackson groomed and abused them over decades, beginning when they were children. Jackson’s estate quietly the accusers $2.5 million.

Would a middle-aged Jackson, apparently scarred by the unproven accusation, beleaguered by debt and at least 12 years past his peak, be offered a lucrative assignment in London and sell out? It’s not unthinkable, but fanciful just the same. Like film producers, concert promoters would tend to treat even lightly-soiled A-listers with caution. AEG Live, the prospective promoters of the London “This Is It!” concerts, as they were called, would probably not have taken the gamble; in the event the promoters were well insured. 

Would Jackson have lived?

Paradoxically, the #MeToo environment could have saved Jackson’s life. Were promoters disinclined to book him and record labels reluctant to offer contracts, he would have been forced to adjust his profligacy and restructure his debts. He still had income from his valuable investments in music publishing.

Perhaps he would have yearned for the buzz of live music and the entertainment industry in which he had been involved since he was six. Yet he would have had the support and comfort of his three children, growing into adolescence, around him (all three children are now in their 20s). He might still have relied on pharmaceuticals to get a night’s sleep, but not the intravenously administered nightly cocktail that ultimately killed him.

So, would Jackson have survived in the #MeToo climate? In a professional sense, no. He would have been quietly ushered toward showbiz oblivion, living — probably to the present day, when he’d be 67 — and remembered as a great but seriously flawed megastar. But he would have remained alive. The memory of the scandals would surely have receded, the music endured and the image of the prodigy turned global icon might gradually have eclipsed lingering suspicions.

Instead, his unexpected death froze the argument in place. Neither vindicated nor condemned, Jackson remains suspended between genius and tormentor, a figure whose legend is inseparable from the perhaps unanswerable questions that still surround him.

[Ellis Cashmore is the author of .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Would Michael Jackson Have Survived in the #MeToo Era? appeared first on 51Թ.

]]>
/culture/would-michael-jackson-have-survived-in-the-metoo-era/feed/ 0
Social Media Addiction is NOT Addiction /more/science/social-media-addiction-is-not-addiction/ /more/science/social-media-addiction-is-not-addiction/#respond Thu, 09 Apr 2026 13:58:05 +0000 /?p=161793 A Los Angeles jury recently held Meta and Google liable in a landmark US legal case, which found that social media platforms such as Instagram and YouTube are designed to be addictive to children. Addictive. What exactly does this mean? That engagement with these platforms produces a form of mental and physical dependence comparable to… Continue reading Social Media Addiction is NOT Addiction

The post Social Media Addiction is NOT Addiction appeared first on 51Թ.

]]>
A Los Angeles jury recently held Meta and Google liable in a landmark US , which found that social media platforms such as Instagram and YouTube are designed to be addictive to children. Addictive. What exactly does this mean? That engagement with these platforms produces a form of mental and physical dependence comparable to substance use? Not quite. More often, it appears to mean little more than intense, even habitual, engagement — something closer to enthusiasm than addiction in any strict sense.

Separating dependency from addiction

This distinction is crucial. Over the past three decades, social scientists have increasingly preferred the term dependency to addiction because it implies reliance without necessarily involving the biophysical changes that render an individual unable to function without a substance. A person may be dependent on shopping, sex, gambling or even social media and yet retain the capacity to stop; willpower, however strained, remains in force.

Addiction, by contrast, denotes something altogether more demanding: a condition in which repeated exposure produces physiological changes that diminish or even override volition. At that point, willpower alone is no longer sufficient. A heroin user, for example, doesn’t simply choose to continue using; their body itself has adapted to the drug in ways that make cessation profoundly difficult.

Yet the distinction is usually forgotten. “Addiction” has migrated from the clinic into everyday language, where it’s used to define practically any activity repeated with gusto — even habitually eating chocolate. The conflation of dependency and addiction has consequences: What was once a term reserved for conditions involving physiological dependence and withdrawal has been repurposed to capture patterns of behavior that are, at source, voluntary, even if strongly incentivised.

Medicalization steps in

Not all habitual behavior is suspect. Many recurrent practices, like attending church, for instance, are undertaken routinely and even ritualistically, without fresh deliberation on each occasion. Yet they’re widely regarded as beneficial, meaningful and socially valuable. So, habit, in itself, is not pathology.

This is not merely linguistic drift; it reflects a deeper transformation in how we understand human conduct. As medical sociologist William C. Cockerham , health and illness are not simply biological facts but are shaped by social organization and institutional authority, especially that of the medical profession. Over time, behaviors once regarded as routines, preferences or even vices have been reclassified as conditions requiring diagnosis and possibly treatment. The expansion has been incremental, almost imperceptible, but its cumulative cultural effect is immense: Medicine now lays claim to areas of life that would once have been considered far beyond its remit.

Earlier critics such as and warned of precisely this development. Writing in the 1970s, they argued that medicine was extending its jurisdiction beyond disease into the management of everyday behavior. At the time, such concerns appeared overstated. After all, the medicalization of conditions such as alcoholism, depression and anxiety brought undeniable benefits: stigma was reduced, sufferers were encouraged to seek help, and treatments — sometimes pharmacological — became widely available.

Few would wish to reverse these gains. In particular, athletes prone to mental health conditions were emboldened to talk openly about them, feeling no more shame than they would about a cruciate ligament injury.

But success has brought unintended consequences. The more effective medicalization has been in rendering suffering visible and treatable, the more tempting it has become to apply the same model to behaviors that do not share the same underlying characteristics. The analogy between physical and behavioral conditions was initially a useful heuristic; it has since hardened into equivalence. We no longer recognize that certain patterns of behavior resemble addiction; we say they are addictions.

Gambling vs social media “addiction”

Consider gambling. Once understood as a form of risk-taking or recreation, it was always known to become excessive, even ruinous. Today, it is routinely diagnosed as a disorder. Yet close examination of gamblers’ own accounts suggests a more complicated picture. Far from describing themselves as helpless or compelled, many interpret their gambling in terms of anticipation, strategy and reward — both intrinsic and extrinsic. They understand the risks and persist not because they can’t stop but because the activity itself is experienced as meaningful and pleasurable. The label “problem gambler” is applied mostly when losses accumulate; when fortunes reverse, the same behavior attracts admiration, not diagnosis. The barrier between pathology and normality, in other words, is contingent on context.

This reveals a tension at the core of contemporary medicalization. If a pattern of behavior is deemed pathological primarily when it leads to undesirable outcomes, the diagnosis risks becoming retrospective: It’s a way of explaining failure rather than identifying disease. What’s presented as compulsion may, in many cases, be persistence in the face of risk, sustained by the intermittent rewards that make activities such as gambling so thrilling and attractive.

The same logic supports the claim that social media is addictive. Platforms such as Instagram and YouTube are undoubtedly designed to capture attention. They lead users through cycles of anticipation and reward (likes, comments, new content) that encourage repeated engagement.

But repetition, even intense repetition, is not proof of addiction. It’s proof of reinforcement. Users return time and again because the experience is satisfying and because participation is embedded in the social environments they belong to. What seems to outsiders to be solitary behavior is, in reality, social interaction in the 21st century. To disengage is not simply to exercise willpower; it is, in many cases, to withdraw from a network of relationships, information and recognition.

Remember, “social media addiction” doesn’t appear as a formally recognized disorder in standard psychiatric classifications such as the Diagnostic and Statistical Manual of Mental Disorders (DSM-5-TR). That absence reveals a great deal: Courts and those cavalierly using the term “social media addiction” are effectively referencing a medical condition that lacks clinical recognition.

Decisions and diagnoses

Equally striking is how rarely young people themselves are taken seriously in this debate. Parents, clinicians, policymakers and now courts speak with confidence about the harms of social media, often without reference to the experiences of those who use it most. Research, including large-scale studies such as , suggests a more shaded reality: Young users are typically aware, reflexive and capable of articulating both the rewards and risks of their online lives. The vast majority do not experience their engagement as detrimental, but as integral to their social life: This is just the way they communicate nowadays.

None of this denies that online harm exists. Some users, particularly younger and more vulnerable ones, may experience anxiety, distress or diminished wellbeing as a result of their online interactions. But harm alone is not a sufficient basis for medical classification. The critical question is whether such patterns of behavior are better understood as disorders of the individual or as features of a social world in which digital interaction has become not only commonplace, but fundamental.

The recent legal judgments against technology companies suggest that the response is increasingly being framed in medical terms. By accepting the language of addiction, courts risk reducing a social phenomenon to a clinical condition, one that implies compulsion where there may instead be choice, habit and human agency. The consequences are not trivial. Once behavior is defined as an addiction, responsibility shifts from user to platform and potentially to government.

[Ellis Cashmore is a co-author of .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Social Media Addiction is NOT Addiction appeared first on 51Թ.

]]>
/more/science/social-media-addiction-is-not-addiction/feed/ 0
Madonna — Diva Provocatrix /culture/madonna-diva-provocatrix/ /culture/madonna-diva-provocatrix/#respond Mon, 16 Mar 2026 12:39:06 +0000 /?p=161264 “I think the most controversial thing I’ve ever done is to stick around. I have seen many stars appear and disappear, like shooting stars. But my light will never fade.”  So says Madonna, with a measure of defiance. She’s someone who understands that endurance, not provocation, is her greatest transgression. She is now 67: For… Continue reading Madonna — Diva Provocatrix

The post Madonna — Diva Provocatrix appeared first on 51Թ.

]]>
“I think the most controversial thing I’ve ever done is to stick around. I have seen many stars appear and disappear, like shooting stars. But my light will never fade.”&Բ;

So says , with a measure of defiance. She’s someone who understands that endurance, not provocation, is her greatest transgression. She is now 67: For more than four decades, she’s offended religious leaders, unsettled moral guardians and insulted polite society. Yet none of those affronts has proved as subversive as her refusal to exit quietly. In a culture organized around novelty and replacement, she’s managed to weaponize longevity.

Madonna’s career might be seen as a sequence of calculated shocks: The wedding dress writhing of “,” the supposedly sacrilegious imagery of “,” the BDSM themes of . A notable biography of her is subtitled . But her subversive moments, however incendiary at the time, were ephemeral. If anything, her most renegade accomplishments often went relatively unnoticed. Like earning $50 million (£26.7 million), a record for a female singer in 2004. Or selling more than 400 million records, including albums, singles and digital. Grossing more than $1.3 billion from her tours, another record. In 1992, she signed a then-unprecedented $60 million with Time Warner.

But what really distinguishes Madonna is not the intensity of any single provocation or her prodigious earnings but the cumulative force of her continued presence. She’s outlived her critics, her imitators and many of her contemporaries. The real scandal is not what she did but that she survived so long.

Her endurance matters not simply because it is unusual but because it allowed her cultural experiment to take place. Over decades, Madonna tested the limits of exposure, turning private life into public performance until the distinction between the two appeared to dissolve. What started as provocation became a template for modern celebrity.

The zeitgeist

In February, she sat in the front row at Dolce & Gabbana’s Milan Fashion Week , her arms wrapped around her knees, heavily tinted glasses shading eyes that have seen nearly every iteration of fame in the modern era. Leather gloves accessorized her black outfit, a theatrical flourish that harked back to her Erotica of 1992–93 (gloves, corsets and leather were part of the visual vocabulary she borrowed from fetish subcultures and, in that tour, repurposed for public consumption.) Across the mirrored runway, models twirled in lace and pinstripes, reflecting Madonna’s many incarnations of the past.

To call Madonna a diva is almost tautological. She is the very definition of a temperamental, world-renowned singer, famed for her volatile temperament and for being notoriously difficult to please. Formidable, demanding, exacting, she’s a force as likely to exhaust collaborators as she is to enchant audiences.

Her epigones and successors — Taylor Swift, Lady Gaga, Beyoncé, Ariana Grande, included — entertain, enchant, influence and inspire, yet all seem anodyne next to Madonna. None has matched her performative ferocity, her willingness to court scandal and alchemize controversy into precious metal. Forty years in, Madonna remains unrepentant, uncontainable, unyielding, the center of attention. She may no longer shape the zeitgeist on her terms, but she remains part of it.

In the 1980s, the world was barely aware of cellphones, the internet was inconceivable and social media was something English novelist H.G. Wells might have dreamt up. Madonna arrived in this landscape as a wannabe dancer who soon learned how to take the cultural pulse. She figured out that the press (as it then was) could either proclaim or annihilate her, that audiences rewarded artists who aroused as well as just entertained them, and who provided spectacle as well as song and dance. She decided to combine them all. In doing so, she did more than respond to a shifting world; she helped catalyze a further shift, scandalizing at every opportunity and dissolving the binary between private and public.

The experiment

Madonna Louise Ciccone moved to New York in 1978, a 20-year-old with nothing but ambition and a few borrowed instruments. She danced, drummed and sang with local bands before releasing her debut single “” in 1982 and her first album, , in 1983. By 1984, her second album, , produced by Nile Rodgers, cemented her international status. The video for the title track and her at the MTV Video Music Awards in a wedding dress simulating masturbation was a foretaste of what was to come.

In 1985, few could imagine a woman deliberately inducing scandal and usually achieving the results she desired. Madonna’s real innovation lay in recognizing something earlier entertainers had missed: Scandal had changed its meaning. No longer necessarily career-ending — as it had been in the cases of Roscoe Arbuckle, Ingrid Bergman and Errol Flynn — controversy had become a resource. Madonna didn’t provoke randomly; she choreographed provocation, each gesture and outfit a calculated engagement with public sensibilities. Audiences, she seemed to conclude, actually enjoyed being outraged: the surge of anger, shock and indignation was oddly satisfying. This may appear obvious today. In the 1980s, it was radically contrarian.

Her 1989 album marked what might have been a Eureka! moment. Madonna appeared to sense that audiences would demand ever more from stars. This was before ’s launched in 1992, allowing viewers to eavesdrop by watching what became known as reality TV. Madonna seems to have arrived at a broadly similar conclusion: Audiences were turning into peeping Toms.

Her ambition was not to shock for its own sake, but to maintain attention by disclosing more and more of what once passed as a private life — and without inhibition. Madonna became, in essence, her own living experiment in making her personal life open to inspection. Before her, entertainers like Elizabeth Taylor had, in the 1960s, allowed private lives to seep into public view via a more cautious media, but this was rare or sensational and delivered to surprised audiences by the then-nascent paparazzi. Madonna made it a career strategy, presenting her personal self as indistinguishable from her stage persona and inviting audiences to witness. Not just witness: Audiences were encouraged to judge her; condemning Madonna was integral to her success.

Like Semtex

The 1990s solidified Madonna’s role as a cultural provocateur. The Madonna: Truth or Dare (1991) documented her tour with unprecedented candor, offering glimpses into backstage rivalries, rehearsals and intimate moments, all alongside the theatricality of her onstage performances. The film predated reality television by years, yet already anticipated its voracious appetite for the minutiae of celebrity.

Around the same period, her book Sex and the album pushed boundaries of sexual representation, blending performance, fetishism and artifice. She intentionally offended, proving unequivocally that scandal was like Semtex, a powerful explosive, but very pliable so that, handled carefully, it can be turned into different shapes. In the years that followed, Paris Hilton and Kim Kardashian corroborated this when they appeared on that would have ruined show business careers in earlier times.

Yet Madonna’s influence went beyond shock and outrage. Critics like recognized her as a harbinger of postfeminist performance: She demonstrated how a woman could be sensual, assertive, ambitious and aggressive while curating her image in a way that conferred power. From this perspective, being sexy was a form of empowerment. Madonna’s conquests were both commercial and symbolic, reframing what it meant to be a female entertainer in a male-dominated industry. Her affectations, from the pink cone bra to platinum blonde hair, were signifiers of her autonomy.

By the mid-1990s, Madonna was both a diva in the operatic sense and a pioneer in media literacy. Her aforementioned 1992 renegotiation with Time Warner secured her own record label. She remained a polarizing figure: The world alternately praised and disparaged her, keeping her relevant. She had transformed scandal into art and fame into an instrument of social influence. The celebrity landscape she helped sculpt is what we see all around us today.

Even into the 2000s and 2010s, Madonna’s career reflected a Darwinian adaptability to changing environments. The 2003 MTV Video Music Awards with Britney Spears sparked a viral debate, raising questions about bisexuality. Tours such as and albums like demonstrated a willingness to collaborate with younger artists while retaining her signature sound. Her postfeminist sensibilities, rooted in self-expression and independence, carried through to her later albums and public appearances. At the 2023 Grammys, she to critics, accusing them of “ageism and misogyny.”

Diva provocatrix

Today, Madonna’s presence at Milan Fashion Week is emblematic of both her longevity and her continued authorship of the fame narrative. She’s still a model for what it means to inhabit the public sphere on one’s own terms. Unlike many successors, she hasn’t become her own tribute act. She’s refused to trade on nostalgia and strives to remain relevant. A figure whose demands, exacting nature and unyielding vision have shaped not only the entertainment industry but the very ways in which audiences understand and appreciate spectacle, Madonna evokes a reminder about the way we live — vicariously, voyeuristically, derivatively and by proxy.

Her legacy is inseparable from the media she mastered and, to be fair, was mastered by. Madonna didn’t merely reflect social and technological changes — she anticipated them, attempted to manipulate them and tried to force the world to respond. It did: From MTV to social media, from the controversy of Like a Prayer to the candor of Truth or Dare, she engineered a dialogue with audiences that has altered our relationship to celebrities. Many will not think this is such a good thing.

Madonna belongs in the same pantheon as Maria Callas (1923–77), Judy Garland (1922–69) and Barbra Streisand (b. 1942), all imperious figures feared as much as revered for their exacting standards and refusal to accept reality when it failed to conform to their visions. Like them, Madonna has attracted detractors as well as worshippers, her difficulty inseparable from her distinction. Yet she added something new to the tradition: Madonna was not simply a diva but a diva provocatrix, a performer who treated outrage as an artistic medium. While there are many contemporary stars of immense wealth and visibility, none appears willing — or permitted — to embody the risk, volatility and sheer force that once defined the type. Perhaps Madonna truly is the last of them.

[Ellis Cashmore is the author of ]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Madonna — Diva Provocatrix appeared first on 51Թ.

]]>
/culture/madonna-diva-provocatrix/feed/ 0
Diana’s Ghost Haunts Britain’s Royals /culture/dianas-ghost-haunts-britains-royals/ /culture/dianas-ghost-haunts-britains-royals/#respond Wed, 25 Feb 2026 17:08:23 +0000 /?p=160977 She wasn’t there, but her presence was undeniable. The Andrew Mountbatten-Windsor debacle has erupted in a way that would have been unthinkable without Diana, Princess of Wales: Her willingness to induce the world’s media into her confidence and share her life changed both the way royals treated the media and the media’s methods of covering… Continue reading Diana’s Ghost Haunts Britain’s Royals

The post Diana’s Ghost Haunts Britain’s Royals appeared first on 51Թ.

]]>
She wasn’t there, but her presence was undeniable. The Andrew Mountbatten-Windsor debacle has erupted in a way that would have been unthinkable without Diana, Princess of Wales: Her willingness to induce the world’s media into her confidence and share her life changed both the way royals treated the media and the media’s methods of covering an institution they had handled with excessive care for decades.

Since the then-Prince’s decision to grant an to BBC television in 2019 to discuss his relationship with Jeffrey Epstein, his life has been sliced open and examined forensically. The interview caused a reputational cataclysm, making Andrew appear aloof and indifferent. After that, the media have examined and interpreted his every gesture and treated every silence as evidence. The police have acted decisively and pitilessly. Where once a public would have looked away to avoid witnessing the impropriety, they have glared intently and without inhibition.

There’s no protective shield of deference, no instinctive reluctance to look too closely. Instead, there’s a degree of disclosure that would once have been unthinkable: Continuous, intimate and often unforgiving. The House of Windsor, once cushioned by mystique, is now consumed as spectacle — global spectacle. This transformation didn’t occur suddenly, nor can it be attributed to a spontaneous change in journalistic policy, the rise of the tabloids or global satellite broadcasting, though all these contributed to the cultural shift of the 1980s.

This transformation occurred when Diana appeared. Young, photogenic and emotionally legible, she didn’t merely join the royal family; she altered its relationship with the media and thus its visibility. When that changed, so did everything else. The protocol unraveled, and the monarchy has struggled to manage ever since.

Royal mystique

Before Diana, the royal family was presented like characters in a Noël Coward play: elegant, composed and emotionally self-contained. They were visible but inaccessible; ever-present yet remote; simultaneously touchable and untouchable. The media reported on ceremonies, births and funerals, but rarely intruded on private emotional affairs. Royals were not expected to reveal themselves. Their authority depended, in large part, on their opacity and mystique. They were less individuals than personifications of majesty.

Elsewhere, however, a new and more invasive form of journalism had begun to develop. In postwar Italy, freelance photographers adopted aggressive tactics to capture candid images of famous figures, most notably Elizabeth Taylor, whose life the media turned into a scandalous spectacle for audiences around the world in the 1960s. The paparazzi, as they came to be known, transformed the relationship between public figures and the media. Privacy became provisional, subject to negotiation or violation depending on commercial value. Yet Britain’s royal family remained largely insulated from this development. Even the publication of photographs showing in intimate circumstances with Roddy Llewellyn in 1976 represented a disturbance or a crack in the royal mystique — depending on perspective. The monarchy absorbed the shock and resumed its usual stately equilibrium.

Diana’s arrival coincided with wider cultural changes that would make such equilibrium impossible to sustain. The 1980s witnessed the rapid expansion of celebrity culture, fueled by global television, mass-circulation magazines and a growing appetite for personality-driven narratives. Fame itself was becoming democratized and commodified. Diana entered royal life not as a seasoned media strategist but as a young ԲéԳܱ whose emotional openness aligned, perhaps unwittingly, with this newly-developing environment. The traditional reserve of royalty was alien to her: She allowed audiences glimpses of vulnerability, loneliness, uncertainty and emotional wounds, all the time offering a new kind of pleasure — guiltless eavesdropping.

Her closest counterpart was not another royal but iconic pop star Madonna, whose ascent during the same decade exemplified a new kind of fame built on continuous exposure, uninterrupted scandal and perpetual reinvention. Madonna’s attention-acquisition seemed to have a strategy, while Diana’s usually appeared reactive. Both women thrived by making common cause with a media that rewarded accessibility and a certain narrative tension. Both blurred the boundary between private experience and public performance. Diana didn’t overwhelm the media with drama and narrative; however, by making herself visible and accessible, she normalized a new conception of the monarchy as an august institution, but one that could be seen and understood through the same interpretive lens as celebrity.

Fairytale

Diana’s marriage to then-Prince Charles III was presented explicitly as a fairytale, not as retrospective embellishment but as contemporary cultural framing. On its wedding-day front page, the Daily Mirror described the occasion as “the fairytale wedding,” while publishers quickly consolidated the narrative in longer form, including a 1982 biography of Diana subtitled .

When the marriage began to unravel, the media did not abandon this narrative so much as invert it. Headlines lamented that “the fairytale is over,” preserving the story’s mythic structure even as its emotional valence shifted. Diana remained the innocent protagonist, while Camilla Parker Bowles (now Queen Camilla) — cast as “the other woman” — assumed the role of antagonist. The monarchy had been translated into the language of folklore.

Diana’s own actions reinforced this construction. Her willingness to cooperate with journalists, to communicate indirectly through carefully timed disclosures and, ultimately, to submit to the now-notorious with Martin Bashir in 1995 marked a decisive break with royal precedent. No member of the royal family had ever spoken so candidly, or so publicly, about intimate emotional pain. The interview did more than reveal personal suffering: It redefined expectations of the monarchy. Audiences no longer saw them as protected.

Bashir, it was later learned, had procured the Diana interview using ethically questionable methods. He would later conduct similarly revealing with Michael Jackson, another global figure whose life became inseparable from media scrutiny. While there is no moral equivalence between the two interviews, taken together, they suggest that, by the mid-1990s, royalty and celebrity occupied the same symbolic terrain. Both were subject to the same processes of exposure, interpretation and commodification. Diana stood at the center of this convergence. She was not merely its most visible casualty but its most consequential catalyst.

Her death in 1997 marked the end of her life but not the end of her influence. If anything, her absence intensified her symbolic presence. The extraordinary public grief that followed revealed the depth of emotional investment she had inspired. Millions mourned not simply a princess but a figure they felt they knew intimately. The monarchy, by contrast, appeared uncertain, its traditional reserve suddenly out of step with public expectation. The institution that had once defined the terms of its own visibility now struggled to respond to forces beyond its control.

The logic of celebrity

In the decades since , the media environment she helped shape has expanded and intensified. The rise of digital platforms has accelerated the circulation of images and narratives, while audiences have become active participants in the construction and dissemination of scandal. The royal family now exists in a system that rewards exposure and punishes concealment. Transgression is both condemned and consumed. Public figures are elevated, scrutinized and, when they falter, subjected to ritualized humiliation.

This dynamic has affected Diana’s sons. Prince Harry has adapted to the logic of celebrity, relocating to the United States and engaging directly with media institutions that his mother helped legitimize as sites of royal storytelling. His wife, Meghan, Duchess of Sussex, brought with her an understanding of media culture shaped outside the constraints of monarchy. Together, they have navigated a world in which royal identity is still a constitutional status, but one that amplifies narrative consequence.

While some royals have adapted and evolved in the ecosystem, others have fared less successfully. Andrew, once secure within the protective structures of , has found himself exposed to the same unforgiving scrutiny faced by disgraced celebrities in other fields. His fall from public grace illustrates the extent to which royal status no longer guarantees insulation from reputational collapse. Despite maintaining his innocence, Andrew has been treated not as a prince apart, but as a public figure subject to the full force of media scrutiny and legal process.

In February, Mountbatten-Windsor was into custody by police following a raid at his Sandringham home, the episode captured by photojournalists. He is the first senior member of the royal family to be detained by authorities in circumstances of this kind since was taken prisoner in 1647.

The death of Elizabeth II removed the last enduring link to the era before this transformation. Her reign had provided continuity and an element of stability, preserving the appearance that the monarchy existed above media attention. Her successor, Charles — Andrew’s brother, of course — now presides over an ancient institution that must operate on a modern cultural landscape, one in which visibility is a necessity and can be a liability.

Diana’s legacy lies in the terrain she transformed: Her influence continues to shape how monarchy and media interact. The manner in which she conducted her life and her relaxed relationship with journalists meant that the distance between the monarchy and the media would diminish during her life and keep diminishing after her death. The consequences of this change continue to unfold. 

Would a more deferential media even approach a subject that could have alienated consumers as easily as it could have excited them? Andrew was never the most popular figure in the royal family, but some could have bridled at the sensationalism afforded his apparent errancy. It’s doubtful that a police force in earlier times would have whisked Andrew away from his home to a police station for questioning and returned him home in a manner befitting a bank robbery suspect. These are hypotheticals, but not unanswerable: No, in all cases. It’s difficult to imagine the Mountbatten-Windsor scandal unfolding as it has before, say, 1990. 

Audiences today are fascinated by rule-breaking but equally by its baleful consequences. Our curiosity isn’t natural but cultivated, and nowadays participatory, sustained by social media tech that allows constant observation and interaction. The royal family, once insulated by reverence, now exists as a permanent object of scrutiny, its struggles consumed as both cautionary parables but, more usually, plain entertainment. We’re enthralled by the prospect of an English prince entangled in an international web of patriarchal exploitation and leaked documents on investment opportunities.

Diana may be gone, but the conditions she helped create remain. She altered not only the monarchy’s relationship with the media but the public’s relationship with the monarchy. And perhaps the monarchy itself. The House of Windsor no longer exists as a realm apart. It is part of the same unforgiving system that governs all modern fame. Andrew’s case illustrates the final consequence of Diana’s revolution: monarchy no longer stands apart from celebrity culture. It operates inside it — exposed to its volatility, dependent on its visibility and vulnerable to its judgments.

[Ellis Cashmore is the author of , published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Diana’s Ghost Haunts Britain’s Royals appeared first on 51Թ.

]]>
/culture/dianas-ghost-haunts-britains-royals/feed/ 0
TRUMP vs. the BBC /world-news/us-news/trump-vs-the-bbc/ /world-news/us-news/trump-vs-the-bbc/#respond Thu, 19 Feb 2026 14:08:37 +0000 /?p=160878 When a sitting or former US president sues a media organization, it’s big news. When they sue the British Broadcasting Corporation for $10 billion, it’s something else, closer to a geopolitical spectacle than a legal action. Florida judge Roy K. Altman has set a February 2027 trial date for US President Donald Trump’s lawsuit against… Continue reading TRUMP vs. the BBC

The post TRUMP vs. the BBC appeared first on 51Թ.

]]>
When a sitting or former US president sues a media organization, it’s big news. When they sue the British Broadcasting Corporation for $10 billion, it’s something else, closer to a geopolitical spectacle than a legal action.

Florida judge Roy K. Altman has set a February 2027 trial date for US President Donald Trump’s against the British public service broadcaster, BBC for defamation. The claim centers on an episode of the BBC current affairs program Panorama, titled “Trump: A Second Chance?” The episode edited together two passages of Trump’s speech on January 6, 2021, in a way that appeared to suggest he had directly urged his supporters to march on the US Capitol and “fight like hell.”

Trump has sued American outlets before and his record is mixed. In 2024, ABC News settled a after anchor George Stephanopoulos inaccurately described the E. Jean Carroll verdict as a finding of “rape” rather than sexual abuse under New York civil law. The settlement reportedly included a multimillion-dollar payment toward Trump’s future presidential library and legal fees. In 2025, CBS News and its parent company, Paramount Global, also reached a financial settlement over a 60 Minutes segment Trump claimed was misleading.

But other suits have failed. In 2023, a federal judge in Florida a $475 million defamation claim against CNN over its use of the phrase, “the Big Lie.” A separate multibillion-dollar action against The New York Times met a similar . American courts have repeatedly emphasized the high constitutional threshold for public figures alleging defamation. Two settlements, two dismissals. A 2–2 , if we want to keep score.

But this is different. The BBC is not a partisan cable network in the crowded US market. It is a century-old British institution, funded primarily by a license fee, chartered to inform and educate as well as entertain. It does not allow advertising. The BBC is woven into the cultural fabric of the United Kingdom and regarded internationally as the Rolls-Royce of broadcasting.

That is what makes this case extraordinary. It’s not simply Trump versus another newsroom. It is Trump versus a totem of British civic life. And the near-theatrical $10 billion figure signals that this is about much more than compensation. It’s about what or who has authority, power and legitimacy on a global stage.

Error of judgment

The BBC has already conceded that the program spliced together two segments of Trump’s speech, delivered nearly an hour apart, without making that clear to viewers. The effect was to compress his rhetoric into a single, more incendiary sequence. Critics argue that the edit omitted a crucial line in which Trump urged supporters to protest “peacefully.”

After an internal uproar and the leak of a critical document by Michael Prescott, a former advisor on editorial standards, the BBC apologized. Its chair, Samir Shah, described the edit as an “ of judgment.” Director General TimDavie accepted responsibility before stepping down amid the wider turbulence. Deborah Turness, chief executive of BBC News, also departed.

Crucially, though, the corporation stopped short of admitting . It offered no damages. And it strenuously denied malicious intent. So, when Trump’s lawyers escalated the matter into a multibillion-dollar suit filed in Florida, the BBC challenged the court’s jurisdiction, arguing that the program was neither produced nor broadcast in Florida and was not available there via its streamer BritBox as alleged. Judge Altman rejected attempts to delay discovery; the case will now proceed.

Could the broadcast genuinely have damaged Trump’s checkered reputation? That is the legal nub of the matter. Defamation law in the United States, especially for public figures, sets a prohibitively high bar. A claimant must show not only falsity but “:” knowledge of falsity or reckless disregard for the truth. Trump’s legal team the edit was “intentionally and maliciously” misleading. Note: Intentionally. The BBC says it was a mistake, now acknowledged.

There are precedents for media organizations paying dearly for editorial lapses. But there are few, if any, precedents for a British public broadcaster facing a $10 billion claim in an American court over a documentary edit. And, in this instance, timing matters because the BBC doesn’t enter the fray in good financial health.

Under pressure

Even without Trump, the BBC is under pressure. The broadcaster is pursuing savings of up to (over $860 million) over three years. License fee revenues are falling as households move toward streaming platforms and social media. Around fewer British households paid their license fee in the last reported year. Departments are bracing for cuts. Outsourcing is inevitable.

Public service broadcasters were never designed to absorb shocks of this magnitude. Unlike commercial rivals, the BBC doesn’t rely on advertising or subscription revenue. It’s funded by a compulsory license fee whose legitimacy is periodically contested in Parliament and in public debate. It is right now.

In that context, a $10 billion liability — even a fraction of it — would not be an ordinary line item. It would be an existential catastrophe. (While the BBC isn’t in literal “debt” like a business with a balance-sheet liability that must be paid off, it is running deficits and facing revenue shortfalls and operating pressures that are forcing cost cuts and license fee increases.)

Which brings us to the first conjecture.

What if Trump wins?

Bookmakers, were they to set odds, would likely price a full $10 billion victory as 25/1, maybe 33/1 at most. The legal hurdles are formidable. Yet imagine, for argument’s sake, that Trump prevails and secures a judgment on that scale.

The immediate consequence would be seismic. The BBC’s annual budget is roughly £5 billion ($6.8 billion) sterling. A damages award of $10 billion (about £7.5 billion) would eclipse its annual income. Even a significantly reduced award could destabilize the corporation’s finances, potentially forcing emergency government intervention or radical restructuring.

The reputational damage would be devastating, too. For a broadcaster that trades on credibility, reliability and impartiality, a court finding of malicious defamation would undermine its moral authority at home and abroad. Critics who already question its impartiality, neutrality and objectivity — and there are plenty in the UK and elsewhere — would feel vindicated. Politicians skeptical of the license fee would gain leverage. Calls to privatize, allow advertising or dismantle the corporation completely would intensify.

For Trump, by contrast, victory would be nectar. He has long depicted mainstream media as hostile and dishonest. A courtroom triumph over, of all broadcasters, the BBC would validate his narrative to a global audience. It would bolster his standing among supporters who see him as a victim of elite institutions. It would inflate his already considerable self-belief.

And there is a longer-term implication: A Trump win would signal to media organizations worldwide that editorial misjudgments, even acknowledged and corrected, can carry calamitous financial risk. The effect could be sobering. Investigative journalism, already expensive and fraught, might grow more cautious. Legal departments would gain power. Editors would hesitate. The media would be domesticated.

That might please those who see the media as already too powerful and untouchable. It would trouble those who value the media’s autonomy and ability to criticize without fear.

What if the BBC wins?

The alternative is less obvious but still significant. Suppose the court finds no defamation — perhaps that the edit, while erred, did not meet the criterion of actual malice. The BBC would emerge legally vindicated. Bookies might price this as evens, perhaps 5/6 (meaning you stake $6 to win $5 if the case is thrown out).

A “victory” for the Beeb would not, however, bring a reward of $10 billion from Trump. Nor would it remove the BBC’s structural financial problems. License fee decline would continue. Savings targets would still loom. Pride and honor might be restored, but balance sheets would not be any healthier.

For Trump, defeat would sting. By February 2027, he will be 80 years old and approaching the end of a second term in office. Attention will be shifting to the next presidential contest, which is scheduled for November 7, 2028. The Republican Party will be thinking about succession and electability.

In July 2024, gunman Thomas Matthew Crooks attempted to Trump, grazing his ear with a bullet — an event that underscored how deeply the now-president divides American society. A courtroom loss would not change his polarizing potential nor end his overall influence. His capacity to command loyalty and shape narratives as well as antagonize detractors and engender hatred would remain formidable. But failure would pierce the aura of inevitability that has often surrounded him. For a leader who reduces complex events, especially , to deals or no-deals, a public defeat against a foreign broadcaster would be an unequivocal disaster. Not just defeat, but humiliation.

Would it be transformative? No. The BBC would continue to struggle financially. Trump would continue to dominate attention for at least the remainder of his tenure. Yet the symbolism would matter. It would reaffirm the resilience of established media institutions against political assault. It would remind would-be litigants that courts are not just campaign platforms.

Beyond damages

Strip away the legal briefs and this case is about something larger: the collision between a populist politician who thrives on confrontation and a public broadcaster that embodies an older model of civic rectitude.

Trump has built a career on challenging institutions, including courts, universities, newsrooms and intelligence agencies. The BBC represents a particularly attractive target: foreign, publicly funded, proud of its editorial standards, perhaps even haughty about the global prestige it still enjoys after over a hundred years of broadcasting.

The corporation, for its part, is navigating a media environment transformed by YouTube, Netflix, TikTok and myriad streaming services. It’s pruning costs while trying to maintain global reach. It can ill afford complacency at the moment. The Panorama edit was, by its own admission, a lapse. In an era of forensic scrutiny, lapses can be expensive.

What happens in that Florida courtroom in 2027 will reverberate far beyond the litigants. A Trump victory could reshape the risk calculus for journalism worldwide. A BBC victory would help stabilize an institution under strain and reinforce the legal protections that enable robust reporting.

Either way, this is emphatically not routine litigation. It’s a clash of reputations: one personal and political, the other institutional and national. When the gavel falls, the consequences will extend well beyond damages.

[Ellis Cashmore is the author of , now in its third edition.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post TRUMP vs. the BBC appeared first on 51Թ.

]]>
/world-news/us-news/trump-vs-the-bbc/feed/ 0
Should FIFA Pull the World Cup Out of the US? /culture/should-fifa-pull-the-world-cup-out-of-the-us/ /culture/should-fifa-pull-the-world-cup-out-of-the-us/#respond Tue, 03 Feb 2026 13:51:48 +0000 /?p=160603 Recently, former FIFA president Sepp Blatter shared his views on the appropriateness of the USA as a host of association football’s quadrennial tournament. In a social media post endorsing comments from Swiss reformer Mark Pieth, Blatter urged fans not to travel to the United States for the 2026 event, warning that the social climate there… Continue reading Should FIFA Pull the World Cup Out of the US?

The post Should FIFA Pull the World Cup Out of the US? appeared first on 51Թ.

]]>
Recently, former FIFA president Sepp Blatter shared his views on the appropriateness of the USA as a host of association football’s quadrennial tournament. In a social media post endorsing comments from Swiss reformer Mark Pieth, Blatter fans not to travel to the United States for the 2026 event, warning that the social climate there “hardly encourages fans to go.” Blatter cited recent fatal shootings by federal immigration agents in Minneapolis, Minnesota — Immigration and Customs Enforcement (ICE) killed US citizens and on January 7 and 24, respectively — as emblematic of a society in and a political administration unconcerned about civil liberties. Blatter has a point.

Football’s unifying spectacle?

For a sports event established and still trading on the rhetoric of world peace and unity, such a controversial intervention from a past FIFA leader is unprecedented and extraordinary. FIFA, football’s global governing organization, has long insisted that politics should be divorced from sports. Its statutes historically prohibited expressions of political opinion on the pitch, and its public posture has been to allow host nations’ flag-waving patriotism while disavowing partisan disputes and domestic politics. But the context of what has been happening in the US has now made that firewall look flimsy.

The US is co-host for the 2026 World Cup alongside Canada and Mexico. Eleven US cities, from Atlanta, Georgia to Seattle, Washington, will stage 78 of the 104 matches, including the showpiece grand final in July at the MetLife Stadium in East Rutherford, New Jersey. The US considers what Americans call soccer a rising sport and a marquee opportunity to showcase itself globally. (The word soccer was originally derived from “socia” from the middle of association football.)

Yet, as many know, the country hosting these games is currently deeply fractured. Operation , US President Donald Trump’s aggressive extrajudicial immigration enforcement operation centered in Minnesota, has witnessed more than 3,000 arrests and multiple deaths. The killing of Alex Pretti, an intensive care nurse, by a federal agent while filming immigration enforcement sparked massive protests and led Minnesota Governor Tim Walz to the National Guard.

On January 21, the Eighth US Circuit Court of Appeals an injunction that had restricted immigration agents from arresting peaceful demonstrators. Civil liberties groups denounced this move as a violation of First Amendment in a tournament host state. And in what Minnesota activists called the first general strike in eight decades, hundreds of local businesses shut down and thousands in protest at the ICE raids and detentions.

In this volatile geopolitical context, Blatter’s announcement sounds like a lot more than rhetorical noise. It seems quite rational and reflects international concern that the US’s civil rights crisis, a colossal breakdown in trust between citizens and federal authority, could disfigure what is meant to be football’s greatest unifying spectacle.

’s stance on politics, then and now

For decades, ’s was that football and politics don’t mix; players couldn’t display political messages and host countries were told that their internal affairs did not concern world football. In the same way, the International Olympic Committee for decades that sports and politics should remain strictly separate. Politicians were tolerated on pitch sidelines, but political protest was not and players were strenuously discouraged from publicizing their beliefs. That principle stretched credulity at times, especially in and , when the World Cup was staged in Russia and Qatar — countries that had both been condemned for their poor human-rights records. Yet ’s public neutrality remained the default position.

With the infamous of George Floyd in the US in May 2020 and the ensuing global movement for racial justice, FIFA and its affiliates were obliged to shift. The organization embraced inclusiveness and equality as new principles, endorsing campaigns, advancing and signaling support for and LGBTQ+ rights. These themes have become central to ’s brand: Women’s tournaments are positioned explicitly as platforms for empowerment and social inclusion and equality initiatives are now integral to ’s corporate identity.

While it would have been inconceivable as recently as ten years ago, FIFA, in 2021, approved of players taking the before games. The move , but players, especially in England’s Premier League, appeared to welcome the gesture of defiance against racism and perhaps other forms of bigotry. The gesture no longer takes place, at least not on a regular basis.

This evolution was initially tactical: It was a sensible and timely response to the changing zeitgeist of global social movements. Other sports similarly abandoned their bans on politics and social affairs. But in football, the change has morphed into something more doctrinal. ’s embrace of diversity, equality and inclusivity is no longer just an add-on feature of the game; it’s part of its credo. That’s why critics argue FIFA can no longer convincingly separate sport from politics when a host nation’s domestic policies contravene the very values FIFA now claims to uphold. Consistency disappears.

And now the contradiction is visible. The question that’s being asked is: How can FIFA champion equality, anti-racism and inclusiveness on one hand, while blithely staging its flagship event in a country where civil unrest stems from bitterly contested enforcement of immigration policy, and where critics claim human rights are being ignored?

FIFA President Gianni Infantino has close personal ties with Trump, a fact that has not helped the organization’s credibility on this matter. Under Infantino’s leadership, FIFA offered Trump a widely-ridiculed “” at the World Cup draw. It was a supinely sycophantic act that introduced geopolitics into an otherwise sporting occasion.

Analysts inside and outside football circles worry that ’s seeming acquiescence could reduce the sport to a mere appendage of realpolitik, meaning that football’s moral capital can and will be exploited by host governments for their own agendas in this and future World Cups.

Protest at the World Cup?

The real question isn’t whether fans can watch 48 teams compete in 104 matches: They will. The tournament infrastructure is already being built, tickets are selling and stadiums are being prepared. But what moral character will the World Cup have?

Trump wants the World Cup in the US. It has more symbolic importance than even the Olympics. It attracts far more attention and lasts longer than a G7 summit. Heads of state will congregate as will all manner of high profile politicians, dignitaries and other kinds of celebrities. The world’s media will attend. Live crowds from everywhere will number in the millions. The hosts will entertain the visitors with pomp and spectacle. But, for all his braggadocio, Trump must surely realize there are risks.

A global sporting event set against a backdrop of civil strife could become a stage for protest rather than celebration. Already, American sports arenas have seen anti-immigration and anti-ICE demonstrations spread into professional games. If similar activism manifests during the World Cup (for example, protests at venues, international delegations refusing to sing anthems, public backlash over entry bans or travel restrictions, etc.), politics could overshadow the spectacle.

Imagine Senegalese fans barred from entering certain US cities. Or headline-making protests in Los Angeles and New York City on match nights. Or even teams and fans using the platform to draw attention to human rights concerns. None of these scenarios are likely but none of them is wildly far-fetched, either, given the current social context and the US’s political direction of travel. ’s administrative muscle in demanding guarantees from host governments has limits. US immigration policy is set by a president emboldened by his base and seemingly deaf to international opinion.

Blatter’s critics also suggest that a stronger FIFA leadership, i.e. one not beholden to Trump, might have already relocated World Cup matches entirely. It’s logistically challenging and financially ruinous, but not impossible. Seasons were shifted for Qatar 2022; matches have been relocated for other tournaments. Yet ’s economic interests and commercial contracts make such a major move barely conceivable, especially considering the US media’s influence in ’s World Cup finances.

So, the Cup will go ahead. But should it?

Pros vs. cons: should or shouldn’t we cancel?

Why we should not hold the World Cup in the US:

  1. Moral inconsistency. ’s exalted values of inclusivity and equality ring hollow if the host nation’s core policies violate those principles.
  2. Civil unrest risk. The US is currently facing large-scale protests and crackdowns that could spill into tournament dates and locations, perhaps necessitating heavy security presences at all games.
  3. Safety concerns. Recent federal shootings and lifted protest protections raise international safety questions.
  4. Political exploitation. The World Cup might inadvertently become a tool for partisan agendas, diminishing football’s unifying power.

Why the World Cup should go ahead in the US:

  1. Unmatched infrastructure. No other co-host — Canada or Mexico — alone can shoulder the logistical and financial burden on such short notice.
  2. Football’s universality. Football transcends politics; millions will watch and enjoy without engaging with the host nation’s controversies.
  3. Economic and cultural opportunity. The tournament could still grow the game in the US and foster youth engagement, long-term.
  4. Risk of precedent. Canceling sets a precedent where sports events hinge on transient political climates, forcing every future host to meet subjective moral thresholds.

’s dilemma

The 2026 FIFA World Cup presents a dilemma: a global festival of sport scheduled in a nation currently racked with deep political fissures that show no signs of repair and could worsen over the next few months. Blatter’s call for a boycott echoes wider anxieties about the US’s social cohesion, and the ethical contradictions between ’s stated values and its hosting choices are more conspicuous than ever.

Association football has long been more than a sport. It’s a global cultural force that can and often does mirror global tensions. The World Cup this year will test whether the sport can stay above geopolitics, or whether politics and justice will impose themselves on football’s grandest stage. The strong likelihood is that the tournament will go ahead. But for 39 days in June and July, the world of football and perhaps the rest of the world will hold its breath.


[Ellis Cashmore is the author of , now in its third edition.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Should FIFA Pull the World Cup Out of the US? appeared first on 51Թ.

]]>
/culture/should-fifa-pull-the-world-cup-out-of-the-us/feed/ 0
Sexual Exploitation — Why Leaving Is Not So Simple /culture/sexual-exploitation-why-leaving-is-not-so-simple/ /culture/sexual-exploitation-why-leaving-is-not-so-simple/#respond Thu, 22 Jan 2026 13:05:47 +0000 /?p=160378 “I had no voice, no choice.” These are the words of a woman who was, for more than 30 years, coerced into having sex with multiple men, while the man who controlled her photographed and filmed the encounters. She was threatened if she refused, though precisely how was never fully spelled out in court. The… Continue reading Sexual Exploitation — Why Leaving Is Not So Simple

The post Sexual Exploitation — Why Leaving Is Not So Simple appeared first on 51Թ.

]]>
“I had no voice, no choice.”

These are the words of a woman who was, for more than 30 years, coerced into having sex with multiple men, while the man who controlled her photographed and filmed the encounters. She was threatened if she refused, though precisely how was never fully spelled out in court. The abuse followed a grim routine: cars, hotels, secluded woodland. Several times a week. Year after year. Decade after decade.

This is not a Victorian melodrama or a cautionary tale from a distant culture. It’s a contemporary , recently tried, exhaustively evidenced and adjudicated. The man responsible has now been imprisoned for life. The woman, finally free, says she no longer knows who she is.

It’s difficult to read this without a sense of disbelief. Not because such abuse exists; that is depressingly and appallingly beyond dispute. But because of its duration and apparent invisibility. How does one person compel another adult to engage in acts they find abhorrent, repeatedly, for over 30 years, without chains, drugs or physical confinement? How is this possible?

A perverse Stockholm Syndrome

My initial temptation was to reach for explanations that preserve assumptions about human autonomy. Perhaps the woman was dependent on drugs, and her tormentor controlled her supply. Or maybe she suffered from untreated mental illness or severe cognitive impairment. In both scenarios, she was, in some sense, incapable of understanding what was happening to her and thus not inclined to do anything to change it.

These explanations are not frivolous. They reflect an intuitive need to anchor such cases in obvious forms of vulnerability. But, in this instance, they don’t work: No evidence of drug dependency was introduced at trial; no diagnosis of learning disability was advanced. The court proceeded on the basis that this was a woman who, in formal terms, was a sentient adult capable of consent and yet whose consent was somehow rendered meaningless or, at best, ineffectual.

The real force of the case is its apparent ordinariness: Nothing about it depends on extraordinary pathology. The abuse didn’t happen in a basement or a makeshift dungeon. It took place in spaces that were mundane, transient and socially transparent: cars, hotel rooms, countryside lay-bys. The perpetrator didn’t need constant violence or even the threat of violence. He needed time, routine patterns and control over consequences.

This is why this case is so unsettling. It doesn’t even let us reassure ourselves that freedom, once established, is lasting or self-perpetuating. It forces us to confront the possibility that freedom can be taken away, gradually, invisibly and without spectacle. And right under our noses — so we don’t notice it vanishing. 

We hear much of , in which people who are held captive, over time, become comfortable with their captivity and even identify positively with those who hold them. It’s a perverse development, perhaps, but in the process, the captives surrender what once passed as their power to speak, act or even think as they want; they give up their volition.

Agency

This brings us to the concept of agency, a term that does a great deal of heavy lifting in contemporary discussions of women’s lives. We are frequently reminded that women have agency. They choose. They decide. They act. The insistence on agency has been politically necessary, a corrective to fallacies that portrayed women as passive, dependent or merely responsive to men.

But there’s danger here. When agency is treated as a universal possession rather than a socially conferred capacity, it loses its analytical edge. Worse, it becomes accusatory: If women have agency, then failure to act can begin to look like failure of will, judgment or even courage.

Agency, properly understood, is not an inner resource that individuals carry with them regardless of circumstance. It is a condition created and sometimes withdrawn by cultural, institutional and relational environments. Those environments distribute possibilities unevenly. They make some actions thinkable and others unthinkable; some exits imaginable and others pipedreams.

In the case I’ve outlined, the woman did not simply “fail” to leave. She occupied an insular social world in which resistance carried consequences she believed she could not survive, while compliance became the least damaging option available. Over time, that world was normalized. Her abuse was “normal.” The idea of escape was unthinkable.

If the concept of agency is to remain socially and politically useful, it must be capable of accounting for this. Otherwise, it risks becoming a slogan rather than an analytic tool.

Abuse disguised as intimacy

It would be comforting to treat this case as a grotesque anomaly. But it is not without precedent. Its most fabled case is George du Maurier’s 1894 novel , in which the controlling and mesmeric manipulator Svengali wields a sinister power over a young Parisian orphan girl. There are more recent, real-life cases.

drugged his wife repeatedly and arranged for strangers to have sex with her, recording and photographing the assaults. Around 70 men were eventually implicated. The case shocked France not just for the abuse, but for how long it was orchestrated without detection (2011–20).

In a man reportedly drugged and filmed his wife over 15 years, using secrecy and routine to sustain his long-term control. South Korea’s notorious Case involved victims who were coerced into recording sexual acts, often under threats or blackmail. One of the most harrowing cases took place among a in Bolivia: In 2009, a group of men were rounded up and convicted of the rape and sexual assault of 151 women and girls, including small children.  

Some of the cases involved drugs and physical assaults; other cases involved women being “shared” in ostensibly intimate relationships, their compliance sustained through intimidation, humiliation and the threat of exposure rather than brute force. What they shared was not violence alone or even its threat, but length of time. The violations were habitually repeated over and over again, so that they became routinized and eventually regular features of the social landscape. The victims probably appeared to outsiders as complicit in their own exploitation, and this is precisely why intervention failed to materialize.

I’m making a deeply uncomfortable observation, and it must be handled carefully. To say that a victim becomes implicated in their captivity is not to say they desire, less still endorse it. It is to recognize that survival in unusually constrained circumstances often requires forms of cooperation that, from the perspective of outsiders, resemble consent.

Indeed, a common rhetorical question directed at victims of domestic violence is the blunt and accusatory: “Why didn’t she leave?” Rape victims are often subjected to a similar, implied blame. Their actions are anatomized after the fact: if she froze, if she didn’t scream or if she refused to fight back, she’s assumed to have somehow induced the assault. Women already face deep-rooted scepticism when reporting sexual violence. Often, that scepticism is loaded with assumptions about consent and by narrow expectations of how a “real” victim ought to behave.

This dynamic is not confined to heterosexual relationships. Comparable patterns can be observed in same-sex relationships, in cults, in abusive workplaces and in situations where women exercise power over men. Even consensual BDSM relationships, when viewed without context, can appear indistinguishable from exploitation to outsiders. The difference lies not in surface behavior, but in the presence or absence of exits.

Some readers may have seen the recent Harry Lighton film (2025) about a queer relationship in which one man becomes “happy” (his word) to operate not just submissively but servilely, while outsiders, like his mother, recoil at the apparent abuse he’s prepared to take. Abuse does not always announce itself as “abuse.” Sometimes it looks like accommodation or habit. Or even more unfathomably, intimacy.

Captivity and freedom

The final, and perhaps most troubling, implication of the main case is that many similar situations may — no, I should be clearer, will — never come to light at all. The idea that victims eventually realize what’s happening to them and leave is a consoling piece of fiction. After decades of routine coercion, there is usually no epiphany waiting to happen. Just a continuation.

The woman at the center of this case didn’t wake up one morning after 30 years and have a lightbulb moment when it dawned on her that what was happening to her was just plain wrong. The conditions that had shaped her life for so long didn’t disappear. What actually changed was not her clarity of vision, but the collapse of the structure that had contained her.

Sociologists use the term “” to describe the process by which moralities, norms and identities acquired in childhood and adulthood are displaced, often following a momentous or disorienting experience.

Such replacements are inherently fragile. The conception of reality they sustain can be destabilized by even fleeting encounters with alternative ways of understanding the world. It’s likely that those subjected to long-term exploitation have their casual social contacts quietly restricted. A conversation in a shop, a bar or a workplace may be enough to unsettle a relationship whose assumptions are otherwise rarely questioned. In this case, the woman’s bond with her tormentor was eventually broken, perhaps through just such unguarded encounters that allowed her, for the first time in decades, to see her situation anew.

That’s why these cases should stop us in our tracks. Not because they are shocking or hideous, but because they expose the fragility of assumptions we prefer not to question. Agency is real, but remember: it is also uneven. Freedom exists, but it always has limits. And some forms of captivity are so thoroughly normalized that they persist for a lifetime without ever being known.

The woman now says she is free. Perhaps. It’s a beginning. But the more pressing task is collective: to develop ways of thinking about power, coercion, consent and, most critically, agency that are capable of recognizing such situations before they harden into decades.

[Ellis Cashmore is the author of ]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Sexual Exploitation — Why Leaving Is Not So Simple appeared first on 51Թ.

]]>
/culture/sexual-exploitation-why-leaving-is-not-so-simple/feed/ 0
YouTubing Reality /business/technology/youtubing-reality/ /business/technology/youtubing-reality/#respond Tue, 06 Jan 2026 14:22:36 +0000 /?p=160055 Say “wardrobe malfunction,” and anyone old enough will immediately picture Janet Jackson at the 2004 Super Bowl halftime show. As Justin Timberlake sang, “I bet I’ll have you naked by the end of this song,” he tugged at Jackson’s costume and, for a fraction of a second, exposed her right breast to 114 million viewers.… Continue reading YouTubing Reality

The post YouTubing Reality appeared first on 51Թ.

]]>
Say “wardrobe malfunction,” and anyone old enough will immediately picture Janet Jackson at the 2004 Super Bowl halftime show. As Justin Timberlake sang, “I bet I’ll have you naked by the end of this song,” he tugged at Jackson’s costume and, for a fraction of a second, exposed her right breast to 114 million viewers. If you weren’t watching live, you missed it, and in those days, missing it meant it was gone. Except, in this case, it wasn’t.

About a year later, three young PayPal employees, Chad Hurley, Steve Chen and Jawed Karim, were still talking about it, lamenting that there wasn’t an easy way to replay the moment. It was 2005. Social media barely existed. They decided they could build a platform where people could upload, store and share video clips easily. Within months, the world was talking about .

Janet’s “malfunction” may have brought personal embarrassment and corporate panic; it also hurt her career, but it helped catalyze a revolution, though one I confess I never saw coming. Discussing the new, oddly named YouTube on Britain’s SkyNews, I pointed out that it was becoming a repository for quirky videos, like a kitten grappling with a ball of wool, and would continue to grow, but how many quirky vids did we need? YouTube was only just beginning, though. From a collection of amateur uploads, it became the world’s dominant media treasury.

Today, YouTube has eclipsed Netflix as the service that audiences spend the most time watching, accounting for 13.1% of all TV viewing, compared to Netflix in second place with an 8.7% share, according to . That means more people watch YouTube than Disney, Amazon Prime or any legacy broadcaster. It is the biggest podcast platform. It shapes music consumption. It is encroaching on live sport. And by the time YouTube takes over the in 2029, it won’t merely be streaming Hollywood: it will be part of it.

Zeitgeist in a bottle

Every so often, a company comes along that seems to have captured the zeitgeist in a bottle, like Nike. It found a niche in the sports apparel market in the 1970s, then recreated that market so that sneakers, tracksuits and baseball caps were no longer just for sports. I remember teaching a class at a US university where I worked in the 1990s and challenging anyone not wearing or carrying one piece of Nike apparel. Everyone was.

Likewise, Apple’s success was not just about elegant hardware but experience architecture: intuitive interfaces, seamless ecosystems and signature aesthetics that made its products desirable. Like Nike, Apple didn’t just capture market share: it set expectations for what technology should be and, in the process, built a new Apple market. Every new product it launches is an event in itself. In both cases, these companies expanded far beyond commerce. They formed new habits. Traditional market preferences ceased to be individual choices and became more like cultural defaults.

YouTube did not design and make products, nor did it create anything in particular. Its power lay in how it framed and presented what people watch and now expect from the media. Uploading a video of a toddler his older brother’s finger or an animated, pixelated with a Pop-Tart body leaving a rainbow trail, set to a repetitive, catchy tune, might not register many views without the imprimatur of YouTube. With it, both became internet sensations.

Gangnam Style 

YouTube’s early aspirations were modest: it simply provided an online warehouse for people to share their videos. But 18 months after its inception, Google saw its wider potential and acquired YouTube in what then seemed a prodigiously high $1.65 billion (£884 million) all-share deal. It effectively made YouTube one of the fastest internet ever. That was 2006; a reorganization in brought YouTube under the ownership of Alphabet, a parent company.

In 2007, Google established the fundamentals of an unusual but effective internet business model. Instead of hosting and sharing videos created by amateur enthusiasts, it allowed producers to monetize their videos with advertising revenue. If the video failed to attract significant views, the producer paid nothing. If they made an impact, YouTube charged them and charged a commission. YouTube called it a .

By 2010, three things happened: YouTube matured into a stable, reliable streaming, global platform; smartphones arrived, making video portable; video became frictionless to share. And, as a result, billion-view counts became possible. YouTube’s first one was “” by Psy in 2012. The global rise of K-pop followed later in the decade. In the 1990s and early 2000s, broadband penetration was still limited, mobile video didn’t exist and neither did social media amplification.

Later in the year, YouTube invested in original channels with a $100 million initiative to bring professionally produced content to the platform. This was an early step toward widening content beyond user uploads and an indication that YouTube had designs on the established television market: It rolled out 60 new channels, none of them owned by YouTube, but all producing original content. YouTube claimed 20 of the new channels generated more than a million views . But YouTube remained a distribution outlet rather than a production company.

This meant comparing YouTube with the likes of the National Broadcasting Company (NBC), Disney or Amazon was like comparing apples and oranges. NBC and practically every other major media company in history commission, finance, curate and sometimes produce original programs themselves. These can be expensive.

In 2008 (until 2013) AMC’s Breaking Bad was considered mid-to-high end at about $3-5 million per episode, but values have risen markedly in recent years. When HBO’s Game of Thrones began in 2011, each episode would have cost $3-6 million. By the time of its conclusion in 2019, this had risen to . Netflix’s Stranger Things (2016–present) is comparable. YouTube stayed resolutely out of production and focused only on hosting. 

Killing time and spending time

This encouraged critics not exactly to dismiss YouTube, but to contemplate its limitations. Ted Sarandos, Netflix co-chief executive, issued a put-down when he made the distinction: “There’s a difference between killing time and .” Sarandos pointed out, “We’re in the how-you-spend-time business.” He said that he thought much of YouTube’s creator-made videos to be “snackable” content, compared with the professionally made shows and films available on his service.

Thing is: what if some of those “snacks” are like caviar-topped blinis or wagyu-and-white-truffle sliders — tiny, yes, but impossibly rich, intensely pleasurable and probably more desirable than the heavy multicourse banquets Netflix serves up. In other words, brevity doesn’t mean inferiority. Sometimes the smallest bites stay with us the longest.

The trouble for Netflix and indeed for all other major media corporations is that media appetites have evolved. A growing share of viewers actually want lots of small, intensely flavored portions instead of a sit-down multi-course feast. They don’t lack attention capabilities; they just don’t want to surrender their concentration for hours at a time. YouTube doesn’t fight that reality; If anything, it feeds on it.

Consider: audiences have been entertained since the rise of cinema in the 1920s by narratives that demand attentiveness for up to two hours, often longer. Film itself was modeled on plays, which in the 18th and 19th centuries were typically 2-3 hours long. Ancient Greek plays were often shorter, unless they were performed at festivals, in which case they could last for days. So, television has absorbed a cultural form that’s at least 2,500 years old. Few other aspects of culture are so enduring.

It surprises practically anyone over the age of 20, but long-form storytelling with a multiepisode arc, character development and conclusion may no longer have appeal. No, let me be blunter: may now be boring. The short-form, hook-driven snippets provided by YouTube may be the preferred format. YouTube vids can be full of highlights from sports, concerts and potentially anything; they can be prescriptive, demonstrating how to fix things, they can be reactions to practically any event, good or bad. And, perhaps surprisingly, YouTube mostly escapes the censorial criticism usually directed at Meta, X and TikTok. However, it has attracted concern over algorithms, misinformation and child safety, plus conspiracy rabbit holes, extremist content and ostensibly children’s cartoons that turn out to be .

The media, or “mass media” as it was in the early 20th century, dictated rather than cultivated social tastes, dispositions and appetites. Television introduced substantial changes, particularly in discovering and satisfying our passion for quiz and talk shows, and, at intervals, live sports, music videos and reality TV. Streamers have broken the linear stronghold, allowing viewers to choose when they watch, and devices let them choose where. will soon invite viewers to participate in creating their own characters and plots and deploy AI to turn them into drama. The signs are that Gen Z has grown weary of traditions and wants an altogether different experience. Sarandos and other media execs are betting that this is temporary and maturity will restore more familiar preferences. Cultural taste is rarely so ineluctable. 

[Ellis Cashmore’s is published by Routledge]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post YouTubing Reality appeared first on 51Թ.

]]>
/business/technology/youtubing-reality/feed/ 0
Brigitte Bardot: Beauty, Bigotry and the Complexity of Legacy /culture/brigitte-bardot-beauty-bigotry-and-the-complexity-of-legacy/ /culture/brigitte-bardot-beauty-bigotry-and-the-complexity-of-legacy/#respond Thu, 01 Jan 2026 14:15:04 +0000 /?p=159975 Can we take pleasure in the art of someone we know has committed deeds we now regard as despicable? And even if that artist once enchanted us, can we ignore the bigotry that may have been festering for decades? For over 20 years, Brigitte Bardot was unquestionably the most celebrated object of heterosexual male desire,… Continue reading Brigitte Bardot: Beauty, Bigotry and the Complexity of Legacy

The post Brigitte Bardot: Beauty, Bigotry and the Complexity of Legacy appeared first on 51Թ.

]]>
Can we take pleasure in the art of someone we know has committed deeds we now regard as despicable? And even if that artist once enchanted us, can we ignore the bigotry that may have been festering for decades? For over 20 years, Brigitte Bardot was unquestionably the most celebrated object of heterosexual male desire, not just in cinema but anywhere. Yet for much longer, Bardot, who has at 91, embodied both sweetness and cruelty in roughly equal proportions.

At the height of her global adoration, Bardot was revered in France. President Charles de Gaulle famously remarked that her contribution to French exports rivaled that of French automobile manufacturer Renault. From 1969 to 1972, she as the model for Marianne, the symbol of French liberty and the Republic. And in 1985, she was France’s highest honor, the Légion d’Honneur. As her looks faded and film roles dried up, however, she morphed into a very different kind of figure — one some continued to adore, but others found deeply repugnant.

And God Created Woman

Parisian grew up in a Roman Catholic family. Her father was an industrialist; her mother, a dance enthusiast, enrolled Brigitte in classes. She performed well enough to gain admission to the National Superior Conservatory for Music and Dance in 1947. As a teenager, she began modeling for fashion magazines. She married film director Roger Vadim in 1952 and made her screen debut in Jean Boyer’s Le Trou Normand that same year. Among her early appearances was a small role in the British comedy Doctor at Sea (1955).

The following year came (Et Dieu… créa la femme), Bardot’s breakthrough. A decade before Raquel Welch in One Million Years B.C., Bardot played an uninhibited 18-year-old, widely labelled a “nymphet” at the time. She soon acquired the epithet “.” This was the mid-1950s, when the term “objectification” was used almost entirely in literary criticism rather than feminist discourse. (As in “the author objectifies the character’s emotions perfectly.”)

Nothing succeeds like scandal, and the 1950s were no exception. Religious and censorship groups, especially in the United States, were outraged by Bardot’s on-screen sexuality — outrage amplified when it emerged she was having an with her married costar Jean-Louis Trintignant. Parallels with her contemporary , whose affair with Richard Burton made her tabloid prey, are obvious. Both women were vilified.

Bardot herself acknowledged that her fame rested more on image than craft: “I started out as a lousy actress and have remained one,” she once . Perhaps this is why she turned briefly to singing. In 1967, she recorded Je t’aime… moi non plus with her then lover Serge Gainsbourg — a provocative song that seemed to eavesdrop on lovers in flagrante. Bardot asked for the track to be withdrawn; the version released — and made notorious — was recorded later by Gainsbourg and Jane Birkin in 1969.

Bigotry

In the 1970s, Bardot to animal welfare. A committed vegetarian, she fulminated against what she regarded as needless cruelty in practices ranging from seal-culling to the horsemeat trade. Halal slaughter, which does not require stunning, became one of her particular targets.

She campaigned vigorously, the Brigitte Bardot Foundation for the Welfare and Protection of Animals in 1986 and raising substantial funds, including half a million dollars from auctioning her jewelry. She wrote protest letters to world leaders, including China’s Jiang Zemin and Denmark’s Queen Margrethe II. Today, celebrities regularly wander into ethical and political terrain; at the time, Bardot’s activism felt more startling. And she didn’t stop there. In 1992, she Bernard d’Ormale, a former adviser to Jean-Marie Le Pen of the far-right Front National, and began publishing her views.

Bardot repeatedly France’s immigration policy, with particular hostility toward Muslims. She was convicted five times of inciting racial and religious hatred. Like Taylor, Bardot was an independent, self-willed woman who cared little whether she was loved or hated. Unlike Taylor, Bardot veered into far-right politics and unapologetic Islamophobia. As the only French celebrity openly to defend the far right, she must have realized how much this damaged her reputation. Perhaps this motivated her decision to live her final years reclusively in St Tropez, where she died.

Arts and artists

So how do we parse La Bardot? In the 1960s, she had few, if any, rivals as a symbol of female sexual liberation, and at the time, this felt revolutionary. Simone de Beauvoir her “the locomotive of women’s history.” Today, many would say she also played too willingly into male fantasy. Her animal-welfare campaigning achieved real legislative impact; her bigotry caused real harm. Both statements are true.

Can anyone who lived through the 1960s remember Bardot simply as the cinematic siren without wondering whether the bigotry was always there, merely lacking expression? And even if it was, would it have provoked the same response then as now? Simply “backtracking,” as has done, is an easy option. The harder question is whether we can meaningfully separate art (in the broadest sense) from the artist. Bardot emerged at a time before women were widely encouraged to free themselves from domestic subservience. appeared only in 1963, yet the values she later embraced leave her firmly on the wrong side of history.

Perhaps the more unsettling truth is that Bardot forces us to confront our own complicity. We didn’t merely consume her image; we helped manufacture it. The erotic fantasy she embodied was fed by studios, journalists, politicians and audiences who thrilled in her transgression while quietly accepting the cultural status quo that made such fantasies necessary. Bardot didn’t invent misogyny any more than she did spectacle, but she allowed herself to be a convenient vessel. In reassessing her, we are not only re-evaluating an individual’s life; we are examining the culture that first exalted her, then recoiled in horror when she revealed that the conservatism and cruelty embedded in that culture were hers as well. We arrive at similar conclusions when listening to Michael Jackson, R. Kelly or Sean “Diddy” Combs.

[
Ellis Cashmore is the author of the book .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Brigitte Bardot: Beauty, Bigotry and the Complexity of Legacy appeared first on 51Թ.

]]>
/culture/brigitte-bardot-beauty-bigotry-and-the-complexity-of-legacy/feed/ 0
Why Children Kill: The Incomprehensible Death of Rob Reiner /culture/why-children-kill-the-incomprehensible-death-of-rob-reiner/ /culture/why-children-kill-the-incomprehensible-death-of-rob-reiner/#respond Tue, 23 Dec 2025 13:45:00 +0000 /?p=159814 Rob Reiner and his wife, Michele Singer Reiner, were killed in their upscale LA Brentwood home. Were they robbed? Did they owe money? Had they upset the wrong people? No, their son, 32-year-old Nick Reiner, has been charged with two counts of first-degree murder. Their son? Surely not. Sons love their parents: they’re the people… Continue reading Why Children Kill: The Incomprehensible Death of Rob Reiner

The post Why Children Kill: The Incomprehensible Death of Rob Reiner appeared first on 51Թ.

]]>
Rob Reiner and his wife, Michele Singer Reiner, were in their upscale LA Brentwood home. Were they robbed? Did they owe money? Had they upset the wrong people?

No, their son, 32-year-old Nick Reiner, has been with two counts of first-degree murder.

Their son? Surely not. Sons love their parents: they’re the people who raised them, took care of them, encouraged them. Rob presumably helped Nick find a role in the film industry, didn’t he?

Apparently not. In fact, Nick was heavily into substance abuse even in high school. He was homeless for part of his teenage years.

All the same, Rob was worth about $200 million.

Well, it looks like Nick didn’t see any of it. There was certainly no love lost between them.

The more this case seems to reveal, the more it mystifies.

How could it happen?

Children killing their own parents is more common than we care to admit. The act even has a name: parricide, from the Latin parricidium. In the United States, parricide is typically estimated to account for around 2% of all . That may sound insignificant, but it translates into several hundred cases each year. In Australia, the proportion appears higher still. A recent Australian Institute of Criminology analysis drawing on 35 years of National Homicide Monitoring Program data suggests that around of homicides involve the killing of one or both parents, with younger offenders disproportionately represented.

Last year in England, Virginia McCulloch, now 36, her father by crushing prescription medication into his drink, then beat her mother with a hammer and stabbed her repeatedly with a kitchen knife bought for the purpose. She kept the bodies inside the family home for four years before police finally discovered them.

There have been many others. In Auckland, New Zealand, in 1998, teenage brothers Matthew and Tyler Williams killed both their parents. In Port St. Lucie, Florida, Tyler Hadley his parents to death after they refused to let him host a party at the family home. In Albuquerque in , 15-year-old Nehemiah Griego murdered his parents and three younger siblings. In Broken Arrow, Oklahoma, in , brothers Robert and Michael Bever killed their parents and three siblings in a mass stabbing.

Some cases take a less direct route. Jennifer Pan did not kill her parents with her own hands but to do it for her in Ontario, Canada. She was sentenced to life imprisonment for first-degree murder and attempted murder, a case built not around rage but around deception and long-simmering family conflict.

The most infamous example of all remains the . In 1989, Lyle and Erik Menendez shot their parents dead in their Beverly Hills mansion. More than three decades later, the case still divides opinion: abuse or entitlement, trauma or greed? Both men remain in prison, their claims rehearsed endlessly, their crime still resisting consensus explanation. Their request for a retrial was denied.

Suffocating families

Parricide is arguably the most puzzling form of unlawful killing. Greed, revenge, jealousy, honor, crimes of passion — these are familiar categories we invoke to make murder intelligible. The killing of one’s parents fits uneasily into any of them. Unless, of course, we are content with stand-bys such as evil or psychosis — explanations that appear to clarify everything while explaining almost nothing at all.

One of the sternest critiques of the modern nuclear family was that of British psychiatrist , who challenged the comforting assumption that this arrangement is necessarily wholesome or beneficial to children. Laing’s work on the relationship between family life and mental distress offers one way of making sense of contemporary parricide. 

For Laing, our image of the family has been pasteurized. His own account is far more poisonous. The family, he argued, is often a multidysfunctional amalgam from which children sometimes emerge bruised, if not permanently damaged, emotionally and cognitively, and sometimes physically. Families impose roles, identities and expectations on individuals in ways that can generate anxiety, distress, schizophrenia and what we now euphemize as “mental health issues.”&Բ;

Children, on Laing’s account, can experience family life as a suffocating form of captivity, a space from which there appears to be no legitimate escape. From this perspective, parricide can be read as a violent act of liberation or self-assertion. In many of the psychodramas we have seen in recent decades, Laing’s approach has undeniable relevance.

And yet a question nags: If Laing is even half right, why is parricide not vastly more common?

This question forces a reversal of the way criminology typically proceeds. Instead of asking why some children kill their parents, we might ask the more neglected question: why do so few do so? Why isn’t parricide widespread?

Why do we follow rules?

This is not a radical new way of thinking. Philosopher (1588–1679) famously began from the assumption that human beings are driven principally by self-interest and fear, concluding that the natural condition of humanity is conflictual: life, left to itself, is “nasty, brutish and short.” Society, in this view, is not natural but artificial: a scaffolding of rules, norms and institutions assembled to make coexistence possible among fundamentally self-seeking individuals that even Darwin would have thought were in survival mode. Over time, we internalize these rules, learn to behave conventionally and, in short, adapt to our environment. Mostly.

The disquieting implication of Hobbes’s argument is that we are not merely capable of harmful behavior but inclined toward it. Why, then, do most of us refrain? Writing in the mid-twentieth century, American criminologist had an answer. Hirschi turned criminology inside out by asking not why people offend, but why they do not. His answer was deceptively simple and brilliantly contrarian: conformity is learned and maintained through social bonds that tie individuals to conventional life, the “real world.”

Hirschi identified four bonds: attachment, commitment, belief and involvement. The most important of these is attachment, especially to parents, but also to peers and other significant figures who signpost our way through life. Some of these we know personally; others we know only through the media (Hirschi didn’t recognize what we now call , but these surely complement his analysis). 

As people mature, they invest in conventional pursuits: education, careers, relationships, reputations. These investments create stakes in conformity. When attachments are strong and investments substantial, rule-breaking becomes costly. When they are weak, slack or broken, the probability of transgressive behavior increases sharply. Approached this way, parricide is not the starting point of analysis but the endpoint of failure: a situation in which attachment has slackened, investment has failed to materialize and the restraining power of social bonds has become ineffective.

Details of the Reiner killings are sparse. It’s believed Rob and Michele died from stab wounds. Nick has been charged with two counts of first-degree murder in the killing of his parents and a special allegation that he used a dangerous weapon, a knife. The additions could mean a greater sentence. The accused has not yet been tried. But what has emerged about Nick Reiner’s life is relevant.

By his own account, he struggled with substance abuse from his teenage years and was homeless for a period while still in high school. This is extraordinary given his background. Nick Reiner was not raised on the margins of society but at its cultural center: the son of one of the world’s most successful and influential film directors, a man whose personal wealth was estimated at around $200 million. Homelessness and chronic instability are not what we conventionally associate with such circumstances.

It is difficult to avoid the conclusion that this points to a serious breakdown in the father–son relationship. Material resources were clearly available, but material resources alone do not constitute attachment, guidance or emotional investment. In Hirschi’s terms, the crucial bonds that secure individuals to conventional life appear to have been weak, fractured or absent altogether.

Intimidated rather than inspired

Hollywood offers numerous counterexamples. Kirk Douglas famously mentored and supported his son, Michael Douglas. Janet Leigh and Tony Curtis both professional insight and a sense of vocation to their daughter, Jamie Lee Curtis. Tom Hanks’s son, Colin Hanks, has about the influence of his father’s presence and expectations. Liza Minnelli distinguished herself as Judy Garland’s daughter. In each case, success was accompanied by transmission: of norms, aspirations, confidence and commitment.

In the Reiner case, that transmission appears to have stalled. The mechanisms Hirschi identified — attachment, investment, belief and the gradual accumulation of stakes in conformity — seem not to have taken hold. Instead of developing commitments that might restrain destructive impulses, Nick Reiner appears to have drifted, becoming increasingly detached from both parents and from the conventional social world more broadly.

Seen through this lens, parricide is not explained by a single motive such as greed, rage, psychosis or even evil, but by the collapse of social controls that ordinarily make such acts unthinkable. When attachment weakens, when investment in a conventional future never materializes, the prohibitions that bind most people lose their force. What remains is not conformity strained to breaking point, but no inclination to conform at all.

The more difficult question persists: why did those attachments weaken in the first place? Unless we fall back on clichés about “addictive personalities” or speculative talk of unconscious death wishes, attention inevitably turns to the parent–child relationship itself. Having a pre-eminent and successful parent does not automatically confer advantage. Fame and fortune do not guarantee emotional availability, guidance or affirmation.

Some children flourish in the shadow of achievement. Others manifestly do not. Many children of famous parents never follow them into public life and remain largely anonymous; not because they lack opportunity, but because comparison itself can be disabling. They grow up measured against an impossible standard, intimidated rather than inspired, overpowered rather than supported. The inheritance is not confidence but pressure; not direction but exhaustion.

Seen this way, Nick Reiner’s trajectory (early substance abuse, homelessness, drift) is not anomalous but symptomatic. It suggests both the presence of privilege and the failure of transmission of encouragement, expectations and a sense of belonging. When that transmission fails, Hirschi’s bonds do not form. Attachment loosens, commitment never consolidates and the ordinary moral restraints that govern behavior lose their grip.

[Ellis Cashmore’s is published by Routledge]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Why Children Kill: The Incomprehensible Death of Rob Reiner appeared first on 51Թ.

]]>
/culture/why-children-kill-the-incomprehensible-death-of-rob-reiner/feed/ 0
Could Netflix Win the Deal but Lose the Media War? /business/could-netflix-win-the-deal-but-lose-the-media-war/ /business/could-netflix-win-the-deal-but-lose-the-media-war/#respond Wed, 17 Dec 2025 14:32:43 +0000 /?p=159674 Traveling home from London after a conference recently, I fell into conversation with four fellow passengers, all 16-year-old high school students. We talked about several subjects, including Cristiano Ronaldo, about whom I was then writing a piece for 51Թ. I noticed one young man looked at my newspaper as I might look at a… Continue reading Could Netflix Win the Deal but Lose the Media War?

The post Could Netflix Win the Deal but Lose the Media War? appeared first on 51Թ.

]]>
Traveling home from London after a conference recently, I fell into conversation with four fellow passengers, all 16-year-old high school students. We talked about several subjects, including Cristiano Ronaldo, about whom I was then writing a piece for 51Թ.

I noticed one young man looked at my newspaper as I might look at a 1990 Chevrolet: Impressed that the owner had kept it roadworthy, but curious why they hadn’t traded it in for a newer model. 

After a while, I asked, “Do you guys watch any television?” They all shook their heads, one waving his phone to show their favored hardware. “I mean content,” I added. “Oh yeah, plenty of shows. But we like to watch when we feel like it.” “What do you do for news? CNN? SkyNews?” They all shook their heads. “TikTok, Instagram …” said one.

End of legacy media

My generation and the students’ parents grew up in a world where TV was prevalent. Avid viewers adjusted their evenings in accordance with schedules. We now call it linear TV: channels screen programs at a certain time. That arrangement won’t last much longer. As Generation Z matures, the channels will find viewing figures dwindling and advertisers disappearing. So, why on earth does a streaming service that has risen to power by offering flexibility in viewing want to buy a traditional, or legacy, media company? Surely, it would be a retrograde step. Or would it? (By legacy media, I refer to newspapers, TV, radio and film that dominated before the arrival of the internet, and which conveyed their contents to consumers but provided no opportunity for interactive participation.)

Hollywood’s boardrooms and streaming executives have recently been involved in a power struggle worthy of Succession. offered $72 billion for Warner Bros. Discovery’s (WBD) studio and streaming assets. Paramount also wanted WBD, so it countered with a hostile all-cash bid of $30 per share, valuing the transaction at more than $108 billion.

Beyond the Machiavellian maneuvering and humongous sums lies another narrative. Two companies are battling for dominance of an industry that increasingly resembles the Sistine Chapel with no worshippers: magnificent, epoch-making, still a thing of wonder — but belonging to a different age. Legacy media has status, gargantuan libraries and brand equity built over a century. But younger generations have deserted it. 

They curate what they watch, rarely engaging with mainstream news, selecting only the drama they want. And for Gen Z especially, the smartphone provides the primary portal; everything else is background noise. The unavoidable truth is this: the grip that television and its mass media forerunners have held over our imaginations for over a century is unclasped.

Shaping habits

Television and the advertising-driven business model on which it was founded once reigned supreme, holding an almost mesmeric power over audiences and shaping popular taste, opinion, attitude and behavior. Its homogenizing effect on audiences justified its description as the mass media. Its precursors, newspapers and radio, had reach and immediacy, penetrating millions of homes, but not the same spellbinding power of TV.

In the middle of the 20th century, television arrived and quickly became the preeminent medium, capturing audiences like nothing else in history. Broadcast schedules didn’t just influence; they dictated daily rhythms, from evening news to Saturday night programming, establishing television as a central institution in social life. The media was no longer a segment of life; it became comprehensive, guiding perceptions, shaping habits and commanding the attention of near-whole populations. Entertainment and advertising became intertwined, forming a commercial and cultural attachment that remains a defining feature of media power today.

Television set agendas, won political elections, dramatized wars and sometimes scandalized audiences; no other social institution has ever shaped collective human thought and action so compellingly. It scripted narratives, created memories and for decades served as the default interpreter of reality, bringing historical events like the moon landing (1969), the funeral of Princess Diana (1997) and the assassination of John F. Kennedy (1963) to our living rooms. 

Then the landscape began to move. In the early 1980s, cable television introduced new technology that sliced mass audiences into narrow segments, weakening the cultural unity that the big networks once created and commandeered. ESPN started in 1979, with and launching over the next two years. Telecommunications satellites pushed the shift further: Viewers could choose from channels originating anywhere, offering round-the-clock news, sports, movies and later, pay-per-view.

Often overlooked, but enormously powerful in changing sensibilities was the VCR, short for video cassette recorder, a piece of technology that allowed viewers to record programs and play them back whenever they wished. They could also build their own libraries of programs. The ability to choose when and what to watch seemed like a minor innovation at the time, though it turned out to be revolutionary.

Choice became the decisive force in media consumption. That became obvious when a California DVD-by-mail company called Netflix took the next step: In 2007, it began delivering video through a new-fangled system called the internet. Within a decade, streaming was no longer a novelty; it was a different architecture for global entertainment. The era of the mass media was gone.

The media in 2030

Today, TV no longer commands the mass audiences it did in the last century. As a result, the legacy media are weakened, and the once-mighty behemoths have become acquisition targets. Netflix and Paramount are not fighting for the future of television. If anything, they want its past: access to WBD’s vast libraries (of shows, such as Friends and The Big Bang Theory), franchises (including the DC Universe and the Harry Potter series) and a subscriber base (of nearly 130 million), assets that can be leveraged to maintain relevance and global reach as audiences continue to fragment. Even a fragmented audience is crucial, of course. So far, streamers’ primary source of income is subscriptions, while traditional media depends on advertising revenue. Both models need viewers.

The WBD deal is sure to be only one of a series that will reconfigure the media. So, how will things look in 2030? The first point we should understand is that streaming is today, not tomorrow. As radio and TV themselves once appeared to be the present and future, streaming will also soon be the past. TV was like a default setting for populations in the late 20th century, but audiences now have other distractions, like TikTok, gaming, messaging and AI-enabled . But there will soon be something else, if only because young audiences treat the media as interchangeable, temporary and disposable.

The Big Tech companies could make moves to buy major studios. But why would they? Apple TV+ and Amazon Prime give them a presence in the TV market without the liabilities that come with a legacy institution. Meta shows little interest. This doesn’t rule out an incursion: If studios become cheap enough, one of these companies might pounce, though not because the content is valuable. More likely, they would value the distribution rights, trademarks and back catalogs that could serve their broader ambitions.

The media has found ways to outwit or circumvent death before, of course. Newspapers are still with us after well over a century. Radio has listeners and linear TV has a Darwinian knack of adapting to new environments. The WBD deal may provide a clue as to how it will try to adapt again.

Netflix carries significant debt (as much as $75 billion if the deal goes through) but has a matchless global subscriber base of over 300 million. Yet it commands less US viewing time than YouTube. It now needs premium libraries to complement its original content and maintain growth momentum. Paramount, in its effort to raise more capital to acquire WBD, will potentially have to ask its Middle East sovereign wealth fund backers — Saudi Arabia, Abu Dhabi and Qatar — to increase their equity contribution from the existing $24bn, meaning that one of the world’s media conglomerates will be partly owned by Gulf state interests. 

The new entity that emerges will likely be a hybrid that moves away from reliance solely on subscriptions or advertising. A Netflix company would probably integrate the WBD library into a premium, global subscription ecosystem to try to tighten its hold on the direct-to-consumer global market. It would also choke to death the already terminally ill cinema chains. 

If Paramount prevails, the new entity will be more familiar, though it will have to take a more diversified, multitiered approach, balancing ad-supported global broadcast networks with targeted streaming subscriptions in an attempt to combine traditional media access with modern digital strategy. The Gulf state funds have relinquished governance rights, we are told; so they should have no influence on content.

History teaches that the media is never just entertainment. It shapes thought, behavior, habits, values and even relationships. The latest developments in Hollywood are the current stage of a process that began with the burgeoning newspaper industry of the late 19th century. Now, the scale is global, the pace unrelenting and the stakes higher than ever.

For a generation whose waking hours are increasingly mediated through screens, the winners and losers in this corporate drama will define not just the future of entertainment, but the contours of contemporary life itself.

[Ellis Cashmore’s is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Could Netflix Win the Deal but Lose the Media War? appeared first on 51Թ.

]]>
/business/could-netflix-win-the-deal-but-lose-the-media-war/feed/ 0
Cristiano Ronaldo in the New Media Ecosystem /culture/cristiano-ronaldo-in-the-new-media-ecosystem/ /culture/cristiano-ronaldo-in-the-new-media-ecosystem/#respond Sat, 13 Dec 2025 12:22:19 +0000 /?p=159619 In February 2025, during a Spanish television interview, Cristiano Ronaldo declared, “I’m the best player in football history. I haven’t seen anyone better than me.” Was this a gauche, egotistical indiscretion or the knowing remark of a consummate self-publicist with brilliant intuition for grabbing the world’s attention? Ronaldo is an athlete par excellence, and his… Continue reading Cristiano Ronaldo in the New Media Ecosystem

The post Cristiano Ronaldo in the New Media Ecosystem appeared first on 51Թ.

]]>
In February 2025, during a Spanish television interview, Cristiano Ronaldo declared, “I’m the in football history. I haven’t seen anyone better than me.” Was this a gauche, egotistical indiscretion or the knowing remark of a consummate self-publicist with brilliant intuition for grabbing the world’s attention?

Ronaldo is an athlete par excellence, and his prodigious skills are the reason we first knew of him. But there have been other exceptional football players, all of whom distinguished themselves as the world’s best. Pelé in the 1960s, Johan Cruyff in the 1970s, Diego Maradona in the 1980s, Lionel Messi in the 2010s: Each redefined what seemed possible in their era. Ronaldo probably does not eclipse them in skills. What makes him singular is less what he does with the ball and more the way he has adapted to a media ecosystem fundamentally different from the one that shaped his predecessors and one that’s still morphing.

A niche of his own

No one disputes the plaudits and prizes Ronaldo has won in his 23-year career. From his early years at Sporting Lisbon to Manchester United, Real Madrid, Juventus, a return to United and his current club, Al-Nassr in Saudi Arabia, his career has been defined by superlatives and record-breaking achievements. Over 1,100 professional goals, Union of European Football Associations (UEFA) Champions League victories, five Ballons d’Or (the highest honor in association football) and consistently high-quality performances at international tournaments have validated his athletic preeminence. Yet he has combined these achievements with red cards in professional football, including a recent one in November 2025 when playing for Portugal against the Republic of Ireland. 

’s to commute his mandatory three-match suspension to a single game (an extraordinary act of leniency for an offense normally punished without exception) underscores just how singular his status has become. The rationale was never stated explicitly, but it seems obvious: Ronaldo is not another player to be disciplined. His presence lifts broadcast audiences, commercial interest and international attention. FIFA chose to protect that asset. Purists see it as an affront to fairness. Yet the episode contains a deeper truth: Ronaldo now occupies a niche of his own, where regulations bend to accommodate him.

$4 million a week and worth every penny

“Ronaldo is the only foreign player worth what he earns because of the global exposure he brings to the league and the country,” said former Saudi Sports Minister Prince in December 2025. Ronaldo’s annual salary at the time was $211 million, or $4 million a week. It was an unusually frank acknowledgment of where Ronaldo’s real value lay: Not in scoring goals or winning trophies, but in drawing “global exposure.”

Exposure — by which I take the Prince to mean media attention — has become a kind of currency. Its social effects are tangible: Status, credibility, influence and, of course, money. Saudi Arabia has plenty of the latter but craves the first three. Ronaldo has all four and, through cultural osmosis, transfers them to the kingdom. His public persona is enacted and reproduced with every appearance, every goal, every celebration; every social media post, sponsorship, endorsement, charitable gesture and interview. Everything he does and says in public is covered by the media, making his presence unmissable.

In 2024, Ronaldo became the first celebrity to reach followers across all social media platforms. He surpasses the global fan base of most football clubs and leagues. Engagement metrics, including likes, shares and comments, translate directly into cultural leverage. Each post, whether promoting a product or simply capturing a moment, is simultaneously entertainment, advertisement and signal. This gives Ronaldo unprecedented exposure for an athlete and helps justify his salary at Al-Nassr.

His Al-Nassr contract is only one of his income streams. Ronaldo has become a multifaceted enterprise independent of his on-field endeavors. He maintains a lifetime with Nike, endorses products from to and lends his name to CR7-branded hotels, gyms, underwear and lifestyle merchandise. Non-fungible tokens (NFTs) and digital collectibles extend the brand into virtual economies. Sponsorships and partnerships are not incidental to football; they are part of the Ronaldo performance, each reinforcing his presence, or, more accurately, his omnipresence.

A Netflix , a recent visit to the , endless international interviews and, of course, football create the impression that Ronaldo is near-ubiquitous. Controversies such as disciplinary incidents or personal-life events that might harm others’ reputations often turn into a kind of felix culpa for — missteps that, instead of diminishing him, circulate as fresh currency. In 2020, he became the first active team-sport athlete to surpass in career earnings.

Digital transition

Ronaldo’s ascent coincided with the most significant shift in sports media consumption since television emerged in the 1950s. Over the past 20 years, the grammar of viewing has changed from a linear, nonparticipative form of spectating to a multidimensional, interactive, endlessly shareable experience. (Linear media refers to scheduled broadcasts chosen by the broadcaster, not the viewer.)

The shift is no longer approaching: it is here. Generation Z and late Millennials do not necessarily watch sports when channels air events. They use phones and tablets to access a continuous stream of clips, memes, highlights, reactions and dialogue across several platforms. This is more than viewing: it is engagement, an interactive feed unfolding in real time among millions of simultaneous observers.

The prologue to this shift was 2005, when YouTube launched. At first, few foresaw its potential. Today, YouTube viewers watch over of sports videos every day. By the early 2010s, Twitter had become a global exchange where fans shared opinions and reactions. Then in 2016, TikTok crystallized the micromoment: a celebration, an injury, a training ground spat; each became a global event.

Direct-to-consumer streaming from DAZN to Amazon Prime accelerated the move away from orthodox sports broadcasting. Disney’s recent negotiations to in the UK and Ireland signal yet another break in the old model. The idea of watching sport exclusively on a big screen is passé. The iPhone has been with us since 2007.

As a result, athletes no longer appear only in scheduled broadcasts. Their presence becomes serial, fluid and continuous. The athlete exists in multiple modes at once: Physically on the pitch, visually on Instagram, narratively on YouTube, commercially on sponsorship platforms, memetically on TikTok and symbolically through fan-created content.

This is where Ronaldo becomes distinct from even the most illustrious athletes who came before him. The arc of his career coincided with the transformation of the media environment. The post-2010 ecosystem wants and rewards perpetual visibility. Did Ronaldo intuitively understand this? Or was he coached by advertisers, clubs and corporations? Was his boast (“I’m the best player in football history”) a careless slip or a calculated flourish?

360° Global exposure

Ronaldo might have been optimized for the digital age. Not just because he conjures clip-friendly moments such as and that George Balanchine might have choreographed, but because he contrives to pull in the media even by doing nothing. Moments entirely outside football have repeatedly renewed his visibility: the 2009 Las Vegas that circulated globally for years without a conviction; the at Euro 2020, when a momentary gesture at a press conference wiped billions from the drink company’s share price; and the 2022 a piece of celebrity theater that dominated news cycles far beyond sport. Even the museum devoted to Ronaldo in Madeira, the , the Instagram follower milestones and his move to Saudi Arabia — reported less as a football transfer than a “” — have kept him in the media’s scope.

Ronaldo generates global exposure on a scale few institutions, let alone individuals, can match. What distinguishes him further is how he has expanded his visibility beyond football’s natural boundaries. His Instagram following — the largest in the world — is not made up only of football fans. His audience is demographically different from those of Lionel Messi or Kylian Mbappé: broader, more diverse, far less dependent on his football. This is why, in 2022, he could transfer to Saudi Arabia (then a distant outpost of football) at 37 and retain, and probably actually expand, his global presence.

His club, Al-Nassr, gains the world’s attention not because Ronaldo scores goals, but because he appears. Remember the ex-Saudi Sports Minister’s comment on Ronaldo’s value being measured in terms of  “the global exposure he brings.” In digital culture, visibility is not a by-product; it is the product.

They say success has a thousand fathers, failure is an orphan. Success on the scale of Ronaldo doesn’t have a single origin: It is the result of the efflorescence of global sports broadcasting in the 2000s and the myriad changes in the media landscape that made some athletes 360° global celebrities. Nike, TAG Heuer and the other brands he endorses have positioned him expertly, so it’s impossible for him to go unnoticed. Collectively, they are the ecosystem in which the Ronaldo brand thrives. 

[Ellis Cashmore’s is published by Routledge]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Cristiano Ronaldo in the New Media Ecosystem appeared first on 51Թ.

]]>
/culture/cristiano-ronaldo-in-the-new-media-ecosystem/feed/ 0
Australia’s Idiotic Social Media Ban /business/technology/australias-idiotic-social-media-ban/ /business/technology/australias-idiotic-social-media-ban/#respond Sat, 22 Nov 2025 12:15:40 +0000 /?p=159245 “More moral panics will be generated … our society as presently structured will continue to generate problems for some of its members … and then condemn whatever solution these groups find”  —  Stanley Cohen, Folk Devils and Moral Panics (1972) Cohen might have been writing about Australia in 2025. By banning every child under 16… Continue reading Australia’s Idiotic Social Media Ban

The post Australia’s Idiotic Social Media Ban appeared first on 51Թ.

]]>
“More moral panics will be generated … our society as presently structured will continue to generate problems for some of its members … and then condemn whatever solution these groups find” 

—  , Folk Devils and Moral Panics (1972)

Cohen might have been writing about Australia in 2025. By every child under 16 from social media — the world’s first, due to take effect on December 10 — the Australian government is not protecting youth. It is spooking its own population, provoking widespread anxiety and amplifying scrutiny over teenage behavior.

In attempting to regulate digital life, policymakers have sparked the very fears they claim to contain. This is textbook moral panic, in which misconceived legislative overreaction has generated attention, consternation and, of course, resistance. There are bound to be unintended consequences.

Rationale

Australia’s legislation is the culmination of a year-long political build-up of concern over online harms, including cyberbullying, sexual predation, self-harm content, algorithmic manipulation and addictive scrolling. Ministers sold the new legislation as a lifeline for parents. Prime Minister Anthony Albanese puzzlingly the law is about “letting kids be kids.” Communications Minister Anika Wells added that parents deserve “peace of mind.”

Publicized of teenage suicide linked to online abuse, combined with national apprehension about the wider digital world’s opacity, created an open goal for decisive intervention. But the intervention was as crude as it will be ineffective.

Nine platforms are affected: Facebook, Instagram, TikTok, Snapchat, Threads, X, YouTube, Reddit and Kick. They must block new accounts for under-16s and deactivate existing ones. Noncompliance carries fines of up to 49.5 million Australian dollars ($32 million).

Platforms had initially protested, warning that mandatory age verification would be intrusive, inaccurate and pretty easy for a teenager to circumvent. The compromise relies on behavioral age-estimation tools, using engagement metrics such as “likes,” with third-party age-assurance apps invoked only for disputes. Teens will receive notices inviting them to download their data, freeze accounts or lose them entirely. The government reckons the measure is fail-safe. 

Interestingly, public opinion largely agrees: a last November found that 77% of Australians over 18 support the ban. Internationally, the legislation is being watched closely: New Zealand is similar restrictions, Florida a comparable law and European countries are experimenting with on social media.

Australia has become a global crucible, potentially setting a precedent for future restrictions elsewhere, though It is unlikely that such a contentious measure would receive comparably emphatic support elsewhere: analogous research from the USA and Europe reinforces the sense that Australia is out of step with global opinion ( of Americans favor banning children under 16 from using social media platforms, while of Brits aged 18-27 would support, relative to 50% who would oppose such a ban).

Forbidden fruit

The ban rests on a naïve assumption: that teenagers will quietly accept exclusion. History suggests otherwise. Adolescents grow up in a culture in which a ban is not so much a prohibition as a challenge. You don’t have to be familiar with to know that anything becomes more desirable once it’s not allowed. It’s called forbidden fruit.

Young people are wired for risk-taking and boundary-pushing, culturally inclined to resist adult overreach and technologically literate enough to bypass nearly any restriction. Cohen’s spiral is already becoming evident: officialdom suppresses, youth respond by circumventing and media attention magnifies both behavior and, by implication, anxiety.

Every generation of adults seems either to forget or ignore what youth entails. This is a developmentally crucial period: experimentation, novelty-seeking and testing limits are essential to forming adult judgement (or at least they were mine). Social media is not simply the communication toy adults assume it to be: It is an organic space, a venue for the formation of identities, connecting with peers and performativity — by which I mean presenting to audiences. Policymakers’ assumption of adolescent passivity,  that young people are childlike innocents who need to be insulated from “danger,” is patronizing and just plain wrong.

Savvy teenagers are inevitably going to find ways around blocks using virtual private networks (VPNs), multiple accounts, peer sharing or app workarounds. Attempts at enforcement will generate not compliance, nor even frustration, but clandestine use, probably promoting the very thing the Australian government is trying to curb. The ban, while intended as a protective measure, will inadvertently amplify attention, defiance and risk.

Australia’s discourse around the online dangers of youth often exaggerates risk while underestimating teens’ capacity for ingenuity and critical engagement. Social media is an uneven terrain: simultaneously treacherous and empowering, unintelligent and educational. By understanding it only as a hazard in the hands of the young, policymakers manufacture fear and fuel anxiety, rather than addressing specific harms in a targeted manner.

Wonderworld

Let me declare an interest: as I see it, the internet has introduced us — and I mean everyone with access to a functioning keyboard — to a wonderworld. It might at times appear dystopian, but it is a beguiling, exploratory, shapeshifting encyclopedia-cum-almanac that fascinates us and will continue to fascinate, no matter how hard misguided politicians try to put young people off.

What Australian legislators have ignored is the immense educational and cultural value of social media and the broader internet. For many adolescents, these platforms are not booby-traps but jetpacks to the stars, taking them to places where they can explore identity, pursue interests and access knowledge unavailable in school.

YouTube hosts Massachusetts Institute of Technology (MIT) on physics, creators offer from Seoul to São Paulo and online communities nurture everything from coding to calligraphy. Teenagers today learn, connect and experiment in ways literally unimaginable to previous generations.

For all the scares surrounding it, social media is not merely a funfair of distraction; it is a gargantuan archive of human knowledge, a site of peer support, creative collaboration and social cohesion. Adolescents do not merely consume content; they negotiate, reinterpret and contribute. The internet has become a vast, decentralized educational system that surrounds and inhabits us. To cordon off adolescents from this is not protection; it is denial, cutting them off from resources essential to their development.

We humans have historically reacted to new technologies with suspicion: the telephone was once accused of distracting women from productive endeavors (like housework); radio of corrupting the young; television of shortening attention spans; film of unleashing delinquency. Every trepidation now seems ludicrous. The hostility to social media follows the same script: a mix of fear of novelty, fondness for stability and conviction that younger generations must be defended from innovation.

Australia’s ban will do little to stop young people from navigating the wonderworld. It will only make that navigation more secretive, more fragmented and potentially more hazardous. In attempting to “let kids be kids,” lawmakers risk stunting the curiosity so integral to growing up. As Stanley Cohen warned in 1972, “Moral panics, once launched, develop a life of their own, becoming more about the panic than the actual event that started it.” Australia is about to learn this.

[Ellis Cashmore is co-author of (Macmillan).]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Australia’s Idiotic Social Media Ban appeared first on 51Թ.

]]>
/business/technology/australias-idiotic-social-media-ban/feed/ 0
A Century of Royal Scandal: Britain’s Shocking Monarchy /interactive/a-century-of-royal-scandal-britains-shocking-monarchy/ /interactive/a-century-of-royal-scandal-britains-shocking-monarchy/#respond Thu, 20 Nov 2025 08:13:28 +0000 /?p=159201 The post A Century of Royal Scandal: Britain’s Shocking Monarchy appeared first on 51Թ.

]]>

The post A Century of Royal Scandal: Britain’s Shocking Monarchy appeared first on 51Թ.

]]>
/interactive/a-century-of-royal-scandal-britains-shocking-monarchy/feed/ 0
The Timeline of Jeffrey Epstein /interactive/timeline-jeffrey-epstein-23949/ Tue, 11 Nov 2025 21:09:17 +0000 /?p=80051 This timeline by Ellis Cashmore, author of “Kardashian Kulture,” explores the Jeffrey Epstein case over the years.

The post The Timeline of Jeffrey Epstein appeared first on 51Թ.

]]>

The post The Timeline of Jeffrey Epstein appeared first on 51Թ.

]]>
Is Taylor Swift Really as Great as Shakespeare?* /culture/is-taylor-swift-really-as-great-as-shakespeare/ /culture/is-taylor-swift-really-as-great-as-shakespeare/#comments Sat, 08 Nov 2025 14:14:48 +0000 /?p=159035 To those who pay attention to the zeitgeist, Taylor Swift is the defining person of the century, not so much an echo from the past or a harbinger of the future as the pure distillation of the present. She reflects others’ lives and identities — restless and shifting — while her own life is mediated… Continue reading Is Taylor Swift Really as Great as Shakespeare?*

The post Is Taylor Swift Really as Great as Shakespeare?* appeared first on 51Թ.

]]>
To those who pay attention to the zeitgeist, Taylor Swift is the defining person of the century, not so much an echo from the past or a harbinger of the future as the pure distillation of the present. She reflects others’ lives and identities — restless and shifting — while her own life is mediated and molded by her fans, even as it unfolds. Those fans are not just fans: they are participants.

To everyone else, she’s simply a woman who writes songs and somehow rules the world.

Rarefied group

When, in October, Swift’s The Life of a Showgirl sold more than copies in its first week (the biggest debut since sales tracking began in 1991), industry watchers snatched at superlatives. “The biggest moment in the sales history of the business,” Hits Daily Double, crediting Swift’s “ability to be ubiquitous in this ultra-segmented media era.” The Financial Times among a “rarefied group” of Elvis Presley, The Beatles, Michael Jackson and Madonna, all artists whose commercial power reshaped popular culture.

It’s an understandable and perfectly justifiable comparison. Swift’s achievements, in purely quantitative terms, are breathtaking. Her 12th studio album has outsold everything in memory: Adele’s 25, Beyoncé’s Renaissance, Billie Eilish’s Happier Than Ever. Her Eras Tour has grossed more than , the highest in history. She has turned record sales, an unfashionable metric nowadays, into a revivalist practice, with fans clasping vinyl, CDs and limited “variants” as if they were sacred objects.

But metrics alone don’t make for greatness. Taylor outsells anyone in history. In terms of cultural significance, can she hold her own with the titans of pop music? The more difficult question is whether her cultural significance belongs to another entirely different order: not alongside Elvis and Madonna, but with the likes of William Shakespeare and Pablo Picasso.

Invention and translation

Elvis Presley poured American blues music into the blender in the 1950s, and it came out pureed, its African American essence hardly discernible. Elvis added a splash of vanilla and served it up to white audiences. He still channeled enough raw energy to draw the wrath of broadcasters, many of whom accused him of corrupting America’s youth with “jungle music.” But, while he frightened adults, he electrified teenagers. And, as we know, nothing electrifies teenagers like a ban. The more stations that banned his music, the more young people wanted to hear it.

Few artists have ever embodied rebelliousness so convincingly as Elvis: his transgressive music and voluptuous on-stage presence made him unsettling. But not for long: by 1960, after two years’ service in the US Army, he returned as a domesticated entertainer, his projects safe, formulaic and designed for the mass market. 

The Beatles did something similar but in another register. They not only re-defined the idea of a group — self-writing, self-performing, British — but evolved so rapidly that they seemed to anticipate every new direction of the Swinging Sixties’ imagination. They moved from the innocent peppiness of “She Loves You” in 1963 to the dreamscapes of “Tomorrow Never Knows” in 1966 to the meditative Indian-influenced “Within You Without You” of 1967. By 1970, when they announced the breakup of the band, the Beatles had expanded the meaning of pop music and helped establish it as a legitimate art form.

Michael Jackson was the first global star in the post-Civil Rights era. And, of course, he was African American. This is not quite the same as being black: for many, he wasn’t black enough. He straightened his hair, lightened his skin, dressed in designer chic and escorted (and married) white women. Aesthetically, Jackson fused the visual and musical into spectacle using the then-emerging medium of the music video. 

His groundbreaking 14-minute Michael Jackson’s Thriller video was given an unheard-of global on December 2, 1983. It didn’t just sell albums; it remade MTV, fashion and choreography. It’s also worth noting that MTV had a reputation for favoring white artists: Jackson wasn’t the first African American to appear on the channel’s playlists, but he became a regular.

So did Madonna, who, like Jackson, made a specialty of breaking taboos, in her case mostly relating to sex, gender and religion. Also like Jackson, she emerged when the video was becoming as important as the record, and hence, pop music was morphing into a new type of theater. Madonna proved expert in weaponizing the video, using visual narrative to provoke, annoy and, whenever possible, outrage as many people as possible. In the process, she challenged a culture still governed by patriarchal sensibilities. She was an original: Strategic, self-conscious and protective about her own image.

Can Taylor Swift claim a similar level of originality to Madonna or, indeed, any of the others? Probably not in content or presentation. She extends rather than creates. Her albums heave with heartache, unrequited love and breakups. Critics have her latest, Life of a Showgirl, of recycling melodies and even borrowing outright from earlier songs. The Guardian the album as “dull razzle-dazzle.” Her have described it as regression, a product of “brand maintenance.” Lyrically, her confessions no longer catch the listener unaware, and her themes seem familiar.

Originality today may be defined less by invention than by translation: the ability to convert private feeling into public industry. Swift’s distinction is her capacity to industrialize emotional and spiritual crises while preserving the illusion of intimacy. She sings to millions, yet each listener feels personally addressed, as if a note to the self had become the basis of an entire cultural economy.

Her contemporaries Lana Del Rey, Hozier and Hayley Williams (of Paramore) all command admiration for their lyricism and emotional range. We might also add Joni Mitchell, whose career began in the 1960s, and who is revered artistically and rhapsodized over by successive generations. But while critics as well as devotees laud all these artists, none has been validated so fulsomely as Swift.

Finding a place for Swift in the canon of pop music is hardly contentious. Of course, others might nominate Bob Dylan, Bruce Springsteen, David Bowie or Prince — each pivotal in their way. Swift appears to represent something different: the fusion of authorship, entrepreneurship and emotional connection on an industrial scale, and perhaps she deserves a place in a different, altogether more magnificent canon.

Transformative

In 2015, the musician Ryan Adams an homage to Swift’s 1989 and concluded: “Taylor Swift is like Shakespeare.” Within a few years, the comparison became unremarkable.

“We compare her to Shakespeare all the time,” acknowledged Elizabeth Scala, a professor at the University of Texas, where a course called analyzed her lyricism and narrative composition. Another scholar, , explored formal and thematic continuities between Swift’s songwriting and Shakespearean conventions in her : The Poetic and Musical Genius of Taylor Swift, noting how the emotional subtlety, wordplay and storytelling in Swift’s work demand interpretive attention in ways reminiscent of classical literature.

of NSS magazine believed Swift might have taken cues from another creative torchbearer: “Pablo Picasso had long foreseen the endless lucrative potential of inserting personal stories into works of art.” Lucian , the CEO of Universal Music Group, also likened her to the prolific painter and sculptor: “Imagine Picasso painting something that he painted a few years ago, then re-creating it with the colors of today,” said Grainge about Swift in 2023. She decided to re-record her early music note-for-note.

As part of the Maryland Symphony Orchestra’s 2024–2025 Lecture Series, the conductor explored “how both [Ludwig van Beethoven and Swift] use music to navigate the complexities of human emotion, frame stories of struggle and resolution, and inspire listeners to reimagine what’s possible.”

Swift is now a subject of serious academic study, her work treated with the critical solemnity traditionally reserved for canonical poets, playwrights and transformative artists. There is a recognition that Swift’s capacity to shape narrative, emotion and audience engagement distinguishes her not just as a pop icon, but as a figure in contemporary culture. 

Now. How about in 134 years’ time? Will people still be listening to her music? I ask because I noticed a new production of Henrik Ibsen’s has just opened. The play was first performed in 1891. Ibsen died in 1906, and he is one of the world’s most performed playwrights, only to Shakespeare in terms of frequency. I’ll start my answer with another question.

Unstable or visionary?

Would the audience that attended Prague’s Teatro di Praga for the premiere of Wolfgang Amadeus Mozart’s in 1787 have wondered whether they had heard the most thrilling, heart-stoppingly beautiful ever created, and whether the gods had blessed the composer with gifts of unique brilliance? Probably not: More likely they, like today’s viewers of reality TV, were just enjoying it.

Picasso came to public attention with an of his and French artist Henri Matisse’s work at London’s Victoria & Albert Museum in 1945, when he was 64, but his unique virtuosity wasn’t recognized until 1960, 13 years before his death. Even Shakespeare was acknowledged as only one of several proficient sixteenth-century playwrights. Some even suspect the Bard took credit for the work of others who, for various reasons, wanted to avoid authorship of their plays. It was the Romantic writers of the nineteenth century who hailed him as a transcendent artist, a sentiment shared and embellished by successive generations.

Artistic brilliance is rarely acknowledged in its own time. Most artists we now recognize as great labor in obscurity, their originality so unfamiliar that contemporaries mistake it for quirkiness or lack of ability. Austrian composer Franz Schubert, who died at 31, was known mainly as a songwriter to a small circle of friends; the vast symphonic and chamber works that later secured his monumental status went largely unheard until long after his death.

Vincent Van Gogh sold almost nothing in his lifetime and was regarded as unstable rather than visionary. Beethoven, by contrast, was an exception to this tendency: He was acknowledged, debated and even revered while still alive. Even so, his audiences may not have understood the full extent of his innovation. But they recognized his stature, sensing — rightly — that they were witnessing a kind of greatness seldom granted to the living.

Had Swift (b.1989) delivered one of her songs about self-loathing and disordered eating, such as “Lavender Haze” or “You’re On Your Own Kid” in 1824 (when Beethoven’s “9th Symphony” premiered in Vienna) rather than 2022 (the actual year when her album Midnights, on which these tracks feature, was released), would there have been a response of rapture and veneration, as there was in the 21st century? Doubtful.

Greatness

Greatness is an attribution: it’s the action of ascribing a quality as a characteristic possessed by a person. As such, it’s a feature of a relationship rather than a property: if audiences recognize greatness, it becomes greatness. Shakespeare and the others are great, but not because of their innate talent — because generations of people recognized, celebrated and responded to their work as extraordinary.

Their audiences agreed, explicitly or implicitly, that the art they produced could only have been created by someone exceptional. This shared belief is precisely what conferred the status of greatness. Without the collective acknowledgment, the works might have been admired, but the creators would not have been idolized.

If greatness is an ascription rather than an innate quality, then the question of Swift’s stature becomes less about her talent and more about consensus. Shakespeare, Beethoven and Picasso endure not because their genius is self-evident, but because history has agreed upon it. They occupy the canon — that institutional space where value, once conferred, becomes permanent. The canon stabilizes reputations: within it, artists may be reinterpreted, critiqued, even periodically dethroned, but never truly ejected.

If we extend this relativistic logic to Swift, she already qualifies as great — vast audiences and critical establishments have ascribed greatness to her. The unresolved question is whether she will achieve the same permanence: whether the agreement that now sustains her brilliance will endure long enough to secure her a lasting place among the cultural immortals.

Which returns us to our earlier question: Will audiences still listen to her in the 23rd century? Ibsen’s plays are masterpieces of literary realism, exploring psyche, emotion and society, timelessly relevant subjects. Swift is known for delivering compositions on love, heartbreak and self-discovery, all universal tropes that will resonate in the future. Her lyrics are intelligent, culturally acute and reflect 21st-century moods rather than inquiries into the human condition.

So, while she commands devotion now, her work lacks the dramatic architecture that keeps masterpieces living. Ibsen’s plays in particular seem organic because they expose the moral tensions, social hypocrisies and psychological constraints that remain ordinary and recognizable across centuries; they invite reinterpretation in every era, and this is another key variable.

Successive generations adapt and modify work without losing its significance. Swift already reinterprets her own catalog: The Taylor’s Version re-recordings were new acoustic rearrangements and live variations on her hits, which show her work’s flexibility and capacity for renewal. But, unlike classic plays or symphonies, most of her songs are unlikely to be covered, adapted or even sampled for years to come, and this may limit their afterlife beyond her own performances.

Swift has already achieved greatness: We, her audience, agree on that. She can hold her own in the esteemed company of Elvis and co. And her material may well be reimagined for decades to come. But is there enough profundity and discernment to guarantee her a place in the artistic canon?

Predicting the future of music preferences is impossible. The ever-changing nature of art and entertainment makes it difficult to determine whether Taylor Swift’s music will remain popular next year, let alone in the 23rd century. And yet her colossal impact on the music industry and her ability to adapt to changing times will surely ensure her lasting popularity, though perhaps not her place in the artistic canon.

[Ellis Cashmore’s is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Is Taylor Swift Really as Great as Shakespeare?* appeared first on 51Թ.

]]>
/culture/is-taylor-swift-really-as-great-as-shakespeare/feed/ 1
When the Spell Breaks for Taylor Swift and J.K. Rowling /culture/when-the-spell-breaks-for-taylor-swift-and-j-k-rowling/ /culture/when-the-spell-breaks-for-taylor-swift-and-j-k-rowling/#respond Wed, 15 Oct 2025 13:24:40 +0000 /?p=158642 Everything celebrities do, say or even think is subject to scrutiny. Not just their official output, like albums, concerts or books, but even their ideas, beliefs and values. The more acclaim they get, the greater the scrutiny. So, any celeb expecting not to receive a mega-tsunami of criticism at some point is in for a… Continue reading When the Spell Breaks for Taylor Swift and J.K. Rowling

The post When the Spell Breaks for Taylor Swift and J.K. Rowling appeared first on 51Թ.

]]>
Everything celebrities do, say or even think is subject to scrutiny. Not just their official output, like albums, concerts or books, but even their ideas, beliefs and values. The more acclaim they get, the greater the scrutiny. So, any celeb expecting not to receive a mega-tsunami of criticism at some point is in for a rude awakening. Condemnation and praise are both part of the celebrity equation. 

Recently, two of the most influential women in the world have been under fire, not for the first time, but in a way that suggests they are now targets. We shouldn’t be surprised: No one reaches the altitude of American singer Taylor Swift or wields such cultural sway as British author J.K. Rowling without morphing into what Australians call a tall poppy: a person who is conspicuously successful and whose success frequently attracts envious hostility. All the same, Swift and Rowling must feel like they’ve been mugged by fate.

Death of a Showgirl

When Taylor Swift’s latest album, The Life of a Showgirl, dropped at midnight Eastern time — or five o’clock in the morning for British listeners — on October 3, it wasn’t so much a release as a global liturgy. Social media went dark for a moment, then exploded. Streaming platforms groaned under the weight of first-hour demand. The scale of anticipation was comparable to the premiere of on December 2, 1983. In the streaming age, that kind of mass simultaneity is almost extinct. Swift revived it and succeeded in turning a piece of art into an event, an instant that millions of people felt compelled to witness in real time.

Yet within hours, the mood shifted. Critics, fans and detractors converged on X and TikTok with competing expositions and barbed . The language was quite moral: her lyrics were “shallow,” her tone “smug,” her persona “calculated.” Others that the album’s apparent simplicity disguised a deft portrait of emotional exhaustion and self-awareness. Swift, they insisted, was no longer merely chronicling the relationship: She was deliberately dramatizing being misread. In doing so, she invited the very scrutiny that she’d previously escaped.

The sheer intensity of the reaction tells us something about Swift’s position in the cultural universe. She no longer belongs solely in the lineage of pop icons like Elvis, Madonna or the Beatles, all artists who revolutionized their genres. Swift has become a creative force whose work operates in the same register as Mozart, Picasso or even Shakespeare: Endlessly deconstructed, interpreted and repurposed by an audience that insists on seeing itself not as fans but as integral parts of her output.

Taylor’s lyrics are densely plotted, multi-charactered, imaginatively textured and woozily allegorical ruminations on love, moods and, it seems, pretty much everything else on the emotional spectrum of young women. Her songs are no longer personal confessions; they are public texts, polysemous and perpetually rewritten by collective opinion. But there are consequences.

Such scrutiny is both the cost and confirmation of greatness in the twenty-first century. Swift’s dominance has made her music inseparable from the moral, political and aesthetic debates that surround it. Each album now functions as a kind of referendum: Not just on her greatness, but on our own cultural values, such as sincerity versus irony, feminism versus fame, authenticity versus performance. The reaction to this latest release shows how fully celebrity has replaced theology — a system of belief and disbelief, faith and heresy, played out on a global scale.

The statue that came to life

Like Swift, her fellow traveller among masters of modernism, J.K. Rowling has become an icon: One with the perverse sway of the infallible that often, or perhaps always, comes with that status. Once the conjurer of a fictional universe filled with spells and sorcery, Rowling has since re-emerged as something stranger: a cultural contrarian who has left behind the safe realm of her own imagination to enter the more perilous environment of public morality. 

Where Swift’s lyrics are dissected for hidden clues to her private loyalties, Rowling’s tweets and interviews are parsed like forbidden texts. Both women inhabit an arena where art and identity, opinion and orthodoxy collide — and where the penalties for misinterpretation can be swift, severe and global.

The dispute between Rowling and her onetime star, Emma Watson, has the character of a fabulous myth: A Pygmalion whose statue refused to stay marble and acquired a life of its own. Rowling didn’t simply write about Hermione Granger; she gave humanity to the actor who embodied her. Watson’s intelligence, integrity and, more problematically for Rowling, activism are almost living testaments to the author’s original conception. The trouble is that those echoes now reverberate far beyond the author’s reach. What began as a relationship of artistic possession has evolved into one of personal opposition: a prolific author confronted by the independent will of her own invention.

Rowling has often spoken of the intimate, even mystical relationship between author and character, but in Watson’s case, that bond is not just fiction. Rowling, it seems, didn’t merely write about Hermione Granger: she effectively magicked the actor who embodied her, or at least she did. Watson’s mind, fame and public presence were once inseparable from the role Rowling imagined. Small wonder, then, that when the two women diverged on questions of gender and identity, the disagreement carried the emotional heft of filial rebellion. 

For Rowling, the heated exchange over womanhood has become a moral, not a semantic war, of words. Her insistence on the primacy of biological sex has alienated many former admirers, including Watson, who has championed a more inclusive view of gender. What might once have been a civil philosophical disagreement has hardened into a symbolic war, each side standing for a different conception of what it means to be a woman. Rowling, cast by her critics as an unrepentant transphobe, now exhibits her ostracism almost as proof of principle; Watson, now recast as the conscience of a younger, more progressive gender-fluid generation, responds with quiet but unmistakably angry defiance.

Victims of greatness

If there is tragedy as well as fascination in this, it lies in the destructive symmetry. Rowling, the author-cum-deity, breathes life into a character; the character’s embodiment grows up, acquires autonomy and ultimately challenges the authority of her maker. What began as a story about differences of opinion is now a kind of parable about power: who gets to define reality, the storyteller or the world she helped shape? What could be more entertaining for the legions of Harry Potter fans who have followed Rowling’s fiction like acolytes?

Two peerless, pre-eminent women, one a writer who conjured a universe anew from her imagination, the other a singer who built a mythology from her own experiences. Each has spent years, even decades, in a state of near-unchallenged ascendancy. Their output has defined eras; their names are signifiers for excellence and female empowerment. Both women have enjoyed the kind of global adulation that borders on worship — a condition that’s as intoxicating as it is precarious. When the applause is quieted and the disciples turn inquisitors, the criticism probably feels like betrayal.

Swift and Rowling aren’t victims of failure but of magnitude, eminence and greatness: when someone’s influence inflates so vastly, every expression, whether a lyric, a tweet or a throwaway remark, becomes public property. Swift’s heartfelt confessions and Rowling’s moral certitudes are annotated, pored over and endlessly reinterpreted as if they were theological texts. In a sense, they’ve become mirrors in which millions search for their own reflections.

At 35, Swift stands at the point where youthful omnipotence gives way to experience; Rowling, nearly 20 years her senior, inhabits a later stage of that same parabola. Both now confront an unfamiliar adversary: The scrutiny that attends not decline but grandiloquence. Each faces a culture newly suspicious of perfection and probably eager to unmask hitherto flawless idols.

[Ellis Cashmore’s is published by Routledge]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post When the Spell Breaks for Taylor Swift and J.K. Rowling appeared first on 51Թ.

]]>
/culture/when-the-spell-breaks-for-taylor-swift-and-j-k-rowling/feed/ 0
Why Do the Beckhams Still Fascinate Us? /culture/why-do-the-beckhams-still-fascinate-us/ /culture/why-do-the-beckhams-still-fascinate-us/#respond Wed, 08 Oct 2025 14:34:09 +0000 /?p=158495 German philosopher Georg Wilhelm Friedrich Hegel believed that history unfolded in stages. Celebrity history seems to obey a similar pattern. The first stage is thrilling: novelty, scandal and surprise. The second is saturation: omnipresence in every medium, endlessly discussed and dissected. Then comes the fateful third stage: oblivion. The public forgets, or at best remembers… Continue reading Why Do the Beckhams Still Fascinate Us?

The post Why Do the Beckhams Still Fascinate Us? appeared first on 51Թ.

]]>
German philosopher Georg Wilhelm Friedrich Hegel believed that history unfolded in stages. Celebrity history seems to obey a similar pattern. The first stage is thrilling: novelty, scandal and surprise. The second is saturation: omnipresence in every medium, endlessly discussed and dissected. Then comes the fateful third stage: oblivion. The public forgets, or at best remembers faintly, why it was ever fascinated.

A presence without a role

David Beckham resists this dialectic. At the turn of the millennium, he dominated front pages — for his goals and his blunders; his haircuts and his alleged affairs; his restless moves across Europe and the United States, from Manchester to Madrid, Los Angeles to Milan and Paris. Retirement should have been the prelude to irrelevance. 

Instead, Beckham remained, appearing in commercials, at sports tournaments and even in the solemn to pay his respects at Queen Elizabeth II’s funeral. Far from vanishing, he has become something stranger — a presence that endures without a role.

When Beckham competitive sports in 2013, no one seriously expected him to follow the familiar path of retired athletes and become a pundit, a manager, a serial reality television participant or even a bit-part film actor like Vinnie Jones. But equally, no one imagined he would prolong the spell first cast in the late 1990s, when he emerged as football’s first bona fide and the fiancé of “Posh Spice,” real name Victoria Caroline Adams (now Lady Beckham), from the all-conquering Spice Girls.

Yet the spell endures. Beckham has developed into an unusual, perhaps unique figure. Never in the history of celebrity has there been someone who remains a regular media fixture while doing nothing beyond appearing.

American socialite Paris Hilton might seem like a precedent. Yet Hilton released an album, starred in several films and published a book. Businesswoman Kim Kardashian and her clan industrialized their presence through reality television, social media and orchestrated moments — from being at gunpoint in Paris, France, to actress Marilyn Monroe’s historic “Happy Birthday, Mr. President” gown at the Met Gala. 

Beckham, by contrast, has been scandal-free for years. Even when he looked headed for trouble in 2022, serving as a £125 million ($166 million) for Qatar’s World Cup despite the country’s criminalization of gay sex, he somehow avoided lasting damage. Earlier in his career, both he and Victoria had openly LGBTQ+ rights. But the controversy simply dissipated.

Beckham has no peers. For 12 years, he has remained fascinating without a role to explain his fascination. Other “presences” like actors George Clooney, Brad Pitt and Pierce Brosnan sell watches, coffee or cologne, but their anchoring role is cinema. Stop the films and the endorsements would surely dry up. Beckham has no other role. He simply appears.

Developing the brand

In 2000, Polity Press asked me to write a book on Beckham, exploring how an athlete with no political views — or even opinions on much at all — could become a global icon. My book, , was a rush job but sold well enough to earn a second edition.

In that edition, I argued that his ascent owed much to Victoria. While her bandmates enjoyed the moment, she studied the dark arts of celebrity. Like the sculptor Pygmalion of Greek myth, she fashioned raw material into a figure the world could love. For a while, there wasn’t just one Beckham: There were innumerable versions in the imaginations of millions. And, because he rarely spoke, he never disappointed.

In 2003, Beckham left SFX (sports agents) for British entrepreneur Simon Fuller’s 19 Entertainment, home of the Spice Girls. As put it at the time: “Victoria believes Mr Fuller, who propelled her to fame, can develop the Beckham brand on a global scale.” No one can doubt that he succeeded, perhaps beyond his or Victoria’s wildest dreams, though surely neither expected the barely believable developments after David retired.

One lesson I am sure Victoria learned with the Spice Girls was how to ration yourself. The band was a prolific and indiscriminate endorser; they appeared in countless print and on-screen ads and licensed their name to products. Eventually, there was overexposure, inclining both advertisers and consumers to turn away. Beckham endorsed plenty of products, but they were usually handpicked in a way that enhanced his persona: underwear, whiskey or cologne, for example. And different ads appeared in different territories so that consumers never tired of seeing his image.

Victoria, meanwhile, had her sights on becoming the head of a serious fashion house. In 2008, she her first collection. It grew into Victoria Beckham Holdings, a structure of interlinked companies financed by family money, Fuller’s management group and later a private equity partner. The business has expanded from clothing into handbags, footwear and beauty lines, though it’s required repeated capital injections to cover consistent losses — over £60 million (over $80 million) since inception — in order to keep inventory, production and expansion going.

Her husband had no comparable travails. His own commercial structure, DB Ventures Ltd (incorporated in London in 2014), was created explicitly to maximize revenue from endorsement contracts with the likes of Adidas, H&M, Armani, Haig Club whiskey, Tudor watches and others. The sums have been vast: In 2016, DB Ventures reported profits of over £39 million (over $52 million); by 2019, its annual revenues topped £45 million ($60 million), so that even in retirement, Beckham regularly ranked among the world’s highest-paid athletes. 

Where Victoria had to build a fashion brand almost from scratch, David simply monetized his name and the world kept buying. In 2022, Authentic Brands Group (ABG) acquired a in DB Ventures for approximately $269 million. This strategic partnership aimed to leverage ABG’s global reach and brand development expertise to expand Beckham’s brand across new verticals and territories, including Europe, the Middle East and Asia-Pacific.

New audiences

The true marvel of Brand Beckham isn’t just how David and Victoria have interwoven their various ventures into a cohesive empire, but how they’ve sustained it across generations. For Generation Z and many Millennials, David is primarily known as a ubiquitous figure in advertisements, a far cry from the footballer who once captivated the world with his on-field prowess. 

These younger audiences won’t recall his at the 1998 World Cup, the subsequent vilification or his eventual redemption and return to the pitch. They might not remember the tabloid frenzy over his fashion choices, like the infamous or the alleged that rocked his marriage. Yet, despite this, their interest in the Beckhams remains palpable.

This intrigue culminated in Netflix’s acquisition of a 2023 documentary series, Beckham, made by the Beckhams’ production company, Studio 99. Forbes suggests the deal was worth approximately .The series offered an “intimate” look into David’s life, from his rise in football to his ventures in business and media. Despite initial reservations (David he “hated almost every moment” of filming) the series resonated with audiences — probably in the same way Netflix’s Meghan & Harry series succeeded. It garnered an approval rating on Rotten Tomatoes and was for five Primetime Emmy Awards, including Outstanding Documentary or Nonfiction Series.

Critics were largely favorable, with The Guardian it as “a candid, riveting truth about the footballer’s life.” Others noted its glossiness, with Vulture that it felt “too neat to feel honest about the complicated life we watched play out.”

The success of the documentary yielded tangible financial benefits. In 2024, David Beckham’s business empire saw profits , leading to a reported $57 million payday. The Sunday Times Rich List for 2025 reported that the combined net worth of David and Victoria was £500 million (approximately $636 million).

David Beckham attending the Inter Miami CF vs. Houston Dynamo, US Open Cup Final. Via .

Defying logic

Buoyed by Beckham’s triumph, Netflix commissioned a three-part , Victoria Beckham. Some will groan that the Beckhams are becoming another Prince Harry and Meghan Markle, or even the Kardashians reincarnate. Whether you find them grim, hilarious or neither, there’s no doubt that the Beckhams have worked, reworked and double-reworked their magic for this century. They have adapted to new audiences and platforms in ways that defy logic.

Celebrities with enduring popularity usually have a body of work that continues to beguile audiences long after their prime — think Marilyn Monroe, David Bowie or Audrey Hepburn. Yet David’s football career ended 12 years ago and Victoria was never the most prominent member of the Spice Girls. Most reality television breakout stars, by contrast, enjoy intense but fleeting fame; few remain culturally relevant beyond a decade.

The Beckhams, on the other hand, have created an ecosystem that consistently delivers what audiences want while staying attuned to cultural trends even when others have written them off. In a landscape where fame is often ephemeral, their ability to sustain attention across generations is a feat both remarkable and almost defiant.

[The 2nd edition of Ellis Cashmore’s was published by Polity in 2004. His book is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Why Do the Beckhams Still Fascinate Us? appeared first on 51Թ.

]]>
/culture/why-do-the-beckhams-still-fascinate-us/feed/ 0
Why Do Americans Want To Buy Premier League Football Clubs? /world-news/us-news/why-do-americans-want-to-buy-premier-league-football-clubs/ /world-news/us-news/why-do-americans-want-to-buy-premier-league-football-clubs/#respond Mon, 06 Oct 2025 13:40:13 +0000 /?p=158467 In association football, the world’s a safari park where wild animals (the players) are kept in the open and may be observed by visitors (fans) driving through. The owners of the areas of parkland charge visitors and pay the usual overheads plus a handsome bonus for the beasts. This isn’t quite as businesslike as it… Continue reading Why Do Americans Want To Buy Premier League Football Clubs?

The post Why Do Americans Want To Buy Premier League Football Clubs? appeared first on 51Թ.

]]>
In association football, the world’s a safari park where wild animals (the players) are kept in the open and may be observed by visitors (fans) driving through. The owners of the areas of parkland charge visitors and pay the usual overheads plus a handsome bonus for the beasts. This isn’t quite as businesslike as it sounds. In fact, it appears absolutely inane when you realize the outgoings exceed the amount of money generated. What sounds like a potentially profit-making enterprise actually loses money hand over fist.

This painfully contrived metaphor leads me to the question I want to answer in this piece. While safari parks are, to my knowledge, well-run endeavors that turn a penny, football clubs are decidedly not. So why would anyone in their right mind and with plenty of money want to prolong the excruciating metaphor, put their heads in a lion’s mouth?

Money-burning on purpose?

Half of England’s Premier League clubs are now in American hands. And yet most of these clubs lose tens of millions of pounds every year. Are football clubs businesses designed to burn money? It certainly seems like it.

Still, it’s a mostly free world and people are entitled to lose their lucre if they wish. Sheikh Mansour bin Zayed Al Nahyan, the deputy Prime Minister of the United Arab Emirates, wishes. In 2008, he bought the then-unfashionable club and used his money to turn the team into worldbeaters. is also owned by Gulf State interests — in this case, Saudi Arabia’s Public Investment Fund. Outside England, Paris Saint-Germain is owned by Qatar Sports Investments. All three clubs lose money, though the losses are comparatively small by Gulf standards. In all three cases, the acquisitions are like you or me treating ourselves to a new high-end television set; we’d have to pay the streaming subscriptions and the electric bills, but watching it is bound to give us a lot of comfort over the years.

The American stampede to buy English clubs needs a different explanation, though. American owners don’t treat clubs like a Cézanne painting, a blue diamond or a 1945 Domaine de la Romanée-Conti wine that sells for about $600,000 a bottle. Possessing irreplaceable artifacts like these is rewarding. Football clubs are valuable but replaceable. Any prospective buyer would surely make a comprehensive appraisal, balance a club’s assets and liabilities, evaluate its commercial potential and conclude it’s a money pit. And then buy it. Why?

I’ll take Manhattan

Let’s say you had enough money — or could at least borrow enough at an affordable interest rate — to buy property in Chelsea, London, or Manhattan, New York. Both are top-tier markets where prime buildings fetch $2,000–$4,000 per square foot, with ultra-rare assets climbing far higher. Real estate is scarce and rents are sky-high.

But you get a chance. You don’t want to live there, but you know the property will rent easily and at a strong rate. The rent, steep as it is, doesn’t cover the interest you’re paying on the loan (let’s say you pitched in 30% yourself and borrowed the balance). So you know in advance that the property is going to lose money. But in ten or 15 years, that property could have quadrupled, maybe quintupled, in value. Scarcity, inflation and global demand take care of that. You bought for $2 million. You sell at $8–10 million, meaning an annualized return of roughly 10–15% per year, compounded — ignoring transaction costs and taxes.

You could enhance the value of the property further by renovating it or making cosmetic or even structural changes, like installing a basement pool. And perhaps a Hollywood A-lister takes a shine to the place and rents it for a period every year, giving the place a certain je ne sais quoi (indescribable quality).

Purchasing Chelsea or Manhattan property at a loss with the expectation of long-term appreciation has a logic and American investors have applied the same logic to Premier League clubs, sometimes with more prodigious appreciation. Take the late businessman Malcolm Glazer’s acquisition of in 2005, for example. At the time, United was already a highly profitable club, yet Glazer bought it with a strategy that blended asset-building with financial leverage: using the club itself to secure loans and restructure ownership. We needn’t dwell on the complexities of his approach here, but the key point is clear: Even a well-run, profitable club can be acquired with an eye to long-term capital growth rather than immediate operating returns.

Today, his sons continue to pursue a similar basic approach. They spend vast sums on salaries to attract top players, betting that the club’s global brand and commercial clout will keep appreciating over time, even if on-field success is intermittent. The purchase price in 2005 was £790 million; the estimated value of the club is now ($6.6 billion), representing a five-and-a-half times increase.

By contrast, under Fenway Sports Group (FSG) and under businessman Todd Boehly and US private equity firm Clearlake Capital (which raises money from investors to buy, restructure and eventually sell companies at a profit) have applied a slightly different but related logic. Both clubs deliberately absorb operational losses to build value, investing heavily in players, infrastructure and global brand recognition. Unlike Manchester United, whose brand was already enormous when acquired, Liverpool and Chelsea required both performance and marketing strategies to enhance their appeal. The results have largely justified the strategy: Liverpool has prospered under FSG, while Chelsea, after selling its women’s team last year, at least broke even. But in both cases, operating losses remain the norm.

Liverpool cost FSG £300 million in 2010; the club is now valued at £3.6 billion ($5.4 billion). Boehly and Clearlake bought Chelsea for £2.5 billion (plus £1.75 billion committed investment) in 2022. So far, the club has appreciated only modestly by about 8% to about $3.25 billion.

Remember: Football clubs may not have much liquidity, but they have predictable assets with substantial resale value. These include stadiums, ticketing, media rights and, of course, the contracts of the players. A single contract for one player could be worth tens of millions on the transfer market. Transfers involve the trading of contracts rather than players.

In short, whether it’s a trophy penthouse in Manhattan or a Premier League club, the American playbook is consistent: accept short-term losses in exchange for long-term asset appreciation, with the added bonus that global visibility and prestige can compound the perceived value just as celebrity tenants elevate a property’s cachet.

What about Bournemouth?

Manchester United, Chelsea and Liverpool are, in a sense, proven commodities. They’re situated in large metropolitan conurbations, with global fan bases and international reach; they are all leading clubs.

What about Bournemouth? Based in a small resort town on the south coast of England with a population of less than 160,000 and no brand recognition to speak of, it seemed an odd acquisition when, in 2022, American businessman Bill Foley 100% of the club for a reported £100 million (over $135 million). He integrated it into his Black Knight Football Club (BKFC), which has clubs in Portugal, Scotland, New Zealand and elsewhere. Foley has interests in a great many sports clubs besides.

At first glance, the asset-building model seems less certain here. Trophies and worldwide sponsorship deals are unlikely to drive rapid value appreciation. Yet Foley’s strategy may not rely solely on global brand recognition. By maintaining Premier League status, optimizing matchday and broadcast revenues and integrating Bournemouth into BKFC’s growing network of clubs, he can create value incrementally. The club becomes a scarce, transferable asset: small, manageable and with room for structural or operational improvements — akin to buying a well-located property in a modest town and enhancing it gradually. Even without the glamor of top clubs, a decade from now, Bournemouth could yield a tidy profit for its American owner.

Bournemouth isn’t the only minor club to attract interest, either: A football club in , a tiny Welsh mining town (population 45,000), was snapped up by Canadian and American actors Ryan Reynolds and Rob McElhenney, respectively, for £2 million ($2.7 million) in 2021. In June 2025, and McElhenney considered selling a stake for a valuation of up to £350 million ($415 million). , an ailing, underachieving club in a big city, was bought by a consortium that included former National Football League star Tom Brady.

Short-term loss, long-term profit

Ultimately, what links all these acquisitions is a willingness to absorb short-term losses in pursuit of long-term value. Whether it’s a global powerhouse like Liverpool or Chelsea, a historically profitable but leveraged club like Manchester United or a modest, out-of-the-way club like Bournemouth, the logic is the same: These are tangible assets whose worth can grow substantially over time even if the immediate cash flow is negative.

For American owners, the allure lies not just in potential financial returns, but also in prestige, global influence and, stretching a point, the opportunity to shape a sporting legacy. Football really is like a safari park. It may appear chaotic, menacing and treacherous, and perhaps it is. But the reserve appreciates in value, and that perilous wilderness is also a thriving asset.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ and edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Why Do Americans Want To Buy Premier League Football Clubs? appeared first on 51Թ.

]]>
/world-news/us-news/why-do-americans-want-to-buy-premier-league-football-clubs/feed/ 0
Anatomy of the Mushroom Murders /culture/anatomy-of-the-mushroom-murders/ /culture/anatomy-of-the-mushroom-murders/#respond Tue, 16 Sep 2025 13:12:23 +0000 /?p=157794 It is a horror story without monsters or demons, though evil is certainly present. In the absence of an intelligible motive, Erin Patterson’s alleged triple-murder of her in-laws has elicited incredulity and the familiar fallback of “evil.” This old bromide stands in for a cogent explanation, but psychologists have remained silent, journalists are waiting for… Continue reading Anatomy of the Mushroom Murders

The post Anatomy of the Mushroom Murders appeared first on 51Թ.

]]>
It is a horror story without monsters or demons, though evil is certainly present. In the absence of an intelligible motive, Erin Patterson’s alleged triple-murder of her in-laws has elicited incredulity and the familiar fallback of “evil.” This old bromide stands in for a cogent explanation, but psychologists have remained silent, journalists are waiting for inspiration and even the lawyers prosecuting Patterson failed to explain her motives. I’ll try. But first, let me describe what is, after all, an extraordinary sequence of .

Erin Scutter worked for RSPCA Australia (an animal welfare organization) in Melbourne, Australia, when, in the early 2000s, she met Simon Patterson. They married and moved to Perth, a city on the west coast. Scutter had earlier inherited $2 million Australian ($1.32 million). She had a son by Patterson and enjoyed cordial relations with her in-laws, Gail and Don Patterson. She changed her surname after marrying. Later, the couple moved to the state of Victoria, ostensibly to be nearer his family. Erin gave birth to a second child but soon lost both her parents to cancer.

Around 2015, Erin and Simon separated amicably, sharing custody of their children. They remained on friendly terms, even vacationing together. But in 2022, Simon tax returns listing himself as single, which reduced Erin’s government child support payments. He claimed it was an accounting error, but it is at least possible that Erin blamed him and bore a grudge.

Between November 2021 and September 2022, Simon was hospitalized three times with severe gastrointestinal issues. Physicians never identified the cause, though the symptoms appeared consistent with ingestion of rat poison. Erin maintained friendly relations with both Simon and his family.

In July 2023, Erin invited Simon’s parents, Gail and Don and his aunt and uncle, Heather and Ian Wilkinson, to lunch at her home in Leongatha. Simon declined. Patterson, who reportedly her guests she had ovarian cancer, served beef Wellington. Later, all four guests ended up being admitted to the hospital with gastro-like symptoms. Gail and Heather later died, followed by Don. Only Ian survived.

Police searched Erin’s home and questioned her. In November 2023, she was arrested and charged with three counts of murder and one of attempted murder. She pleaded not guilty, claiming the deaths were a tragic accident. But the jury found she had laced the food she served with Amanita phalloides, better known as death cap mushrooms, and found her guilty. The judge sentenced her to (though one imagines she won’t be trusted with kitchen duty).

So far, no one has satisfactorily answered the question: Why would an apparently ordinary woman commit such an extraordinary act of familial homicide?

Why? Why not?

Let me start by turning the question inside out: Why ɴdzܱ’t Patterson, a supposedly ordinary woman, kill her relatives? She may have harbored resentment toward her estranged husband after what he called an accounting error reduced her income. Perhaps she didn’t rage at him or his family openly, but silently held a simmering grievance. Rage can be expressed in different ways.

Criminologist s Social Control begins from an unusual premise: People commit crimes not because of irresistible urges, but because the restraints that usually check behavior have weakened. Bonds of attachment, commitment, involvement and belief ordinarily fasten us to society and restrain our behavior.

In Patterson’s case, many of those bonds appear weakened. Her marriage had collapsed. Trust in the extended family was frayed. She’d allegedly engaged in deception, by which I mean fabricating a cancer diagnosis. These are signs of someone unmoored from the attachments and commitments that inhibit transgression. Social Control Theory doesn’t reduce her actions to pathology: It suggests how crime becomes possible when the ordinary prohibitions of social life lose their hold.

It’s conceivable Patterson may have suspected that Simon, though estranged and living independently, had met another woman. There is no evidence of this, but even the belief could have shaped her sense of entrapment — hemmed in by disappointment, estrangement or disrespect. The fantasy of removing obstructive relatives may have seemed like a reasonable solution to otherwise insoluble pressures. The lack of control can’t explain the actual transgression, but it frames it as a distorted response to unbearable experiences.

Unnatural born killers

Killers are not born with murderous intent. They acquire techniques, rationalizations and cues that normalize deviance. Crime is learned behavior. People adopt definitions favorable to lawbreaking through their interactions with others. For Patterson, these lessons may not have come from a criminal underworld, but from subtler sources, like television, books, even casual conversations. Poisoning with mushrooms requires familiarity: lethality, preparation, dosage. Anyone versed in Agatha Christie’s novels knows how cues abound in popular literature. Knowledge, once acquired, makes the step into action conceivable.

Sociologist David Matza’s of “drift” adds another layer. Matza argued that people don’t set out to become criminals. (There are exceptions, as anyone familiar with the first line from filmmaker Martin Scorsese’s knows: “As far back as I can remember, I always wanted to be a gangster.”) They drift into deviance, oscillating between conformity and transgression. At times of loosened social bonds or weakened supervision, opportunities for deviance open up and individuals rationalize their acts as temporary departures from the norm.

The Patterson case fits this unsettling model. She may not have begun with a firm resolve to kill, but with smaller transgressions — deceits, manipulations, fantasies. Over time, these slid toward a point where serving poisoned food no longer felt unthinkable but almost natural, even normal. Drift explains the gradual erosion of moral boundaries that can culminate in extraordinary violence.

None of these accounts alone captures Patterson’s motivation. But together they suggest a convergence: weakened social bonds, perceived strains, learned definitions of deviance and a slow slide into moral suspension. This does not yield a neat motive — revenge, resentment or liberation may all have played roles — but it situates the crime in broader social dynamics. What looks incomprehensible becomes, from a sociological perspective, an intelligible sequence of disintegrating bonds, blocked goals, deviant learning and drift toward transgression.

Enduring fascination

If the causes of the crime lie in subterranean processes, the spectacle it created belongs to a different realm. The “mushroom murders,” as they’re colloquially called, were not just a local tragedy. They became global news, followed in real-time by , and soon a drama series. Why has this case captivated the world?

Since the 19th century, crime has been a staple of mass journalism. The Jack the Ripper of 1888 made East London the focus of global headlines and established a template: Lurid crimes, mysterious motives and a public insatiable appetite for detail. The mushroom murders fit into that lineage.

They contained all the elements of narrative drama: family betrayal, exotic poison, survival and death, deception and courtroom revelation. A Sunday lunch, usually a picture of domestic normality, became the setting for spectacular horror. Journalists know instinctively that such juxtapositions of the banal and the grotesque guarantee readership. So do scriptwriters for the British drama , in which charming villages in rural Oxfordshire, England, become the scenes of macabre killings.

The 21st century has seen an explosion of true-crime culture. Streaming platforms, podcasts and documentaries have turned real cases into serialized entertainment. The mushroom murders, with their unusual method and compelling characters, were perfect raw material for this ecosystem. Millions followed the daily updates, not only in Australia, but worldwide, as though consuming a live drama. ABC’s decision to dramatize the case in a television series, , is less an aberration than the logical next step in a global appetite for crime stories.

Why does crime, especially gruesome crime, hold such enduring fascination? Partly it reassures: By observing the extraordinary, we confirm our own normality. Partly it excites: Transgression, especially in the domestic sphere, exposes the fragility of everyday order. A family lunch is supposed to embody familiarity, friendship and safety. Turning it into an occasion of mass poisoning shatters those assumptions and forces us to ponder what we ordinarily suppress.

We are also drawn to questions of motive. When killers act from greed or desperation, their behavior is explicable, even if repellent. But when motives remain opaque, as in Patterson’s case, curiosity intensifies. The absence of explanation makes the story more haunting. Media interest feeds on that vacuum, replaying details in the hope that a rationale might surface.

Finally, the globalization of media ensures crimes no longer stay local. Satellite news, digital platforms and social media amplify cases that once would have occupied only regional headlines. The mushroom murders became a global spectacle not only because they were sensational, but because the global infrastructure now exists to circulate them instantly. In that sense, the case reveals as much about us and our contemporary media ecology as it does about Patterson.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Anatomy of the Mushroom Murders appeared first on 51Թ.

]]>
/culture/anatomy-of-the-mushroom-murders/feed/ 0
In Memoriam: Giorgio Armani and the Rise of Designer Culture /culture/in-memoriam-giorgio-armani-and-the-rise-of-designer-culture/ /culture/in-memoriam-giorgio-armani-and-the-rise-of-designer-culture/#respond Sat, 06 Sep 2025 14:12:02 +0000 /?p=157613 Giorgio Armani’s death at the age of 91 marks the passing of more than a fashion designer. Armani personified a cultural shift in the 1980s, a decade when the idea of style became inseparable from identity, social aspiration and a form of social drama. To wear Armani, or at least, clothes that resembled Armani, was… Continue reading In Memoriam: Giorgio Armani and the Rise of Designer Culture

The post In Memoriam: Giorgio Armani and the Rise of Designer Culture appeared first on 51Թ.

]]>
Giorgio Armani’s at the age of 91 marks the passing of more than a fashion designer. Armani personified a cultural shift in the 1980s, a decade when the idea of style became inseparable from identity, social aspiration and a form of social drama.

To wear Armani, or at least, clothes that resembled Armani, was to make a statement about taste, achievement and modernity. His minimalism, sophistication and attention to detail transformed clothing into an instrument of social expression and lifestyle into a theatrical performance. In this sense, Armani exemplified how culture itself became a stage, influencing not just what people wore but how they thought, felt and, indeed, lived.

Style democratized

Born in Piacenza, south of Milan, in 1934, was the second of three children. He dropped out of medical school and moved to Milan in the 1950s, joining the design team at the luxury department store La Rinascente as a window dresser. Despite lacking formal fashion training, he honed an instinct for style and proportion that would define his work. After a period designing for ’s Hitman line, he founded his own fashion house in 1975 with architect and partner Sergio Galeotti.

Armani’s philosophy was simple but, in its own way, revolutionary: Suits should be easy to wear, comfortable and convey swagger. His unstructured take on men’s jackets broke with the fitted tailoring of French designers and , who dominated men’s fashion in the 1960s and 1970s, creating a more relaxed style, understated yet aspirational.

Armani’s breakthrough came with the American screenwriter Paul Schrader-directed film American Gigolo in 1980. American actor Richard Gere’s performance was inseparable from the Armani he wore, demonstrating for the first time how clothing could project an identity on screen. Armani became synonymous not only with taste, but also with a new kind of social visibility: Style as social capital, enacted through a carefully constructed persona.

This was reinforced by Armani’s move to Los Angeles in 1983: He was the first designer to open an office there with the explicit goal of dressing Hollywood actors and other celebrities. The likes of Michelle Pfeiffer, Jodie Foster and Anjelica Huston became ambulant advertisements for Armani, updating the historical link between Hollywood glamor and fashion. But not unattainably haute couture fashion: Anyone potentially could dress in designer clothes and carry them off. Style was, in a way, democratized.

The diffusion of Armani’s style went beyond the red carpet. Television shows such as Miami Vice turned designer clothes into an aspirational shorthand for taste, success and lifestyle. The and Italian cars of James “Sonny” Crockett and Ricardo “Rico” Tubbs were visible markers: The clothes didn’t simply cover the actors; they validated a way of living. The cop drama used some Armani clothes, but also drew on his rivals, Hugo Boss and Gianni Versace, among others, as well as his former mentor Cerutti. All were described with the key adjective, designer.

The 1980s was the designer decade. The term itself evoked the kind of sophistication and social distinction to which the cognoscenti aspired, where social distinction, as Pierre Bourdieu would argue, refers to how taste functions as a marker of social differentiation. Clothing, media and even household products coalesced into a new social ecosystem, one in which image, style and personal branding could speak more articulately about people than their class, ethnicity, background or anything else.

Armani hotels, restaurants, homeware and beauty lines exemplified the integration of fashion into lifestyle. American rock band Blondie’s lyric “roll me in designer sheets” on (on the American Gigolo soundtrack) echoed this: A commonplace bedroom item could signify taste and cultural capital, merely by its label.

Liquid modernity and the cultural turn

We can view Armani’s significance through the lens of Pierre , whose concepts of habitus and distinction help explain how style operates as a social display. Habitus describes the ingrained dispositions and practices that structure how individuals move through society. Distinction refers to how taste functions as a marker of social differentiation.

Armani’s restraint, subtly signalling elegance without ostentation, exemplifies distinction in action: The wearer communicated the social capital they’ve accumulated and the taste they’ve acquired; they could project identity and status through clothing. (Social capital refers to the value and prestige associated with certain cultural markers, such as designer clothing.)

In the context of the 1980s, this aligns with Zygmunt Bauman’s notion of . Social categories were becoming less fixed, lifestyles more flexible and identity pluralized, so we could swap and change how we thought about ourselves and how we wanted others to regard us as we passed from context to context. Armani’s clothes allowed individuals to navigate this fluidity: An Armani suit, shirt or gown was more than a garment for display: It allowed wearers to inhabit the person they wanted to be.

The aforementioned American Gigolo and Miami Vice dramatized this. They showed a society in which visible image was not incidental to social life, but absolutely central. Where style articulated ambition, mobility and cultural distinction. Scholars of media and fashion have long recognized as a template for the diffusion of designer culture, demonstrating how values and aesthetics could circulate through screens and into everyday life. Armani’s impact was not merely sartorial but social: He participated in the creation of a mediated culture in which taste and lifestyle became determinants as well as reflections of identity and social status.

Armani’s rise coincided with the moment of what academics later called the “cultural turn.” Until then, the prevailing view was that the economy (the production of goods and the circulation of money) was the decisive force shaping society. The cultural turn reframed this, showing that culture, style and everyday practices could be just as powerful in shaping how we live.

Here, Bourdieu and Bauman converge: The fluidity of liquid modernity was navigated through acts of taste, and Armani provided a kind of grammar for those acts. His influence was thus structural: He did not simply clothe bodies, he helped articulate a society increasingly defined by image, style and the symbolic markers of distinction.

Beyond clothing

Armani crystallized a moment when culture itself was acknowledged as a structuring force in society. The 1980s cultural turn represented a broader shift in social life and the way we study it. Aesthetic choices, media and consumption became potent instruments of identity, aspiration and social negotiation. Armani was part of this shift. Through his designs, his brand and, indeed, his cultural presence, he created a method for the accrual of social and cultural capital.

In this sense, Armani was not only a designer but a cultural agent, a figure whose influence illuminates how performance, taste and social navigation were central to modern life. His passing marks the end of an era, but the culture he helped codify continues to shape how we dress, present ourselves and understand social distinction today.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post In Memoriam: Giorgio Armani and the Rise of Designer Culture appeared first on 51Թ.

]]>
/culture/in-memoriam-giorgio-armani-and-the-rise-of-designer-culture/feed/ 0
Ken Loach: Auteur as Agent Provocateur /culture/ken-loach-auteur-as-agent-provocateur/ /culture/ken-loach-auteur-as-agent-provocateur/#respond Tue, 19 Aug 2025 13:59:32 +0000 /?p=157253 In a morbidly ironic turn of events, English filmmaker Ken Loach’s latest public stand came almost at the same time as the death of Ray Brooks, the actor who played in Cathy Come Home (1966), Loach’s breakthrough television drama. Last weekend, Loach refused to accept the 2025 Gran Premio Torino Lifetime Achievement award at the… Continue reading Ken Loach: Auteur as Agent Provocateur

The post Ken Loach: Auteur as Agent Provocateur appeared first on 51Թ.

]]>
In a morbidly ironic turn of events, English filmmaker Ken Loach’s latest public stand came almost at the same time as the death of , the actor who played in (1966), Loach’s breakthrough television drama. Last weekend, Loach to accept the 2025 Gran Premio Torino Lifetime Achievement award at the Turin Film Festival. The Turin dispute centered on claims that cleaning and security staff at the city’s National Film Museum, the festival’s parent body, had suffered wage cuts and dismissals after the organization outsourced their services.

Loach, scheduled to receive the award and attend screenings, withdrew with “great regret,” saying he couldn’t accept a prize, “while the most vulnerable have lost their jobs for their opposition to a pay cut.” The museum insisted the outsourcing followed legal tendering rules and that it bore no responsibility for contractors’ staffing decisions. Loach was not convinced and remained unmoved. He is, after all, as much an agent provocateur as an auteur.

Art and activism

To anyone who has followed his career, the gesture was no surprise. Loach has always been a political filmmaker for whom art and activism are the same thing. Born in the English Midlands town of Nuneaton in , the son of an electrician and a factory worker, Loach studied law at Oxford but gravitated toward theater. After a stint in the Royal Air Force, he joined regional repertory companies.

He later moved into television direction at the British Broadcasting Corporation (BBC), working on the police series before finding his stride with . This drama anthology gave Loach the chance to develop his distinctive blend of improvisation, non-professional acting and social realism.

was game-changing television: A harrowing portrayal of a young family’s slide into homelessness, watched by over 12 million viewers, the drama caused public outrage, provoked debate in Parliament and is credited with helping to launch the housing charity . Loach’s subsequent films, from (1969) to (1991) to (2016), have consistently foregrounded the lives of the working class, the unemployed and the dispossessed, often tackling issues of welfare, labor rights and social injustice.

Alongside critical acclaim — two Palme d’Or wins, British Academy Film Awards (BAFTAs) and numerous festival honors — Loach has political enemies and predictable accusations of bias. Yet, he has never softened his tone to widen the appeal of his work. The Turin boycott is consistent with a lifelong pattern: A refusal to compartmentalize the personal, the political and the artistic. In Loach’s oeuvre, the filmmaker’s moral responsibility extends beyond the set or editing room, reaching into choices about how and from whom to accept recognition.

Social realism as weaponry

People often describe Loach’s films as “social realist,” but that term can sound clinical, even dry, until you see his art. in its art-historical sense emerged in 1930s America, particularly in painting and photography, as a way to depict everyday working-class life with brutal, unvarnished honesty, often with a subtext but sometimes obvious statements of protest.

Loach introduced that spirit into British television and cinema, drawing on the post-war realist tradition of filmmakers like and , but stripping away the theatrical flourishes. For Loach, social realism was not merely a style; it was a political instrument.

The technique begins with casting. Loach and, indeed, still favors non-professionals or actors with lived experience of the worlds they’re portraying. He hands out scripts piecemeal to encourage “workshopping,” spontaneous, unguarded, perhaps random reactions. This method is as much about preserving authenticity as it is about subverting artifice.

Loach didn’t simulate the memorable scene in Kes where schoolboys are beaten on the palms: He directed real punishment to elicit a believable reaction. It’s a tactic that’s drawn criticism for its toughness, but one that reveals Loach’s priority: Truth over comfort.

Then there’s his visual language. Loach rejects grandiose camera movements, studio lighting and manipulative musical cues. The camera sits quietly in the room, letting characters breathe, while natural light and ambient sound anchor the scene in a recognizable world. The effect is immersive without being ostentatious. He doesn’t invite anyone to admire the cinematography: Everyone should inhabit the moment.

The didactic intent is never far from the surface. Cathy Come Home wasn’t just a drama: It was a reminder about homelessness, as urgent as any charity appeal. Which Side Are You On? () put striking coalminers on national television speaking their own words. I, Daniel Blake (2016), offered a human face to the bureaucratic cruelty of welfare reform, prompting headlines and policy debates. For Loach, the point of realism is not just to depict suffering, but to link it as if by a causal chain to power and to mobilize audiences to think and, better still, act.

This approach situates Loach in a long lineage of artists who have weaponized their medium. Diego Rivera’s in Mexico blended portraiture and revolutionary history to inspire workers. used theater to jolt audiences out of passive consumption and into an alarming awareness. Jacob Riis’s late-19th-century of New York tenements were both journalism and advocacy. In each case, the art was inseparable from the politics, not just in content but in the method of production and the intended effect on the viewer.

Loach has often said he mistrusts “neutral” art, arguing that inaction is itself a political stance. For him, film is not a mirror held up to society: It is more like a can-opener, something to pry open the public conscience. This belief explains both his cinematic style and his readiness to walk away from accolades when they clash with his principles. In his hands, social realism becomes a two-edged tool: A faithful witness to lived experience and a prod to collective action. 

Like other social realists, Loach measures artistic success not in aesthetic terms alone but by the extent to which it unsettles the status quo. 

Can we separate the art from the artist?

The Loach–Turin episode prompts an age-old question in cultural criticism: Should, or even can we, separate an artist’s work from the artist’s beliefs, politics or their personal conduct? The temptation is to imagine art as an independent, free-floating entity, available for private enjoyment, detached from the person who produced it and the circumstances in which they created their art. But it’s a temptation we resist. Especially in Loach’s case: The separation is almost impossible anyway, not because his private life seeps into his work, but because the work is his politics.

Unlike, say, , whose operas can be and often are staged without reference to his antisemitism, or , whose cubist innovations can and do receive acclaim without endorsing his behavior toward women, Loach integrates his convictions into every one of his films in such a way that they are inescapable.

The characters, stories and even the austere visual style are political choices. To appreciate I, Daniel Blake without engaging with its critical evaluation of the British welfare bureaucracy is to miss its central purpose and render the experience of watching the film meaningless. Loach would argue this is not art despite the politics, but art because of the politics.

Yet the art world offers many examples where this boundary between creator and creation is problematic. Michael Jackson’s music continues to fill airwaves and dance floors despite about his conduct and he still commands the adoration of legions. , convicted of a crime in the US in 1977 involving a 13-year-old female, still inspires respect for films such as Chinatown and The Pianist. Wagner’s operas, infused with elements of German nationalism and antisemitic undertones, were beloved by Hitler, yet they remain integral to the classical canon.

In each case, audiences, critics and institutions face the daunting task of deciding whether appreciating art implies endorsing its maker and whether they can separate art’s aesthetic value from the moral or political failings of its creator.

What distinguishes Loach from many in this company is that his films demand that the audience join him on his home political ground. You can listen to a Jackson song without thinking about his life, or wander through a Picasso retrospective focusing purely on color and form. No cineaste would diverge from the view that much of Polanski’s work is magnificent. 

But with Loach, it’s impossible to uncouple art from politics. Rejecting his worldview while embracing his work is like idolizing a beautiful tapestry but knowing the dye that colored it poisoned the workers who made it: The latter defiles the former. If you don’t go along with Loach’s moral arguments, you can’t treasure his films. Politics is not part of the art: It is the art.

This plaiting clarifies why Loach has both fervent admirers and equally fervent detractors. Those who share his concerns about inequality, austerity and war find his films rewarding and urgent. Those who don’t, or who disagree with his methods or conclusions, find them . But perhaps that polarity is part of his artistic legacy: To force the question of whether art should comfort the audience, or agitate it.

Loach’s work reminds us that the debate about “separating art from artist” is not binary. For some creators, distinguishable separation is a plausible strategy. For others, Loach in particular, the art and the artist are fused inseparably. Whether that makes his films more admirable or more troublesome depends entirely on the viewer’s willingness to engage with or reject the world he so uncompromisingly presents.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Ken Loach: Auteur as Agent Provocateur appeared first on 51Թ.

]]>
/culture/ken-loach-auteur-as-agent-provocateur/feed/ 0
Bill Clinton, Monica Lewinsky and the Politics of Spectacle /world-news/us-news/bill-clinton-monica-lewinsky-and-the-politics-of-spectacle/ /world-news/us-news/bill-clinton-monica-lewinsky-and-the-politics-of-spectacle/#respond Sat, 09 Aug 2025 14:32:35 +0000 /?p=157116 Politics changed forever 27 years ago. No election, assassination or international summit marked the shift. No tanks rolled, no walls fell. Yet a transformation occurred, not in America’s laws or institutions, but in how power was experienced, watched and consumed. Politics shed its sacred aura, became disconcertingly familiar and began to feel unmistakably like the… Continue reading Bill Clinton, Monica Lewinsky and the Politics of Spectacle

The post Bill Clinton, Monica Lewinsky and the Politics of Spectacle appeared first on 51Թ.

]]>
Politics changed forever 27 years ago. No election, assassination or international summit marked the shift. No tanks rolled, no walls fell. Yet a transformation occurred, not in America’s laws or institutions, but in how power was experienced, watched and consumed. Politics shed its sacred aura, became disconcertingly familiar and began to feel unmistakably like the kind of entertainment we were used to watching on television.

On August 17, 1998, after months of denials, US President Bill Clinton to a grand jury: “I did have a relationship with Ms. [Monica] Lewinsky that was not appropriate. In fact, it was wrong.”

Sex scandals in American politics were certainly nothing new. President John F. Kennedy’s remained whispered rumors, never televised. Gary Hart, daring reporters to follow him, like a stone when they did. Even Clinton himself had navigated earlier allegations from women, namely and , that might have ended another politician’s career. But Monica Lewinsky was a different proposition. She wasn’t merely another woman; she was the central, unwitting protagonist in an international psychodrama.

What set her affair with Clinton apart wasn’t the sex, juicy as that was. It was the unprecedented, raw access: the leaked transcripts, the damning voicemail, the infamous navy blue . This wasn’t just a scandal; it was a high-definition spectacle, delivered directly to every household and in real time.

Accidental celebrity

Clinton made history by becoming the first sitting president to testify before a grand jury as the target of a investigation. The questions were deeply personal and, at times, vulgar; the setting borderline surreal. Beamed from the White House via closed-circuit TV, Clinton answered prosecutors’ questions with lawyerly evasion and painstaking, almost excruciating, phrasing.

In one memorable exchange, the prosecutor asked: “Mr. President, do you understand that the statement that there ‘is’ no sexual relationship, an improper sexual relationship, or any other kind of improper relationship, could be false if indeed there was one, even though it’s in the past?” Clinton’s convoluted response became an instant cultural touchstone: “It depends on what the meaning of the word ‘is’ is. If the—if he—if ‘is’ means is and never has been, that is not—that is one thing. If it means there is none, that was a completely true .”

Following his grand jury appearance, Clinton delivered a televised address to the nation. It was short, stiff and heavy with legalisms. He admitted the relationship had been “not appropriate” and that he had misled people, including even [his] wife.” He appeared unsettled yet spoke with an underlying defiance. The nation and indeed the world remained transfixed, unsure how to feel — disgusted, tantalized or simply impressed by Clinton’s audacious bravado.

Three days later, on August 20, American cruise missiles struck targets in Sudan and Afghanistan. Officially a response to the East Africa embassy bombings, Operation was immediately dubbed a distraction. Jokes were made comparing these events to the previous year’s comedy film, ; in the film, a government spin doctor (Robert De Niro) and a Hollywood producer (Dustin Hoffman) work to fabricate a war in Albania to distract the public from a presidential sex scandal. It was, perhaps, the first time in history a significant international military action found itself relegated to a mere footnote in a domestic sex scandal.

What held this entire spectacle together, making it so utterly compelling, was Clinton himself. He wasn’t imposing like President Ronald Reagan, patrician like President George H.W. Bush or saintly like President Jimmy Carter. Clinton was fundamentally different. He possessed the easy manner of a man you might chat with in a Walmart supermarket checkout line — someone seemingly knowable, perhaps even someone who might flirt with you. His flaws, his all-too-human messiness, ironically, made him the first truly relatable president. That quality, once unthinkable in a commander-in-chief, now became an unexpected asset.

The age of the spectacle

By the end of that August, America’s political culture had undergone a quiet yet profound and lasting transformation. The presidency, once associated with distance and solemn dignity, had become a pivotal component in the nation’s entertainment machinery.

By the late 1990s, America was already a nation expertly “trained in watching.” Talk shows routinely blurred the line between confession and performance. Paparazzi relentlessly pursued not just film stars, but increasingly, personalities. Shows like packaged dysfunctional families as primetime entertainment. Stores now offered more than groceries — they stocked America’s new unholy secular scriptures: glossy weekly gossip magazines like People and National Enquirer. Into this readied landscape stepped Lewinsky: intern, lover, national punchline and, ultimately, a reluctant protagonist in the most-watched real-life soap opera the world had ever seen.

But to grasp how Monica became Monica™ — a name that, for a time, needed no surname — we need a brief glance at the preceding cultural landscape. Few figures shaped that terrain more dramatically than Madonna. Throughout the 1980s and ‘90s, the diva transcended mere pop stardom; she was a cultural agent provocateur who taught audiences how to look, how to stare and, crucially, how not to look away.

She turned taboo into a trending topic years before hashtags even existed. Whether on stage, publishing her explicit 1992 book, or using the word “” repeatedly on the Late Show with David Letterman, Madonna didn’t just push boundaries — she dissolved them. More significantly, she made it respectable, even desirable, to gaze intently… and to enjoy the spectacle.

By the time Clinton’s affair was exposed, the public was ready. What once might have been muttered discreetly became common watercooler chat. And the media, by then no longer deferential gatekeepers but increasingly predatory content chasers, knew how to satisfy the appetite for tittle-tattle. Monica™ was like a gift from heaven.

Clinton’s scandal wasn’t merely covered; it was serialized. It possessed a clear structure, escalating suspense, compelling secondary characters (like civil servant and attorney ) and even unexpected wardrobe plot points. Lewinsky’s semen-stained blue Gap dress transcended mere evidence, as did a cigar Clinton used as a sex aid. They became pervasive cultural references, almost sacred objects in a new age of scandal. The narrative had sex, power, concealment, betrayal and a president who, with every denial, seemed only to get more intriguing.

In an earlier era, shameful exposure meant indelible disgrace, dishonor and often everlasting stigma. But shame was in the process of being redefined. It might still have felt temporarily humiliating, but it carried no lasting loss of respect or esteem and the disgrace was far from indelible: It was quickly effaced. But, with the rapid ascendance of celebrity culture, shame seemed oddly out of place. Becoming famous by any means necessary was quickly becoming a legitimate career aspiration and shame, at times, was simply accepted as collateral damage.

Lewinsky became an accidental celebrity: a woman who, by her own later , lost not just her privacy but her “reputation and dignity and … almost [her] life.” Clinton, meanwhile, seemed to waft above it all, protected less by institutional power than by his sheer attractiveness, an undeniable charisma and an audience seemingly too rewarded by his very human antics to abandon him.

It’s easy to categorize the scandal as purely political, and of course, it did have political consequences. But at its heart, it belonged less to Washington, DC, than to global popular culture. The public wasn’t shocked by what Clinton did; it was utterly captivated by the unprecedented access. People were allowed to watch it all unfold. The real revelation wasn’t about morality; it was about media. The affair didn’t signal the fall of a president; it heralded the rise of the culture of spectacle.

Scandal fatigue

“If you can’t trust the president to tell the truth, who can you trust?” an incredulous reporter asked. But for much of the public, that question entirely missed the point. By then, Clinton was no longer being measured by old-fashioned virtues like trustworthiness or reliability, but by his performance.

Remarkably (perhaps), his approval ratings spiked after he admitted to the Lewinsky affair. This wasn’t despite the scandal: it was, in a perverse way, because of it. His transgression became fused with his relatability, even his disarming authenticity. The public was so exhausted by the continual prurient allegations against the president that what might have started as shock or indignation became an agreeable distraction. “” was the term used to describe the cultural desensitization.

He lied, he squirmed, he strangled grammar (as demonstrated previously when he defined the word “is”). But he did it all in plain sight. For a public raised on The Oprah Winfrey Show, The Geraldo Rivera Show and the confessional stylings of reality TV, that transparency almost felt honest. (Today, of course, we are all habituated to US presidents who lie, squirm and strangle grammar.)

Lewinsky, meanwhile, was publicly and savagely destroyed. “I was of losing a personal reputation on a global scale,” she reflected years later, keenly aware of the Internet’s embryonic yet devastating role in her humiliation. Her name became a cipher for shame, a global punchline in a thousand late-night monologues. Yet, in time, she courageously reclaimed her voice, emerging not as an object of scandal but as a speaker, writer and against cyberbullying. If Clinton represented the survival of political power through personal disgrace, Lewinsky came to represent something arguably more modern and profound: the possibility of a woman surviving a potentially global scandal and, in the process, discovering agency.

The end of privacy

Perhaps the most enduring legacy of August 1998 wasn’t political or purely personal. It was cultural: the irrevocable departure of the concept of a “private life” for public figures and, eventually, for virtually everyone. Clinton’s affair and the ravenous media machinery it cranked into life were features of a nascent era in which visibility became permanent, intimacy became endlessly shareable and secrets became monetizable. And everyone was left asking and answering a question: If the most powerful man in the world couldn’t conceal an affair, who the hell could?

Fast-forward to July 2025. At a performed by rock group Coldplay in Foxborough, Massachusetts, the jumbotron’s kiss-cam pans to a couple sharing what appears to be an intimate moment. The image flashes on massive screens across the stadium. The woman recoils, visibly embarrassed, as she realizes she’s been caught on camera. Coldplay frontman Chris Martin even comments on the scene. Within hours, the video of the brief encounter goes viral across social media. Reddit threads wildly about a potential affair as TikTokers frantically try to the pair. X explodes with . No one, anywhere, pauses to ask if this exposure was fair or proper. The story wasn’t about morality.

That fleeting moment, brief yet dramatic and seemingly random, is connected to August 1998 by a kind of molecular chain. It serves as a gentle reminder that the rules, such as they were, have fundamentally changed. There is no on-stage versus off-stage anymore. No quiet corner of life remains immune to broadcasting. There is no longer true privacy. We are all potentially “that woman” or “that man” now — framed, packaged and offered for the casual delectation of anyone. We are all shareable now. And today, we are so accustomed to it, we don’t notice. And, if we did, large demographics ɴdzܱ’t care. Generations Y and Z are products of the post-private era.

Clinton was the first president of that era. He was a politician who smudged the demarcation lines between statesman and spectacle, between leadership and sheer . He didn’t fall from grace so much as slide into a new kind of fame, the kind in which the fall itself was an essential part of the entertainment. The sleazy kind.

Lewinsky, more than anyone, bore the cost. She didn’t crave celebrity status; it was affixed to her. The affair, the dress and the endless denials weren’t just political moments. They were cultural markers, showing the world that no one, not even the president of the US, is exempt from unwelcome, permanent exposure.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Bill Clinton, Monica Lewinsky and the Politics of Spectacle appeared first on 51Թ.

]]>
/world-news/us-news/bill-clinton-monica-lewinsky-and-the-politics-of-spectacle/feed/ 0
Is Diane Abbott Right? /politics/is-diane-abbott-right/ /politics/is-diane-abbott-right/#respond Fri, 18 Jul 2025 15:06:43 +0000 /?p=156852 In April 2023, Diane Abbott, the UK’s first Black female Member of Parliament (MP), wrote a letter to The Observer that caused a political storm. In it, she distinguished the racism experienced by “people of color” from the prejudice suffered by Jewish, Irish and Traveller communities. Her words were blunt: “They undoubtedly experience prejudice. This… Continue reading Is Diane Abbott Right?

The post Is Diane Abbott Right? appeared first on 51Թ.

]]>
In April 2023, Diane Abbott, the UK’s first Black female Member of Parliament (MP), wrote a letter to The Observer that caused a political storm. In it, she distinguished the racism experienced by “people of color” from the prejudice suffered by Jewish, Irish and Traveller communities. Her words were blunt: “They undoubtedly experience prejudice. This is similar to racism and the two words are often used as if they are interchangeable. It is true that many types of white people with points of difference, such as redheads, can experience this prejudice. But they are not all their lives.”

The backlash was immediate and severe. Sir Keir Starmer, leader of the Labour Party, called the letter “antisemitic”. Abbott was suspended and lost the party whip, effectively removing her from the parliamentary Labour group. She later apologized “unreservedly” for the letter, withdrew its content and said the drafting had been an “initial version” sent in error. Despite this, she has remained outside the Labour fold.

The controversy raises a deeper question: Was she justified in distinguishing between different forms of discrimination? The assumption behind the criticism is that all forms of racism and prejudice are equal and must be treated as such. But Abbott’s argument, clumsily expressed, perhaps, suggests that racism is far from a uniform phenomenon. It can take many forms, some structural, others cultural, some visible, others hidden. The character of racism shifts according to time, place and other contextual variables. We might properly refer to the plural racisms.

Abbott’s argument was not that antisemitism, anti-Irish or anti-Traveller prejudice were not real, damaging or impactful, but that racism directed at people of color, particularly Blacks and South Asians, is historically different in Britain and persists in surreptitious, yet systemic ways and continues to have consequences. The question is not whether Abbott’s wording was offensive. It is whether her central claim has merit.

Race, racism and visibility

Since the postwar era, Britain has seen significant immigration from its former colonies in Africa, the Caribbean and South Asia. These migrants were visibly different from the white majority and quickly became targets of both social hostility and institutional exclusion. Even state policy itself what was, in the 1960s, called racialism or racial discrimination. Housing policies, police practices and labor market discrimination routinely disadvantaged them. This was not simply personal animus. It was systemic and legal. At least until 1965, when outlawed “racial discrimination.” The legislation’s scope was expanded in 1968 and, again, in 1976.

Abbott’s point is that this form of discrimination was not just prejudice: it was racism, specifically designed to maintain a racial hierarchy with white Britons at the top. The “weaponization” of the spurious concept of race, as Abbott implies, was a powerful tool for dispossessed whites seeking scapegoats for their misfortunes, as unemployment rose.

Abbott’s key distinction is visibility. Jews, Irish and Travellers have faced serious and sometimes violent discrimination in British history. But, for the most part, they were not physically distinguishable from other white Britons. This did not make their suffering any less real, but it did make it easier, in some circumstances, to “pass” or blend into mainstream society. Black and South Asian people had no such option. They were and remain instantly legible as Other in a society that codes whiteness as normal. 

This visibility has and continues to have profound consequences. Racial profiling, media stereotyping, criminalization, over-policing and underrepresentation are all facilitated by the ability to mark people visually. Discrimination against Jews or Irish people has typically taken different forms, often cultural, religious or ethnic rather than strictly racial. The Holocaust and pogroms across Europe demonstrate that antisemitism can be just as deadly. But the mechanisms of marginalization differ.

Abbott’s error

Abbott’s error was in appearing to rank oppressions, a dangerous misstep given the severe and distinct forms of prejudice faced by various groups. Yet her underlying insight holds considerable weight: racism directed at people of color, particularly those of African and Caribbean descent, in Britain was historically and globally justified by theories about biological difference.

These theories were based on the premise that the world’s population was divisible into distinguishable groups called “races,” a concept then used to enforce a strict hierarchy. This racialized thinking gave anti-Black discrimination a unique, pervasive force — deeply embedding itself in institutions, legal frameworks and social structures over centuries, spanning slavery, colonialism and post-colonial migration.

Discrimination against the Jewish or the Irish, while undeniably severe and sometimes involving racialized caricatures and theories of their “inferiority” at certain historical junctures, did not develop with the same epic duration or global systemic reach as the ideologies underpinning the transatlantic slave trade and its enduring legacies.

The visible and indelible nature of Blackness, forged under conditions of chattel slavery and subsequent systemic oppression, solidified a distinct form of racial hierarchy that made it virtually impossible to escape through assimilation, distinguishing it from the historical prejudices faced by other white ethnic groups who, over time, often achieved greater integration.

The politics of antisemitism

Antisemitism is not just a relic of the past. In Britain, the US and across Europe, it remains a live threat, as evidenced by the recent spike in antisemitic incidents during periods of the Middle East conflict. The Labour Party, under Jeremy Corbyn’s leadership, was beset with it had tolerated or failed to respond adequately to antisemitism in its ranks. The 2020 Equality and Human Rights Commission (EHRC) report concluded that Labour had equalities law, a damning indictment that forced internal reform and public apologies.

This context partly explains the reaction to Abbott’s letter. Her remarks were interpreted as minimizing the suffering of Jewish people, undermining the seriousness of antisemitism and echoing the logic of those who sought to relegate it to a subsidiary type of bigotry. The optics were politically abominable, not least because Labour was attempting to rebuild trust with Jewish communities under Starmer’s leadership.

Yet, in this hypersensitive political climate, details risk being lost. Abbott did not deny that Jews face prejudice. Her argument was not dismissive; she tried (in an admittedly imprecise and perhaps inadvertently provocative way) to distinguish between the types of discrimination different groups confronted. Her critics responded as if any deviation from the view that all prejudice is equal is itself a form of bigotry.

But this homogenizing approach obscures, rather than clarifies, understanding. Racism is not a single experience or a uniform category. It is historically contingent, culturally shaped and unevenly distributed. We might properly use the plural racisms

No stranger to controversy

Abbott is no stranger to controversy. As a pioneering Black politician, she has often been held to impossible standards, facing pretty much throughout her career. According to , she has the unenviable distinction of receiving more online hate than any other female politician. Her suspension from Labour can’t be separated from this longer history of scrutiny, marginalization and misrepresentation.

Yes, Abbott’s letter was inelegant in its expression. But her fundamental argument that racism directed at people of color in Britain has a different genealogy and impact than other forms of prejudice deserves more than disciplinary action. It deserves critical discussion. There is a risk that, in policing speech about racism and other bigotries, we enforce a new orthodoxy that brooks no difference of analysis or perspective.

More broadly, the episode reflects a troubling shift in British political culture: Away from thoughtful engagement and toward thoughtless punishment. Rather than explore whether Abbott’s claims had intellectual, historical or empirical grounding, party leaders moved hastily to condemn and exclude. This may have restored short-term political capital, but it did little to advance public understanding of racism or, for that matter, anti-racism.

Abbott from the parliamentary party more than a year later. Her case has become, in a sense, symbolic, not just of Labour’s internal politics, but of the wider struggle over how racism is defined, by whom and to what ends. And how we should respond to it, even in its vestigial forms. In a society still challenged by its colonial past and present inequalities, these questions can’t be settled by fiat.

So we return to the original question: Was Diane Abbott right? No, not in every word. But in her central claim that racism against people of color in Britain has distinct historical roots and lived consequences, she was not wrong either. While all forms of discrimination are harmful and abhorrent, Abbott’s thoughts prompt an uncomfortable examination of the precise language we use to describe interlocking systems of oppression. 

Ellis Cashmore’s “” is published by Bloomsbury. One of his earlier books is.”

[ edited this piece]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Is Diane Abbott Right? appeared first on 51Թ.

]]>
/politics/is-diane-abbott-right/feed/ 0
Rock Music is Sickening, Disgusting, Filthy, Repugnant, Unpleasant and Odious – That’s the Point /more/environment/rock-music-is-sickening-disgusting-filthy-repugnant-unpleasant-and-odious-thats-the-point/ /more/environment/rock-music-is-sickening-disgusting-filthy-repugnant-unpleasant-and-odious-thats-the-point/#comments Thu, 03 Jul 2025 12:58:40 +0000 /?p=156128 Rock ’n’ roll didn’t simply shake the cultural landscape; it caused an earthquake — a sudden rupture whose aftershocks reverberated from the 1950s onward. At its epicenter was Elvis Presley. A white Southerner who borrowed, filtered and, perversely, embodied black musical traditions, including gospel, blues and even swing, Elvis made palatable (and sellable) what white… Continue reading Rock Music is Sickening, Disgusting, Filthy, Repugnant, Unpleasant and Odious – That’s the Point

The post Rock Music is Sickening, Disgusting, Filthy, Repugnant, Unpleasant and Odious – That’s the Point appeared first on 51Թ.

]]>
Rock ’n’ roll didn’t simply shake the cultural landscape; it caused an earthquake — a sudden rupture whose aftershocks reverberated from the 1950s onward. At its epicenter was Elvis Presley. A white Southerner who borrowed, filtered and, perversely, embodied black musical traditions, including gospel, blues and even swing, Elvis made palatable (and sellable) what white America had previously either ignored or condemned. His voice hinted at the sensuality of the Black church; his hips were denounced as pornographic. But Elvis’s real subversion was racial. Here was a white man singing like he was black and, worse, moving like it too.

, the owner of Sun Records, for whom Elvis recorded his early material, is often credited with musing: “If I could find a white man who had the Negro sound and the Negro feel, I could make a billion dollars.”

In postwar America, conformity was the air people breathed. It was the era of Levittown suburbs, Chevrolet Bel Airs and nuclear family orthodoxy. The suburban conformity of postwar America, starchy, restrictive and suffocating, has been memorably captured in Richard Yates’s novel (1961), later adapted into a film featuring Leonardo DiCaprio and Kate Winslet, which explored the desperate yearning for escape beneath the surface of middle-America in the 1950s.

Civil rights were barely on the horizon; feminism was still second-wave future tense. Against this backdrop, rock ’n’ roll didn’t just sound unlike anything else: it felt transgressive. And it didn’t pop out of a cultural vacuum. Film had already incubated the youth rebellion. Asked what he was rebelling against in the film The Wild One, Marlon Brando replied: “What have you got?”

James Dean died in a car crash in 1955 at age 24, after starring in Rebel Without a Cause. His death cemented his legend as the ultimate symbol of tragic youth, living fast and dying young. But it was in rock music that young people discovered a new kind of rebellion: they didn’t just listen to it; they danced to it, wore it and shrieked at the bands that played it. Rock music both surrounded and penetrated them.

The outrage that greeted Elvis’s performances (particularly his shamelessly “indecent” hip swivel: he was known as “Elvis the Pelvis”) was the genre’s original moral panic. Parents feared their children would be corrupted by Elvis and the “jungle music” he purveyed. That was the point. Rock was supposed to worry people. 

And what made it dangerous wasn’t the music alone: it was the race politics hidden in plain sight. In this sense, the genre was born already in disguise: Black art, white faces, sold as new. What Elvis launched wasn’t just a sound or even a look: it was a method, a way to disguise rebellion as pleasure.  

Dylan and the politics of protest

By the 1960s, rock had swapped its leather jacket and drainpipe bluejeans for placards and newsboy caps. As America agonized over the nuclear bomb, civil rights and the Vietnam war, a generation of musicians appeared. They were a bit like troubadours, but their aim was not just to entertain, but to educate. Enter Bob Dylan. He didn’t invent the genre that became known as protest music, but he gave it poetic licence. His songs didn’t chronicle the world: they pointed angry fingers at the likes of the “Masters of War”. Dylan didn’t merely lament conflict; he condemned the military-industrial complex with a biblical fury. His “The Times They Are A-Changin’” became the opus of a generation. 

Dylan’s influence remains today: He showed that a rock lyric could be philosophical, elliptical and occasionally incomprehensible. But it could still resonate. The more obscure his lyrics became, the more they seemed to capture the zeitgeist.

Unlike the comparably important output of Motown (more of which below), there was no optimism in Dylan’s folk-rock hybrid: it encouraged discomfort if not downright rage. It told uncomfortable truths. He wasn’t alone. Artists like Joan Baez, Joni Mitchell and, eventually, John Lennon, aligned rock with anti-establishment causes, such as anti-war, anti-nuclear, pro-civil rights and, in Lennon’s case, the dissemination of love.

But there was something else going on: the medium itself was becoming more openly oppositional. It wasn’t just the lyrics; it was what we now call attitude – some genres of rock became truculent, angry and uncooperative. Rock became self-consciously untamable and immovably defiant. 

Rock concerts morphed into political assemblies with guitars. In Pierre Bourdieu’s terms, rock was asserting its own “cultural capital”, inverting what respectable taste looked and sounded like.

And of course, this politicization of sound generated its own backlash. From the FBI’s surveillance of folk singers to the Reagan-era culture wars, protest music was never left alone. But the mere existence of such responses proved the point: rock could and was willing to provoke. Not change policy perhaps, but change the way people looked at the world. Some might argue that this is a necessary precursor. It urged fans to think and argue. Still does.

Motown, Hip-Hop and today’s outrage

If rock in the 1960s screamed defiance, Motown in the same decade softly murmured it. Berry Gordy’s Detroit hit factory crafted a pop-funk blend that explicitly avoided politics or any kind of social issue. Gordy’s genius (and, for some, limitation) was to make music that white audiences couldn’t resist. Artists like Marvin Gaye and The Supremes broke barriers, but at a price: no overt mention of civil rights, no protest, no bucking the system. Respectability was the Trojan horse.

Yet this silence was strategic: it was part of Gordy’s master plan. He demonstrated that black culture could “cross over” and flourish, if not dominate the mainstream. Eventually, cracks formed in its apolitical façade. Marvin Gaye’s inimitable What’s Going On (1971) was a turning point. A protest against the Vietnam war and an elegiac cry for peace, it was unmistakably political.

Still, by the 1980s, it was another genre that picked up the cudgel of confrontation: hip-hop. Unlike rock or Motown, hip-hop didn’t bother with subterfuge. Emerging from the ruins of post-industrial cities, it voiced fury, detachment, pride and a different type of community, closer to tribes than families. Public Enemy’s “Fight the Power” was both a song and call to arms. The message behind N.W.A.’s “Fuck tha Police” was self-explanatory. Hip-hop didn’t flirt with outrage: it pursued it. And it succeeded. Politicians, parents, and the police responded predictably: with bans, censorship, and surveillance, all of which paradoxically made hip-hop more relevant.

Kanye West, in many ways, was and is both culmination and mutation. Early in his career, he revived the politically aware rapper: “Jesus Walks”, “All Falls Down”, and “Diamonds from Sierra Leone” offered critique and analysis-of-sorts. But then he seemed to lose interest and turned inward: flirting with Trump, invoking slavery as “a choice” and dissolving the boundary between art and spectacle. His brand of provocation blurred the lines between dissent and bigotry.

Today, whether through Kendrick Lamar’s Pulitzer-winning meditations on black trauma or the performativity of artists like the British pair Bob Vylan, the tradition continues, reminding us that the purpose of such music is not always harmony but conflict, collision and confrontation. As ever, it is unpleasant, filthy, and repugnant to some. That, after all, is the point.

Kraken wakes

The recent — still raging as police investigate Bob Vylan’s anti-IDF chant — reminds us that rock’s restless and unruly spirit never quite disappears. Like Kraken, the legendary sea monster, it just lies dormant until someone dares to wake it. That a neo-punk duo like Bob Vylan could provoke such political and media uproar with a few shouted words speaks not only to the raw power of performance, but to the enduring unease society feels when music stops entertaining and starts accusing.

Critics argue the chant “incites violence”. Yet the legal bar for incitement is set high: it requires intent, imminence and, crucially, actual disorder. No such consequences have emerged. What has emerged is something more familiar: moral panic. Just as Elvis’s hips, Dylan’s lyrics, and N.W.A.’s defiance once stirred fears of chaos, today’s outrage says more about the public’s need for containment than any actual threat.

If anything, the Glastonbury episode proves the point made across decades: rock and its successors exist to disrupt, offend, confront and get people’s backs up. It challenges the status quo or, to paraphrase Brando, “what you got”. And in doing so, it affirms its place in a tradition that stretches back through the history of cultural resistance. Sanitized pop will always dominate the charts. But when a song, a chant, or a moment on stage still has the power to rattle institutions, from the BBC to the police, it’s a healthy sign that the music, however filthy or odious, still matters.

*[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Rock Music is Sickening, Disgusting, Filthy, Repugnant, Unpleasant and Odious – That’s the Point appeared first on 51Թ.

]]>
/more/environment/rock-music-is-sickening-disgusting-filthy-repugnant-unpleasant-and-odious-thats-the-point/feed/ 1
The Andrew Tate Myth: In Reality, He Has Only Limited Influence /world-news/the-andrew-tate-myth-in-reality-he-has-only-limited-influence/ /world-news/the-andrew-tate-myth-in-reality-he-has-only-limited-influence/#respond Thu, 26 Jun 2025 13:06:15 +0000 /?p=155961 Andrew Tate is often described as one of the most influential figures in the world today. Unlike global icons such as Taylor Swift, Khloé Kardashian or Greta Thunberg, whose influence is rooted in creativity, celebrity or activism, Tate’s notoriety centers on his perceived role in propagating toxic masculinity. He is regularly held responsible, often singularly,… Continue reading The Andrew Tate Myth: In Reality, He Has Only Limited Influence

The post The Andrew Tate Myth: In Reality, He Has Only Limited Influence appeared first on 51Թ.

]]>
Andrew Tate is often described as one of the most influential figures in the world today. Unlike global icons such as Taylor Swift, Khloé Kardashian or Greta Thunberg, whose influence is rooted in creativity, celebrity or activism, Tate’s notoriety centers on his perceived role in propagating toxic masculinity. He is regularly held responsible, often singularly, for advancing a regressive model of manhood. But how accurate is this portrayal? Does Tate’s influence merit the level of scrutiny it receives?

Measuring impact

To explore these questions, we surveyed 1,100 men across a broad age spectrum, primarily in the United Kingdom, the United States and Australia. The results challenge common assumptions.

Only 7% of respondents reported being influenced by Tate in any meaningful way. Among this small group, many described their response as critical rather than imitative. Over three-quarters (76%) found his views on women offensive or harmful, and while 84% agreed that “toxic masculinity” exists, few recognized it in themselves or associated it with Tate. Just 4% described his ideas on masculinity as insightful.

Tate presents a composite identity: misogynist, alpha male, free speech martyr, self-styled philosopher and walking embodiment of the supposed masculinity crisis. He has been described as a figurehead for a backlash among men unsettled by shifting gender norms and the erosion of traditional roles. His message is argued to appeal to those who feel disoriented by feminism, gender fluidity and the growing visibility of women in public life.

In that sense, Tate is not just out of step with the zeitgeist—he’s proudly backward-looking. His retrograde vision is part of his marketing strategy: a fantasy of regression, in which men rule by birthright and women willingly cleave to subservience.

A manufactured Messiah

Tate, a former professional kickboxer born in Chicago and raised in the UK, entered public consciousness after appearing on in 2016, where he was subsequently removed from the house. His prominence grew through provocative online content, amplified on platforms like TikTok and further fueled by extensive media coverage. 

This promoted a version of masculinity based on getting rich, building physical strength and retaliating against contemporary feminism (which, in this context, we understand as the advocacy of women’s rights based on the premise of gender equality). He allegedly became wealthy by selling access to his “Hustlers University,” which claims to teach subscribers how to make money online.

In 2022, Meta banned him from Facebook and Instagram. Shortly after, Hope Not Hate’s research director, Joe Mulhall, him as a “genuine threat to young men, radicalizing them towards extremism, misogyny, racism and homophobia.” TikTok and YouTube later followed suit, banning his content and inadvertently increasing his notoriety. Today, he has around 10 million followers on X (formerly Twitter).

Garrulous and always ready to offend, Tate became a magnet, attracting interest from everywhere, including Romania where he and his similarly regressive brother were arrested on suspicion of human trafficking and rape, charges they both deny.

Tate has been variously labelled the or of toxic by outlets including The Washington Post, The Sunday Times (South Africa), The Irish Independent and The Guardian. Yet such designations raise a central question: is Tate genuinely shaping minds, or has he become a symbol for broader anxieties?

Limited influence, strong rejection

Our research suggests Tate’s real-world impact is limited. Only a small proportion of men reported any influence, and even then, many described a kind of inverse effect. One participant noted that encountering Tate’s views increased his willingness to confront misogyny: “His influence means I’m more likely to take a stand when I hear those views. So perhaps I have been influenced by Tate—but in the opposite direction.”

Others echoed this sense of reactive engagement: “I feel a more direct responsibility to challenge misogyny than I did before the manosphere existed. Previously, if one of my friends or someone at work said something ignorant about men or women, I’d probably let it go. Now, I feel it’s more important to say something—even if it makes people uncomfortable.”

Far from adopting Tate’s ideology, a significant proportion of men describe their response as deliberately oppositional. “I’m more determined than ever to be an ally to women—just to spite men like Andrew Tate,” said one participant. Similarly, another man stated plainly: “It’s made me a much stronger advocate for female causes.”

The majority of respondents found Tate’s views harmful. Even those who rejected his ideas acknowledged their potential to influence others. One participant observed: “Tate believes men should have authority over women in relationships—including controlling how they dress, who they speak to and what they do. He also thinks women bear some responsibility for being sexually assaulted. It’s completely misogynistic and toxic and just shows how out of touch he is in the 21st century.”

Indeed, the danger may lie less in what Tate says than in what he allegedly does. In April 2025, four women filed a civil lawsuit against Tate alleging sexual violence and coercive control—a pattern of behavior intended to dominate and isolate. He denies all allegations.

A crisis questioned

Toxic masculinity is often invoked as both cause and symptom of a broader crisis in masculinity.  It bears resemblance to “hegemonic masculinity,” a phrase first coined in the 1990s by Australian sociologist R.W. Connell (now Raewyn Connell), to describe a form of masculinity culturally idealized by the likes of Arnold Schwarzenegger and Sylvester Stallone and associated with strength, power and control. The new equivalent is coupled with a crisis captured by the observation: In the USA and UK,  boys are more likely to own a smartphone than to live with their biological father.

As one participant in the survey put it: “Many younger lads find it hard to find their place in the world, especially when it comes to relationships. That’s always been the case. Tate tells them it’s not their fault—and that the way to overcome it is by acting dominant and treating women not as equals, but as possessions.” Another was more direct: “He’s brainwashing boys, young men and adult males who should know better.”

Although 84% of respondents acknowledged the existence of toxic masculinity, few saw it as personally relevant. The concept appears widely accepted but weakly internalized—more a matter of cultural script than personal conviction.

A cultural mirror

The story of Andrew Tate may reveal more about the cultural context in which he operates than about the man himself. Like Hans Christian Andersen’s fable of The Emperor’s New Clothes, the discourse around Tate reveals how narratives can gain authority through repetition rather than evidence. It captures a timeless social truth about the power of groupthink, fear of dissent and a tendency to uphold a fiction for fear of exclusion or ridicule.

A consensus has formed, amplified by media, educators and policymakers. Young men, we are told, are in thrall to a pernicious ideology perpetrated by Tate. But our research suggests a different reality: Most young men reject Tate’s views and don’t see themselves in this alleged crisis. Like the emperor’s subjects, many may privately dissent from the prevailing narrative but remain silent, assuming everyone else sees what they do not. 

In that sense, andrew Tate functions less as a catalyst and more as a mirror, reflecting broader concerns about gender, authority and social change. His influence may lie not in what he says, but in how society reacts to him.

[edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post The Andrew Tate Myth: In Reality, He Has Only Limited Influence appeared first on 51Թ.

]]>
/world-news/the-andrew-tate-myth-in-reality-he-has-only-limited-influence/feed/ 0
Is the Manosphere Myth, Media or Reality? /culture/is-the-manosphere-myth-media-or-reality/ /culture/is-the-manosphere-myth-media-or-reality/#respond Thu, 08 May 2025 13:48:34 +0000 /?p=155472 They have little hope, plenty of hate and scant traces of what most would consider humanness. Something appears to have infiltrated the minds of young men, reminding them that they should think and behave as men — men as defined not by feminists, gender benders or purveyors of wokeism, but as “real men,” restored to… Continue reading Is the Manosphere Myth, Media or Reality?

The post Is the Manosphere Myth, Media or Reality? appeared first on 51Թ.

]]>
They have little hope, plenty of hate and scant traces of what most would consider humanness.

Something appears to have infiltrated the minds of young men, reminding them that they should think and behave as men — men as defined not by feminists, gender benders or purveyors of wokeism, but as “real men,” restored to a state that once prevailed before reformers began to pervert what nature intended. Websites, blogs and social media apps serve as conduits for these young men to express their views, particularly their hostility toward women and the rights they have claimed in recent decades. The collective enterprise of these young men is known as the “Manosphere.”

The term itself was first used in a 2013 The Manosphere: A New Hope For Masculinity, though its associations with misogynistic violence, sexual assault and online harassment came later.

The Manosphere seems to sit incongruously on a cultural landscape reshaped by the #MeToo movement, gender fluidity, transgender visibility and a growing commitment to diversity, inclusivity and equality — a landscape where traditional notions of masculinity were expected to evolve rather than reassert themselves. The zeitgeist appeared to be leaving behind what was once called — an ideal emphasizing power, strength, dominance, competitiveness, emotional suppression, heterosexuality and authority over women.

So, why has the Manosphere sprung into being? Who or what is to blame? Is it genuinely threatening, or are we exaggerating its importance?

The red pill

Before I address these questions, let me clarify our subject. The Manosphere is a collection of online communities where “red pill” acolytes gather to discuss and promote what they consider to be an awakened understanding of gender dynamics and masculinity.

The term “red pill” refers to a process by which someone’s worldview is transformed, often dramatically, revealing a hidden and sometimes disturbing truth about the world. (The metaphor originates from the 1999 The Matrix, in which a blue pill maintains a conventional conception of reality while a red pill exposes a more unsettling truth.) In cyberspace, multiple versions of reality coexist; populist politics and neo-mystical abstractions compete, offering different paths to what red pill men believe is the concealed truth about manhood.

In these spaces, like-minded young men share personal experiences, validate each other’s views and sometimes just challenge liberal or progressive narratives on sex, gender and other matters. The range of subjects up for discussion is vast. Manosphere devotees pontificate about geopolitical issues as easily as they issue personal advice.

Andrew Tate

Cultural change always stirs pushback. From the moment feminism began to gain traction in the late 1960s, the traditional conception of men as leaders, providers, protectors and breadwinners came under threat. For many men who rejected overt sexism and the strict traditional male role, sharing domestic responsibilities was acceptable. Yet, in recent years, certain figures have emerged who advocate a return to an earlier era, one in which men wielded overwhelming power and purpose.

Several prominent voices have critiqued the changing gender hierarchies and called for a reversion to traditional arrangements. The Canadian academic Jordan Peterson is influential in this space. American Paul Elam argues that “most of the discrimination is faced by men. The fact of the matter is that men are .” Daryush Valizadeh, known as Roosh V, brands himself a “neo-masculinist,” and Rollo Tomassi is the author of . Yet, without doubt, the most influential presence in the Manosphere is Andrew Tate, the former kickboxer, who has become the most persuasive proponent of hegemonic masculinity in the digital age.

Tate has amassed a huge following of young men, all drawn to his rejection of recent cultural change and his support for a return to “traditional” values. Together with his brother Tristan, he is currently facing allegations of trafficking minors, sexual intercourse with a minor and money laundering, charges he denies. His social media presence and legal scrapes have made him a globally renowned, if notorious, figure.

The popular view is that Tate’s influence reflects the appeal of hegemonic masculinity in an era where some men feel disempowered, marginalized and adrift. His popularity also underscores the power of social media algorithms in amplifying polarizing figures and the receptivity of younger audiences searching for some sort of identity. In another era, someone like Tate might have been dismissed as a psycho-chauvinist or an unreconstructed misogynist; today, he stands as the archetypal influencer.

Tate was referenced in the Netflix series , which dramatizes the arrest of a 13-year-old boy who fatally stabs an older schoolgirl for mocking him as an “” (involuntary celibate). The drama, written by two middle-aged men, seeks to raise points about the Manosphere. The boy, for instance, betrays barely concealed anger during an interview with a female psychologist and seems to harbor an understanding of sex that is dismembered from sexual attraction, feelings, intimacy or actual sexual activity.

Moral Panic?

Some might assume that Tate possesses an almost clairvoyant ability to deliver a scabrous moral lesson about liberal hypocrisy masking a “real” reality. In this respect, he resembles other gender reactionaries of the past. Yet, his ability to influence young men prompts a pertinent thought: Why are they letting him in? Or are they?

Two contradictory conclusions emerge. If young men are embracing him, it may be because they sense personal powerlessness or detachment — perhaps even experiencing isolation or estrangement from society. Stripped of male role models, possibly due to the prevalence of female-headed single-parent families, they turn to figures like Tate who promise an alternative world in which men dominate and where authenticity is reclaimed.

Conversely, if the Manosphere is not as extensive and formidable as the popular media suggests, it ɴdzܱ’t be the first time that the media’s focus on a potentially troublesome pattern of behavior has precipitated widespread fear or anxiety, spiraling into a . Historically, such apprehensions have centered on youth subcultures, Satanism and video games, among other things. Media coverage of these amplify fears until a self-fulfilling prophecy takes hold. In recent years, TikTok and cellphones have undergone similar treatments from the media. (In the UK, the reported prevalence of zombie knives precipitated a panic, resulting in a government on the weapons.)

It’s possible that the Manosphere is more a media figment than an actual movement. If so, we may be nurturing a fictitious monster, only to become ensnared in its myth. There is little concrete evidence to support its existence, though Tate’s hold on the popular imagination is evident in his prodigious metrics on Google searches and social media — especially on platforms like , where his content has garnered billions of views, suggesting a vast audience that likely includes many young men (YouTube Tate for his “hate speech” in 2022).

I have posed questions for which there are no definitive answers. There are no membership cards for the Manosphere, and few would openly identify as misogynists.

The Manosphere, like the web itself, is diffuse, pervasive and largely anonymous. It remains unclear how many of those who encounter manospheric material truly embrace its worldview. Many might be merely curious. Are they “lost boys” or bitter, hateful teenagers capable of genuine malice? Or are they phantoms, conjured by an overactive media’s imagination?

[Ellis Cashmore is the author of , r, and other books.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorialpolicy.

The post Is the Manosphere Myth, Media or Reality? appeared first on 51Թ.

]]>
/culture/is-the-manosphere-myth-media-or-reality/feed/ 0
Sean “Diddy” Combs: Race, Gender and a New American Dilemma /history/sean-diddy-combs-race-gender-and-a-new-american-dilemma/ /history/sean-diddy-combs-race-gender-and-a-new-american-dilemma/#respond Mon, 28 Apr 2025 13:30:52 +0000 /?p=155352 Sean “Diddy” Combs will stand trial on federal sex trafficking and racketeering charges on May 5. He may or may not be guilty of the several crimes for which he is charged. But his case, like many of those before him, offers a revealing mirror. In it, we see the tensions of two powerful cultural… Continue reading Sean “Diddy” Combs: Race, Gender and a New American Dilemma

The post Sean “Diddy” Combs: Race, Gender and a New American Dilemma appeared first on 51Թ.

]]>
Sean “Diddy” Combs will stand trial on federal sex trafficking and racketeering charges on May 5. He may or may not be guilty of the several crimes for which he is charged. But his case, like many of those before him, offers a revealing mirror. In it, we see the tensions of two powerful cultural reckonings: #MeToo and Black Lives Matter (BLM) intersect in the body and biography of one man. Whether that intersection yields justice, contradiction or further division will depend not only on evidence but on how honestly society can confront its own myths, prejudices and sense of piety.

A hugely successful black man with colossal earnings, a storied career, a following of millions and limitless respect suddenly falls foul of the law. He is indicted for sex trafficking, transporting individuals across state lines for the purposes of prostitution, racketeering and other federal charges. If convicted, he could face decades in prison — the most serious charges carrying a maximum sentence of life imprisonment. Remind you of anybody?

Black celebrity pastiche

Last September, Combs, the musician-cum-music mogul, was accused in a of having used his billion-dollar business empire to abuse, threaten and traffic women in order to “fulfill his sexual desires.” He was denied bail and ordered to remain in custody. He denied all allegations.

It feels like a dispiriting pastiche. The names change — Michael Jackson, Mike Tyson, R. Kelly, Bill Cosby — but the pattern remains hauntingly familiar. A black man ascends to global superstardom, achieving wealth, prestige and, sometimes, cultural influence unmatched by his white contemporaries, only to be dragged down by accusations of criminal and moral transgression. In many cases, including that of 54-year-old Combs, now facing even more recent allegations culminating in a superseding indictment, the legal and cultural forces surrounding the individuals seem to involve more than just whether or not they are guilty: They invite us to stare at the fraught intersection of racial and gender politics in American life.

The dilemma of black success

In a broader sense, the cases of these famous black men and Combs’s in particular pose a deeply troubling dilemma. Not the described in an eponymous study in 1944, but a new version that poses an impossible choice, not merely about the guilt or innocence of these men, but in whether their very success creates a tension in American culture.

Their status as cultural icons, combined with their blackness and gender, often leads to a distorted reckoning, where their very ascent is seen as an affront to the social order. Combs’s trial, then, is not just about legal consequences but about how America grapples with the uncomfortable reality of black success — and how quickly that success can turn into perceived transgression.

The charges against Combs, which include trafficking, coercion and operating a criminal enterprise, are grave and properly demand careful legal inspection and adjudication. Yet the surrounding discourse bears an unnerving resemblance to the moral dramas witnessed in the cases of Jackson et al. As with those cases, the spectacle of an idol with feet of clay is being played out in the media. These kinds of cases are manna for them: They can elicit voyeuristic satisfaction among consumers. The moral tone is familiar: Less a function of law enforcement than of culture enforcing its own unwritten norms, with African-American men disproportionately cast in the role of transgressors.

At issue is not whether Combs is guilty, but whether his public reckoning reflects a consistent application of justice or a culturally overdetermined and so disproportionate response shaped by America’s unresolved anxieties about race, power and masculinity. In other words, we’re usually left to ponder what exactly is being pursued: justice or revenge?

#MeToo vs. BLM

#MeToo and BLM are social movements that have helped redirect the zeitgeist. For the most part, they complement each other. But only for the most part: There’s a space where the legacy of white supremacy and the evolution of feminist critique collide, often with conspicuously successful black men as the flashpoint.

#MeToo, of course, has revolutionized the way societies regard sexual misconduct, shifting focus from the individual actions of men to the cultural arrangements that allow abuse to thrive. But inside that broader analysis, questions of race are often marginalized. When white men are accused, the script usually centers on individual or psychological problems. When black men are implicated, the narrative can take on a and one that not-so-subtly reinforces age-old of hypersexuality, and lack of self-control. These stereotypes stretch back to slavery and Reconstruction and were during the Jim Crow era to justify lynching and segregation.

Even now, when legal structures claim neutrality, the popular imagination still operates with encoded biases. In the Combs case, the accusations (sex trafficking, violence, coercion) seem to awaken these lingering myths. Combs’ persona as a brash, extravagant mogul flaunting wealth, power and women is now being reframed as part of a symptom of his predatory predilections. Historian Ed Guerrero has on how the media’s framing of cases like Combs’s both reflects and perpetuates historic biases, contributing to a cycle that affects the lives and careers of black men in the public eye.

Whether or not the charges are substantiated, the imagery evokes a familiar template. The ancient idea of the droit du seigneur — a feudal lord’s alleged to have sex with a vassal’s bride — has modern echoes in today’s celebrity culture, where power can and, as we know, does enable and abuse. But its application isn’t consistent: It becomes racially charged when the “lord” in question is a black man who defies every expectation set by a white-dominated society.

Black success is conditional

In one of my , The Destruction and Creation of Michael Jackson, I argue that Jackson’s very existence challenged categories of race, gender and even age, and that the cultural and media backlash he suffered was not just about his actions, but about his symbolic or even imagined transgressions. In a similar vein, Combs, who has reinvented himself several times (Puff Daddy, Diddy, Love), amassed a fortune from music, fashion and liquor empires, and cultivated a public identity of invulnerability. Forbes his wealth at $90 million last year. He, perhaps more than any of the other figures mentioned so far, has pushed the boundaries of what a black man could legitimately represent.

His 2023 civil case was for an undisclosed sum amid accusations of violent abuse and descriptions of parties known as “freak offs,” which went on for days and involved the coercion of women into sex. It came after years of whispers, lawsuits and accusations. The case led to more allegations and denials culminating, we anticipate, in May. This may represent a legal reckoning, but it also invites reflection on how his public identity may have made him an irresistible target.

None of this is to exonerate wrongdoing. R. Kelly’s , Cosby’s civil suits and (later overturned), and Tyson’s for rape all involved credible accusations and, in some cases, overwhelming evidence. Yet the broader pattern invites interrogation. Is America more eager to believe the worst about black men? And does the celebration of their success carry an unspoken caveat: that they must eventually be humbled, exposed or destroyed?

BLM arose in response to state-sanctioned violence, racial profiling and a justice system that often discounts black lives. While its origins lay in police brutality, its implications stretch into every institutional domain, including the courts, the media and public opinion. The movement highlighted the inescapable fact that racism is not just about overt acts of hatred, but about the cumulative effect of double standards, implicit biases and institutional neglect. In this light, the Combs case must be understood not only in terms of personal accountability but also as part of the larger cultural script that BLM sought to disrupt.

That script tells us this: Black success is conditional. It is tolerated but never fully embraced; admired, yet never quite trusted. And when a successful black figure is accused, the rush to judgment is faster, the appetite for spectacle keener and the desire for ruin more intense.

Combs’s story isn’t just about his alleged crimes: It’s about the kinds of narratives society is prepared to believe. His destiny will become part of a historical arc that has often positioned the successful black man as a threat once he ascends too far. Whether in sports, entertainment or public life, the American imagination appears more comfortable with black male success when it is controlled; when it can be reclaimed or punished.

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorialpolicy.

The post Sean “Diddy” Combs: Race, Gender and a New American Dilemma appeared first on 51Թ.

]]>
/history/sean-diddy-combs-race-gender-and-a-new-american-dilemma/feed/ 0
Russians — The Great, the Gifted and the Terrible /interactive/russians-the-great-the-gifted-and-the-terrible/ /interactive/russians-the-great-the-gifted-and-the-terrible/#respond Sat, 29 Mar 2025 11:58:17 +0000 /?p=155029 The post Russians — The Great, the Gifted and the Terrible appeared first on 51Թ.

]]>

The post Russians — The Great, the Gifted and the Terrible appeared first on 51Թ.

]]>
/interactive/russians-the-great-the-gifted-and-the-terrible/feed/ 0
FIFA Under Fire: Trump’s Transgender Ban Sparks Dilemma /politics/fifa-under-fire-trumps-transgender-ban-sparks-dilemma/ Mon, 17 Feb 2025 14:13:03 +0000 /?p=154576 The impact of Donald Trump’s executive order banning transgender athletes from participating in women’s sports will be felt by every sports governing organization, most forcefully by FIFA. Association football (soccer) is the most popular sport in the world, and it is run by arguably the most powerful regulatory apparatus in history. Non-Americans may not know… Continue reading FIFA Under Fire: Trump’s Transgender Ban Sparks Dilemma

The post FIFA Under Fire: Trump’s Transgender Ban Sparks Dilemma appeared first on 51Թ.

]]>
The impact of ’s executive order transgender athletes from participating in women’s sports will be felt by every sports governing organization, most forcefully by FIFA. Association football (soccer) is the most popular sport in the world, and it is run by arguably the most powerful regulatory apparatus in history.

Non-Americans may not know the meaning of an executive order: It is an official directive issued by the President to federal agencies and departments and has the force of law. The ban on transgender athletes is US policy, but its effects will be felt everywhere. A number of sports organizations, including those that govern swimming, golf and even chess, have already banned transgender women from competing in female events if they have passed through male puberty. The National Collegiate Athletic Association (NCAA), the US’s governing body for collegiate sports, reacted immediately, banning transgender women from competing in women’s sports.

Inclusivity and the World Cup

But FIFA is sure to challenge Trump’s ruling. The National Women’s Soccer League () is the top-tier professional women’s soccer league in the US and operates under the jurisdiction of the United States Soccer Federation (USSF), which is a member of FIFA. As one of the world’s major sports governing bodies to have pledged themselves to inclusivity and against discrimination, FIFA will be deeply compromised by the transgender ban. The NWSL currently permits athletes to participate in accordance with their gender identity, provided their testosterone levels are within typical limits for female athletes. The guidelines will presumably be superseded by the new restrictive provisions.

That’s only one of ’s difficulties: equally as vexing is its commitment to holding its quadrennial World Cup competition in the USA, Canada and Mexico. FIFA faced criticism for granting hosting rights to the 2034 World Cup to Saudi Arabia, where homosexual relations are outlawed and punishable by law. The criticism will seem mild compared to the condemnation that will surely follow if FIFA remains silent on Trump’s prohibition, which seems to undermine every feature of ’s credo. Some will argue it is hypocritical to stage an event that symbolizes inclusivity in a territory where inclusivity is now sneered at.

Trump’s common sense

Since becoming president, Trump has ordered an end to federal government diversity efforts, including some dating back to Lyndon Johnson, and may expel transgender people from the US military. Trump diversity, equity and inclusion (DEI) policies for the collision of a commercial jet and military helicopter that killed 67 people just outside Washington in January. It was his “common sense” assessment rather than an evidence-based evaluation. The same common sense informs much of Trump’s early initiatives. On his first day in office, he signed an order calling for the federal government to define sex as “only male or female” on reproductive cells. This should be reflected on all official documents, such as passports.

Even the title of the transgender order echoes Trump’s version of good sense and sound judgment: “Keeping Men Out of Women’s Sports.” Anything other than Trump’s understanding is dismissed as dogma or fanaticism: an earlier Trump order the insistent title, “Defending Women from Gender Ideology Extremism and Restoring Biological Truth to the Federal Government” and prescriptively instructs the federal government to remove “all radical gender ideology guidance, communication, policies, and forms.”

All this jars with global trends that have affected many parts of the world since the rise of the #MeToo movement. Common sense, at least as Trump defines it, is a kind of knowledge that seemed perfectly serviceable 40 or 50 years ago.

Women’s football — an LGBTQ+ platform

Over recent years, FIFA has positioned itself as a champion of inclusivity, drawing short of activism but relaxing its strictures of mixing the association football it governs with social, cultural and political affairs. For example, following the killing of George Floyd in 2020 and the ensuing protest, FIFA sanctioned football players to take a knee in shows of support for Black Lives Matter before games. Its effective elevation of the women’s game to the most popular female sport in the world has drawn admiration.

Women’s football is arguably the most effective crusader for LGBTQ+ rights in the world, perhaps eclipsing Stonewall, ILGA World and Outright International (remind yourself what the T in LGBTQ+ stands for). FIFA has symbolized its commitment by endorsing players and sometimes whole teams who wish to display their loyalties by wearing rainbow colors. Both female and male teams have worn rainbow armbands and shoe laces to exhibit their moral positions. Football as a sport stands squarely on the right side of history. It is barely imaginable that FIFA will stray to the other side.

What will FIFA do next?

World sport has no uniform policy on transgender athletes. The eligibility rules are different for different sports and in different countries. The International Olympic Committee (IOC) has a laissez-faire framework that allows for sports-specific eligibility criteria. It, too, will be challenged to respond to Trump’s initiative, but not nearly as much as FIFA. Association football has managed to steer clear of major controversies. The organization’s existing gender verification regulations, established in 2011, simply that only men are eligible to play in men’s competitions, and the same applies to women. In 2022, following policy changes in other sports, FIFA announced it was reviewing its gender eligibility regulations in consultation with expert stakeholders. No updated policy has yet been published. In the absence of explicit guidance from FIFA, some leagues developed their own policies. Spain, for example, a team comprising only transgender players.

Now, FIFA must confront Trump’s ban and decide whether or not to oppose it. It’s conceivable that American teams could face exclusion from international tournaments if US sports organizations are unable to field teams that comply with more inclusive international rules. But this is massively complicated by the fact that games at the 2026 FIFA World Cup are scheduled to take place in the USA, as well as Canada and Mexico. A robust response would be to threaten to rearrange games scheduled for New York, Dallas, Atlanta and elsewhere in the USA. But it would be a logistical nightmare and, in any case, media groups would protest. Ridiculous as it seems, FIFA could disqualify the US team from the competition. Trump himself would probably intervene and threaten FIFA.

FIFA can hardly avoid becoming involved in the furor. It will express misgivings about the ban and emphasize the organization’s continuing commitment to inclusivity. It may allow individual players or entire national teams to stage protests or articulate their disagreement with the order. It could even endorse some sort of protest at the World Cup, though this is unlikely. In 2022, England team captain Harry Kane was prevented from wearing a rainbow armband, presumably to avoid embarrassing Qatar, where the World Cup tournament was being held. FIFA clearly did not wish to upset the tournament hosts.

Monstrous dilemma

Yet, if FIFA needed to bare its teeth, now is the time: Transgenderism is likely to be the single most intensely debated issue in sports over the next decade or so. The arguments on both sides are persuasive: Women complain the hard-earned advances they have made in sports since the 1990s are under threat because athletes assigned male at birth are allowed to compete against natal females. Athletes who have experienced gender dysphoria and transitioned in a way they feel reflects them intellectually and emotionally complain they are excluded from competition or forced to compete in a hybrid class. For example, The New York City Marathon a non-binary division for runners who do not identify as either men or women. There are other variations in other sports.

FIFA faces a monstrous dilemma. It would probably love to reassert its position as sport’s most enlightened, progressive and reformist governor. But the first of 104 games that will comprise the next World Cup will take place on June 11, 2026, so any threats are bound to appear empty.

The next women’s World Cup is not until 2027. There is likely to be change between now and then, but if there isn’t and the ban remains in place, the USA will not have a team in Brazil: It will either withdraw voluntarily or be disqualified. Women’s football is more activist and a lot less conciliatory than its male counterpart and will use Trump’s ban to dramatize the transphobia it opposes, along with any other form of bigotry.

[Ellis Cashmore’s new book (with Kevin Dixon and Jamie Cleland) will be published in March.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorialpolicy.

The post FIFA Under Fire: Trump’s Transgender Ban Sparks Dilemma appeared first on 51Թ.

]]>
Is Gambling Addiction Really an Addiction? /more/science/is-gambling-addiction-really-an-addiction/ /more/science/is-gambling-addiction-really-an-addiction/#respond Tue, 31 Dec 2024 13:18:08 +0000 /?p=153933 Last September, La Monde reported a surge in online sports betting in Brazil: In the first seven months of 2024, approximately 25 million Brazilians began participating in online betting, with an average of 3.5 million new bettors each month. Gambling’s sudden growth in popularity raised concerns about its impact on consumer spending and financial well-being.… Continue reading Is Gambling Addiction Really an Addiction?

The post Is Gambling Addiction Really an Addiction? appeared first on 51Թ.

]]>
Last September, reported a surge in online sports betting in Brazil: In the first seven months of 2024, approximately 25 million Brazilians began participating in online betting, with an average of 3.5 million new bettors each month. Gambling’s sudden growth in popularity raised concerns about its impact on consumer spending and financial well-being. A survey by the research organization revealed 51% of Brazilians used money intended for savings to place bets.

In December 2024, the UK’s National Health Service (NHS) announced that referrals for gambling addiction shot up almost 130% between April and September, prompting the NHS national director for mental health to : “Addiction is a cruel disease that can take over and ruin lives. NHS England has almost doubled the number of specialist clinics available in the space of a year.”

By contrast, Brazil has not structured its healthcare system to accommodate any problems arising from the spike in gambling. The country doesn’t officially recognize gambling addiction. It is by no means alone: Several other countries, including Kenya, Ukraine and the Philippines, allow legal gambling but don’t recognize gambling addiction as a medical condition. The US, Sweden and Australia are among the countries that accept gambling addiction as a treatable condition. But are they right?

The history of gambling and its enemies

Betting money on games of chance is as close to a cultural and historical universal as you can get. The earliest known dice date back to 3000 BCE, discovered in archaeological sites of the Indus Valley Civilization (modern-day Pakistan and northwest India) and ancient Mesopotamia. Gamblers probably bet on games of chance or even board games, such as. 

Card games became popular in medieval Europe, though the emergence of organized sports from the eighteenth century onward provided a new landscape for gambling. Prizefighting and horseracing prospered because of the, a following of aficionados who gambled enthusiastically (over time, fancy evolved into “fans”). A combination of human curiosity, acquisitive impulses and an ability to think probabilistically maintained our interest in gambling.

People during the Industrial Revolution of the late eighteenth and nineteenth centuries viewed gambling through a moral prism. The Salvation Army, founded in London in 1865, the Women’s Christian Temperance Union, founded in Ohio in 1874, and the Methodist Church were religious organizations that opposed gambling, decrying it as sinful and a product of individuals’ moral failings or a more general moral decay.

Moral condemnation softened in the twentieth century as lotteries, casinos and, in Britain, the football pools normalized gambling, making it respectable. Britain’s Betting and Gaming Act of 1960 significantly liberalized gambling. At that point, gambling was framed as a pursuit, which, if followed zealously could lead to ruin or, conversely, riches. It lay outside the scope or concerns of medicine.

That changed in 1980 when the American Psychiatric Association (APA) formally classified “Pathological Gambling” as a mental disorder in the third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-III). In 2013, this organization reclassified it as “Gambling Disorder” in the DSM-5, and categorized it alongside substance-related and addictive disorders.

Medicalization

The expansion of medical authority and the categorization of what were once non-medical issues as medical problems is called medicalization, a process driven by the power the medical profession has accumulated to define a wide range of experiences and practices as medical issues. In this way, the medical profession has widened its jurisdiction by reconceptualizing conditions that have origins in social and cultural circumstances as medical problems requiring professional intervention and treatment. Conditions like (Attention Deficit Hyperactivity Disorder) and alcoholism have been medicalized. The medical profession, through its ability to regulate itself and define what constitutes illness, has shaped modern understandings of health, illness, and normality.

For example, Body Dysmorphic Disorder was first included in the DSM-IV (published in 1994) under the heading of Somatoform Disorders. In the DSM-5 (2013), hoarding disorder was added as a distinct condition. Other conditions were near-misses: Sex addiction was proposed in DSM-5 but not included (2013). And, while oniomania (compulsive shopping) has been recognized as a behavior of concern, it has never formally been classified as a standalone disorder in the DSM, though it is sometimes considered a manifestation of Obsessive-Compulsive Disorder (OCD).

These conditions resemble more traditional forms of addictions that are compulsions that instigate biophysical changes in the human body and brain. But they aren’t the same: Addictions to, for example, alcohol, nicotine or opioids differ from compulsive behaviors (like shopping or exercise addiction) that don’t involve identifiable physiological processes and biochemical markers.

As recently as the 1990s, we weren’t sure whether gamblers who lost consistently and occasioned hardship to themselves as well as their families deserved blame or community sympathy. Now we know: it is the latter. Gambling addicts, sometimes known as problem gamblers and occasionally compulsive gamblers, are afforded patient status and treated accordingly. They are not credited with volition, by which I mean the faculty or power of using one’s will, or agency, that is the capacity to act in a way that produces a desired effect. Instead, they are invalided and confirmed as having an illness. Poor decision-making is rendered a pathology.

Rational gamblers

Gambling is a social activity, drawing people from diverse backgrounds together to pit their wits and sagacity against one another. No one is forced to engage and, despite arguments that there is a compulsive element to gamblers’ behavior, there is ultimately a question behind placing a bet: “Will I win?” The answer determines the action. In 2013, my erstwhile colleague Jamie Cleland and I conducted a modest project with 2,500 self-proclaimed gamblers. The results challenged what we called “the myth of the gambling addict” and supported a model of the typical sports bettor as rational decision-makers who understand the odds and the technicalities of betting, rather than helpless victims unable to control their compulsions.

Gambling is a rewarding activity even if the gambler loses money: The gratification is in the frisson of excitement it confers. Labeling some gamblers “compulsive” is misleading: They’re not driven to gamble by overpowering forces but by the prospect of being thrilled. Even when they realize the damaging consequences of losing, they choose to gamble from a range of possibilities. Being encouraged to think of themselves as something other than volitional agents means a kind of surrender is offered.

Sanitization

Addiction has been sanitized to the point where people have assimilated it into their self-conceptions and believe they’re helpless to resist. Writer concedes she is, or was, addicted, in her case to social media apps. She recently reflected on “the behavioural conditioning that I’d unconsciously consented to since getting aged 13” and its consequences: “I couldn’t go 15 minutes without reaching for my phone, and the disappointment would surge each time I realised I couldn’t get that instant dopamine hit.” (Dopamine is a neurotransmitter — a chemical messenger in the brain — that’s associated with feelings of pleasure and reinforcement of behaviors. While there’s direct and persuasive evidence that Class A drugs, like cocaine and methamphetamine stimulate the release of dopamine and consequent habit formation, there’s no compelling proof that social activities like gambling or engaging with social media have comparable mechanisms. The fulfillment derived from these activities is unlikely to be biochemical.)

Some might even exploit the sanitization. Like, a former financial manager for the NFL’s Jacksonville Jaguars who stole $22 million from the team and then sued FanDuel for $250 million, saying the betting company preyed on his gambling addiction by extending him more than $1.1 million gambling credits.

There are undeniably gamblers who have problems, but the sources of those problems probably lie outside the sphere of gambling and are unlikely to be addressed, less still solved, by medical or therapeutic means. Interventions rely heavily on counseling or behavioral therapy. They’re probably not addressing an underlying medical cause, if only because there isn’t one. Self-restraint, impulse control and improved decision-making are the kinds of objectives achievable without medical diagnoses and the admission of addiction it implies.

Addictions have become so prevalent that practically any behavior with undesirable outcomes that’s repeated without modification is likely to be called addictive. About 30% of offenders in the US are estimated to reoffend, continuing to drink and drive even after facing legal consequences. In the UK, a similar pattern of recidivism is emerging. No one suggests drivers can be addicted to driving-under-the-influence. Yet.

Some years ago, the term “dependence” seemed poised to replace “addiction.” This described the state of relying on or being controlled by something or someone and had no clinical or pathological implications, focusing instead on how circumstances and cultural contexts shape behavior.

“Addiction” is easier on the intellect: It is a definable condition with clear boundaries, usually rooted in biology or psychology, offering a simple way to understand behavior that might otherwise be complex and opaque.

[Ellis Cashmore’s “” is published by Bloomsbury.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Is Gambling Addiction Really an Addiction? appeared first on 51Թ.

]]>
/more/science/is-gambling-addiction-really-an-addiction/feed/ 0
Ten Reasons Saudi Arabia Should Host the 2034 FIFA World Cup Finals /world-news/ten-reasons-saudi-arabia-should-host-the-2034-fifa-world-cup-finals/ /world-news/ten-reasons-saudi-arabia-should-host-the-2034-fifa-world-cup-finals/#respond Fri, 20 Dec 2024 12:58:06 +0000 /?p=153778 FIFA, the world governing organization of association football (soccer), recently announced that its quadrennial tournament, the World Cup, will be staged in Saudi Arabia in 2034. The birthplace of Islam in the 7th century, Saudi Arabia, which occupies most of the Arabian peninsula, became an independent kingdom in 1932 and, after the end of World… Continue reading Ten Reasons Saudi Arabia Should Host the 2034 FIFA World Cup Finals

The post Ten Reasons Saudi Arabia Should Host the 2034 FIFA World Cup Finals appeared first on 51Թ.

]]>
FIFA, the world governing organization of association football (soccer), recently announced that its quadrennial tournament, the World Cup, will be staged in Saudi Arabia in 2034.

The birthplace of Islam in the 7th century, , which occupies most of the Arabian peninsula, became an independent kingdom in 1932 and, after the end of World War II, grew to become a major economy, revolutionized by the exploitation of the area’s oil resources. It is the world’s second top oil producer after the USA, accounting for 13.2% of the world’s oil. Saudi Arabia (population 31,500,000) is ranked 18th richest country in the world.: wi

But there are strong , which seem to crystallize around four main concerns. The kingdom’s human rights record, which includes issues such as the suppression of dissent, lack of freedom of expression and use of capital punishment, is often raised.

Like other Gulf states, Saudi Arabia has faced allegations of exploitative labor practices, particularly involving migrant workers and, despite promises of reform, questions about workers’ conditions during the preparation for such events persist.

Homosexuality is illegal in Saudi Arabia, and same-sex relationships are punishable by imprisonment, flogging, or even the death penalty under Sharia law. This contrasts sharply with of LGBTQ+ rights and inclusivity.

Arguably, the most powerful objection is Saudi Arabia’s subjugation of women. The kingdom now allows women to participate in the workforce and drive cars unaccompanied, but guardianship laws that require women to obtain permission from male relatives for many activities and limited representation of women in leadership positions reflect deep-seated social inequality. Despite this, I believe Saudi Arabia is an appropriate host and offer ten reasons why.

1. Promoting ethical labor practices

Saudi Arabia’s World Cup preparations will involve many large infrastructural projects, and ’s oversight should ensure these adhere to global standards. Over the next decade, ’s inspection teams will monitor construction sites to safeguard workers’ rights, promote ethical labor practices and insist on compliance to its own standards. This decade-long timeline gives Saudi Arabia an opportunity to demonstrate its commitment to improving working conditions, addressing past concerns, and setting new benchmarks for fairness and safety. By making transparency and compliance a condition, FIFA can leverage its influence to leave a lasting legacy of ethical labor reform in the region.

2. A wider conception of inclusivity

’s stated mission is to celebrate cultural diversity. This presumably means the organization is prepared to embrace different cultures, regardless of whether their values and norms differ from Western equivalents. But ’s adoption of inclusivity as an animating principle is, at present, limiting: It effectively excludes nearly a quarter of the world’s population, who subscribe to Islam. For this group (numbering about 1.9 billion), same-sex relationships are a sin and women are not equal to men. As such Muslims’ fundamental beliefs contrast with ’s commitment to LGBTQ+ rights and women’s status in terms of rights and opportunities. FIFA has approved of players wearing rainbow colors and promoted women’s football to signify its resolve. By selecting Saudi Arabia, FIFA may broaden its conception of inclusivity by welcoming nations with different and possibly conflicting religious beliefs.

3. Productive dialogue on LGBTQ+ rights

Hosting the World Cup in Saudi Arabia will surely promote dialogue about differences in approaches to . No one is naïve enough to believe Islam will change dramatically, if at all. But there is at least the possibility that religious and cultural differences can be addressed in a respectful and constructive manner. While significant cultural gaps exist, the visibility of LGBTQ+ issues during the event could encourage awareness and sensitivity, promoting incremental progress. The World Cup’s traditional role as a unifying force could highlight the importance of diversity and inclusion.

4. Advancing women’s rights

Saudi Arabia has made some strides in improving women’s rights, and hosting the World Cup could accelerate this progress. The event’s global spotlight will encourage the kingdom to further expand opportunities for women in sports and beyond. Recent , such as the introduction of women’s sports leagues, indicate a willingness to evolve. A World Cup’s emphasis on equality and inclusion would act as a stimulus, pushing for greater gender parity in sports while inspiring young Saudi women to break barriers and participate fully in social change.

5. Women’s rights in other Islamic territories

While it’s a lofty ambition, the World Cup in Saudi Arabia could also catalyze deeper global dialogue on women’s status in Islamic societies. While the kingdom has made progress, significant cultural and religious restrictions remain. By hosting the tournament, Saudi Arabia would face international expectations to showcase advancements in women’s rights. This external pressure, combined with internal aspirations for modernization, could foster more material changes, providing a platform for discussions about balancing tradition with contemporary gender equality. This sounds quixotic but the World Cup could help redefine how women participate not only in sports but in wider society.

6. Only Gulf States can afford global sports tournaments

World Cups and Olympic Games are increasingly expensive to stage, and by 2034, only a handful of nations may possess the resources or the political will to host such massively costly events (Qatar is estimated to have spent on the 2022 World Cup). Saudi Arabia’s substantial financial capacity makes it an ideal candidate to sustain these costs and one of only a handful of countries prepared to. This pragmatic adaptation reflects the new reality of global sports, where Gulf States are becoming central hubs for high-profile events (see 10, below). ’s decision acknowledges this reality, ensuring that the World Cup remains a sustainable and spectacular global celebration despite mounting financial challenges. After 2034, countries outside the Gulf may not be able to afford the World Cup or, for that matter, the Olympic Games. Saudi Arabia, together with Qatar and the United Arab Emirates may become permanent homes.

7. “Sportswashing” is a misnomer

Critics often accuse Gulf States of using sports to improve their international image, a practice known as “sportswashing.” Yet, hosting high-profile events inevitably has exactly the opposite effect, drawing global media attention to a country’s human rights record. By selecting Saudi Arabia, FIFA will guarantee that critical issues — such as labor rights, freedom of expression, and gender equality — remain in the media. This scrutiny will put pressure on the host nation to address their limitations, leveraging global attention to drive meaningful change or face the consequences of bad publicity. The World Cup’s visibility thus becomes a tool for accountability and meaningful change rather than mere optics, or image management.

8. Saudi Arabia will build state-of-the-art stadiums

The built for the Qatar World Cup in 2022 received widespread acclaim for their innovative design and advanced technology. Saudi Arabia is likely to follow the pattern, constructing state-of-the-art venues that will no doubt set new standards for sports infrastructure. These facilities would serve not only the World Cup but also future sporting and cultural events, providing lasting value for the kingdom and the broader region. By investing in cutting-edge infrastructure, Saudi Arabia would ensure a world-class experience for players, fans, and broadcasters alike, leaving a legacy of excellence in global sports.

9. Growth of the Saudi Pro League

The has not yet emerged as a significant player in global soccer, even though it now boasts several world-class players like Cristiano Ronaldo and Neymar. But, by 2034, this competition could rival the English Premier League, Serie A and La Liga, showcasing top-tier talent and competitive matches. Hosting the World Cup could solidify Saudi Arabia’s position as a global soccer hub, drawing attention to its domestic league and boosting its credibility. Increased investment in local clubs and player development would further elevate the Pro League, creating a sustainable ecosystem for soccer within the region.

10. The tectonic plates of sports are shifting

The Gulf States have made their intention signally clear: They want to be sports’ center of gravity. They have monopolized world heavyweight boxing title fights, created a golf tour to rival the PGA, staged F1 Grands Prix and hosted an ATP Tennis Open. It’s possible that Qatar will petition for a tennis Grand Slam that will rival Wimbledon. Fans may balk at the idea, grumbling that there is no natural tradition of sports in these areas. But the clink of coin can be heard everywhere. No one knows for sure why the Gulf states want to “own” professional sports. They lose prodigious amounts of money on it. There is a certain cachet in staging prestigious sports events, for sure; but do the wealthy territories need status, distinction and acclamation? The nearest we can get to an answer is another question: Why does the billionaire art collector David Nahmad want the largest collection of Picasso paintings in the world? He currently has about 300 works and , somewhat inscrutably, his artworks are “as dear to him as children.”

[ by Ellis Cashmore, Kevin Dixon and Jamie Cleland will be published in March 2025.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Ten Reasons Saudi Arabia Should Host the 2034 FIFA World Cup Finals appeared first on 51Թ.

]]>
/world-news/ten-reasons-saudi-arabia-should-host-the-2034-fifa-world-cup-finals/feed/ 0
What Happens When We Ignore Genuine Mental Illness? /region/europe/what-happens-when-we-ignore-genuine-mental-illness/ /region/europe/what-happens-when-we-ignore-genuine-mental-illness/#respond Tue, 03 Dec 2024 12:32:52 +0000 /?p=153542 In recent years, the prevalence of mental health issues has been magnified by the number of entertainers and athletes who are living, or have lived through, such issues. Prominent examples include Justin Bieber, Simone Biles, Naomi Osaka, Selena Gomez and Tyson Fury. Over one in five American adults are estimated to suffer from diagnosable mental… Continue reading What Happens When We Ignore Genuine Mental Illness?

The post What Happens When We Ignore Genuine Mental Illness? appeared first on 51Թ.

]]>
In recent years, the prevalence of mental health issues has been magnified by the number of entertainers and athletes who are living, or have lived through, such issues. Prominent examples include , , , and Tyson Fury. Over one in five American adults are to suffer from diagnosable mental health conditions, with people aged 18–25 experiencing them at much higher rates — nearly 34% — than other demographics. The rates are somewhat in the United Kingdom.

But mental health issues were not ascribed to a now-infamous unnamed from Cheshire, England. This woman trapped her baby in an underbed drawer for nearly three years, keeping her alive by feeding her with a milky breakfast cereal through a syringe. She afforded her child no medical care or proper food and did not permit her to leave the drawer for long periods. The woman had other children apart from this one; the number of children and their ages were not disclosed.

The hidden child was discovered only by accident, when the mother’s partner used the bathroom and heard noises in her bedroom. The child was suffering from malnutrition, dehydration and a cleft palate.

When questioned, the mother revealed that the baby girl had been born in a bathtub at her home in March 2020. She didn’t tell the father, as they had an abusive relationship. Instead, she kept the baby a secret from him and the authorities. So, the child was never provided with medical attention nor even registered at a register office. There was no legal record of the birth. Perhaps the most chilling court testimony came from a caregiver now looking after the child who said the three-year-old girl, once recovered, had needed to be taught to smile and “didn’t know what was”

The court’s neglected options

The woman’s defense attorney claimed her mental health, a volatile relationship with the abusive father and the Covid-19 lockdown had combined to create an “ of circumstances.” Regardless, the court sentenced her to seven-and-a-half years in prison.

Under of the UK’s Mental Health Act 1983, if a defendant is found to be suffering from a mental disorder at the time of the offense, they can be sentenced to hospitalization rather than prison. The court might have sent the defendant to a secure psychiatric hospital if it deemed her unfit for a prison environment due to her mental condition. There were other options.

In England, if the court determines that a defendant’s mental health issues are present but not severe, it may issue what’s called a . This order permits the individual to receive psychiatric treatment and supervision while living in the community, rather than serving a prison sentence. In some exceptional cases, the defendant can be found not guilty by reason of insanity if they are considered to have a mental disorder that prevented them from understanding the nature or consequences of their actions at the commission of the crime. This is not the same as having a mental health condition because it suggests an inability to comprehend the criminality of their actions.

None of these options were taken. The verdict’s implication is that the court considered the woman to be of sound mind, in possession of her faculties and had the capacity to think clearly. This strikes me as, in its own way, every bit as bewildering as the woman’s horrifyingly transgressive behavior. At a time in history when celebrities habitually claim to suffer anxiety, distress and miscellaneous other ailments associated with mental illness and are readily believed, how is it possible to conclude the woman is compos mentis (having control of one’s mind)?

Scant evidence shows the woman’s motivation. During an interview with police, she said she had not known she had been pregnant and was “really scared” of giving birth. Remember, she already had children. She added that the underbed drawer was never closed and that the child did not remain in it at all times. But the girl was “not part of the family.” Puzzlingly, none of her other children reported the extraordinary presence of the child in the drawer.

Comparable cases

As uniquely grotesque as this case is, it resembles several other instances of extreme cruelty, the most notorious being in Amstetten, Austria in 2008. In this gruesome case, kept his daughter Elisabeth locked in a cellar from age 18 to 42. During her time in captivity, Fritzl raped her thousands of times, fathering seven children with her. Fritzl was jailed for life by a court in 2009, but he spent the time in a psychiatric institution until 2024, when he was diagnosed with dementia.

That’s not all. David and Louise Turpin their 13 children at their home in Perris, California. The couple was exposed in 2018 when one child, 17-year-old Jordan Turpin, escaped and called the police. They pleaded guilty of torture and were sentenced to life in prison. There was no indication that the court found the parents to be suffering from significant mental health issues that would have mitigated their sentences.

Cases of cruelty to children by parents and stepparents are grimly repetitious. Ten-month-old was murdered by his parents, Stephen Boden and Shannon Marsden, in Chesterfield, Derbyshire in 2020. Eleven-year-old was tortured and killed by his stepmother in Placerville, California, also in 2020.

In 2021, a 17-year-old girl was in Floreat, Western Australia and admitted to Perth Children’s Hospital in Nedlands. She was severely malnourished, infantilized and kept captive by her parents, both female. The girl weighed under 62 lbs, well below the healthy parameters for a young woman of her age: 105–150 lbs. The girl was homeschooled and allowed limited interaction with peers at dance school. The parents will undergo psychological assessments before sentencing in January 2025.

Sources of mental illness

All these cases elicit our incredulity. It’s difficult to believe let alone understand behavior that causes pain and sustained suffering to children from the very people who bore them. Explaining it in terms of the social circumstances of the torturers and killers is a tall assignment. However, we can sometimes discern patterns of intimate partner violence, coercive control and other kinds of domestic abuse, compounded by relative cultural deprivation and the failure of care organizations.

These are the kind of social conditions under which mental illness develops. Dysfunctional families, traumatic events, convulsions and conflicts are all potential triggers. Mental health maladaptation has its source in circumstances, but it manifests in a way that demands a particular response. Locking people up is a crude rejoinder.

In other words, mental illness, disorder or, to fall back on today’s favored term, issues, have their origins in social experiences. But they express themselves in thoughts and actions that persuade us they are purely individual properties. Perversely perhaps, mental illness often coexists with a rationality: People who harm or kill children typically employ manipulation, intimidation and isolation, all of which require some degree of planning and consideration of what’s likely to happen in the future. The perpetrators mentioned so far and, indeed, all other known or unknown child tormentors and killers behave in accordance with reason and even logic. This does not mean they are mentally well: They are not. They do have mental problems.

This should make us reflect when we say, “mental health issues.” Obviously, this is a kaleidoscopic term, not a description of a single malady. It is a constantly changing pattern or sequence of experiences and states. Describing perpetrators of violent crimes against children as “monsters” is trite and misleading. Their actions may appear inhumanly cruel and violate every known assumption we harbor about loving filial relationships. But they are unmistakably, harrowingly human and betray facets of family life we prefer to deny.

Every way I think about the hideous case at the center of this piece, I arrive at the conclusion that the woman, now presumably serving her seven-and-a-half years in prison, is not mentally well. And I mean genuinely. Her punishment seems more of a sacrifice than corrective or reparative action. 

We blithely use mental health issues to describe the relatively mild discomforts of celebrities yet avoid applying it to people who clearly are mentally unwell and often in dire need of treatment. My argument in no way removes the woman’s actions from what they are: abhorrent, sickening and unutterably loathsome. This should not preclude recognition that the perpetrator is afflicted nor closer examination of the sources of her affliction.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post What Happens When We Ignore Genuine Mental Illness? appeared first on 51Թ.

]]>
/region/europe/what-happens-when-we-ignore-genuine-mental-illness/feed/ 0
Quincy Jones, Michael Jackson and Off the Wall /culture/music/quincy-jones-michael-jackson-and-off-the-wall/ /culture/music/quincy-jones-michael-jackson-and-off-the-wall/#respond Thu, 14 Nov 2024 13:27:03 +0000 /?p=153035 What would have happened if John Lennon hadn’t met Paul McCartney at the Woolton Parish Church Garden Fete, Liverpool in 1957? Or if director Brian De Palma hadn’t introduced Martin Scorsese to his friend, Robert De Niro, in 1973? Or if Anni-Frid Lyngstad hadn’t, in 1969, sung at Sweden’s Melodifestival where she met Benny Andersson… Continue reading Quincy Jones, Michael Jackson and Off the Wall

The post Quincy Jones, Michael Jackson and Off the Wall appeared first on 51Թ.

]]>
What would have happened if John Lennon hadn’t met Paul McCartney at the Woolton Parish Church Garden Fete, Liverpool in 1957? Or if director Brian De Palma hadn’t introduced Martin Scorsese to his friend, Robert De Niro, in 1973? Or if Anni-Frid Lyngstad hadn’t, in 1969, sung at Sweden’s Melodifestival where she met Benny Andersson and started a collaboration that would lead to the formation of ABBA? No one can say, but there seemed a divine providence at play in all those rendezvous; as there was when Michael Jackson met Quincy Jones in 1978.

In honor of Jones’s on November 3, 2024 at the age of 91, I’d like to retell the story of this groundbreaking partnership.

Something in my head

Jones was on the film set of , a film version of a Broadway musical based on the 1939 film, , starring an all-black cast. Diana Ross played Dorothy, originally Judy Garland’s role. Jackson, part of the Jacksons, was also in the film. At the time, he was a known commodity, but far from being the world-renowned figure he became.

Director Sidney Lumet was a friend of Jones’s and wanted the composer/producer to provide orchestral gravitas for The Wiz’s soundtrack. Jones wasn’t impressed by the musical, but apparently felt he owed Lumet a favor or two. He and Jackson didn’t know each other before the film but struck up a serviceable working relationship. Jones later The Hollywood Reporter’s Seth Abramovitch that he remembered Jackson approached him with a task: “I need you to help me find a producer,” he said. “I’m getting ready to do my first solo album.” (Truthfully, he had made two previous solo albums.)

The two men discussed the possibility of renewing that relationship again on the projected solo album for which Jackson had already written three songs. Jones became curious about how Jackson was able to write songs without a musical instrument. According to Time’s Steve Knopper, the went something like: “I hear something in my head. I make the sounds with my mouth.”

On hearing this, Jones grew interested. “There’s an instrument that can make the sounds you want. I can write anything down on paper,” Jones replied. “If you can hear it, I can write it down.” We’ll never know whether Jackson’s career would have soared and crackled like a rocket or merely hissed like a squib had Jones not been intrigued and agreed to work on the mooted album.

Transformation

All the same, inviting Jones to take the weighty role of producer carried some risk. Like any entertainer, Jackson must have been aware of audience expectations: they must have been sharpened to a point by the then-popular Philadelphia Sound and the Saturday Night Fever disco that captivated the public in the mid-1970s. The sweet-sounding Jacksons were perfect for the late 1960s and early 1970s. But against a background of Sylvester’s thumping synth on “” or Chic’s twanging bass lines on “,” the brothers sounded tame and, perhaps worse, quaint.

The last thing Jackson wanted at his pivotal stage in his professional life was to sound old-fashioned. So, Jones, for all his mastery, wasn’t an obvious choice. He was 45 in 1978. Five years earlier, he had produced Aretha Franklin’s “,” which lacked Franklin’s gutsy blues quality and hadn’t overly impressed critics or consumers. His own double-album, , had been released to little impact in 1976.

Somehow, Jackson became convinced Jones could provide him with the kind of transformative makeover he wanted. Perhaps it was a compelling incongruity, like casting Charlize Theron as prostitute-cum-serial killer prostitute Aileen Wuornos in Patty Jenkins’s 2003 film, . It looked so odd, it might just work. Known for her glamor, Theron gained weight, wore false teeth and turned herself into a believable Wuornos. Jones seemed such an unusual producer for Jackson’s project, it too might yield something surprising.

George Benson, once a guitar prodigy who grew to prominence with his distinctive style of soul-infused jazz, once reflected on his own particular relationship with Jones. For years, Benson was discouraged from singing by his record company. Jones produced his breakthrough album in 1980 and issued contradictory advice. “Quincy Jones looked at me and said: ‘I know you better than you know yourself.’ This made me feel angry, though I didn’t say anything. But he was pushing me to do things that didn’t come naturally to me,” Benson the Financial Times’ David Cheal. “He was always pushing me to do things. He persuaded me to sing in a way that didn’t feel comfortable.”

Once outside his comfort zone, Benson sang in the unnatural way Jones suggested and the process yielded a record. “And it was a smash,” he said. The album won him three Grammys in 1981. Jackson never said Jones pushed him in the way Benson described, though the product of the collaboration suggests Jackson might also have been displaced from his comfort zone — with similarly agreeable results.

Life ain’t so bad at all

Those results were well-received, though not ecstatically. Rolling Stone’s Stephen Holden their album, , “A slick, sophisticated R&B-pop showcase with a definite disco slant … A triumph for producer Quincy Jones as well as for Michael Jackson.”

There was disagreement over Jackson’s voice. New Republic’s Jim Miller that, “Jackson’s voice has deepened without losing its boyish energy. He phrases with delicacy, sings ballads with a feather touch.” But the Los Angeles Times’sDennis Hunt , “The adolescent frailties that linger in Jackson’s voice are nagging enough to, if uncontrolled, undermine good material and production.” In the end, though, he commented, “Thanks to producer Quincy Jones, that didn’t happen here. The result is one of the year’s best R&B albums.” Presciently, Hunt wondered, “Is it possible that he’s outgrown the Jacksons?”

Between them, Jackson and Jones captured the audacity of a notionally prosperous, upwardly mobile African-American population. They were willing to take risks, avoiding a disco saturation but absorbing enough of the euphoria that animated dancefloors around the world. They added lush arrangements that might, with another artist, have sounded too sickly, or worse, clichéd. Here, they sounded innovative and sophisticated.

Even the cover art radiated aplomb: 21-year-old Jackson was wearing black tie, tuxedo and loafers. He seemed to be searching for something. His right to be free from his brothers? Or family, perhaps? Or more likely, self-validation: with Jones, he seemed to discover a license to be a fully-fledged independent artist.

music album
Off the Wall (1979). Via .

Sure, he had released four solo albums before. But none came close to Off the Wall in terms of artistry and imagination — and maybe irony. The expression “off the wall” meant unusual or strange, and the chorus of the title song was, “Life ain’t so bad at all if you live it off the wall.”

Reviews for The Wiz bore no resemblance to the warm approval Off the Wall had drawn. Time expressed the film critics’ consensus in its , “Nowhere Over the Rainbow.”

Off the Wall is regarded as a classic. It won a Grammy in 1980, multiple American Music Awards in the same year. It was into the Grammy musical Hall of Fame in 2007. It reached on the Billboard Hot 100 and spawned four Top 10 , including “Don’t Stop ‘Til You Get Enough” and “Rock with You” — both number ones.

Yet, as often happens when two artists collaborate and produce a creation for the ages, they had a falling out.

The sour aftermath

“Mr. Jones and Mr. Jackson had worked together for years, forging one of the most productive and profitable relationships in pop music,” The New York Times’s Colin Moynihan . “The two worked together on albums … that sold tens of millions of copies and catapulted Jackson — already famous from his days in the Jackson 5 — into superstardom.” And yet, years after Jackson’s death, Jones found himself in court, head-to-head with the Jackson family.

They had continued to work together. Jones produced two more Jackson albums: 1982’s , which became the best-selling album of all time; and 1987’s , which was the first album to have five consecutive singles reach number one on the Billboard Hot 100. Cumulatively, they have sold over 100 million copies.

The three albums were made and released at a time when music videos were hitting their stride and practically every record had a short companion movie. Jackson and Jones never fought or, as far as we know, even argued. But the release of Kenny Ortega’s 2009 film, , a documentary feature based on rehearsal footage shot while Jackson was preparing for his proposed comeback in 2009, brought conflict. Jackson on June 25 of that year and never made a comeback, of course.

Songs originally produced by Jones were included in the film’s soundtrack album. Jones filed suit against the Jackson estate, claiming, as Rolling Stone’s Miriam Coleman , “under the contracts, he [Jones] should have been given the first opportunity to re-edit or re-mix any of the master recordings and that he was entitled to producer credit for the master recordings, as well as additional compensation if the masters were remixed.” Obviously, no one could have foreseen how such an opulently smooth album could lead to legal convulsions decades later.

In 2013, Jones Sony Music Entertainment and Jackson’s estate owed him close to $30 million in royalties for edits and remixes of music he produced with Jackson during their collaboration. Four years later, in 2017, a jury in Los Angeles County Superior Court decided that Jones had not been sufficiently rewarded by the Jackson estate for the use of records Jones had produced and which were featured in This is It. The court awarded him $9.4 million in 2017.

Three years later, a California appellate court reduced this to $2.5 million, this being the amount due to Jones for the use of his master recording and other fees. It seemed a bitter conclusion to a relationship that, in many ways, remolded Jackson into a legitimate icon. While Jones maintained his dispute was not with Jackson himself, journalist Martín Macías, Jr. the Jackson estate’s attorney as saying: “Quincy Jones was the last person we thought would try to take advantage of Michael Jackson by filing a lawsuit three years after he died asking for tens of millions of dollars he wasn’t entitled to.”

Jones too seemed to turn vindictive. While he’d enjoyed an amicable relationship with Jackson over many years in the 1970s and 1980s, he later reflected, “He [Jackson] was as Machiavellian as they come.” In a 2018 with Vulture’s David Marchese, he declared, “Michael stole a lot of stuff,” meaning his compositions incorporated passages from other artists’ music.

It was a sour end to an artistic collaboration that ranks with the greatest of modern times. Nothing will, in practice, diminish the significance of Off the Wall. It is established in pop music’s pantheon. For all his colossal contribution to music, Jones’ elemental role in the creation of Jackson’s album will be his defining achievement.

[Ellis Cashmore’s “” is published by Bloomsbury.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Quincy Jones, Michael Jackson and Off the Wall appeared first on 51Թ.

]]>
/culture/music/quincy-jones-michael-jackson-and-off-the-wall/feed/ 0
Swapping Sex: A Timeline of Transgender Trailblazers /interactive/swapping-sex-a-timeline-of-transgender-trailblazers/ /interactive/swapping-sex-a-timeline-of-transgender-trailblazers/#respond Mon, 11 Nov 2024 11:47:53 +0000 /?p=152991 The post Swapping Sex: A Timeline of Transgender Trailblazers appeared first on 51Թ.

]]>

The post Swapping Sex: A Timeline of Transgender Trailblazers appeared first on 51Թ.

]]>
/interactive/swapping-sex-a-timeline-of-transgender-trailblazers/feed/ 0
Why Don’t More Children Kill Their Parents? /culture/why-dont-more-children-kill-their-parents/ /culture/why-dont-more-children-kill-their-parents/#respond Sun, 03 Nov 2024 09:22:22 +0000 /?p=152865 In 1958, sixteen-year-old William Arnold asked his parents for permission to use the family’s car. He wanted to go to the movies. When his father refused, he took a rifle, shot both parents dead and buried them in a shallow grave in the backyard of their home in Omaha, Nebraska. He was sentenced to two… Continue reading Why Don’t More Children Kill Their Parents?

The post Why Don’t More Children Kill Their Parents? appeared first on 51Թ.

]]>
In 1958, sixteen-year-old William Arnold asked his parents for permission to use the family’s car. He wanted to go to the movies. When his father refused, he took a rifle, both parents dead and buried them in a shallow grave in the backyard of their home in Omaha, Nebraska. He was sentenced to two life sentences in the Nebraska state penitentiary. He served only eight years until he escaped.

It seemed an extraordinary crime, though, in a sense, it was less extraordinary than it seems: children kill their parents more often than readers might suppose. It’s called parricide and the most recent instance of this emerged recently in Great Baddow, Essex, in England, where Virginia McCulloch, now 36, her father with prescription medication that she crushed and stirred into his drink and, in the attorney’s words, “beat her mother with a hammer and stabbed her multiple times in the chest with a kitchen knife bought for the purpose.” This happened four years ago. She stored the putrefying corpses at the family house until discovered by police.

There are other comparable murders in recent times. 1998, Auckland, New Zealand: Matthew and Tyler Williams, aged 14 and 13, killed their parents.1998, Beverly Hills, California: Lyle and Erik Menendez killed both parents. 2011, Port St. Lucie, Florida: Tyler Hadley pummeled both his parents to death after they refused to let him host a party at the family home. 2013, Albuquerque, New Mexico: 15-year-old Nehemiah Griego killed his parents and three younger siblings. 2015, Broken Arrow, Oklahoma: Robert and Michael Bever, murdered their parents and three siblings in a mass stabbing. There have been related cases, for example, that of Jennifer Pan, who hired assassins to kill her parent in Ontario, Canada: She was sentenced to life in prison after being of both first-degree murder and attempted murder. 

Parricide in history

Abhorrent and unnerving as parricide strikes us, it was prevalent in the medieval era (specifically, the 11th–14th centuries). Disputes over succession and land ownership were usually the source of dynastic violence: Younger children, ambitious or desperate for control but blocked by their parents, killed fathers, sometimes mothers and occasionally siblings in their pursuit of control.

Even in territories dominated by cultures that encouraged honoring and respecting, as well as loving parents, and norms that emphasized filial duties, there are examples of children either killing or trying to kill their parents. Most famously, , the sixth Mugal emperor of India (1658–1707) — famous for building the Taj Mahal in memory of his wife — imprisoned his father and killed many of his male relatives in his rise to power. While technically not a case of parricide, Byzantine Emperor Constantine VI imprisoned and reportedly blinded his own mother.

Historical cases of parricide are often intelligible in terms of ancestral struggle, though we should also remember current conceptions of the family as a cohesive, supportive natural unit in which love, caring and unselfishness are taken as natural, are products of a relatively recent understanding of the conjugal family. Earlier forms of the family tended to be different.

Multi-dysfunctional families

One of the most ferocious critiques of the family, particularly the modern nuclear family, was that of , who challenged the assumption that the arrangement was wholesome and beneficial to children. Laing’s argument about the impact of family relationships on mental health offers a way of comprehending contemporary parricide. For Laing, our image of the family has been pasteurized. His own account is more adulterated: The family is often a multi-dysfunctional amalgam from which children sometimes escape bruised, if not permanently damaged emotionally and cognitively (and sometimes physically). The family imposes roles, identities, and expectations on individuals in ways that can lead to anxiety, distress, sometimes schizophrenia and what we today euphemize as “mental health issues.”

Children, on Laing’s account, sometimes experience a suffocating sense of captivity and believe there is either no escape from family demands. Parricide, from this perspective, would be a violent attempt at liberation or self-assertion. In all of the cases in recent history, Laing’s approach appears to have relevance.

So, why do some children embark on the putatively forbidden path while others think about it and then withdraw just in time for them to leave home feeling virtuous? In other words, if Laing it to be even half-accepted why isn’t parricide most widespread? Here I ask readers to ask a question usually ignored by criminologists and other social scientists. Not, “Why do children kill their parents,” but, “Why do so few children kill their parents?”

Why isn’t there more parricide?

What if we tried to explain conformity instead of spectacularly conspicuous divergences from socially accepted standards? Philosopher Thomas Hobbes (1588–1679) did exactly this, of course, his aphorism being “The life of man [is] nasty, brutish and short.” Human beings are driven principally by selfish concerns, the fear of extinction being the primary one. So, the “natural” condition of humanity is warlike: Society, as we know it, is an artificial apparatus to accommodate the coexistence of divergent, self-seeking individuals at once. We frame rules, laws and norms so that over time, we become conventional, behaving in a way that meets others’ expectations. Mostly.

The disorienting implication of Hobbes’ thoughts is that we are all not just capable of but have an inclination or natural tendency to behave in a way that serves our own interests, no matter what the cost to others. Why don’t we then? Travis Hirschi, an American criminologist, in the 1930s, supplied an answer: We learn to conform and tend to remain compliant with rules by forming affiliations that secure us to conventional society.

In Hirschi’s , individuals are stitched into conventional life in four ways: attachment, investment, beliefs and reputation. The most important one is attachment to parents, peers and other people who matter to us in some way. Hirschi also believed that as we mature, we invest in society, specifically the years we spend in formal education and in pursuing our careers and starting a family. In many cases, individuals acquire a reputation that they try to maintain or enhance. In other cases, individuals fail to reach the standard, status or rank they had been aiming for. Our attachments prevent us from breaking rules or norms. When they are loose, slack or broken, the probability of transgressive behavior becomes pronounced.

The fathomless McCullough case

We know little of the McCullough case at the moment, though Hirschi’s theory gives it a shape. But there are other perplexities: Virginia, the self-confessed killer, had three sisters, none of whom has been implicated. During the investigation and trial, there were no allegations against these siblings for complicity or involvement in the murders or the subsequent concealment of the bodies.

It seems the siblings were unaware of their parents’ deaths and had somehow been reassured by Virginia that their parents were either unwell or away from home. Virginia contrived an incredible and — at least to this writer — implausible series of excuses for years. Statements from the siblings during the trial suggest they trusted Virginia. One sibling referred to their parents as “.”&Բ;

Even in the context of other cases of parricide, the McCullough killings are staggering. The actual killings are explicable in terms of misfiring family dynamics and the failure of at least one family member to experience little or no meaningful bonds with wider society. But storing what must have been two rotting cadavers at the family home for four years without arousing the suspicions of neighbors, care agencies, or other family members takes some fathoming. This is a case that is destined to confound us for years.

[Ellis Cashmore’s “” is  published by Bloomsbury.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Why Don’t More Children Kill Their Parents? appeared first on 51Թ.

]]>
/culture/why-dont-more-children-kill-their-parents/feed/ 0
Does Taylor Swift Want To Be a Genuine US President? /world-news/us-news/does-taylor-swift-want-to-be-a-genuine-us-president/ /world-news/us-news/does-taylor-swift-want-to-be-a-genuine-us-president/#respond Sat, 21 Sep 2024 12:47:42 +0000 /?p=152357 Imagine cleaning out your basement, finding what appears to be a charming but unremarkable painting, then scratching its surface to discover a Frida Kahlo self-portrait beneath. In 2012, Taylor Swift was a prominent country music artist with crossover appeal, but not a major force in entertainment. Then came the Red album and the genius began… Continue reading Does Taylor Swift Want To Be a Genuine US President?

The post Does Taylor Swift Want To Be a Genuine US President? appeared first on 51Թ.

]]>
Imagine cleaning out your basement, finding what appears to be a charming but unremarkable painting, then scratching its surface to discover a Frida Kahlo self-portrait beneath. In 2012, Taylor Swift was a prominent country music artist with crossover appeal, but not a major force in entertainment. Then came the Red album and the genius began to appear. Comparisons with Mozart are now more commonplace and understood, and universities teach courses on her. She occupies the same kind of status as Madonna and Michael Jackson in the 1980s and 1990s and, earlier, Elvis Presley and the Beatles. The Kahlo is now visible. Is there yet another layer?

Swift’s recent of United States presidential candidate Kamala Harris may conceal more than it reveals. After all, everyone knew her political allegiances lay with Democrats; none of her Instagram followers or anyone else would have been surprised that she wants Harris to win the forthcoming election. Maybe the endorsement is something more: advance notice that Swift intends to become a political presence in the future. If so, she could run for president in 2028. By then, she’ll be 39 years old. John F. Kennedy was 43 when he was elected in 1960, making him the youngest elected president in US history.

A new day?

Preposterous as it sounds, remember: In May 2015, Donald Trump was known principally for the NBC television show, The Apprentice, which he had fronted since 2004. He’d made his political views well-known, taking out full page in The New York Times and The Washington Post criticizing US foreign policy in 1987. In 1999, Trump briefly explored running for the Reform Party’s nomination for president in the 2000 election, though he .

So when Trump announced his candidacy as a Republican in June 2015, it came as an outrageous surprise. He’d never held political office of any kind. Only one other president had been elected without political experience: Dwight Eisenhower’s as the Supreme Commander of the Allied Expeditionary Force in Europe during World War II provided him skills that translated well to the presidency. He served two terms as president, from 1953 to 1961.

Eisenhower was a product of a different age in US politics. Trump is very much part of an age when the US struggles with a political bipolarity: Policy vs passion, logic vs emotion, wisdom vs relatability. Politicians are elected as much for celebrity appeal as leadership capability. Voters seem ready to believe they are much the same thing. How otherwise can we explain Trump’s success in 2016?

Two years after Trump’s election, Oprah Winfrey seemed poised to turn the 2020 election into a showbusiness extravaganza when she said she was “” about running for president. At least, that was the inference from her at the Golden Globes. “A new day is on the horizon,” she prophesied. In 2018, Oprah was at her persuasive peak. She was arguably the single most influential person in the world and would have made a formidable contender, despite her political inexperience. Oprah was a rare celebrity, praised for her moral authority, venerated for her inspiration and respected for her support to countless women. She seemed kissed with purpose — her destiny was surely the White House.

Trump actually named Oprah as a possible running mate when he was putting himself forward with the Reform Party in 1999; it’s doubtful she would have been interested.  She settled into a kind of trusted advisor role, dispensing wisdom and assistance without showing any ambition for power. Today, Oprah has lost her momentum, though her coruscating endorsement of Harris was a reminder of her presence. She remains an interested party.

Public face and private life

Traditional politicians like senators and governors have, in recent years, lost immediacy. They project personae and exude authority in a carefully stylized and practiced manner, using the media in almost the same way Bill Clinton (president 1993–2001) or George W. Bush (president 2001–2009) did. By contrast, figures from entertainment know how to make themselves believable. They engage audiences by sharing ostensibly private insights and exchange the experiences that shape or scar them.

Swift, like other celebs, makes no attempt to separate her public face from her private life. She surpasses arguably every artist in history in her ability to share personal experiences through her music. Her fans wax about how her music speaks to them personally with insight and vision. Many of her fans are too young to vote now, but not in four years.

Some readers will think I’ve stumbled Lewis Carrol-like down a rabbit hole leading to a land of magic and strange logic. I remind them that in 2016, Trump secured 304 compared to opponent Hillary Clinton’s 227, winning the presidency. He may yet be re-elected. Swift will not feel intimidated by her lack of political worldliness, sophistication or practical knowledge. After all, Trump had none of these benefits.

In 2018, Swift publicly supported Democrats in her home state of Tennessee, causing a surge in voting registrations, especially from young people. It was the first sign of political engagement among her fans. The following year, she spoke out in favor of the Equality Act. In her 2019 music video for “,” she the petition for the act. She was an active supporter of the Black Lives Matter movement as well.

So perhaps it makes sense for her to maintain her position on the sidelines and encourage advocates, but without risking what could be a damaging misstep. A-listers like Barbra Streisand and George Clooney have stayed in their own dominion while earnestly making their political preferences heard. This would be Swift’s safest choice. After all, you can have too much of a good thing and no one in history has ever been as ubiquitous, audibly as well as visibly. Could audiences just get sick of her?

One of the verities of celebrity culture is that it values change, freshness and novelty. Swift has been on top longer than most. Maybe she recognizes this herself and is already plotting a segue into politics. It’s not exactly a logical move: That would be to sidestep into movies. Not that this is without perils: Madonna crashed as spectacularly as she succeeded in cinema.

The sanest thing to happen to the US

Celebrity times demand celebrity politicians — or politicians who are prepared to greet Oprah’s “new day” and entertain as much as govern. In showbusiness, Swift has reached Parnassian heights: astral record sales, unsurpassable box office and unbelievable social media followings. Artistically and commercially, she is at her zenith, cleverly integrating of patriarchy into her songs when she conveys how even unmistakably successful women are still liable to run into misogyny.

But is it all just too trivial? The state of the world is grim and nothing Swift does will change that  right now. But the winds are blowing in her direction: The post-Harvey Weinstein tremors have destabilized patriarchy and the #MeTo movement remains a force. Would Sean Combs have met with instant condemnation and been reassigned as persona non grata were his known ten years ago? Censured, castigated, deplored, perhaps; but probably not canceled, as he surely will be. The historical privileges of manhood are disappearing.

Will Swift feel like culture-hopping from music to politics? It may be a leap too far, but no one can ignore her unstoppable influence. Much, I believe, depends on the outcome of the November election. If Harris wins, Swift will devote more time to championing her, perhaps closing the distance between herself and the Democrats, but not maneuvering into the political mainstream. If Trump wins instead, Swift may take the leap of faith and embrace the impossible, as giddily disturbing as this sounds today. Given modern America’s history, Swift’s leap could be the sanest thing to happen to the US.

[Ellis Cashmore is the author of  and .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Does Taylor Swift Want To Be a Genuine US President? appeared first on 51Թ.

]]>
/world-news/us-news/does-taylor-swift-want-to-be-a-genuine-us-president/feed/ 0
Voters Want Politicians Like Trump and Harris to Be Celebrities /politics/voters-want-politicians-like-trump-and-harris-to-be-celebrities/ /politics/voters-want-politicians-like-trump-and-harris-to-be-celebrities/#respond Wed, 04 Sep 2024 11:35:43 +0000 /?p=152121 “How has the national debt personally affected each of your lives? And, if it hasn’t, how can you honestly find a cure for the economic problems of the common people if you have no experience of what’s ailing them?” Republican candidate George W. Bush stood and started to answer this question before the chair interrupted… Continue reading Voters Want Politicians Like Trump and Harris to Be Celebrities

The post Voters Want Politicians Like Trump and Harris to Be Celebrities appeared first on 51Թ.

]]>
“How has the national debt personally affected each of your lives? And, if it hasn’t, how can you honestly find a cure for the economic problems of the common people if you have no experience of what’s ailing them?”

Republican candidate George W. Bush stood and started to answer this question before the chair interrupted him and warned he was digressing. “Help me with the question,” he requested after getting tongue-tied. The questioner wanted to know how he was personally affected. Democratic candidate Bill Clinton took his turn to answer. He stood, walked toward the audience and spoke, not to the audience but to the woman who had asked the question. He motioned to her, his eyes fixed on hers. “In my state, when people lose their jobs, there’s a good chance I’ll know them by their names.”

It was a transformational in politics. Of course, we didn’t know it at the time, but on October 15, 1992, at the University of Richmond’s Robins Center, politics changed. The hapless Bush was aloof and seemed almost contemptuous while Clinton interacted relaxedly with the audience without feints or deviations. It was as if he was having private conversations that could be heard, not overheard.

Outside politics, cultural change was turning us all into voyeurs. I don’t mean that people started to take an unwholesome pleasure from watching others engaged in sex or suffering in some way (although some might have). No, the new voyeurism involved the guiltless enjoyment of observing or eavesdropping on private conversations and discovering intimate details of others’ lives, particularly through television and, later, social media. This reflected a growing fascination with the personal and often unfiltered experiences of others. We called it curiosity. It soon extended into politics.

Political celebrities who seem like real people

Celebrity culture was, for many, a Trojan horse: Innocuous-looking enough to allow into our lives but baleful in its consequences. Our captivation with the lives of other people seems perfectly natural now. But it wasn’t in the 1970s. The misleadingly inoffensive horse entered in the 1980s, so that by the early 1990s, it had already taken up residence. Impatient with entertainers who were cautious about sharing details of their private lives, audiences wanted everyone to be like Madonna: unsparing in their distribution of the minutiae of their lives.

Audience appetite was for real people —  not the disproportionately impersonal and untouchable godlike characters who dominated public life for most of the 20th century, but people who resembled the other people they were supposed to entertain. 

This affected politicians. It seems laughable that we once looked up to them. For most of the 20th century, they were guardians in a benevolent moral and ministerial sense. The electorate admired, respected and, in some cases, idolized these near-transcendent beings. By the 1990s, however, audiences no longer admired politicians from afar; they wanted close-ups. What’s more, they demanded access to their private lives, blurring the lines between public service and entertainment.

Clinton seemed to understand the power of ordinariness. The folksy, down-to-earth charm that characterized him and allowed him to face several accusations of impropriety and an impeachment with equanimity made him one of the most popular presidents in history.

Clinton’s kind of ordinariness became a valuable resource. Audiences responded to politicians who mirrored themselves: They may have had more power, authority, status and attention; they may even have led more opulent lifestyles; but, unlike politicians of earlier eras, the new breed could and probably should exhibit the same kinds of flaws and problems as the people who followed them. So, Clinton’s sex scandals, far from being a source of damnation, worked like a celebrity benediction. There had been sex scandals before, but never anything approaching Clinton’s triple obloquy. The media, which by the early 1990s were ravenous for scandal, covered it extensively.

Related Reading

Bush’s struggle to connect with the audience starkly contrasted with Clinton’s approach, highlighting a shift in what Americans began to value in their leaders. Bush followed Clinton to the White House. He was prone to gaffes, making him the object of parody and criticism, especially after the 9/11 terrorist attacks and the subsequent US invasions of Afghanistan and Iraq.

By contrast, Bush’s successor Barack Obama masterfully balanced the demands of celebrity culture with a scandal-free image, projecting the persona of a cool president. He had suaveness, eloquence and an uncommon ability to connect with a broad range of people, from appearances on talk shows to a preparedness to share his (he was known to favor Beyoncé, Tyla and Kendrick Lamar.)

Harris, Trump… and Oprah

Obama’s successor, Donald Trump, entered politics as a fully formed celebrity in a similar way to President Ronald Reagan and California Governor Arnold Schwarzenegger — all three were well-known entertainers before their forays into politics. Trump hosted The Apprentice for 14 seasons from 2004 till 2015, so, by the time he won election in November 2016, he was an established figure in the media and popular culture.

Trump may have lacked Clinton’s magnetism and Obama’s relatability, but he could challenge both with his sex scandals and ability to dominate the news cycle. He had little experience in public office but was adept at maneuvering the media. Perhaps he still is. But is his audience still excited? Or are we witnessing Trump fatigue?

Audiences like novelty, freshness and new personalities. If Trump’s celebrity appeal begins to wane, Kamala Harris emerges as a pristine face in American politics. Despite being vice president since 2021, she’s relatively unknown. She’s probably the least-known nominee in living memory. She didn’t even benefit from the exposure of going through primaries. Ironically, this might not be such a bad thing.

Her paradigm will surely be Oprah Winfrey. A proven kingmaker with her pivotal “We need Barack Obama” at Des Moines, Iowa on December 8, 2007, Oprah has already given Harris her.

Related Reading

As far as I’m aware, there is no celebrity equivalent of osmosis in which style, knowledge and appeal can pass from one person to another. If there were, Harris should learn how it works. Harris’ campaign already has an Oprah feel: The “Joy” theme is confection, though not meaningless confection: It suggests Harris will, if elected, be a person who brings great pleasure and happiness — as celebrities often do.

The most amusing political spectacle in history

It seems frivolous to discuss celebrity culture in the solemn context of politics. But let’s face it: politics is no longer solemn: The dignity that once seemed to ennoble politicians has vanished and whatever they say seems glib or, at best, rehearsed. Small wonder that audiences expect value-for-money entertainment from politics. Politicians, at least the successful ones, know this and often respond in a way that elicits a reaction. Trump has an intuitive grasp of this: His bombastic statements and bumptious behavior guarantee him an expectant audience and a breathless media. His dismissal of a miscellany of accusations with a shrug gives him a certain sheen. He also recruits established showbusiness stars, sometimes to their chagrin (Abba Trump to stop playing their music at his rallies).

Like everything else, politics changes. Some might despair at the prospect of politics succumbing to trashy and meretricious celebrity culture. But voters demand it: They want politicians who are as imperfect as they are, empathic enough to be relatable, unpredictable in a way that keeps everyone curious and, above all, entertaining. And, if they’re not, they’re gone: There are plenty of politicians with presidential aspirations who rose to prominence but not for long. Who remembers Deval Patrick, Jim Gilmore or Lincoln Chafee — all hopefuls from recent political history?

Voters are accustomed to being entertained by all manner of celebrity, some weaponized with talent, others just disposable and quickly forgotten. Harris and Trump both want to convince voters that they’re not celebrities but serious politicians. That means much of the campaign will be about trying to command the media’s attention and shape the way it presents the candidates, whether as impressively august with superabundant leadership skills or just pretenders. This guarantees the campaign will deliver a theatrical, extravagant and probably the most amusing political spectacle in history.

[Ellis Cashmore is the author of, now in its third edition.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Voters Want Politicians Like Trump and Harris to Be Celebrities appeared first on 51Թ.

]]>
/politics/voters-want-politicians-like-trump-and-harris-to-be-celebrities/feed/ 0
This Is What Makes Celebrity Couple Drama Interesting to Us /culture/this-is-what-makes-celebrity-couple-drama-interesting-to-us/ /culture/this-is-what-makes-celebrity-couple-drama-interesting-to-us/#respond Sat, 24 Aug 2024 15:20:23 +0000 /?p=151952 “Jennifer Lopez and new flame Ben Affleck kissed, cuddled and made goo-goo eyes at each other for hours yesterday as the Latina lovely was feted at a surprise birthday party.” So reported the New York Post on July 25, 2002. It was the first of countless stories about the couple known sometimes-affectionately as “Bennifer.” Twenty-two… Continue reading This Is What Makes Celebrity Couple Drama Interesting to Us

The post This Is What Makes Celebrity Couple Drama Interesting to Us appeared first on 51Թ.

]]>
“Jennifer Lopez and new flame Ben Affleck kissed, cuddled and made goo-goo eyes at each other for hours yesterday as the Latina lovely was feted at a surprise birthday party.” So the New York Post on July 25, 2002. It was the first of countless stories about the couple known sometimes-affectionately as “Bennifer.”

Twenty-two years later, the news broke: Bennifer is over — . In the interim, there had been an engagement, two marriages (to other people), five children, more than 18 new fragrance endorsements, a few box office bombs, several spells in rehab and an Oscar. And, for a while, the kind of media delirium that produces headlines like “BEN AND JEN: BODY LANGUAGE: WHAT IT MEANS,” “ ‘BEN DEFINITELY WEARS THE PANTS’” and “ TELLS OF NIGHT WITH BEN.” Perhaps the most memorable was “ AND JEN SAY ‘NOT YET.’” In September 2003, Lopez visited her spiritual guide, spent two hours with her, then announced she was calling off her hugely publicized wedding with Ben Affleck. So the most recent breakup conjures a sense of déjà vu.

Here’s my question: Why? No, not why does this pair keep getting together, splitting up and then kissing-and-making-up before parting again? The more interesting question is: Why on earth are we so fascinated by them? For that matter, why are we fascinated by celebrity couples and their endless caprice?

Taylor-Burton: The beginning of celebrity couple coverage

Precedents can be found in the life of, whose combustible affair with Richard Burton imploded in 1974, after 12 years, only to regenerate itself in 1975. They married each other for the second time, but this marriage ended in less than a year. Taylor’s volatile romance is customarily considered the first modern celebrity coupling in the sense that it was copiously covered by the media. Because of this, it effectively promoted audience interest in how the other half love.

The Taylor-Burton amorous entanglement was a commodity — open, visible, public — compared to, for example, erratic but essentially private romance with Frank Sinatra in the same period. With Gardner, the media were made to work for their stories.

Taylor, probably more than Burton, practically handed out press packs. Their relationship was a romance in the golden age of the American dream factory. As such, it was glitzy, glamorous and, at times, gaudy. There might have been some hesitance, perhaps even reluctance to stampede into Gardner’s and Sinatra’s private lives, especially as there were spouses and, more importantly, children to consider. Were the media likely to contribute to marital disharmony and even the sadness of innocent children merely by reporting the relationship? Taylor removed those kinds of uncertainties. She practically directed events, which involved double-home-destruction on a catastrophic scale.

Taylor, like Gardner, reminded the world that women could be and often were prime movers in relationships. Sinatra went on to become one of the preeminent entertainers of the 20th century. But during the marriage (1951-1957), Gardner, not he, was the main attraction. One inquisitive enquirer once asked her why she stayed with the 119-pound Sinatra. Gardner replied “Well, I’ll tell you — nineteen pounds is cock.”

Similarly, Taylor was the force field that pulled in media from all over the world. Being the consummate Hollywood star — Burton had learned his art on the stage — Taylor knew the value of ostentatiousness. She behaved as if she were always in front of a camera. She usually was.

Tabloids and the new voyeurism

There was nothing comparable until 1999, when Jennifer Aniston and Brad Pitt together at the Emmys and announced a relationship that was, for all intents and purposes, conducted in front of cameras. This included a lavish Malibu wedding in July 2000. The marriage lasted until 2005, by which time J.Lo’s epic relationship with Affleck was known, had taken over as the celebrity coupling du jour and, in time, supplied a narrative of Homeric proportions.

There were other breakups that took the entertainment world by storm: Britney Spears and Kevin Federline in 2006. Justin Timberlake and Cameron Diaz in 2007. But Lopez and Affleck was epochal: It characterized a period when the media’s interest in the unappetizing areas of celebrity life was rising and audiences gave their approval to the increased coverage. One way they did this was by buying tabloid magazines.

Sales of the likes of Us Weekly, People and Star have slipped in recent years as social media has become the main conduit of celebrity gossip. But their impact in the early 2000s was appreciable and played no small part in cultivating our near-voyeuristic interest in glamorous couples. It could be plausibly argued that there was little new in this. Some might maintain that audiences had long been attracted to dreadful experiences while they remained at safe distance. Living through awful times vicariously may have its rewards: Just imagining how others feel rather than actually feeling is a pain with its own analgesic properties.

The decision by Aniston and Pitt to split and Pitt’s subsequent with Angelina Jolie was the affair that shook tabloid journalism. It alerted editors that audiences enjoyed learning about how people who otherwise led charmed lives were just as susceptible to the same painful ordeals and privations as anybody else.

This is part of the reason for our prolonged captivation with Lopez and Affleck and, to a lesser extent, other celeb couples. We might envy their lifestyles and adulation. We might even engage in wish-fulfillment and imagine what the world must be like with an A-list partner. Yet, there is gratification in learning that even the world’s most fabulous couples experience mundane squabbles and domestic discord, reminding us that beneath the glamor, they too are just as human as we are.

Performative coupledom and authenticity

That’s not the only reason we’re drawn to celebrity couples. Harper’s Bazaar writer Marie-Claire Chappet uses the term “” to describe the way many couples like J.Lo and Affleck present themselves to the media for our delectation. Chappet argues that celebrity couples are not passive recipients: They pull out as many stops as they can to maximize the inquisitiveness of the media. Coupledom can be a valuable and highly commodifiable item.

Chappet also suggests there is a kind of synergy in performative coupling. “Just look at Ben Affleck and Jennifer Lopez,” she writes, “both huge stars whose wattage flickered all the brighter once they got back together. In fact, in many ways, this couple are the ultimate embodiment of this trend.” The colossal coverage given the latest breakup underlines her point.

Neither party swept gracefully upwards after the 2003 breakup. Affleck had scored a with his Oscar-winning film Argo, but had featured in flops, too. He struggled with alcohol dependency and had at least three periods in . Lopez’s career also seemed to downwards when she appeared on the television series American Idol. But to her dubious credit, her Super Bowl halftime show appearance in 2020 elicited 1,312 from viewers to the Federal Communications Commission (FCC). She was 50 years old at the time and most of the complaints were about the sexual explicitness of her performance. The latest rift will surely regenerate interest in the ill-starred duo.

No celebrity couple is perfect. Even the best-matched partnerships hit unexpected and often hidden snags, obstacles that complicate or even destroy relationships. If a couple is seen as just too good to be true, the adage kicks in: It usually is. Celebrity couples must have the imprimatur of genuineness to captivate us. This means extremely short affairs, like Kim Kardashian’s 72-day to Kris Humphries, are dismissed as stunts. Or, in the case of Britney Spears, whose to Jason Alexander lasted 55 hours, they’re viewed as false-starts.

The seeming contradiction between an authentic relationship and performativity is smoothed over by audiences who like to see people at their best and worst. Today’s celebrity-savvy audiences suspect staging here and there and accept it. They are celebrities, after all. But couples must humanize themselves and remind audiences of their authenticity with everyday emotions, quarrels and fall-outs that serve to maintain captivation. An occasional rage helps, too.

J.Lo and Affleck may be waving goodbye to each other, but they might just as well be waving a banner bearing the slogan, “This is our pitch for immortality.” Individually, they’re probably worth a lot less than they are together. But even breaking-up unites them as far as the media and its audiences are concerned. The heartbroken pair appear to be marching toward celebrity immortality. Meanwhile, we wait for the reconciliation.

[Ellis Cashmore is the author of, now in its third edition.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post This Is What Makes Celebrity Couple Drama Interesting to Us appeared first on 51Թ.

]]>
/culture/this-is-what-makes-celebrity-couple-drama-interesting-to-us/feed/ 0
Sex in (and Out of) the White House /multimedia/sex-in-and-out-of-the-white-house/ /multimedia/sex-in-and-out-of-the-white-house/#respond Tue, 20 Aug 2024 13:03:03 +0000 /?p=151875 The post Sex in (and Out of) the White House appeared first on 51Թ.

]]>

The post Sex in (and Out of) the White House appeared first on 51Թ.

]]>
/multimedia/sex-in-and-out-of-the-white-house/feed/ 0
The Tale of the Boy Who Cried “Racism!” /culture/the-tale-of-the-boy-who-cried-racism/ /culture/the-tale-of-the-boy-who-cried-racism/#respond Sun, 28 Jul 2024 12:34:45 +0000 /?p=151433 The French Football Federation recently announced its intention to file a legal complaint over “racist and discriminatory remarks” made by Enzo Fernández and other Argentinian football players. Fernández had shared a video on Instagram featuring him and his teammates singing about the rival players, specifically those of African heritage. “They play for France, but their… Continue reading The Tale of the Boy Who Cried “Racism!”

The post The Tale of the Boy Who Cried “Racism!” appeared first on 51Թ.

]]>
The French Football Federation recently announced its intention to file a over “racist and discriminatory remarks” made by and other Argentinian football players. Fernández had shared a video on Instagram featuring him and his teammates singing about the rival players, specifically those of African heritage. “They play for France, but their parents are from Angola. Their mother is from Cameroon, while their father is from Nigeria. But their passport,” sang the artless athletes.

Possible overtones?

Invited to respond, Argentinean President Javier Milei and Vice President Victoria Villarruel shrugged and said Fernández was just being truthful. Aurélien Tchouaméni and several other players on the French national team are of Cameroonian descent. Ousmane Dembélé is of Senegalese, Mauritian and Malian descent.

Days later, football fans in Argentina were repeating the chant. Fernández was investigated by association football’s world governing organization, FIFA, which has prioritized the fight against racism in the sport. The players can be suspended for up to 12 matches if the chant is found to be racist.

Is it racist?

I asked a Spanish-speaking friend for a translation of the comments, and he confirmed the above is accurate. He reckoned the chant had racist “overtones,” meaning it implied that to be properly French, you had to be white. I accept there were overtones. I also accept that the verse was derogatory and insulting to France’s black players. But I am still not convinced this is racism. Then again, racism itself changes.

The myth of race

In 1950, UNESCO published a significant titled “The Race Question.” This report was one of the first major efforts to expose the scientific invalidity of race as a biological concept. It concluded that “for all practical purposes, ‘race’ is not so much a biological phenomenon as a social myth.”

Despite its mythic status, no one doubted the devilish concept’s potency. “Racism” referred to thoughts and theories predicated on the validity of “race” and the corresponding assumption that the human population was divided naturally into a hierarchy, with whites permanently at the top.

“Racialism,” on the other hand, described language or behavior that reflected those beliefs. So, racialism, or racial discrimination as it was often called, was obviously much more damaging to groups conceived as lower in the purported hierarchy. Anti-discrimination laws and policies were designed to manage racialism rather than educate people.

During the 1980s, the terms racism and racialism converged in academia, public discourse and policy discussions. “Racism” increasingly described both the belief in racial superiority and the resultant discriminatory behaviors. The focus shifted to recognizing that racist beliefs and actions were part of a larger, interconnected complex of injustice and subjugation.

Institutional racism

The term “institutional racism” was first used by Stokely Carmichael (later known as Kwame Ture) and Charles V. Hamilton in their influential Black Power: The Politics of Liberation. Over time, the term became closely associated with the UK’s report on the death of Stephen Lawrence, published in 1999. In this case, was defined as “the collective failure of an organization to provide an appropriate and professional service to people because of their color, culture or ethnic origin. It can be seen or detected in processes, attitudes and behavior which amount to discrimination through unwitting prejudice, ignorance, thoughtlessness and racist stereotyping which disadvantage minority ethnic people.”

According to the report, institutional racism is not only about overt acts of racism but also about the more subtle and systemic practices that lead to unequal treatment — what are now known as microaggressions. Institutional racism and plain racism were soon used interchangeably to mean widespread discrimination.

The parameters have shifted so that the concept of “race” is no longer germane. In 2018, for example, many people from felt they were discriminated against on the grounds of national identity. Under the UK’s , these concerns could be considered justified. The Welsh were a “protected group.” The defining feature of racism, in this conception, is not “race” but vulnerability to discrimination.

The Boy Who Cried “Wolf”

The benefits of categorizing racism in this way are many. Groups that have been treated wrongfully or prejudicially, be that presently or historically, are protected by law and can use the emotively powerful claim of racism in their defense. Offenses motivated by a victim’s supposed ethnicity, nationality, religion, sexual orientation, disability or similar characteristics are now grouped collectively as hate crimes. The defining characteristic is the perpetrator’s intention, not the victim’s attributes. A claim of a racist attack on a cisgender, fully abled, while male heterosexual has merit.

But there are dangers, the most obvious one captured by the phrase “cry wolf.” The fable of the tricksy shepherd boy who playfully misleads people with false cries of, “Wolf!” is illuminating. When a wolf actually does appear, others are so used to the boy’s stunts that no one takes notice. Repeatedly claiming “racism” calls attention to an unpleasant and widespread presence, but may also devalue such claims. The enlargement of the concept to cover all manner of discrimination tends to trivialize racism in the form it once had.

Racism has disfigured America’s history from the 17th century and Europe’s from the 1950s. It has provoked slave uprisings, riots, protest marches and other forms of civil disobedience. Torture, mutilation and death have been its grimmest byproducts. To cluster these sins under the same rubric as microaggressions against the Welsh lessens their significance in the eyes of many.

Racism in the Fernández case

I am certainly not condoning the behavior of Fernández and his teammates. It was not just careless, but wrongheaded, pernicious, arguably defamatory and possibly malicious. France’s black players were subject to abuse on social media following their World Cup defeat to Argentina in 2022, so these kinds of irresponsible deeds can have consequences. But was it racist?

Fifty years ago, no. Thirty years ago, still no. In fact, in 1998, France won the FIFA World Cup with a multicultural team that included Zinedine Zidane, Patrick Vieira, Lilian Thuram and Marcel Desailly, among others. Had Fernández’s video been released then, it likely would have been ridiculed and dismissed as a case of “sour grapes.” But today we err on the side of assuming malignancy.

The impact of racism has been diluted by our eagerness to recognize it in any situation in which hatred of particular groups is involved. This is not a bad thing and in a great many instances, there has been a racist component buried among other sordid motivations. Yet the danger lies in spurious attributions. Some offenses, even hate crimes, are not impelled by spurious beliefs about race and should be treated as conceptually distinct.

None of this excuses Fernández et al. But perhaps we should laugh at their idiocy and childlike attempts to make fun rather than dignify them — which is what we do when we endow them with serious motives.

[Ellis Cashmore is the editor of]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post The Tale of the Boy Who Cried “Racism!” appeared first on 51Թ.

]]>
/culture/the-tale-of-the-boy-who-cried-racism/feed/ 0
MURDER MOST FOUL /multimedia/murder-most-foul/ /multimedia/murder-most-foul/#respond Thu, 25 Jul 2024 10:16:54 +0000 /?p=151398 The post MURDER MOST FOUL appeared first on 51Թ.

]]>

The post MURDER MOST FOUL appeared first on 51Թ.

]]>
/multimedia/murder-most-foul/feed/ 0
Sociology of the Olympics /multimedia/sociology-of-the-olympics/ /multimedia/sociology-of-the-olympics/#respond Thu, 18 Jul 2024 11:44:15 +0000 /?p=151306 The post Sociology of the Olympics appeared first on 51Թ.

]]>

The post Sociology of the Olympics appeared first on 51Թ.

]]>
/multimedia/sociology-of-the-olympics/feed/ 0
Do Celebrity Endorsements Help or Hurt Politicians? /world-news/us-news/do-celebrity-endorsements-help-or-hurt-politicians/ /world-news/us-news/do-celebrity-endorsements-help-or-hurt-politicians/#respond Sat, 13 Jul 2024 10:34:49 +0000 /?p=151088 “I am not here to tell you how to think,” Oprah Winfrey told a 10,000-strong crowd at the Iowa Events Center in downtown Des Moines. “I am here to tell you to think.” It was December 2007, eight months before Barack Obama was selected as the Democratic presidential candidate and 11 months before he won… Continue reading Do Celebrity Endorsements Help or Hurt Politicians?

The post Do Celebrity Endorsements Help or Hurt Politicians? appeared first on 51Թ.

]]>
“I am not here to tell you how to think,” Oprah Winfrey told a 10,000-strong crowd at the Iowa Events Center in downtown Des Moines. “I am here to tell you to think.” It was December 2007, eight months before Barack Obama was selected as the Democratic presidential candidate and 11 months before he won the US presidency.

It was the most potent celebrity endorsement of a political candidate in history. Distancing herself from partisan politics, Oprah she was acting out of a sense of obligation: “I feel compelled to stand up and speak out for the man who I believe has a new vision for America.”

She closed with gravity, drawing on Ernest J. Gainer’s 1971 novel, which tells the life story of a woman born in slavery at the end of the American civil war. The book recounts how each time a new baby was born, its mother would take it to Jane Pittman, who would hold the baby John-the-Baptist-like and wonder aloud whether the child would be the deliverer of black people: “Is you the one? Oprah refined the grammar, changed the context and answered affirmatively that Obama was indeed The One.

Rarely, if ever, has a single affirmation been so : It was less an endorsement, more a proclamation. But is a thumbs-up from rapper-turned-country music star going to make much difference to Donald Trump’s chances at this year’s presidential election? For that matter, is anyone’s endorsement going to make an impact? I can think of one, but more on that later.

More than entertainers

Politicians have attracted endorsements from popular entertainers since the 1950s. Republican President Dwight Eisenhower, in 1952 and 1956, recruited the likes of Bing Crosby and Ethel Merman at a time when the popular assumption was that Hollywood stars were communist sympathizers.

Frank Sinatra re-recorded “” complete with the line “Vote for Kennedy” as part of JFK’s successful presidential campaign in 1960. Around the same time, Britain’s Labour Party leader Harold Wilson received conspicuous support from the Beatles. Twenty years later, Sinatra donated $4 million to Republican Ronald Reagan’s successful presidential campaign.

Jane Fonda threw her weight behind Democrat George McGovern in the 1972 presidential

campaign. Fonda’s endorsement aligned with her opposition to the Vietnam War.

Celebrities, including athletes, have been conspicuous in every postwar US presidential campaign, though basketball star and shoe endorser Michael Jordan famously remained absent from a Senate race in 1990 explaining — when invited to endorse Harvey Gantt, an African-American Democratic candidate in North Carolina — “Republicans buy sneakers too.”

Bill Clinton garnered support from celebrities, including Barbara Streisand and Whoopi Goldberg, during his presidential campaigns in 1992 and 1996. The value of Michael Jackson’s endorsement was arguable. While Jackson was an immensely popular and influential figure with a vast global fanbase, Jackson faced allegations of child sexual abuse. (He was eventually cleared.)

Since Clinton, celebrity endorsements for presidential candidates are a required part of campaigns. The 1990s witnessed an expansion of the roles of showbusiness entertainers: Perhaps they felt the need to demonstrate they were more than entertainers and held solid beliefs, values and commitments. Politicians enthusiastically gave them a platform and what evidence there is suggests they benefited.

Risky business

In 2016, Trump counted Mike Tyson and Kanye West among his celebrity endorsers. While Tyson was a convicted rapist, having African Americans among his cohort presumably lent Trump credibility among blacks. Black voters make up about 11–12 percent of the US electorate and Trump lobbied for their votes, though he managed only of black votes in both 2016 and 2020.

While West, or Ye, as he prefers, had previously favored Democrat candidates, his approval couldn’t have done Trump any harm. Today, Ye is kryptonite (the fictional green mineral that weakens Superman). His flip-flopping was one thing, but his in 2022 persuaded sportswear manufacturer Adidas that it should cancel his best-selling “Yeezy” line, valued at $250 million per year.

Adidas’s experience with Ye may have chastened political candidates. Popular, black and seemingly multitalented — he designed his own clothes range — Ye imploded with an unexpected stream of invectives. He did have a history — having described slavery as a “” in 2018 — so Adidas must have known he was a risk. As are many other celebrities, of course. Many rose to prominence after scandals and know how to ride them like surfers conquering waves, transforming controversy into a vehicle for even greater fame.

Consumer culture

Endorsements have been integral to consumer culture, which began properly in the economic prosperity following the end of World War II in 1945. Hollywood stars appeared in advertising campaigns, and their effect on sales was encouraging enough to persuade ad agencies to pay for their services.

Today, they pay mightily: in 2015, LeBron James signed a multi-year deal with Nike valued at $1 billion. James used his platform, including social media and public appearances, to express his support for Joe Biden and his running mate, Kamala Harris, in 2020. Political candidates don’t pay endorsers, of course.

The value of celebrities to advertisers is reflected in sales: Some individuals, including Oprah, Jordan, George Clooney and Jennifer Aniston, can pitch for almost anything and make it sell. On the other hand, didn’t work for Nivea, which dropped her in 2012. Often, the relationship is symbiotic, with celebrities enhancing their reputations by associating themselves with popular brands.

However, selling things, inanimate material objects, is one thing; selling living sentient beings is another. Politics, like every other aspect of society, has been penetrated by celebrity: Votes are cast as much for people as what those people stand for. Ideals, values, policies and commitments will always feature in the mix when voters decide. As will relatability: Politicians strive to make voters think they share their concerns, identify with their problems and understand their feelings. When they can’t do it, they hope their endorsers can.

Convictions or self-aggrandizement?

Oprah was so influential she shooed off disbelief. Her blessing was strong enough to convince, even empower voters. But she was extraordinary. Other celebs elicit a note of cynical perspicacity. Voters suspect them more than respect them.

I have only inference and extrapolation to back up my claim. A recent , in which I was involved, centered on sports fans’ reactions to athletes, clubs, sponsors or entire sports leagues that push boundaries and make pronouncements on causes, such as war, racism and LGBTQ+ issues. A swath of fans detected their sermonizing was largely self-aggrandizement, as if saying, “We want you to take us seriously and accept that we truly believe in this cause [whatever it is].” If their gestures and pronouncements do little else, they prove athletes know how to read the room: They are aware of voguish attitudes and values and adapt themselves to suit them.

It may be fallacious to use the same logic for voters. Or it may be instructive. If the latter, celebrities see elections as pretexts for posturing and, ever-eager to provide an illusion of depth to further their ambitions, they offer their support. In this sense, presidential elections offer painless opportunities to burnish any celebs’ profundity. At least, if we follow the logic. Joe Biden’s alarming performance in front of 51 million American viewers recently may give prospective endorsers for thought. How much burnish is there in associating with a faltering politician?

What about Taylor?

The endorsements ringing for Biden sound like cracked bells: Barbra Streisand, Julia Roberts, George Clooney (since retracted, however) and others, including Robert De Niro, have all made their allegiances known before. Apart from the aforementioned Kid Rock, Donald Trump has only a handful of celebs, most of pensionable age, in his corner.

The unique figure in modern cultural history is, of course, Taylor Swift. She bridges many gaps, between pop and art, poignancy and jubilation, intensity and matter-of-factness. Is the gap between entertainment and politics one she aims to traverse? With followers on Instagram, she’s not hard to imagine running for the presidency herself. There’s even a about her political ambitions. In the meantime, no human being has more

influence. Her endorsement would match Oprah’s.

Some celebs have genuine convictions and nail their colors to the mast without considering whether publicizing their political preferences will affect their careers. Others are primarily concerned with boosting their reputations. I sense that voters think they are all in the latter camp. So, why are politicians so keen on having them in their corner?

Oprah and Taylor are sui generis: They are both unique, albeit in their different ways and capacities to galvanize voters. No one else presently comes close and, while this year’s presidential candidates clearly welcome support from any quarter, the support of celebs is probably worthless and, if the message of our skeptical sports fans is any gauge, counterproductive.

[Ellis Cashmore’s latest book is, 3rd edition.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post Do Celebrity Endorsements Help or Hurt Politicians? appeared first on 51Թ.

]]>
/world-news/us-news/do-celebrity-endorsements-help-or-hurt-politicians/feed/ 0
A Sociologist’s Perspective on the Olympics and EURO2024 as Protest Platforms /world-news/a-sociologists-perspective-on-the-olympics-and-euro2024-as-protest-platforms/ /world-news/a-sociologists-perspective-on-the-olympics-and-euro2024-as-protest-platforms/#respond Fri, 07 Jun 2024 11:02:30 +0000 /?p=150487 For as long as anyone can remember, the only certainty about sports and politics is that they should not mix — yet they do. The subject provokes piousness from traditionalists who argue for sports’ purity of spirit and all the neutrality this implies. But it also excites the rebel imagination. What better showcase for a… Continue reading A Sociologist’s Perspective on the Olympics and EURO2024 as Protest Platforms

The post A Sociologist’s Perspective on the Olympics and EURO2024 as Protest Platforms appeared first on 51Թ.

]]>
For as long as anyone can remember, the only certainty about sports and politics is that they should not mix — yet they do. The subject provokes piousness from traditionalists who argue for sports’ purity of spirit and all the neutrality this implies. But it also excites the rebel imagination. What better showcase for a cause than a major sports event?

On June 14, Germany will host one of the two biggest sports tournaments of 2024., as it’s called, is association football’s second biggest men’s event after the quadrennial FIFA World Cup. In July, the Paris Olympics will follow. In the absence of a deus ex machina, both tournaments will take place while military conflict rages in Ukraine and Gaza. Will either or both sports events become platforms for protest against the wars?  

The wars have prompted almost continuous remonstration of one kind or another, primarily in support of a ceasefire, around the world. University campuses, embassies and streets have been sites of protest. The recent Eurovision Song Contest in Malmö, Sweden, provided an attractive showcase. On the day of the competition’s grand final, 10,000–12,000 gathered on the central Stortorget square of the Swedish host city before marching toward the contest venue, waving Palestinian flags and shouting “Eurovision united by genocide” — a play on the contest’s official slogan, “United by music.” Earlier, there had been a more modest pro-Israel demonstration. Neither side missed the golden opportunity for widespread publicity.

Eurovision draws a formidable 162 million TV viewers, who will have been aware of the railing. But this figure is eclipsed by the viewers who watch football. 5.23 billion cumulatively watched the 2022 edition of the European Football Championship, to the Union of European Football Associations — that’s nearly 122 times the combined population of Ukraine, Israel and Palestine. Any march, blockade, sit-down or exhibition is likely to be seen worldwide.

Sports and politics have a long history together

Despite sports administrators’ refusal to acknowledge it, the affinity between sports and politics is undeniable. A political ideal inspired the modern Olympic games: Their creator, Pierre de Coubertin (1863–1937), reimagined the ancient Greek religious, literary, musical and athletic festival as stripped down — a good-natured competition between nations and one with substantial symbolic value. Having witnessed the Franco-Prussian War (1870–1871), the rise of nationalism and militarism, colonial conflicts and the events that would eventually lead to World War I, de Coubertin suspected a multi-sports event could bring nations together. So, a large part of the games’ remit was to counterbalance the gathering forces in late nineteenth-century Europe.

The 1900 Paris Olympics, integrated into Exposition Universelle, an international event showcasing technological and cultural achievements, would have encouraged De Coubertin, an enthusiastic propagandist for world peace. He was less encouraged by the 1936 Berlin games in the year before his death. The Berlin tournament was an effective showcase for Nazis’ administrative expertise and competence: It staged arguably the most successful sports tournament up to that point in history, featuring 49 nations. The games were also intended to promote the destructive ideology of an “Aryan race.”

Sports has also been deployed as a conduit of opposition and, at times, at least appeared to influence social and political change. Many people credit the international sporting boycott of apartheid-era South Africa (from 1964 to 1992) with helping to end segregation and bring about the rise of the African National Congress (ANC) led by Nelson Mandela in 1994.

In 1977, Commonwealth nations agreed to exclude South Africa from international competition in Gleneagles, Scotland. The ban effectively froze South Africa out of major sports and turned it into a pariah state. Teams and individuals refrained from visiting or competing against the country, although not all observed the ban. 

It is satisfying to believe sports, activities that ostensibly promote unity of action and feeling, played a part in ending a regime based on racist separation and abominable inequality. But there’s no hard evidence to corroborate this unless we rely on conjecture and inference. On the other hand, the boycott certainly did not harm the anti-apartheid movement.

Dramatic protests by athletes

Disruption and mayhem can catalyze new friendships and insights, like breaking eggs to make omelets. At the 1968 Olympics in Mexico City, two African American athletes dramatically revealed their disdain for the US by bowing their heads and raising their gloved fists defiantly while on the victory rostrum. Tommie Smith and John Carlos are now hailed as fearless pioneers who changed the world’s perception of the American Dream. However, they were condemned and expelled from the games at the time.

Cultural rehabilitation came slowly and the “black power salute,” as it became known, is now regarded as a totemic moment in the history of modern USA. It’s tempting to exaggerate its impact, but the symbolic demonstration of resistance has become critical over the decades. Smith and Carlos captured the rebellious mood of the 1960s when much of the USA was affected by civil uprising.

Similarly, Colin Kaepernick’s motion in 2016 engaged a nation horrified by the deaths of two black men on consecutive days in July in different parts of the USA. Police officers fatally shot both Alton Sterling and Philando Castile, the former in Louisiana and the latter in Minnesota. In August, Kaepernick, then playing for the National Football League’s (NFL’s) San Francisco 49ers, refused to stand during the playing of the American national anthem. “I am not going to stand up to show pride in a flag for a country that oppresses black people and people of color,” Kaepernick NFL Media in 2016. He dramatized his stand further when he dropped to one knee during the anthem. It synced perfectly with the Black Lives Matter (BLM) movement that had emerged three years earlier and set off a chain reaction.

Sports brings many benefits — is world peace one of them?

Over the following years, European football embraced the knee gesture and encouraged observance before games. Other sports were not so keen. Thomas Bach, president of the International Olympic Committee, warned athletes against political protests, calling on them to avoid “divisive” statements that could overshadow the world’s biggest sporting event. “The podium and the medal ceremonies are not made … for a political or other demonstration,” said prior to the Covid-delayed Tokyo games in 2021.

US shot-putter, who is queer, fashioned her own protest as she collected her silver medal, crossing her arms representing, in her words, “the intersection of where all people who are oppressed meet.” 11 other Formula One drivers joined Lewis Hamilton as he took a knee before the start of the Styrian Grand Prix in Austria.

Just Stop Oil, a British environmentalist group that opposes the use of fossil fuels, spectacularly ambushed the World Snooker Championships in Sheffield, England, in 2023, leaping on the baize-covered tables and releasing a cloud of orange powder that disrupted the competition and provided impressively colorful images for the media. The same group staged a less publicized demonstration at Wimbledon in the same year. Earlier this year, hundreds marched to the World Athletics Indoor Championships venue in Glasgow, Scotland, to protest the Gaza conflict. Palestine players wore keffiyehs (Bedouin Arab headdresses) when they entered the field against Australia in November 2023.

The toxin of the Ukraine and Gaza has by now envenomed the political atmosphere in much of the world and opposition to the wars manifests in rallies and marches somewhere practically every day. In this cultural climate, it would be unusual if EURO2024 and the Olympics’ Stade de France did not become protest sites. No one would be caught by surprise. Almost everyone can foresee at least one disruption to the competition. Most fans won’t encourage it, but these are exceptional circumstances in which to pursue what are, after all, trivialities. What’s a trophy or a medal in the context of widespread bloodshed?

Sports have no real reason to exist at all. They won’t save the planet, cure chronic disease, end social inequality or deliver peace on earth. Only fantasists believe campaigners for world peace can bring an end to the two military conflicts. Even concerted demonstrations from fans, players, teams and even organizers are unlikely to make impressions on the perpetrators of war. Like most political protests, their impact would be, at best, part of a cumulative dissent. And, at worst, futile. But is futility such a bad thing? Isn’t any form of protest better than no protest at all?

[Ellis Cashmore is a co-researcher of the “Will EURO2024 struggle to keep war protests out of football?” published in Soccer & Society.]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post A Sociologist’s Perspective on the Olympics and EURO2024 as Protest Platforms appeared first on 51Թ.

]]>
/world-news/a-sociologists-perspective-on-the-olympics-and-euro2024-as-protest-platforms/feed/ 0
FO° Talks: Celebrity Culture: More than a Figment of our Imagination /video/fo-talks-celebrity-culture-more-than-a-figment-of-our-imagination/ /video/fo-talks-celebrity-culture-more-than-a-figment-of-our-imagination/#respond Sun, 24 Mar 2024 08:43:52 +0000 /?p=149156 Ellis Cashmore, a professor of sociology, currently at Aston University is an expert when it comes to why we (the public) are so fascinated with celebrities. He has penned volumes with titles such as The Destruction and Creation of Michael Jackson, Elizabeth Taylor and Celebrity Culture. Cashmore defines celebrity culture as “our tendency to have… Continue reading FO° Talks: Celebrity Culture: More than a Figment of our Imagination

The post FO° Talks: Celebrity Culture: More than a Figment of our Imagination appeared first on 51Թ.

]]>
Ellis Cashmore, a professor of sociology, currently at Aston University is an expert when it comes to why we (the public) are so fascinated with celebrities. He has penned volumes with titles such as The Destruction and Creation of Michael Jackson, Elizabeth Taylor and Celebrity Culture.

Cashmore defines celebrity culture as “our tendency to have our values, practices and habits affected by figures who have risen to prominence (some would say undue prominence because it’s not proportionate to their accomplishments).” Notice that celebrities are referred to as figures and not people. Cashmore makes the distinction because celebrities are more products of our imagination than they are the flesh-and-blood person behind the fame. By existing in our imagination, they are independent of time and space and can be anything we want them to be. 

But what separates a celebrity from an ordinary person?

When asked if an ordinary person could become a celebrity, Cahsmore replied, “Not without the help of a legion of followers.” Despite many, many people trying to grow a following online, the vast majority fail at becoming a true celebrity. Every so often someone does manage to pull it off by doing something crazy but this fame is often fleeting. Not many have the staying power within our imaginations. 

The deciding factor of celebrity status is, ironically, we, the public. Anything can make someone a celebrity provided we find it interesting. Our perception of them makes them interesting. What makes a famous person a celebrity is the public: We turn them into celebrities. 

A normal person can become a celebrity in a short amount of time provided they get national attention. They simply have to occupy people’s minds (for example, winning the lottery or a reality TV show).

Before the year 2000, movies and TV were the established methods of gaining fame, but now we all carry phones with us. We essentially carry celebrities with us and in a moment’s notice can summon them for our entertainment

Some people, like Paris Hilton and Kim Kardashian, caught onto this trend early. They realized that social media were not just a presence but a force. It doesn’t matter what you say or do as long as the media notice you. Paris Hilton was more than “famous for being famous”; she was famous for appearing. The media trailed her because we were interested in her.

Kim Kardashian saw this and realized she could make it better by using social media and she exploded in popularity. Kim started as an assistant for Paris, but eventually eclipsed her.

Nearly any kind of notoriety can transform into stardom. Oscar Pistorius, who was already famous to a degree as a paralympian, became infamous after shooting his girlfriend, Rea. This rocketed him from being known within the sports world to international stardom. Pistorius’s audience mushroomed because people who weren’t interested in running were intrigued by the murder trial. It’s a combination of our fascination with killing as well as how much we enjoy seeing the rise and fall of our celebrities.

Related Reading

Another key ingredient of being a celebrity is appearing to be relatable — this is key to the transformation into celebrity culture. It’s like “the larger-than-life characters have come down to earth,” Cashmore explains. Our affection for celebrities is rooted in our love of how ordinary they are.

How does celebrity culture affect us?

Celebrity culture is “inescapable” and “a defining aspect of culture today,” whether we like it or not. Cashmore clarifies, “The main way is that it affects the way we spend our money. We can’t untangle celebrity culture from consumer culture.”

Celebrity culture encourages us to buy things that we don’t need but things that we want. It exists not just to sell us specific products but instead to advertise a way of life in which we are rewarded for owning the commodities we see they have. While this encouragement may not be overt endorsements we do our best to mimic celebrities.

While Cashmore asserts that celebrity influence is often overestimated, there is a chance that this could change in the future. 

We live in a time where people who are famous not for their leadership or anything related to politics, can still earn a reputation (think Arnold Schwarzenegger or Donald Trump). It seems like simply being known is half the battle in politics. As long as you can provoke strong emotion, you’re in business. The worst thing that can happen to a celebrity is that people stop caring. 

Cashmore noted, “I ɴdzܱ’t put it past Kim Kardashian or Taylor Swift to someday make the transition to the political playing field.”

How long can we expect for Celebrity Culture to last? Cashmore reminds us that “a change is hardly visible on the horizon let alone an end. Celebrity culture is here to stay, it seems.”

[wrote the first draft of this piece.]

The views expressed in this article/video are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post FO° Talks: Celebrity Culture: More than a Figment of our Imagination appeared first on 51Թ.

]]>
/video/fo-talks-celebrity-culture-more-than-a-figment-of-our-imagination/feed/ 0
What Does “Mental Illness” Mean in a Murder Case? /world-news/what-does-mental-illness-mean-in-a-murder-case/ /world-news/what-does-mental-illness-mean-in-a-murder-case/#respond Fri, 02 Feb 2024 12:17:47 +0000 /?p=147959 Diminished responsibility. The recent Valdo Calocane case has driven the term into our consciousness. Calocane, a 32-year-old male, fatally stabbed Barnaby Webber and Grace O’Malley-Kumar, both aged 19, and Ian Coates, a school caretaker, in Nottingham, England, on June 13, 2023. He also drove a stolen van at three pedestrians. Calocane was charged with murder… Continue reading What Does “Mental Illness” Mean in a Murder Case?

The post What Does “Mental Illness” Mean in a Murder Case? appeared first on 51Թ.

]]>
Diminished responsibility. The recent Valdo Calocane has driven the term into our consciousness.

Calocane, a 32-year-old male, fatally stabbed Barnaby Webber and Grace O’Malley-Kumar, both aged 19, and Ian Coates, a school caretaker, in Nottingham, England, on June 13, 2023. He also drove a stolen van at three pedestrians. Calocane was charged with murder and three counts of attempted murder. He was a dual Guinea-Bissau/Portuguese national with settled status in the UK and an engineering graduate from the University of Nottingham.

Over the next several months, it emerged that Calocane had been known to mental health services since 2020 and had been prescribed treatment. Police also pursued but did not arrest him for allegedly attacking two people weeks before the stabbings.

While in custody awaiting trial, Calocane was transferred to a “secure hospital setting” and assessed by forensic psychiatrists. Forensic psychiatrists have expertise in both psychiatry and the legal system. Their work involves conducting evaluations to assess issues like competency to stand trial, criminal responsibility and other mental health-related aspects of legal cases.

The forensic psychiatrists’ conclusions were presented to the judge. He declared himself “satisfied” that Calocane was suffering from paranoid schizophrenia and converted the charge to manslaughter on the grounds of diminished responsibility.

“Malign forces”

Paranoid schizophrenia is a subtype of schizophrenia, a severe and chronic mental disorder. The Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, better known as the DSM-4, it by “preoccupation with one or more delusions or frequent auditory hallucinations.” (The Fifth Edition, however, no longer recognizes “paranoid schizophrenia” or other subtypes.)

Calocane experienced paranoid delusions in which he believed he was being targeted by “malign forces” and agencies such as MI5 (Britain’s domestic counterintelligence agency) which were controlling his thoughts and behavior. The symptoms apparently began in 2019.

Auditory hallucinations (i.e., hearing voices) reinforced his beliefs. Calocane’s thinking seemed muddled, and it’s possible that his inability to distinguish between reality and delusions impacted his judgment. He may not have lacked the ability to distinguish between right and wrong, but his condition could have affected his ability to assess situations accurately. If this sounds unclear, that’s because it is. This is why the legal system accepted diminished responsibility and committed Calocane to a medical facility where he would presumably receive psychiatric treatment. But it was a controversial decision.

Had the forensic psychiatrists not persuaded the judge, he would almost certainly have imposed a lengthy prison sentence, probably life. Instead, the judge accepted Calocane’s guilty plea of manslaughter and handed him a restricted hospital order. In the UK, if judges determine that an offender poses a danger to the public, they can invoke section 41 of the Mental Health Act and commit the offender to an indefinite period in a special hospital where physical security arrangements are the equivalent of a prison’s. (There are nearly 8,000 people currently living under such conditions in the UK. Historically, the Moors murderer Ian Brady, Peter Sutcliffe aka the Yorkshire Ripper, and infamous gangster Ronnie Kray, all spent periods of their sentences in high-security hospitals.)

The parents of Webber and O’Malley-Kumar were understandably enraged by what they regard as leniency shown to their children’s killer. As far as they were concerned, it was murder, and Calocane should have been charged accordingly. Webber’s mother Emma declared that “true justice has not been served” and accused the police chief of having blood on his hands.

“This man [Calocane] made a mockery of the system and he has got away with murder,” added Coates’s son.

The medical model of mental illness

The case forces us to think about mental health, but not in the almost-comfortable way we ordinarily turn it over in our minds. Rock stars, athletes and other celebrities habitually solicit public sympathy by professing their so-called mental health issues, typically undiagnosed and self-treated.

to the World Health Organization (WHO) 301 million people worldwide suffer from anxiety disorders and over 280 million people have depression. The WHO estimates that 1 in 8 people worldwide suffers from some mental disorder. Most of these people are functional in the sense that they hold down jobs and get through their days, perhaps with the help of selective serotonin reuptake inhibitors (SSRIs) like Prozac or monoamine oxidase Inhibitors like Elavil.

Graver forms of mental illness such as schizophrenia have typically been treated with antipsychotic drugs since the early 1950s. The first known antipsychotic medication was chlorpromazine. Care providers also employ non-pharmacological methods such as cognitive behavioral therapy and group therapy, but medication adherence is crucial for managing symptoms. It seems Calocane had refrained from taking medication, presumably antipsychotic drugs.

In the 1970s, psychiatry underwent a revolution. Mental illness came to be considered distinct and separate from physical health. Practitioners rejected the “medical model” which assumed that mental illness always had a physical basis. They considered it a crude, reductive simplification that was easy on the intellect, but of limited value in understanding. Unlike physical ailments that may have visible symptoms, mental health issues were more elusive, making diagnosis and treatment challenging. The subjective nature of the mental added another layer of complexity.

I learned from lodestars of the movement like Thomas Szasz (1920–2012), R.D. Laing (1927–1989) and Thomas Scheff (b.1929) during my own undergraduate studies. Each of them offered ways of analyzing mental illness as the result of experiences in social contexts, whether the family or large-scale institutions. All emphasized the importance of response, reaction and cultural labeling in affecting our understanding and treatment of people considered, rightly or wrongly, to be mentally ill. These and other sociologically inclined scholars were critical of the medical model’s indifference to social influences.

Yet treating mental illness as a disease, even metaphorically, has proven both intellectually appealing and practically favorable. The catalytic effect of Prozac (fluoxetine) in boosting the popularity of the medical model shouldn’t be underestimated. The US Food and Drug Administration (FDA) first it for prescription in 1987. Since then, Prozac has been widely prescribed for the treatment of depression, obsessive-compulsive disorder, panic disorder and several other mood disorders. This popularity led to the proliferation of other SSRIs.

Despite the simplification involved, approaching mental illness as analogous to physical sickness has yielded colossal benefits. As well as removing much, if not all, of the stigma traditionally associated with mental illness, it has facilitated more open discourse and, by implication, enhanced inclusiveness. No one today feels embarrassed by declaring themselves to be experiencing mental health issues — “issues” now having replaced “problems,” “difficulties” or “troubles.”

Also, by considering mental health in the broader framework of illness, society is compelled to recognize the interconnectedness of mental and physical well-being. This approach promotes a more holistic understanding of health. But, the medical model, though serviceable, should be approached with caution. Mental illness is analogous to, not the same as, physical illness. We need to respect the unique features of mental health conditions, especially when it comes to a case like Calocane’s.

Liquid society

Unlike medical practitioners, sociologists like me tend not to see the world in terms of types or categories; rather, we see it as sequences of moments in perpetual flux. Sociologist Zygmunt Bauman’s concept of emphasized the fluid and uncertain nature of relationships, institutions and other social phenomena.

While Bauman didn’t study mental illness, his approach would probably emphasize constantly changing patterns rather than identifiable, diagnosable types. Recently, mental health professionals visualize mental health as lying on a . Yet even this seems overly reductionist. One may be at one end of a spectrum or the other, but the spectrum itself does not change. Instead, imagine mental illness as a kaleidoscope — a mix of changing elements, confused, chaotic and unpredictable.

The mental illness may be permanent, temporary or sporadic. Its causes or antecedents may lie in physical injury or decay, or they may be congenital (present from birth). A neurobiological model of mental illness would posit that mental disorders are primarily caused by physiological factors, such as neurotransmitter imbalances in the brain. According to this perspective, conditions like depression, anxiety, schizophrenia and others would result from disturbances in the functioning of neurotransmitters.

On the other hand, the causes may lie outside the individual in social experiences such as poverty, geographical dislocation or cultural disengagement. Interpersonal relations convulsed by, for example, unemployment, bereavement, the departure of a loved one or any radical change in circumstances can give rise to trauma.

Given the scope, scale and complexity of the phenomena we group together as “mental illness,” the treatment options we have available seem limited. In the early to mid-20th century, the practice of treating mental illness with physical surgery, known as psychosurgery, gained prominence. This was the period of lobotomy, which involved severing or damaging the connections between the prefrontal cortices (which govern higher-order cognitive functions) and the rest of the brain.

While lobotomy is no longer practiced, some surgical interventions, like deep brain stimulation (DBS), are still used for conditions like treatment-resistant depression or OCD. (DBS involves implanting electrodes into certain brain regions and is considered a last resort when other treatment options have failed.) But the trend has been away from invasive methods and toward medication and psychotherapy.

So, how much confidence should we have in British Prime Minister Rishi Sunak when he Calocane’s victims’ families that “we will get the answers”? Do we even know the questions? The Calocane case challenges us to square circles when we don’t know whether a circle means a figure consisting of points equidistant from a center, or a curved upper tier of seats in a theater. In other words, when you and I talk about “mental illness,” we don’t even know whether we are thinking about the same thing. How could we begin to talk about the diminished responsibility “mental illness” supposedly implies?

If Calocane resisted taking prescribed medication, was he exercising freely willed choice? Or was he behaving in accordance with delusions or auditory hallucinations? Can he thus be held responsible? At the moment, the answer is “no,” or at least, “not completely.” This may change when Attorney General Victoria Prentis completes her review of the case.

Still, she will discover that the case poses problems incapable of an indisputable solution. No matter how many sides to the argument Prentis considers, her conclusion will be controversial. Mental health resists pat answers; it just offers tougher questions.

[Ellis Cashmore’s most recent book is .]

[ edited this piece.]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post What Does “Mental Illness” Mean in a Murder Case? appeared first on 51Թ.

]]>
/world-news/what-does-mental-illness-mean-in-a-murder-case/feed/ 0
What Makes a Child Murder Another Child? /world-news/what-makes-a-child-murder-another-child/ /world-news/what-makes-a-child-murder-another-child/#respond Thu, 18 Jan 2024 13:23:00 +0000 /?p=147583 Like a monster serially rebirthed, child murder appears after periods of decline. It unfailingly strikes terror and panic into entire societies. The murder of children by other children is a crime that transcends time, space, logic and other finitudes that criminologists are used to. There is a dreadful counterintuitive senselessness about children who kill other… Continue reading What Makes a Child Murder Another Child?

The post What Makes a Child Murder Another Child? appeared first on 51Թ.

]]>
Like a monster serially rebirthed, child murder appears after periods of decline. It unfailingly strikes terror and panic into entire societies. The murder of children by other children is a crime that transcends time, space, logic and other finitudes that criminologists are used to. There is a dreadful counterintuitive senselessness about children who kill other children: Motives are either absent or barely intelligible. The gain or reward the murderers take from the deed are just not available to the senses. At least, the senses of most sentient beings.

On February 11, 2023, two teenagers murdered 16-year-old Brianna Ghey by stabbing in Culcheth, England. The killers have been called wicked, devilish and, most regularly, evil. None of these descriptions are in the least bit convincing. People just keep gasping, “Why did they do it?” The nowadays fashionable and utterly simplistic “hate crime” is easy on the intellect but of no use to understanding.

Two youths’ macabre experiment

Brianna was born a male in Birchwood, in the north of England. She had not undergone any gender reassignment surgery but identified, dressed and referred to herself as a woman at the time of her murder.

Her killers were both 15 at the time of the murder. Labeling their motive as transphobia is convenient, but misleading. One of the killers, a male known in court as Y, had a grim curiosity that led him to use dehumanizing language suggesting a dislike or prejudice against transgender people. “It’s a boy,” Y wrote in response to a message from his female accomplice, X, in which she referred to Brianna as “she.”

“Really all I want to see is what size dick it had,” wrote Y, revealing a grisly fascination. Britain’s Cheshire police announced that they had “no information or intelligence to suggest it was a hate crime,” though the view was not shared by many, especially not by members of the LGBTQ+ community who have been holding vigils.

In a perverse way, the manner in which the young killers planned the murder betrayed maturity. Their text messages seemed to indicate that there were four other potential victims as subjects in what might have been a macabre experiment. X messaged, “Let’s just stab her. It’s more fun.” The male Y answered,  “I want to see if it will scream like a man or a girl.” (Note the persistent use of “it”.)

The two killers went to the trouble of befriending Brianna, feigning a sort of state of mutual trust. They plotted to meet her in a lonely country park in Culcheth. Then they stabbed her 28 times in the head, neck and body. They put a certain level of thought into their method before doing the deed. But the question remains: Why did they do it?

The Bell and Bulger cases

At a certain age, children are intrigued by things beyond their comprehension. It sickens us, but there are children who want to experience the sensation of killing. Most dismember daddy longlegs or trample on small animals for no better reason than that they can. But others have more exacting curiosities.

The UK’s paradigm case was in 1968 when the 11-year-old Mary Bell strangled two boys, aged three and four, “solely for the pleasure and excitement of killing,” according to the judge. Like X, she had an accomplice. Mary’s accomplice was eventually acquitted, leaving her as the only culprit. (She was named, as will be X and Y in due course.) Mary was sent to a special security unit and released on license in 1980, when she was given a new identity. It is believed she now has a daughter. There is no way of knowing how accurate the judge’s assessment was, but maybe she really was seeking a depraved thrill.

The case was echoed in 1993 when two 11-year-old boys killed James Bulger, aged two — read that again: two. The killers dropped the child on his head and repeatedly punched and kicked him, at one point forcing batteries into his mouth, before hitting him on the head with a 22-pound iron bar. There were so many injuries that the coroner could not determine the precise cause of death. The killers were named as Jon Venables and Robert Thompson, and they were sentenced to a juvenile offenders’ facility. They were released in 2001, given new identities and granted legal anonymity for life.

Violence and the mentality of children

Morality is not innate: We learn to distinguish between right and wrong, probably from the age of four or, possibly, even earlier when we begin to internalize values and accept them as part of a natural order. From about six, children develop a more nuanced sense of morality and, later, learn to apply moral reasoning — that is, to discern rightness or wrongness in specific social settings. Empathy arrives any time after about eight. In all three cases discussed here, the ability to understand and share the feelings of another appears to be lacking. Or is it? It is difficult to believe any of the killers were not aware that they would be causing physical distress and, in the Bulger case, acute pain, to their victims.

We shouldn’t be surprised to learn how easily they subordinated this awareness to their own specific, excessively selfish interests. Adults do it all the time. Otherwise, there would be no crime. I am not arguing that child murder mirrors what goes on in society generally. But crimes of violence or crimes that involve violence concomitantly probably require subjectively depriving a person or group of people of some human qualities.

We are taken aback by the ferocity of children who kill, but adults do similarly. Children, like adults, victimize their peers. Fully of child sex abuse incidents in England and Wales, for another example, are perpetrated by children. This is the recurringly reborn “monster” I mentioned earlier: it’s a cruel and daunting creature and we don’t know where it comes from or how it can be defeated.

Young killers pick on children as victims, not because they hate other children, but because they are convenient victims. Jon Venables and Robert Thompson were 11; they would hardly be in a position to kill an adult or anyone else who could fight back. Children can, however, and sometimes do kill adults too. In 2000, three teenage girls Sister Maria Laura Mainetti to death in Chiavenna, Italy. Like other victimizers, they attack whom they can.

Brianna Ghey was killed because her killers believed she would not provide resistance. Dehumanizing her was part of the method, not the motive. So, what is the motive? The same as it was for Bell and the others, perhaps.

Children live in a world in which arguments are settled usually by some form of violence either in or outside the school playground. Some even witness arguments settled at home by the same means. They also learn that larger arguments are also settled or not settled by violence.

This is not the place to go into the socio-psychological dynamics of violence or how global conflict affects the mentality of children. But we should at least recognize that the violence that horrifies us so much when perpetrated by children and adolescents is not so different from the behavior to which we are habituated. After all, no state can exist with a police force or its equivalent in armed might. Ultimate violence may not be used frequently, and official violence is deliberately under-emphasized, but it is the ultimate foundation of any social order.

Mercifully, child-on-child killings are rare. So rare that they appear extraordinary. Children are inspired to use violence by all manner of inescapable influences. They, like adults (and I mean all of us) experience the temptation to hurt others and maybe even kill them. But they and we exercise effective control over these impulses. Our sense of morality usually kicks in and, when it doesn’t, the threat of long-term imprisonment is always lurking.

But it is not so for some children. Their sense of right and wrong is either underdeveloped, buried beneath other sensibilities or possibly distorted by what they’ve witnessed at home or in their social environments. They do not have inhibitions that stop others. Or they have curiosity sufficient to overpower them.

[Ellis Cashmore’s most recent book is]

The views expressed in this article are the author’s own and do not necessarily reflect 51Թ’s editorial policy.

The post What Makes a Child Murder Another Child? appeared first on 51Թ.

]]>
/world-news/what-makes-a-child-murder-another-child/feed/ 0
Sex and Sports /multimedia/sex-and-sports/ /multimedia/sex-and-sports/#respond Sat, 25 Nov 2023 12:20:44 +0000 /?p=146381 The post Sex and Sports appeared first on 51Թ.

]]>

The post Sex and Sports appeared first on 51Թ.

]]>
/multimedia/sex-and-sports/feed/ 0