It’s Only Words (To Take Your Heart Away?)

Introduction

“The Impotence of Being Earnest”

I believe it was Søren Kierkegaard who once said “If you label me, you negate me”.  Already, I’m sensing you’re rolling your eyes at the audacity of my quoting a 19th-century Danish philosopher, without any warning.  “Oh no, here we go.  What an absolute [insert insult of choice]”.  Hold on a moment, though.  Wouldn’t your chosen term of abuse be a label, meaning you’ve rather spectacularly missed our Scandinavian friend’s point?

sorenkierkegaard.jpg
Søren Kierkegaard.  Photo: Wikipedia

You’d also be falling into the trap I’ve laid for you.  It was a pretentious quote but not entirely in the way it seems.  I’m pretending to quote Kierkegaard but it’s actually a line from the 1992 film ‘Wayne’s World’, a far less academically significant (therefore a more socially acceptable) source.  If I’d said I was quoting “Wayne from ‘Wayne’s World’”, would that have made the label any more flattering?  Would it negate me any less if it was? 

In recent years, as the public discourse in the UK, the US and elsewhere seems to have grown more adversarial and unsophisticated, I’ve found myself reminded more and more of Wayne Campbell’s unsuccessful chat-up line.  Maybe I am a little over-sensitive to the choices of words used by anyone in power whose intentions are unclear – or conversely, as seems to be increasingly the case, perhaps most people aren’t sensitive enough.

The Scientific Case

“Of Course It’s In Your Head – Why Would That Mean It’s Not Real?”

For over a century, the disciplines of Psychology (the study of the mind) and Linguistics (the study of Language) have found themselves frequently intertwined.  The central argument that has always drawn these two distinct areas of study together is this:  Language determines Thought.  It’s important to say that I’m in no way posing as an expert in either field.  I did a little Cognitive and Social Psychology at University and for a year, I lived next door to a Linguist – and no, he wasn’t a cunning one.  I can google ‘psycholinguistics’, read about Piagetian cognitive determinism and name-drop the Sapir-Whorf hypothesis but I won’t pretend to understand any of it fully and it won’t make much difference to my basic premise that you or I don’t.  It’s just important that you keep in mind the widely-supported proposition that the words we use and hear used go some way – perhaps a long way – to influencing the way we think.

I have, however, worked in Marketing for over twenty years so it’s fair to say that I’ve written enough copy to know how to use language to seek to gain acceptance and approval from the reader – the keys to being able to persuade them.  I’ve been to seminars in which eminently more experienced wordsmiths than I am have forensically deconstructed their craft as part science, part art – often a dark art.  We’re all consumers of products – which makes us all consumers of advertising.  It shouldn’t be hugely controversial to suggest we’re all to some extent aware of it when others are trying to change the way we think about something and yet the practice still works remarkably well for it to continue to exist.  If you don’t just love a McDonald’s meal but you’re “lovin’” the experience of going there, that word has influenced, possibly even determined, your thought.  Yes it’s a free country and you may have been free to make the choice to visit the ‘Golden Arches’ but how free were you to arrive at that thought?  Consequently, if 300 million people don’t hear the name ‘Hillary’ without it being prefixed by the word ‘crooked’, what is that going to do to the unconscious opinion of her with a large swathe of them?  It’s what psychologists (and marketers) call the ‘mere exposure’ hypothesis.

The Historical Case

“Time’s Arrow”

What about when the same techniques are applied even more nefariously?  Let’s not mess about here, I’m going to go all ‘Godwin’s Law’ at this point and use one of the most chilling, notorious, shameful examples of persuasive writing – just to prove that it actually happened: “Arbeit Macht Frei”.  It’s German and it translates as “Work Makes You Free”.  Have you remembered where you’ve seen it, yet?  It was (and still is) written above the gates at the Auschwitz-Birkenau concentration camp, possibly the site of the very worst of humanity’s depravity.  The context is clear: those that were sent there, arriving in a state of fear, were met with a message of promise and a condition.  “You may feel trapped and persecuted right now but if you work hard here, you’ll actually be free”.  Even without knowing what happened next, it’s an incredibly shocking attempt at a strapline.  When you then consider that “Freedom” seems to be a deliberate euphemism for the reader’s impending death, it’s breathtaking in its soul-crushing brutality.  The real lesson that this example teaches us is not just that it’s a fairly crude attempt at thought control but that such a crude phrase was used so brazenly, so utterly cynically.  

Meanwhile, back in the 21st century, where we all feel we know better and could never possibly return to those sick, twisted days, there’s a small, nagging suggestion that we may not be as wise as we think.  Arguably, we’re still happy to support those who would use our language against us, so have we really learned from our species’ mistakes – and is our complacent belief that we have aiding and abetting aspiring thought-controllers of the future?

The Literary Case

“The Right To Tell People What They Do Not Want To Hear”

At the forefront of the effort to ensure that the horrors of totalitarianism must never be revisited was, of course, George Orwell.  In his scathing critique of the subject ‘Nineteen Eighty-Four’, he shrewdly included the vital role that the distortion of language could play as a means to facilitate and perpetuate an all-powerful state.  “Newspeak”, the name given to the dangerously re-defined, state-approved form of language was the means by which concepts such as “doublethink” (a means by which one fact simultaneously demonstrates the opposite) could possibly exist.  Logically, it seems perverse to assert a patently self-contradictory statement such as “War is Peace” but the practice of doublethink, delivered in the approved guise of newspeak, would eventually compel Orwell’s oppressed inhabitants of “Airstrip One” to agree that it must be the case.  

You may think this is all a little extreme and scare-mongering but the context is vital.  It was written in 1948 and propaganda was a huge part of the war effort on all sides.  It had long been understood that to control what is believed to be “the truth” is to control a war effort and, by extension, a war – the famous quote about the truth being “the first casualty” of war was anecdotally attributed to Californian Governor, later Senator Hiram Johnson, in about 1916.  Orwell’s genius suggestion was that by maintaining a perpetual state of war, his totalitarian regime was able to maintain a permanent control of truth itself. 

Today, it’s a rather sad irony that, rather than his masterpiece and its darkest ideas being fully understood by all, they have, for many, become trite buzzwords from TV shows in which mildly perilous situations occur – an undesired form of newspeak, you might conclude.  Viewers of ‘Big Brother’ and ‘Room 101’ may know of the Orwellian connection but without having read the book, can have no grasp of the gulf that exists between the plight of Winston Smith and that of the guests of Frank Skinner or Davina McCall et al.

The Semantic Case

“Warning: Implicit Content”

As our friend Søren would no doubt agree, the problem with labels is that they are generally one of many ways to describe a person or a group which can be used wholly to define them, removing natural complexity and using a simplistic shorthand instead.  Too often, we don’t really identify it’s happening when others use labels and we rarely notice when we do it ourselves.  Labels allow unspoken connotation to fill in the gaps and strip out context and nuance.  It takes effort to realise that there’s more to the simplistic description than is being made explicit and it’s too easy to derive a wider, unsaid, implied meaning.  

The other problem is that we all have many applicable labels at all times; some of which present us positively, many of which don’t.  I’m a father, a husband, a dog-owner, a tax-payer, a voter and a graduate which I would hope all sound like good things to be.  I’m also an SUV-driver, a cyclist, a caravanner, a Libran and a football fan, descriptions which do not always meet with universal popularity and can be used, in isolation, to undermine.  Furthermore, depending on your particular perspective, my applicable geographic labels of Lancastrian, Northerner, Englishman and Briton may or may not derive positive acclaim.  Subjectiveness, relative to an audience hugely affects the positivity, or otherwise, of a label.  If someone wanted to create antipathy towards me from a Yorkshire audience, guess which label would be most useful in achieving that aim?  What if the audience was from London?  Or Wales?  Or Germany?  Labels make it easy to discredit and are too easily met with unquestioning acceptance.

The Pragmatic Case

“That’s No Way To Go, Does Your Mother Know?”

To a certain extent, none of the above should be that surprising.  Most parents will recognise the important distinction between the justifiable chastisement “you’ve behaved stupidly” and the altogether more dismissive “you’re stupid”.  We take care not to label children when they err because it’s unfair and it sets a poor example – yet we seem to forget all that when it comes to the behaviour of adults.  Anyone can behave idiotically.  It’s a complicated world – so we tend to simplify idiocy by distributing it at the individual rather than the event level.  

Social psychologists have observed from studies that people tend to attribute judgements of others due to “dispositional factors above situational factors”.  Mothers have long discouraged their children from taking such a disposition-centric view by encouraging the more situationalist “they can’t help it and probably didn’t mean it”.  When we grow out of childhood, such guidance shouldn’t need to change – but as we become more hard-bitten by life experience, it just seems as if it is advice more appropriate for children.

The Logical Case

“Therefore, My Dog Is A Cat”

There’s also the issue of flawed logic to consider.  Mathematicians have long known about something called the Conditional Probability Fallacy – a logical trap that suggests that if one thing is represented in another, the opposite must also be true.  As a species, we seem to be innately disposed to accept certain binary truths and it’s logical for us to attempt to apply that trusted model wherever we see two states in a relationship.  “Darkness equals night” so it’s obviously equally true that “night equals darkness”.  The fallacy exists when such a relationship between the two states is implied, hence: “All fathers are male” – so all males are fathers?  A simple logical ‘sense check’ is often enough to debunk the flawed conclusion here – our own experience tells us It’s obvious that the inverse cannot be true.  

What if there’s insufficient personal experience to undermine the proposition?  What if the intricacies of such logical traps are exploited to an audience largely unaware of their existence?  Can we be conned, en masse, merely by implication?  For example, it’s easy to imagine the suggestion raised by the logical relationship ‘All jihadists are Muslim” – so are we being invited by anyone who asserts this point to conclude that “all Muslims are jihadists”?  Why is their religion suddenly important in this context, anyway?  Where is the consistency with other descriptions of terrorists?  When the UK was beset by horrifying attacks by the IRA, a supposedly exclusively Catholic Irish Republican militia, they were never described as “Christian terrorists”.  Is it fair to surmise that there’s a reason for such inconsistency?  Is there a justification for it?  

The Legal Case

“You Can’t Handle The Truth!”

We trust our politicians and news outlets to deal in the truth but from a legal standpoint, that’s only a third of the requirement.  Any witness in a court of law – arguably the arena where words matter most –  must swear to tell “the truth, the whole truth and nothing but the truth”.  We’ve heard this seemingly quaint legalistic phrase so often that its incredibly profound meaning tends to be lost.  It’s of huge significance that there are three strands of truth in this well-worn saying and they absolutely do not mean the same thing repeated twice merely to add gravitas.  Logically speaking, there are three very distinct requirements to be met by this oath.  Firstly, there’s “the truth”:  Is X factually correct, yes or no?  In answering this point, is the requirement for the statement to be comprehensively true: does it explain all facets raised by the question truthfully or does it omit any elements that are also true and inconvenient to include?  Finally, the need to strip away misleading detail: does the answer include other, spurious information, implied by its inclusion to be equally true and relevant?

In law, circumstantial evidence is deemed to be be flimsy and generally inadmissible.  In the media and, far too often, in public debate, little distinction is drawn between material and immaterial fact.  One provides insight to a story, the other adds innuendo.  Guess which of the two additions tends to be most commercially attractive?

There’s a reason that physical representations of Justice are traditionally depicted as ‘blind’, her eyes generally obscured by a blindfold.  It’s precisely because the Law is expected to ignore such spurious details as may be supposed just by looking at a person (i.e. “nothing but the truth”).  A verdict must be based solely on the facts presented, in the expectation that they are exhaustive and untrammelled by concoction, regardless of wealth, power or any other supposedly irrelevant factor of those on either side.  No politician or news outlet is bound as strictly to these principles and, by extension, their ability to convey what might be termed ‘absolute truth’ is inevitably inferior.

The Digital Case

“A Binary Expression”

In an ever-more inter-connected world, words travel further and elicit more words of riposte from more respondents than ever before.  With such inordinate possibility and reach, has humanity used the adolescent phase of the internet principally to broaden its mind and further its understanding?  Sadly, the evidence suggests, in the main, that it hasn’t.  Indeed, we’ve tended to deal with the exploding plurality of opinion and viewpoint by most commonly retreating to the comfort and solace of people with whom we are most in agreement, like disparate prehistoric tribes retreating to their various, demarcated caves. 

In our ‘echo chambers’, our digital ghettos, we appear to be doing what social psychologists have always observed in group dynamics: emphasise intra-group similarities and highlight inter-group differences, like opposing sets of football fans.  Here again, language is a useful stick – striking a drum to emphasise unity and beating those to whom that unity does not apply.  With all the zealotry of the Spanish Inquisition, those who are judged to be heretical to the orthodoxy of one side or the other are denounced as ‘snowflakes’, ‘libtards’, ‘fascists’, ‘leftists’, ‘Blairites’ or ‘TERFs’ to name but a few epithets.  Similarly, the mere mention of these terms of heresy is sufficient to remove any further right of explanation or mitigation to be heard, like the man being branded a blasphemer in the always-relevant ‘Life of Brian’.  In short, the process of labelling doesn’t just negate individuals in these circumstances, it can defenestrate them of their credibility. 

A clear example of the ease with which negative labels can be proliferated in the digital age is the much-discussed ‘Centrist Dad’.  Aside from the fact that it is principally designed to trivialise and undermine a particular assumed set of views, like any other label, it would appear to take the principle a stage further.  To its proponents, the term generally represents a frustration at a perceived lack of radicalism that they would believe is necessary, a dismay at the supposed reliance on much of the status quo.  Aside from the implied sexism and ageism of the term, it is essentially a disapproval of ‘Centrism’.  The trouble with this term is that it is only really clearly defined by that which it is not – radical leftism or indeed rightism – rather than that which it can be said to be.  Centrism is therefore analogous to atheism, which is defined merely as the absence of a belief system rather than an ‘active’ position in and of itself.  So-called ‘centrists’ subsequently find themselves being thus defined more for a set of values that they don’t hold rather than any that they demonstrably do.  This appears to be clear with-us-or-against-us posturing – and history holds dark warnings for that kind of simplistic tribalism.

And then there’s the media in the digital age.  Like any other consumer product, media proliferation has led to a huge increase of news providers, each subsisting on ever-narrower niches of audience type.  Unlike things like breakfast cereal, which has also found itself in a market in which it must accommodate more choices, tailored more closely to a more specific clientèle, it seems questionable whether news should operate in this way.  British newspapers always represented a fairly diverse range of readers but reporting of facts generally superseded the in-house interpretation of their significance and so the Guardian and the Telegraph, while ideologically opposite, would report essentially the same stories, albeit differently paginated and analysed according to their (and their readers’) politics. 

When the world wide web was barely a twinkle in Tim Berners-Lee’s eye and I was but a sixth-form student, I tended to spend my Monday mornings trying to avoid doing my Maths A-Level homework by reading each of the day’s newspapers in the library.  Today, I believe that the appreciation it gave me of the role of a broad yet largely responsible media landscape was the best education I received at that time, consistently far more meaningful than my questionable ability to perform differentiation from first principles or indeed identify a Poisson distribution.  As a result, I find myself suitably dismayed and alarmed at the willingness of a legitimised partisan press to use the language of their own tribal agenda, abandoning the media’s traditional role of observer and analyst, in order to become a willing participant.  Most depressingly, those outlets that attempt to retain a vestige of objective detachment are now being demeaned by the dismissive label “Mainstream Media”.  Somehow, this is what we seem to believe to be progress.

Words are also being used accordingly to reinforce another growing social trend: the rise of simplicity or – as it’s described by Stephen Fry – infantilism, of debate.  Nuance seems to beyond the grasp of many, brought up on oversimplified phone-in radio debates and most issues find themselves being reduced to saccharine Good/Bad questions.  Is this helpful when debating the most difficult questions we face?  Complexity is an inherent component – or should be – to any far-reaching question.  For that reason, the answer is not simple and anyone who claims otherwise is likely to be doing so for expeditious reasons.  BAD! – with a commensurate level of qualifiers…

The Evolving Case

“If You Tolerate This, Then Your Children Will Be Next”

There have always been ways for the unscrupulous in power to self-aggrandise or denigrate those with whom they would disagree and, as Orwell and many others have shown, attempts to distort the meaning of words to suit an agenda is a recurrent one.  Of course, they aren’t all as pernicious as newspeak.  Some methods are older and simpler than others but they’re all employed with the same aim in mind – to influence our perception.

The old favourite among politicians is to speak with such eloquence and articulacy, that most people won’t stop to wonder if they’ve been lied to.  It’s therefore no surprise that when everybody’s favourite Victorian throwback MP Jacob Rees-Mogg was invited to admit his investment company’s stance on the financial uncertainties of Brexit seemed at odds with his stated political position, he merely brushed off the matter by suggesting there was “terminological inexactitude” in the assertion (a phrase first adopted by Churchill in 1906 to circumvent the prohibited practice of using the term ‘lying’ in Parliament).  Just as he expected, the country sniggered at the archaic delivery and chose largely to overlook the suspicion that ‘Jaunty Jacob’ had been putting his money somewhere other than where his mouth was.   

I suspect you’ve been waiting for the next bit.  I hope you’ll agree I’ve tried to restrict myself from returning to the Trump well of lexical chicanery thus far but then I’d hate to disappoint you so here it is, probably his most egregious example.  When an earlier video emerged, during the 2016 US election, of Donald Trump’s startling boasts of what he felt his fame allowed him to do and say around women, the matter was then raised at the second televised Presidential debate.  Famously, his main defence was “It’s just words, folks. It’s just words”, an astounding attempt to discredit his accusation by simply dismissing the importance of words – the very currency with which he was attempting to ascend to the office he holds today.  Protectors of the power of language were horrified – notably JK Rowling who later tweeted that “If they [words] don’t matter, we’re all lost” – and then were further horrified to note that a wide section of the American public were happy to accept the explanation, uncritically.  Once again, the most shocking examples of the abuse of language are not the most sophisticated examples but the brazenness with which the most crude versions are employed. 

Finally, it’s important to note that the powerful do not have the monopoly on distorting language to further their own causes.  Recent years have seen those with less power adopting a similar technique – with concerning implications.  No-one should want to undermine the experience of minorities in their struggle to gain equal recognition and representation but to a language purist, it’s equally unedifying to see certain groups explaining their experiences and situations with the phrase “my truth”.  It’s understandable that the assertion is that their perspective on issues is different and needs to be more widely understood but can it be right that the word ‘truth’, which is supposed to be an absolute, can now be treated like any common noun and fall under the auspices of a possessive?  This isn’t merely a point of grammar but one of meaning itself.  Surely there can only be one truth, however many interpretations of it there may be.  If we complicitly downgrade the term ‘truth’ to mean little more than ‘opinion’, aren’t we devaluing the very concept of truth itself? 

Such concerns are brought into sharper relief when, inevitably, the language of the powerless is then appropriated by the powerful.  During the furore that surrounded Serena Williams’ conduct at the 2018 US Open Final, opinions swirled that she was both a powerful, millionaire athlete or the victim of sexism and racism, depending upon the level of support or condemnation being proclaimed.  Headlines such as “Serena Williams is being punished for speaking her truth” legitimise the concept in the vernacular and will offer those who seek to further themselves by factual obfuscation with another useful tool to achieving it.

No wonder we’re being described as living in a “Post-truth” world but the very existence of such a phrase is, to my mind, hardly a portent of promise.  If nothing is going to mean anything anymore, shouldn’t we be more worried that the most basic principles that anchor our hard-won rights are under the same threat of being erased?