Diana, (nearly) twenty years on… grief tourism and emotional performativity

Diana, Princess of Wales (née Spencer) has been in the news this week… well, she’s in most of the time for various reasons, but this time the 20th anniversary of her fatal car crash in Paris is fast approaching. It caused me to, if you excuse the pretentious term for a moment, reflect on what that even meant for me at the time, and since… no, no, please, stay with me, please. This isn’t going to be one of those posts.

See, I remember the day fairly well. And I also very clearly remember exactly how I felt…

…I felt nothing.

I didn’t care. And I still don’t, really.

No, no, please, against, stay with me… I’m not saying this because I’m one of those people, either. You know the kind, the one who so does not care about [insert today’s top story] that they have to tell you about it three times a day how much they totally don’t care about it. I’m simply saying that the event had very little salience to me. I had no basis on which to build the required level of “care”.

I was young in 1997, but I wasn’t even born in 1981 when Diana rose to fame and married Prince Charles. I was barely sentient for most of her good works afterwards (and, yes, I objectively admire her work and charitable contributions) but it has absolutely no effect on me on an emotional level. Her charitable efforts were only something that, literally, happened to other people. Her name was something that appeared on the news, like “Bosnia” or “Nelson Mandela”; something that got in the way of me watching Power Rangers. She was a thing, certainly, but I had no deep connection to her as a person or as a concept.

Her death elicited nothing but a shrug from me at the time. And it still does, to be honest. Yes, I do, objectively speaking, understand and hold a positive impression of her today, but that’s come to me in the context of her being an historical figure – in the same broad category as Henry VIII, Winston Churchill, or Emmeline Pankhurst. There was no true and current emotive connection to me, there still definitely isn’t. I have had, and continue to have, no salient value attached to Diana as a person or in the abstract.

Then we come to what happened that week at school.

Or, more specifically, what happened with one RE teacher who thought it would be a fabulous idea to make a lovely, colourful tribute board, made with the traditional materials of young schoolchildren. We were asked to write tributes and cards to Diana, expressing our sad feelings about the tragedy, and place them on a display in the school corridor. Looking back, it seems quite strange that a run-down school in a working-class mining town would celebrate a central figure of the Royal Family and the upper-class of the South, who rose to prominence in the decade when the Establishment was busy systematically destroying and disenfranchising the area, but hey, they must have thought it was a good distraction and good for us. Or something like that.

But I literally didn’t care. I couldn’t care, perhaps. I had the same feeling (or lack thereof) I just described, at length, above. I might not have known those words, I might not have had the confidence to express it out loud, but I definitely felt it – and even at that age I had the meta-cognitive presence of mind to know it.

I was asked to express my feelings, and I genuinely didn’t have any.

This wasn’t good enough for that particular teacher, though. I had to have feels, or I had to try to have them. Because I must have felt them, they were sure of that – so it would have been good for me to express it! I said I didn’t actually care (or words to that effect) and I was told I did, and had to express it.

So I did it…

Well… I faked it.

And, at the end of the day, it wasn’t that hard. I faked a few feelings. I wrote something that was, apparently, quite poignant for a pre-teen. I was complimented for it being so true and expressive. And that was it. It was so easy to fake emotional engagement.

I think that experience has shaped my life far more than Diana’s death itself ever could. I now had direct experience of faking feelings, expressing grief where there was none, and ever since I have been utterly distrusting of grief. To this day I’m uncomfortable with seeing other people do it publicly over people they don’t know or had no connection to. I remember watching the funeral and not quite understanding why people were throwing flowers and crying. They didn’t know her, they had no good reason to feel like that. At least, that many people couldn’t possibly have a good reason to. If I could fake it, so my reasoning went, surely they were faking it, too.

That’s been my default assumption ever since. I’ve distrusted public displays of grief. It’s even spilled out into distrusting my own feelings. How can I tell that it’s genuine? How do I know they’re not just performing because they think they ought to? How can I tell that they really feel sad, and don’t just think they’re sad because they have a much deeper, and much more demanding, believe that they should feel sad?

I can’t. I could just trust them… but if can fake it, then…

It’s left me cold and cynical. This idea of taking a tourist-like break into the grief of others only got worse after 1997. People flocked to places where children had been murdered, cried for the news cameras and left flowers and cuddly toys…

Okay, rapid aside. It bugs me that those cuddly toys would almost certainly have just been binned and sent to landfill after being left out in the elements for a week. It bugs me that these inanimate objects never got to fulfil their purpose of comforting or entertaining a living child. It bugs me that I have a greater emotional attachment to inanimate objects with cute faces than I do to actual humans. There, I’ve finally admitted that in writing. Moving on…

…and as the internet grew into a thing, these displays could be made even more easily. Now you just have to comment “Hastag-RIP!” or “you’re in our thoughts”.

Sorry, another one. Okay, I get that, should you feel pushed for what to say, as I was in 1997, then a ready-made, widely acknowledged phrase is a good thing to have, it’s a nice default for when you don’t know what to say… but I do think that silence is a more honest response if silence is all you can come up with when pressed for something original, anyway, moving on again…

To me, it comes across as a performance. You’re there, making sure you say the right things from a select set of pre-approved terms and phrases, and commenting on the death of someone you never met, someone who you probably didn’t care about before, and someone you almost certainly would never have heard of if they died in some other way. And… with it all being online these days, how can anyone know that this sentiment is anything other than a performance? How do we know they’re really “keeping you in our thoughts” and not just fucking off to play another round of Candy Crush and thinking about dinner, and anything but what they said they were thinking of?

I could trust them, but then again it’s so easy to fake, I know that too well.

So, a few years ago, someone was murdered (I won’t say who, and I apologise if this comes across as horribly blasé). The usual messages cropped up… “RIP”, “This is so sad”, and other delightfully original pithy phrases… but… this was different for me – because one of the victims friends cried in my arms for the best part of an evening. I’m not sure we got that much sleep that night. I was only two degrees of separation from the victim. Only some prior commitment and a long distance stopped me from being literally at their funeral. After that, the “supportive” messages felt so hollow to me.

I have to emphasise the “to me” part of that, though. I am emphatically not saying that the family and friends shouldn’t feel comfort from those messages if that’s how it makes them feel… but it’s not for me. I felt that grief a mere second hand – and it still wasn’t enough for me to truly comprehend the loss. I was in that category of someone who didn’t even know they existed, I didn’t feel I had the right to feel grief-stricken or out-pour it publicly in a great competition of who can feel saddest and buy the most flowers and write the most pre-approved buzzwords in a Facebook comment.

So what about people only read about it via BBC News, and probably couldn’t remember the victim’s name if asked today? Did they really feel something? Or did they feel the need to perform as if they did? Are these comments now only Pavlovian responses to the news?

Maybe not… maybe they’re real connections. But if I can fake it so trivially and so easily…

I felt first-hand, back in 1997, how easy it was to perform grief. I felt first-hand the pressures people are put under to show it. I’ve seen and felt the pressures we’re under to feel something, and the stigma of how cold it seems not to – and how afraid we all seem to be of just throwing our hands in the air and saying “look, it’s a bad thing, yes, but I have no emotional connection to this, I can’t feel anything about it and I respect the truly grief-stricken too much to lie about that”. I actively lied to the world about feeling something I didn’t because of those pressures. There was no benefit for it – no-one gave a shit what I wrote, even I can’t properly remember the words I wrote down. If I didn’t do it, the world would be no worse off. If I did a better job of it, the world wouldn’t be any better. I only did it in response to social pressure and social pressure only… and I don’t think I’m special in doing that.

Do I have an actual point to this? No, I suppose I don’t. I don’t want a massive call to arms that says “NO MORE PUBLIC GRIEF!” – I’m not writing for the Guardian, here. If you want to, go on, do it. If you take comfort from the support of thousands of people you don’t know, take it – if you’re in that position you need all the support you can muster. If you take comfort from expressing it, go ahead.

Just that, if I die, and all you can come up with is “#RIP”, know this; I’ll be back to get you.

The UK Election – John Oliver Style…

Welcome, welcome, welcome… I’m absolutely not John Oliver, but this will be the best impression I can muster via text alone. We start with the United Kingdom…

westeros

…a country you think about so little you didn’t realise that wasn’t the United Kingdom, that was Westeros from Game of Thrones crudely photoshopped into western Europe. This is the United Kingdom.

uk

The UK has recently finished the count in a snap general election to decide on a new Prime Minister and new ruling government. There were some ups and downs in typically British laugh-first-ask-questions-later fashion. At one point we were treated to, and yes this is genuine, a man known only as Lord Buckethead standing against the Prime Minister in her own constituency as a “strong, but not entirely stable” protest vote.

I look at that photo and can hear the dum-dum-dum of the Imperial March from Star Wars, but I don’t know who it’s playing for.

Unfortunately for incumbent PM Theresa May, while she did manage to defeat Lord Buckethead, the rest of the evening had not gone well. This has not gone well at all, and the vote has returned a hung parliament. This means that no one party controls an outright majority and can’t form a government on its own. This means that after seven weeks of scaring the public that her opposition leader Jeremy Corbyn would form a “coalition of chaos”, May has had to reach across the Irish Sea to the Democratic Union Party, the DUP, to form a coalition of her own.

Now, at this point most people in England, Scotland and Wales simply went… “who?” and were forced to learn as much about the DUP in the space of three hours that they possibly could.

Holy shit!

The DUP are, amongst other things, associated with young earth creationists, climate change denialists, and are both anti-LGBT and strictly anti-abortion even in cases of rape and incest. So reaching out to the few elected DUP members of parliament clearly an act of some desperation for Theresa May, whose Party under David Cameron tried to bill itself as a somewhat pro-environment, pro-LGBT and progressive affair.

To understand how the UK got to this point, and to play a bit of catch up, we need to go back about ten years to the resignation of Tony Blair. You may remember him from such things as super-awkward attempts to make politics cool…

Image result for tony blair guitar

…his newborn son’s role in the MMR-vaccine controversy of the early 2000s, and, of course, sexing up a dossier on Iraq’s ability to launch weapons of mass destruction as a pretext to taking the United Kingdom and the United States into a war because, in terms best described with Pulp Fiction metaphors, Tony Blair was the Gimp to George W. Bush’s Zed.

After Blair’s resignation under a wave of controversy, the position was inherited by his long-term friend and political ally Gordon Brown, a man for whom the words ‘dour’ and ‘lacklustre’ were specifically invented for, and whose attempts at smiling still haunts the dreams of those who were children in the 00s and are now permanently traumatised adults:

Image result for gordon brown smiling

“PLEASE DEAR GOD NO! I’LL VOTE FOR YOU I’LL VOTE FOR YOU! JUST NEVER DO THAT AGAIN PLEASE!”

Now, there is nothing wrong with UK leaders simply inheriting this position. Constitutionally the Prime Minister of the UK is whoever leads the party with the largest majority in Parliament. But it does leave a bit of a sour taste in the mouth when one hasn’t actually stood for an election. So when 2010 rolled around and Gordon Brown stood for election, it was a big deal. A lot rested on the ability for Brown to win that election… which he utterly failed to do.

There was a disastrous combination of Tony Blair’s poisoned legacy, a perceived mishandling of the credit crunch and financial crisis of 2008, and Brown himself being recorded calling Gillian Duffy, a Scottish woman he spoke with about immigration while on the campaign trail, a “bigoted woman” on a hot microphone. All this left him pretty much unelectable. It was as if, after years of training a dog to do basic math by tapping out the numbers, when it finally came to the talent show final he just sat there, defecated on the stage and barked “bigot!” at the audience.

Image result for dog doing maths

“NO, FIDO! WE’VE BEEN OVER THIS BEFORE! JUST TAP YOUR PAW THREE TIMES! BAD DOG!”

Now, to be fair to Brown, the defeat here wasn’t a complete humiliation. 2010 saw the UK enter a hung parliament situation, the kind it’s in today, and the Labour Party’s main rivals, the Conservatives, still couldn’t form a majority on its own. So, the Conservatives reached out to the UK’s third party, the Liberal Democrats, and this man Nick Clegg:

Image result for nick clegg

Clegg is a man who is perfectly acceptably normal on the face of it but, while you can’t put your finger on why, probably has something wrong with him. He’s like that cousin that occasionally comes to dinner that you get on with, you like, and you agree with, and is wholly charming, and erudite and intelligent, but almost certainly masturbates with both hands and can only climax while looking at cleaning products. Or a work colleague you respect enormously, would do anything for and would even be happy for if they got a promotion instead of you, because “Go you, Nick, you deserve it!”… but then goes to a bar and orders a Bud Lite Lime.

Now, the words here should be self explanatory, a coalition between Conservatives and Liberal Democrats appears to be the weirdest flavour combination since Ben and Jerry’s introduced Peanut Butter and Sweetcorn, with a core of cheap hotdog. Sure, you could technically eat it and survive, but… reallyjust really? You want to eat that?

And that continued into 2015 as an uneasy peace between the two ideologically mis-matched parties. The UK then held another scheduled election. This time, the Conservatives reached the threshold for a majority, and reigned as a full government. Meanwhile, the Liberal Democrats were destroyed completely, losing most of their seats to a combination of Labour and the Scottish Nationalist Party as revenge for leaping into bed with the devil amongst other things. So the Conservatives were expected to have plain sailing from then on… and then this happened.

In fairness, it’s easy to see in hindsight that this was the equivalent of David Cameron cycling along merrily and then jabbing a metal rod into the spokes of the front wheel, before toppling into a ditch full of horse manure. But at the time it was a shrewd strategy to try and silence anti-EU members of his party, and prevent his voters entirely defecting to UKIP, the United Kingdom Independence Party, a party whose name will remain forever a huge slap in the face for people actually fighting for real independence.

Obviously, the EU Referendum went badly for Cameron and he resigned in a shock announcement, declaring that whoever lead the country out of the EU it wouldn’t be him. This then opened up a leadership contest to become the Prime Minister. This would be something of a repeat of Tony Blair stepping down to leave Gordon Brown in charge – whoever replaced Cameron would have to do a very convincing job of it to keep that sour taste of “unelected Prime Minister” out of the mouths of the electorate.

So, who did the Conservatives have to choose from?

Image result for conservative leadership election 2016

Well, first, there was Michael Gove, a man who was such an incompetent education secretary that teachers now literally use his name as a verb to mean ‘blustering into a situation you have no experience in and fucking everything up’. Boris Johnson, one of those inflatable flailing tube-men you see outside used car dealerships who gained sentience during a Weird Science ritual. And there was also Adrea Leadsom, a woman so unknown that even her supporters had to repeatedly check Wikipedia every day to see who she was.

But it turned out there wouldn’t even be a leadership election anyway as those other three candidates eventually dropped out. Johnson and Gove stabbed each other in the back pretty much at the first hurdle, while that… Other One dropped out after saying May was unqualified to lead the country because she didn’t have children. Theresa May won by default – hashtag-itsokaytonothavechilden, hashtag-feminism. Yes, she won the same way that a half-blind, half-deaf octagenarian with no thumbs would win at Mario Kart: The other three players acted like stoned toddlers who took one look at the Rainbow Road and just fell over giggling.

But… meanwhile, behind all of that craziness, there was this man, Jeremy Corbyn, who was elected as leader of the official opposition, the Labour Party, in the aftermath of the 2015 election.

Image result for corbyn

Imagine Corbyn as something of an Obi-Wan Kenobi like figure. If Obi-Wan explicitly said he would refuse to push the button on his lightsaber, and instead sat down to discuss a peace deal with Darth Vader over tea, only for the photos to emerge years later to deride him as a Sith Sympathiser. Or, perhaps more close to home, he’s like a less-angry Bernie Sanders whose stunningly sober vices include gardening in his allotment and studying manhole covers, no really.

Corbyn was somewhat of a disaster for Labour initially. Despite having strong popular support from the Party’s membership, he faced almost constant criticism from Labour MPs, who, only a year into his time as leader even launched a coup against him, starting a vote of no confidence that Corbyn lost, forcing him to stand for election again, which he won again by a similar margin. Labour were in disarray for nearly two years following the 2015 election. It was as if the stoned toddlers from before had simply given up playing Mario Kart at all, and just started hitting each other with the controllers because, well, why not, we don’t need to play Mario Kart for another five years anyway, let’s have some fun hitting each other instead. For some time, it seemed like Labour’s prospects of political success were in the sewers, which ironically might be fine with Corbyn because he likes staring at manhole covers.

Further Labour in-fighting seemed like a dead cert for the next few years. And that’s where we find yet another bit of complexity in the story in the form of the ‘Fixed Term Parliament Act’ – a title of a law so boring that not even Tony Blair could sex it up. But in short, the act limits UK Parliamentary terms to five years, calling for an election on a fixed 5-year cycle, as opposed the previous system which was near enough officially “I dunno, whenevs, bruh?” There was, however, an out from this – if two thirds of Parliament voted to repeal it, an election could be called at any time.

No one, quite literally no-one, thought it would happen, though. Theresa May said there wouldn’t be one – the first of many, many U-turns in 2017 – and most people thought Labour would be clinically insane to go along with it since, in their drunken state of perpetual in-fighting, the only end result would be complete decimation of their party at the polls. But, somehow, it happened anyway and the election was called.

Why it was called is possibly even more complicated. On the one hand,  Theresa May was insistent it was to give her the mandate to tackle the Brexit negotiations exactly as she wanted. On the other hand, many of her MPs were facing police investigations into electoral expenses fraud. Now, to be fair, those investigations mostly ended without any charges being levelled against the MPs in question, and the police said their mis-spent expenses were mistakes, not intentional fraud, but the timing of the election has been considered suspect in the light of the scandal.

And this is where it gets difficult to really appreciate and follow exactly what has happened since the General Election was called and the result came in. British politics has almost, but not quite fully, inverted. Now, to be fair, Labour still haven’t won. They won’t be in government. There’s no indication that Jeremy Corbyn will ever be Prime Minister as it stands. He can attempt to form a minority government if Theresa May’s negotiations with the DUP fall through, but most likely that would take another election, and probably one pretty soon – and there seems to be very little appetite going around the UK to go through this shitfest yet again. But let’s look at a quick run-down of the things that have happened in only the last few weeks.

Firstly, Theresa May and the Conservatives basically shot themselves in the foot by targeting their own base – that is, wealthy, white old people – and threatening to force pensioners to sell their houses to cover the costs of dementia care, as well as taking away benefits including a winter fuel allowance. Both pledges became so toxic they became the first flagship manifesto promises to be broken before an election had even taken place. Then Theresa May refused to take part in any debates, televised or otherwise, and mostly hunkered down to take part in planned and controlled photoshoots with selected party faithful, rather than the general public. Then, during one televised debate, Home Secretary Amber Rudd – standing in for Theresa May – asked the audience to judge the Conservatives on their record to get the biggest outright laugh of the evening. The second biggest laugh possibly went to this man, Tim Farron, a man who is a pea-on-a-cocktail-stick crossed with the children’s TV presenter, when he used his final debate speech to compel the audience to go make a cup of tea and change the channel to watch the Bake-Off instead of listening to Amber Rudd’s closing remarks:

Image result for tim farron

And then there was the entire Conservative policy and manifesto, which was widely mocked for being very light on detail, or substance of any kind, but very firm on the words “Brexit” and “strong and stable” as if those were the only three words they had available at the time, because their austerity measures stopped them from buying new ones.

At the other end of the political spectrum, a minor miracle occurred in that the stoned toddlers in Labour stopped hitting each other with video game controllers long enough to snap to attention and get their act together, resulting in a huge surge in the opinion polls that put them neck-and-neck with the Conservatives by election day on June 8th. Then when the votes were counted, Labour had won big, the Conservatives had lost a little, but the Scottish National Party lost big with their votes splitting apparently at random between the Conservatives and Labour. And that’s without getting into UKIP, which may no longer exist by the next election, even if it is really soon, as it managed to win zero seats for the 3rd election in a row. And it’s also without mentioning the small gains by the Liberal Democrats, who recovered slightly from the beating they got in 2015. And that’s also without getting into how Labour managed their turnaround in the face of a very hostile media that has been widely criticised for not giving Corbyn a fair hearing.

And while all this was going on, Britain faced two major terrorist attacks that left dozens dead in two major cities.

The whole thing has left Britain a confuddled, weary mess, much like it was in 2010. And there is now a lot of uncertainty going around, perhaps even more so this time around. The deal between the Conservatives and the DUP seems to be strained, yet very casual, so no-one can say how strong and stable it will be in the end. Theresa May was looking to shore up her majority and run the country for five years, but has instead been humbled, and there are already calls for her to resign, opening up the Party leadership to many of those who fell flat on their faces the last time. Meanwhile, the fallout this will have for the Brexit negotiations is completely unknown, and the UK’s departure from the European Union will be even more up in the air than before.

So, the question is; how much more of this can Britain take?

And now, this…

Thanks, but I’d rather not have the cash…

Warning, British politics ahead…


Today, another fearmongering piece of toilet paper came through the letterbox, proudly proclaiming that I need to be shit-scared of the following…

tax_cut

My income tax cut is under threat should I vote Labour in the coming weeks!

Factual dubiousness aside – the Labour tax plan doesn’t suggest massive income tax rises for that many people – lets assume it was literally true. Let’s assume that the last tax cut (the raise in the tax-free allowance this year) will actually be reversed. Let’s look at the tax cut I’ve been granted this year under the Conservatives.

My pay slip shows this as about £11.60 – you can now use a tax calculator and the nationally-agreed pay scale for academic-related activities to deduce how much I earn if you like.

Now, the question I would like to pose to Conservatives – either official representatives or their voters – seriously, what the fuck am I meant to do with this?

tax_cut2

It’s not enough to buy private health insurance. That’s £93 per month on average.

It won’t pay off my student loan much faster. £10 a month would clear it in about 83 years from a rough estimate.

It won’t pay for private school for any potential kids I have. That’s a grand a month on average.

It won’t get me a car or fuel it, and it won’t exactly grow into a massive retirement fund.

I doubt I could employ someone and start a business. I might be able to pay someone to mow the lawn for me with it, which I’m sure will fix the economy pronto.

Hell, at best I figure it’ll buy me two pints of Brew Dog’s fizzy-piss-flavoured craft beers per month!

Maybe I’ll just up my charitable contributions by £10, that’s probably the best use I can think for it.

“Ah, but be grateful!” you might say “it’ll mean a lot to people on minimum wage!”. Maybe, sure. Maybe it’s a lot when you’re super poor – sure, I’ve been there. But even when I was, that was a fraction of rent, a fraction of commuting costs, and would only be a week’s worth of ramen. But I should still be grateful, because it’s rewarding me as a member of a Hard Working Family™.

But, more pressingly, what has this wonderful sum of money cost me? Let’s rattle a few things off that have been going on under neoliberal rule in the UK…

The wages in my sector have stagnated or even reversed in real terms in the last few years. The students I teach are now paying more (about 10x even my early-millennial ass did) for less return as graduate wages collapse. Education is being undermined. Children in school are having their meals threatened. The disabled are being forced into tests to see if they’re fit to work – and then die afterwards as if it was designed for it. We’re going to ruin the economy through giving the middle-finger to our nearest neighbours because Strong and Stable. Rents are going up uncontrolled. Funding for local services are going down, meaning rises in council tax rates alone will eat up anything income tax cuts will grant me. They’re punishing children who dared to be born third and poor. They’re dismantling the NHS, replacing it with a health pseudo-service whose principle aim is to turn government subsidy into private profit – as the rail privatisation has done repeatedly, having increased their fares above inflation for over a decade now. And I’m sure the list continues and people with more knowledge than me can be more precise about it.

All of this is going to, and has, cost me the equivalent of hundreds a month. If you want to put a figure on it a back-of-an-envelope calculation for my loss in real-terms earnings since Conservatives started is around £200 a month at least. The crash in the pound after that referendum cluster-fuck wiped even more off my earnings by contrast  to my overseas colleagues. Cheers guys, the extra tenner almost makes up for it! The price of living is going up and up – and they want me to get down on my knees and worship them for the extra tenner they granted me? That cut, which sounds like it could go somewhere useful, has been eaten up ten times over just by the shit they’ve caused granting even bigger cuts to the far more exceedingly wealthy.

“OH GREAT BENEVOLENT THERESA MAY AND YOUR WISE CABINET! I AM UNWORTHY OF YOUR GENEROSITY!!!!”

Can they just fuck off with their shit, already?

The Chemistry of ‘Yes Minister’

I’m writing a book. Well, it’s more like 20+ blog posts strung together, all on the theme of how science is depicted in the media. More than just factual errors, I’m interested in how science and scientists are depicted and how it influences peoples’ opinions of science as a process. This includes looking at fictional depictions. Here’s one sub-essay from it – looking at a particular episode of the comedy series Yes Minister.

That has science in it? Yes, it does. Let’s have a look…


Yes Minister is one of the most seminal British political satires, remaining profoundly relevant even decades after it first aired. The series centres on the ineffective and self-aggrandising Minister for Administrative Affairs, Jim Hacker, the somewhat-Machiavellian civil servant, Sir Humphrey Appleby, and their fight over who truly runs their department. Given the biting political satire on the nature of the British government, its traditions and sprawling bureaucracy, it’s possibly the last place you’d expect to see some serious depictions of science in fiction. Yet, nestled amongst the normal political spoofs of self-preserving civil servants fighting against self-serving politicians, is one of the most striking bits of satire on science with respect to political policy.

The second series episode The Greasy Pole, named after a Benjamin Disraeli quote on the nature of climbing the “greasy pole” of political promotion, focuses on the minister trying to give the go-ahead to the British Chemical Corporation’s (BCC) manufacture of propanol in Merseyside. It opens with Sir Humphrey talking with the chairman of the BCC, and offering assurances that his minister will find no objection to their new chemical plant – after all, it’s Sir Humphrey who truly runs the department whenever he can befuddle the minister into agreeing with everything he proposes.

Any standard essay delving into science on television would stop here to examine some chemical facts and inaccuracies brought up in the opening exchange. Yes Minister gets some of this almost hilariously wrong. In the real world, propanol is a short-chain alcohol, only a single carbon unit longer than ethanol, the compound that gets you drunk and gives you a hangover afterwards. The TV show, in addition to pronouncing it with shortened vowels not common to modern chemistry, seems to imply that it’s some mysterious chemical or medical drug, and a complex concoction that requires dioxin to produce. Similarly, “dioxin” isn’t an individual chemical, per se, but a conventional name for wider class of compounds known as “dioxin-like compounds” (DLCs). DLCs are known to be very persistent pollutants, amongst the most toxic and dangerous chemicals commonly used. The most basic DLC is known as 1,4-dioxin but there are a few hundred derivatives with this structure at their centre. We’ll talk about the most likely candidate the fictional characters are referring to in a moment.

The mild factual errors continue. In Yes Minister, the British Chemical Company have shown that they can produce propanol not with dioxin, but meta-dioxin, which they claim is inert and safe compared to the deadly dioxin. Trouble is, there is no such thing as meta-dioxin – the closest real DLC that would match that name is known as 1,3-dioxin. There is also metadoxine, which, despite its incredibly similar name, is something else entirely.

dioxin

Meta in chemistry refers to a position around a hexagonal ring of six atoms. The closest position (to a particular reference position) is known as the “ortho” position; the site opposite the reference position is the “para” position; and the one in between is the “meta” position. Ortho-, meta– and para– compounds, therefore, are similar chemicals where a particular group of atoms has been moved around the ring to different positions. “Metadioxin” makes a small amount of sense as a name, but can’t tell us what group of atoms is “meta” to what other group of atoms – unless we reasonably assume it means the two oxygen atoms, in which case we arrive at either 1,3-dioxin or an isomer of 1,4-dioxin that hasn’t been made before.

The confusion around the identity of “metadioxin”, however, is mostly a side effect of using “trivial” or “non-formal” names for chemicals. When you see long, convoluted names that sometimes span more than one line of text, complete with brackets and numbers (and we’ll come across one shortly), this is a sign that the name is systematic – it follows rules that allow you to construct the exact structure of the compound from knowing only the name. Without this (and the knowledge to unpack the name into the structure) a trivial name can be very misleading, verging on complete meaninglessness, and leave us at a complete loss as to the identity of the compound.

This brings us nicely to where Yes Minister gets its depiction of science absolutely correct: no-one in the TV show knows anything about the chemistry at hand and, as they say, hilarity ensues. The characters lack the ability to evaluate the safety of metadioxin and propanol themselves, and can barely even identify the compounds in the first place. At one point in the episode, the minister, a local MP and two civil servants try to make their way around the decision while attempting to not betray their utter ignorance of the subject at hand. When asked what “meta” means, Sir Humphrey uses his education in classics to describe it as “after” dioxin, or “beyond” dioxin, from the literal Latin translation of “meta” – as opposed to the positional nomenclature described above. Even the definition of “inert” evades the small group – they conclude that it must mean that the compound isn’t “ert”, whatever “ert” means. “Inert”, of course, means unreactive, or harmless – nitrogen gas, or argon, are considered “inert” gases, for example, and the term is perhaps one of the more common chemical definitions. But the fictional scene is a good example of how apparently common jargon for a specialist might elude a non-specialist audience. The show depicts people making an important political decision about chemistry, while blindly knowing nothing of the science.

Image result for yes minister

This isn’t too far removed from events that have happened in reality – the 2016 “psychoactive drugs” act in the UK was widely criticised for not having a rigorous definition of “psychoactive”. Drug enforcement legislation up until this act was based around banning specific chemicals, which can be codified into law fairly easily. That posed a clear problem; what about new drugs that weren’t legislated against? So-called “legal highs” were major news stories throughout the late ‘00s and early ‘10s, often with dubious factual accuracy. So the government aimed to ban them in advance by banning substances that ‘affects the person’s mental functioning or emotional state’. But such a broad, and medically meaningless, pre-emptive strike by legislation caused a lot of confusion – it would ban countless substances from alcohol and caffeine, and even incense as used by churches across the country. The government would struggle to implement the legislation.

What do the fictional characters have to go on when talking about chemistry, if not an education in the subject? The politicians and civil servants only have two things they’re sure of:

  • The US Food and Drug Administration have given the fictional metadioxin a clean bill of health. There’s enough precedent for them to go ahead without worry.
  • Protests are erupting in Merseyside about the process – protesters don’t want the “dangerous” metadioxin in their backyard.

This is where the show seems to have done its homework in depicting representations of science in the public eye, and the name “metadioxin” becomes more consequential than you might expect. Just as with the “nuclear” in “nuclear magnetic resonance imaging”, the name alone has struck a chord with public perception. Chemical names can sway people because of their familiarity and unfamiliarity work in tandem. Vani Hari, aka “the food babe”, uses the mantra that you shouldn’t eat a chemical if you can’t pronounce it. This makes absolutely no scientific nor chemical sense but the mantra is simple enough that people do genuinely follow it assuming it must be a good idea – it seems fine as a heuristic. Names and guilt by association, therefore, do affect peoples’ judgement of chemicals in the real world – Yes Minister is a little heavy-handed about it, but that’s expected in satire. It’s certainly not wrong.

The episode makes it clear that the protesters’ have a rationale for being sceptical of the British Chemical Company, and have an origin for their fears over the mere name “dioxin”. A key plot point of the episode is the Seveso disaster, where their knowledge of “dioxin”, apparently exclusively, comes from. Unlike the apparent nature of the “propanol” and “metadioxin” mentioned in the episode, this event is very, very real. The disaster occurred on July 10, 1976 (only five years before the episode aired, so still in recent memory at the time) about 20 km north of Milan, Italy. The disaster exposed thousands of people to 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) – this is one of those long systematic names that lets us identify it more specifically. Although the much simpler 1,4-dioxin is the basis for the word, TCDD is possibly the dioxin as far as public perception and environmental harm is concerned.

TCDD gets its infamy from both the Seveso disaster and as a contaminant in “Agent Orange”, the military herbicide used in Vietnam to destroy crops and foliage. TCDD in Agent Orange has affected the health of around a million Vietnamese, causing birth defects, cancers and skin reactions. About five years after the end of the war, the accident near Seveso released around six tonnes of TCDD into the atmosphere, which settled into the surrounding area, affecting a population of over 100,000 people. The exposure immediately killed around 3,000 animals, and over the next two years 80,000 were slaughtered to keep TCDD out of the food chain. About a quarter of 1,600 people examined in the zone closest to the chemical release suffered extreme skin lesions and inflammation, and 15 children were hospitalised. The political aftermath was significant, and Seveso gives its name to the European Union directives governing safety for chemical sites.

Seveso, however, is small fry compared to one chemical disaster that would yet come. Three years after this episode of Yes Minister aired, a leak of 40 tonnes of methyl isocyanate in Bhopal, India, would kill around 4,000 people – just over 2,000 of them in the immediate aftermath – and harm around half a million. Other estimates put the direct death toll at around 8,000 for the weeks following the Bhopal disaster.

Despite the immediate reaction of the pro-science crowd to say that chemophobe protesters have nothing to worry about, their fears are based in very real, and very deadly, precedents, and need to be understood and accepted before we can begin reassuring them about the stuff that definitely is safe. Or, at least, safer.

So politicians, just as depicted in this episode of Yes Minister have a very fine line to tread. On the one hand, they have evidence that they may not fully appreciate or understand, and on the other they have public opinion that they very much do understand. The Rt. Hon. Jim Hacker, therefore, has a tough decision to make. Bring jobs and industry to a constituency by opening the chemical plant, and trust the evidence of scientists and chemists, or kowtow to the public fears and make a popular decision to shut it down – to ensure both the local MP is re-elected and preserve his own position in government. Even Sir Humphrey, who throughout the episode sings the praises of the chemical industry and is keen for the Minister to give the go-ahead for the plant, lacks the chemical know-how to make a convincing case.

The mostly-accurate, albeit cynical, depictions of science and policy continue. Faced with an FDA report that they can’t quite trust, the British government commission their own report by Cambridge academic Professor Henderson. The fictional “Henderson Report” is expected to follow the FDA’s conclusion and clear metadioxin for use – as a result, the Minister has no reasonable grounds to make the politically expedient decision to block the chemical company. He could, however, attempt to circumvent the report. They could fail to publish the report (which Sir Humphrey emphatically denies is equivalent to “suppression”) or they could discredit it – which they can do without even reading it by issuing blanket statements such as “it leaves questions unanswered” or “the conclusions have been questioned”. The suggestions to personally discredit Professor Henderson himself will also seem familiar – they echo tactics used against Dr David Kelly, a biological weapons expert who (conspiracies about assassination aside) was pressured by the government over his involvement with the media and driven to suicide, and Professor David Nutt, who was forced out of his position as a government adviser because of his evidence in the relative harms of different drugs went against government policy.

In the end, Hacker takes the politically self-serving decision and blocks the chemical plant, after putting pressure on Professor Henderson to add a note of caution to the report’s conclusion, even if the evidence in the body of the report couldn’t back it up. He’s hailed as a hero by the protestors, and might have secured re-election for his party’s MPs in the area – all at the expense of any scientific evidence. We can’t take self-confessed satire as a literal depiction, but we can see it reflect accurate attitudes of science, politics and protest. Twenty years after The Greasy Pole first aired, we can see how it reflected things such as the MMR/autism hoax or climate change denial in politics. The scientists work in the realm of fact, the people work in the realm of heuristics and emotion – relying on the familiarity of names and words to inform them – and politicians can find themselves torn between the two with only have their own interests and experiences to guide them.

This shows that some depictions of science in fiction need to be looked at a little deeper. Their factual errors may amuse science enthusiasts, who can claim a brief moment of superiority over the ignorance of the writers, but these are merely superficial. Below the simple assertions of scientific fact lie depictions of how science works as a process and how it interacts and influences people. Some can be accurate or satirical exaggerations. Indeed, the deeper meaning and satire might not be so profound if the superficial facts were entirely correct.

Everything is a Terrorist’s Friend…

Today’s Daily Mail front page… Jesus fucking wept… Anyway, here’a a link to some snark from New Statesman on the subject. I want to offer my own below.

Here’s a list of all the things that are friends of terrorists…

It took me two minutes to find a car that could be used to mow people down in an intentional attack

THE INTERNAL COMBUSTION ENGINE, THE TERRORISTS’ FRIEND…

I notice that the suspect also breathed a considerable amount of oxygen in the run up to the attack

RESPIRATION, THE TERRORISTS’ FRIEND…

It takes approximately two minutes to conceive a child, a child that could grow up to read a terror manual

SEXUAL REPRODUCTION, THE TERRORISTS’ FRIEND…

Look at all the terror attacks that rely on substances reacting to form other substances, from fuel burning in engines to bullets firing to bombs exploding

CHEMICALS, THE TERRORISTS’ FRIEND…

We found a terrorist manual printed in PDF format online

UNICODE, THE TERRORISTS’ FRIEND…

We found a map of parliament that showed exactly where it was in London, enabling us to go bomb it if we wanted

THE LONDON A-Z STREET ATLAS, THE TERRORISTS’ FRIEND…

We noted that the attack had not yet been killed by a tumour, we have to wonder why were they spared this disease

CANCER, THE TERRORISTS’ FRIEND…

Yesterday, it took the Mail two minutes on web to find a terrorist manual

TCP/IP PACKET SWITCHING NETWORKS, THE TERRORISTS’ FRIEND…

It takes seconds to punch someone in the face, potentially killing them immediately.

VOLTAGE GATED CALCIUM CHANNELS LEADING TO MUSCLE CONTRACTION UPON RECEIPT OF A NERVE IMPULSE COMBINED WITH PRE-EXISTING HEAD INJURIES THAT MAY BE SENSITIVE TO SUDDEN IMPACTS, THE TERRORISTS’ FRIEND…

If I had the time, I’m sure it’d be worth photoshopping these onto a front cover, but it’s the fucking Mail, I’m not sure they’re worth the damn effort.

5 Things They Don’t Tell You About Teaching in Higher Education…

Image result for university lecture

Have you ever considered a career in teaching? Does it sound totally great but the concept of a PGCE and a month’s mandated nose-wiping in a Primary School turn you off? Would you rather teach people you can be cynical and sarcastic to? Then try HE!

(note that this is primarily cathartic cynicism, it’s still a good job, and where I’ve highlighted problems below I do have solutions – at least for the things in my control – but maybe for another time)


It’s long been thought that the difference between teaching at school and at university is that the former were forced to be there – as subjects become more optional, attitudes improve. I do think that’s true, and is part about what makes teaching in further and higher ed. much more attractive. When students want to learn because they intrinsically value it, they’re great to teach, and this is backed up by (decent) research in education and psychology.

Except…

1. The students don’t want to be there

Once, long ago, students came to university because they had a passion for the subject – although this tended to correlate strongly with being wealthy and white for various reasons, that’s beside the point for now. People would happily come to university for the privilege and sheer honour of sitting in a stuffy room and listen to an academic talk endlessly about their area of expertise (a minor exaggeration, I’m sure). After all, it didn’t matter if you didn’t understand it, you just went to the library to learn it again properly before hitting the subsidised alcohol.

But that’s very much changed now.

We can blame the new fees regime, sure, but there’s been a broader cultural shift in what university is actually for – or, at least, seen to be for. It’s now a continuation of school, it’s just “what you do”, and if you don’t go to university you’re seen as a failure. Whether this comes from employers demanding “any degree” for jobs that don’t warrant it, wider society now valuing education for its own sake, or even direct bullshit-expectations from parents, students have to go to university. Students now scramble onto difficult STEM courses because they’re offered through clearing, but do so with a lack of maths qualifications and an interest in the subject that comes exclusively from being told “don’t do Art History or English, it’s a waste of time”. The expectation is “university”, as opposed to “physics” or “biochemistry”.

The end result is that students don’t really want to be taught by you. They see university as a 3-4 year prison sentence they must serve before they can graduate and a get a decent job with more money – a fallacy given that graduate wages are rapidly collapsing. Students increasingly see only the extrinsic value in the subject – the degree, the stepping stone to the next thing. You’re there to tell them how to pass the exam so that they can be graded and graduate with a 2:1 and put off deciding what they want to do with their lives for a bit longer. The effect this has on their motivation is just as bad as school pupils who are “forced” to stay in school way past the point where they care about it.

It’s not a universal, but it’s a large enough quantity to make the job much harder than it needs to be. In fact, the job is already hard enough given that…

2. There is no training (that’s of any use)

Surely, the person standing up to lecture you has been taught how to do it effectively, right? And when someone is organising a tutorial, they’ve been told how to structure the session, respond to queries, and their notes on the questions contain an extensive troubleshooter and FAQ?

Nah.

You’re pretty much thrown into it with nothing if you decide to go on a teaching route. You’ll go into the lab for the first time to supervise 100 or so 18-21 year-olds and know nothing of the practicals. You’ll have a group of 6 in a tutorial and you won’t have had the chance to practice what you’re going to do with them. You’ll turn up to a lecture and this is the first time you’ll have given a presentation where the audience’s comprehension of what you’re about to say actually matters.

Now, this isn’t to say you’re ultimately terrible at it. Junior academics usually have to present to a lecture theatre (their research and proposals) before they’re employed. The ones that don’t get the job are the ones that fail to realise this isn’t a presentation, it’s an audition. As a result, anyone employed in that position can at least speak clearly and won’t fidget and mumble their way through a lecture series. But that’s it, that’s the main bit of quality control that weeds out the physically incapable. Barring a yearly peer review (usually precipitated when the one person who cares decides to organising), which focuses mainly on ticking a few boxes along the lines of “were you any good? Yeah, whatever”, there’s little to no culture of review or quality control on HE teaching. Responses from student feedback are, generally speaking, either useless (“they were fine” ranging to needlessly personal insults) or unrepresentative (a response-rate of 5% is good) so can’t be used to improve teaching and direct your own development as a teacher.

Well, there is some training. But it doesn’t involve how to deliver or develop a curriculum, or make sure that your ideas are understood by people. Instead, you’ll get taught “learning styles” (largely debunked as hokum) or you’ll be taught Kolb’s learning cycle (I’ve yet to find a use for it) and countless over-complicated words that really do nothing but state the obvious. You’ll hear about “travelling theory” where you treat your “subject as a terrain to be explored with hills to be climbed for better viewpoints with the teacher as the travelling companion or expert guide”. This all sounds lovely and poetic and makes some abstract high-level sense, but doesn’t really help you help you teach someone how to normalise a wavefunction or integrate a rate equation. And the diagrams – be sure to always call them “models” – the bloody diagrams that mean nothing but will make your eyeballs bleed. Bloom’s taxonomy (or at least the cognitive domain of it) might be useful for you writing exam questions, but that’s it. Make sure you use “conceptions” and “discourse” a lot when it comes to writing your essay to prove you’ve learned this stuff.

The only useful thing I got out of a year’s worth of workshops and coursework was a half-hour session on vocal health – because talking your bollocks off to 200 people for 45 minutes is harder physical work than it seems. That was great; and something I appreciated more than most, thanks to being married to a pro vocalist who has schooled me in the theory of that for over a decade.

Anyway, why the “training” sucks segues nicely to the next bit. You’re not really being trained to teach, exactly…

3. If you want recognition, be prepared to do something useless

Teaching is a largely thankless task in higher education. This sounds a bit weird if you think of university primarily as an educational institution, yet, it makes perfect sense if you think of them as academic institutions designed to generate research. Teaching doesn’t generate headlines, it doesn’t bring in millions in grant money, and it will get you a new building only once in a blue moon when the university finally listens to the 800th email saying “the teaching labs are about to fall down and kill people!” (because “they’re too small to fit the students you demand we should take” doesn’t get the job done).

This is slowly changing, though. We can blame the fee regime for this. Students now make up the majority of funding for universities, and with the Teaching Excellence Framework around the corner, the higher-ups are taking it seriously.

Except…

The training and recognition don’t reward good teaching, they reward talking about good teaching. Hopefully, I shouldn’t need to hammer home that these aren’t the same thing.

Consider what you need to do for an HEA fellowship, for example. You need to write essays and take part in continuous personal development (CPD), but few of those are ever based around your actual teaching (you have to write a case-study of your own teaching, but the actual aim is to analyse it using the bullshit you learned in your ‘training’ workshops). As a result, the people who get published in the educational literature, and so make a name for themselves as ‘good’ teachers, are the ones who write things like “Conceptions of Student Learning: A New Model Paradigm For Higher Education” and then proceed to yank four student types out of their arse and call them “Square Thinkers” and “Circle Thinkers” and “Triangle Thinkers” and “Squiggle Thinkers”, each described with Barnum statements and no real evidence, and then try to say something profound like “you should make your tutorial group of Squares, Circles, Triangles and Squiggles”. I’m not naming names, but this actually happened once.

So if you can guff around and talk crap about teaching and learning, and make it sound complicated and theoretical and academic, you could very easily find yourself on route to a very cushy academic job in an education department.

Alternatively, you can innovate. Innovation is something I won’t bash outright, but innovation for the sake of innovation is the enemy. Want a teaching award? Start a Twitter account! Send out homework assignments via Snapchat! Get into a packed lecture theatre and do explosions with your students – don’t bother telling them why they explode and how to stop it, that might be useful to them, and that’s boring. Experiment with keeping your office door open! Do EBL and PBL and use the word “constructivism” a lot! Add your students on Facebook! Tear up the rule book because you’re cool and wait for lavish praise to fall upon you!

If you’re a softly-spoken lecturer who stands at the front to just talk – calmly, rationally, and with a clear message – the students will go away knowing a lot about a subject. But that sort of crap doesn’t get you an award or promotion. (before you think this just sounds like bitterness on my behalf, you need to know I’m not actually this kind of person)

Anyway, you can avoid most ‘training’ sessions, except the most important one, which they probably won’t tell you about…

4. You need to learn mental health first-aid

So, cynicism aside for a moment, if you want to work with students, seriously, learn mental health first-aid. Believe me, there’s a lot that “common sense” won’t get you through here so you need to know it and get taught by someone who knows what they’re doing. It’s difficult to deal with, but it’s something you will inevitably deal with and may even take up a measurable chunk of your time (which can’t be directly assigned to the Work Allocation Model, of course).

Why is this important and potentially time-consuming?

Look above at all the crap students have to deal with. Under pressure to perform from their parents, locked into a course they hate by the expense and the fears that they’ll never pay back these objectively ridiculous fees, surrounded by staff who would rather be writing their next Science paper than answer questions on thermodynamics, faced with lab work that’s almost designed to overload their working memory… and then panicked that they haven’t learned anything from the young, hip and trendy ones that are telling them to check their twitter feed for tutorial announcements.

All that on top of being young, a bit dim, unsure… by the gods, the list goes on. It is a perfect recipe for a mental breakdown. And this is strikingly common, and not just restricted to the stereotype of the emo goth girl who broke up with her boyfriend. Anyone who comes into your office could break down in tears at a moments notice.

I really don’t talk about this often, so I’ll get it over with in a single quick-fire list: in a few short years I’ve had students on anti-depressants, undergoing  CBT, having panic attacks in labs, admitting to being sexually assaulted, having been mugged, saying that their family has just imploded, discovered they’re dyslexic, passed out in an exam and woke up in hospital, passed out in a laboratory, passed out in my office…

This is serious fucking business. We’re not there to be therapists – we shouldn’t take on that role – but university counselling services are stretched thin, underfunded (by comparison to their need), and are only really available as palliative care rather than preventative. As a result, we often have no choice. If you want to take a teaching-track route into HE, you’re likely to be in close contact with students far more often than research-focused counterparts, you’re going to be seen as more approachable because of it, you’re going to deal with this whether you like it or not.

Maybe you want to stay in research over teaching, because…

5. We don’t know if it’s going to become a dead-end or not

As recently as 5-6 years ago, a teaching-track in a university was a dead-end. Teaching staff were recruited as a cheap and easy plugs to do jobs that senior academics didn’t want to do. They don’t want to spent 6 hours on their feet in teaching labs. They don’t want to blow 4 hours a week on tutorials. They’ll put up with a lecture course if it’s the only one they have to teach that term and they don’t have to do anything but stand and talk. And so, teaching-focused staff were born – costing only as much as a postdoc to employ, capable of absorbing much of the abuse students generate, and having copious free time to load up with that “any other duties” bit of the job description.

But there was no promotion track. There’s no way, as a teaching-focused academic, you can write and bring in a 6-7 figure grant. There’s no way, as someone who doesn’t run a research group, you can really publish a high-impact paper. And so there was no way that a university or department could reward you for it.

This has, however, mildly improved. There are now promotion criteria, there are pathways to get to senior positions, and – even if it is rare as astatine – you can get a tenured professorship purely on teaching. Some places are even slowly unpicking the distinction between teaching and research focused staff, allowing you to hold the title of “lecturer” officially – ironically, “lecturer” usually means you do less lecturing than the people without it. This is all fabulous, of course. Finally, universities are recognising that students bring in a load of cash, and so the staff to teach them stuff might be worth investing in.

But.

There’s always a ‘but’.

The UK is slowly moving over to the United States’ model in, well, every area, really – and this includes HE. We’re going to privatise our healthcare, prisons and welfare, and we’re going to hike higher education fees to make them inaccessible to all but the most advantaged people. We also run the risk of paying staff less, exploiting the eagerness of younger researchers and teaching staff to take poorly-paid positions for a 1-2% shot at the big time. The US runs on a frankly appalling system of “adjunct” professors, who are usually newly-minted PhDs who are typically paid per class they teach. The end result is that many of them teach classes at multiple institutions, often with long commutes between, and are paid only for the hours teaching. Once you factor in the travel times between jobs, the marking, grading, course development and other sundry overtime, the wages work out as just below minimum wage. Yet the system works because people feel they have no other choice – and they’d be right, that’s their only choice.

Is the UK heading that way, too? Maybe, maybe not. I don’t know. On the one hand, we’ve seen staggering improvements in the respect you get for teaching in HE, on the other hand we could revert to the US model at a moments notice if the suits in charge see that it’s cheaper to pay some young pup £3,000 to teach a class than it is to pay them £30,000 to be full-time and only teach 4-5.

I’ve seen an increase in teaching positions advertised as “term-time only”, which pro-rata down to quite a low salary for a year of work, meaning you’ll need a temp or part-time job to keep you busy in the long summer. But, more importantly, term-time-only contracts and per-class contracts robs universities of the chance to do any development work. Most teaching labs experiments were cutting edge back in the 60s, some lecture courses haven’t been updated since the 90s, intro courses given to first years are still the same tired old things despite evidence that flipped delivery would improve them. No one can do that unless teaching-focused staff are given the time, respect, and clout to develop – and that means employing them full time, even over the Christmas, Easter, and summer breaks. If the worrying trend to employ them for their hours only continues, we’ll lose any chance of curriculum development or review by people who actually care about effective teaching.

So there’s a lot of work being put in to make the position respectable. But it’s likely that the walking suits earning 10x what I’ll ever be able to won’t like that, and reverse the entire thing into a ditch.

We need to talk about “common sense”…

No two words in the English language have done more damage to the cause of human rationality than “common sense”.

(At least, I would like to simply assert that as some opening rhetoric/hyperbole, as quantifying that last sentence to prove it might be a little difficult and more trouble than it’s worth.)

Everyone’s heard of “common sense” before. There’s 62 million results for it if you bung it into Google, it’s probably been spat in your face since forever.

“We need more common sense!” you’ll hear from politicians as they begin to dismantle complex laws built up over time, or “you have no common sense!” you’ll read in a backwater comments section as if it refutes actual studies and research, “this is just common sense” says someone in support of their argument.

Image result for common sense

But what does it mean? Let’s run to Wikipedia and grab the first line:

Common sense is a basic ability to perceive, understand, and judge things that are shared by (“common to”) nearly all people and can reasonably be expected of nearly all people without need for debate.

Okay, not bad. We can all agree on that. And if you asked anyone who made an appeal to “common sense” this is probably what they’d cite as their definition.

Hopefully, though, you shouldn’t take too much convincing to know that there’s a big difference between what people say that they mean, and how they act out what they mean. Someone who is adamant that Jesus will protect them at all times will still look both ways when crossing a busy road. Someone with a stated mistrust of science and doctors will still knock back aspirin and paracetamol like it’s going out of fashion to cure their headache. Someone will proudly proclaim “I’m not a racist!” and then proceed to cite The Bell Curve and argue that Asians are all good at maths and that can’t be racist because it’s a positive thing. An intersectionalist will decry overgeneralisations, racism, assumptions, and promote the need to assess everyone as an individual that’s a sum of all their experience – and then rant about how terrible white people all are. The list goes on.

So it shouldn’t take too much imagination to realise that when someone says that “common sense” means “something obvious to all of us”, they might not really use it that way.

How is it used? As in, actually used by people who say it, and what do they want to achieve by saying it. Words themselves don’t have inherent meaning, but they do have use-cases and an intended effect by the users.

It doesn’t take too long to search a right-wing tabloid for examples of the phrase. That hardened bastion of “common sense”, The Daily Mail, is full of them. Lord Falconer proposes “common sense” human rights – doesn’t seem to suggest anything, just uses it as an excuse to throw out other ways. “She’s intelligent but doesn’t have common sense” – again, to dismiss someone or some other way. “Insult to common sense” – said about laws to stop workplace harassment.

Image result for common sense

These are random-ish examples, but in each one there’s not much reasoning to be found that’s “common to everyone” as the definition suggests. There’s a lot of “common to everyone who thinks like me” as a matter of course, but not much common to everyone. Everyone who thinks like a Daily Mail reader will certainly think lewd and sexist comments toward barstaff is acceptable, because “common sense” says that’s okay because at least it’s not really lewd and sexist. But I don’t agree with that conclusion, it’s not obvious nor self-evident to me without need for debate. I think it’s fine to stick a recommendation, regulation or law in place that says “that’s not acceptable”. That’s why laws exist – to tell people that something isn’t acceptable when they may well think it is. Do I “lack common sense”, or do I just have a different view of the world and would like to make it better for people? It seems that the “without need for debate” section of that Wikipedia definition should be bold, italic, underscored and in a much larger font-size than the rest of the definition.

I’ll leave you alone to stick things like “climate change” and “common sense” into Google to fill out those examples for yourself – it’s just too depressing otherwise.

Let’s look at another particularly insidious example. You’ve probably read a few things like this before – “Common Sense Died Today!” reads the usual headline, though there are many variants.

Image result for common sense

I want to use this example to convince you that while “common sense” supposedly  means “something obvious agreed upon by everyone”, it’s really used to mean “this heuristic that I’ve used for years and will now uncritically apply to a new situation whether it’s applicable or not” in addition to the “we won’t debate this” part.

This reflects how humans actually think – our brains are huge stores of prior experience and one of the reasons we can think and act so quickly (and sometimes efficiently) is that we look to these old experiences to help us deal with new ones. I don’t need to calculate the exact way I need to move my spine to counter-balance on one foot every time I walk – my brain is simply looking it up from prior experience. That’s what things like “practice” leads to.

“Common sense” sounds like it fits this heuristic-based description quite well, but all good skeptics and rationalists should recognise that doing exactly as you did before sometimes isn’t a good idea. It might work in 90% of situations, but applying those hard-learned rules to that remaining 10% could end pretty badly. Yes, we can live with those odds, a few screw ups is a small price to pay for efficiency the majority of the time – but I want to argue that if the only reason and rationale you give is “common sense” or “we’ve always done it like this”, then you’re far, far more likely to be operating in that 10% of situations where it’s not going to work. Because, of course, people who are right just give a reason that they’re right – they don’t need to say “this is just common sense” and refuse to give further reasons.

Anyway… back to the “death of common sense” trope. This is a very common meme associated with “common sense”, and I want to unpick that one I linked to in particular, and hopefully convince you that it is, in fact, utter nonsense…

Common Sense lived by simple, sound financial policies (don’t spend more than you can earn)…

Right from the off we have something that’s actually very much wrong – or at least painfully oversimplified to the point of being effectively useless. Everyone has to accrue some form of debt to get things done. No-one really buys a house in cash, few even buy a car outright – we all spend more than we earn, usually as a form of investment so we might earn more later. In fact, businesses run on this principle pretty much exclusively.

Reports of a 6-year-old boy charged with sexual harassment for kissing a classmate…

This is uncited bollocks, since it’s unlikely anyone would be able to charge a minor for sexual harassment – not because it’s “common sense” but because they don’t have legal capacity to be responsible for their actions. But mostly, damn fucking right that’s sexual harassment you stupid fuck.

It declined even further when schools were required to get parental consent to administer sun lotion or an aspirin to a student…

Further demonstrable nonsense because, get this, you’re their teacher not their doctor. It sounds like great “common sense” to just hand out painkillers to kids, but you don’t know their medical history. You don’t know what else they’re taking that could react with it. You don’t know their allergy information – and if you’re dealing with a school kid you can’t trust them to reliably tell you. What? The kid complains of a headache and wants teacher to give them paracetamol – you sure they didn’t take some before arriving at school? They go to a lesson an hour later, ask for more… before long this “common sense” has caused a kid liver failure.

Another version of this bit talks about elastoplasts – yeah, great, risk giving someone a dangerous reaction to latex because your “common sense” overruled actual medical responsibility.

…but could not inform parents when a student became pregnant and wanted to have an abortion.

Also not true. Generally speaking, teachers have no legal obligation to inform parents… nor do they have a legal obligation to confidentiality. But are you trying to imply that “common sense” dictates that you must go behind the back of a student you have a duty-of-care over in order to tell their parents about something without discussing it first? I really don’t want to be seen dead doing this “common sense” thing.

Common Sense finally gave up the will to live, after a woman failed to realize that a steaming cup of coffee was hot. She spilled a little in her lap, and was promptly awarded a huge settlement.

Randy Cassingham’s True Stella Awards is no longer active, but catch the page while it’s still up to learn while this one true but misreported. The full list of others nonsense lawsuits that have been fabricated can be found here, in case you spot one in a variant of this meme.

Okay, so a lot of the above is wrong on a factual basis – yet they still are, to a degree, intuitively correct. People will certainly go around pretending some of the points sound reasonable. But reality is far from intuitive. First-aid (particularly mental health first-aid) is hugely counter-intuitive. “Oh, you’ve broke your arm, clearly… here, give it to me while I yank it about and put it into a splint… stop screaming! This is just common sense that you need to do this!!” – erm, no, if someone has broken a bone and they’re cradling it, it’ll be literally in the most stable and comfortable position you can get it in, so much for common sense. “You want to self harm? NO! Stop that! You’ll hurt yourself!!” someone will scream, citing that it’s “common sense” to protect someone – the reality says if you remove someone’s stabilising mechanism you’re likely to do more harm in the long run. Someone might say “but you’re describing common sense!” – but, no, I’m not. This isn’t common. These things are genuinely counter-intuitive. They’re not well-known facts. And I certainly won’t defend them by saying “it’s just common sense”.

Exceptions come along far more regularly than you might expect – and, remember, if your main reasoning is intuition and “common sense”, then you’re probably dealing with one of those exceptions.

To go back and answer a previous question – how is it used? What is the intention behind someone saying “common sense”? Far from being an example of reason, it seems to be there principally to rebuke it – screw your argument, it’s common sense! It seems to be used to defend the status quo – a six-year old kissing a classmate without their consent isn’t sexual harassment, common sense says so! It’s there to shut down reason, with the implicit assumption that someone proposing something lacks the fabled “common sense” and so is, deep down, just stupid just because.

So, instead of appealing to “common sense” to defend something, how about appealing to an actual reason. You should be able to give one if you happen to be right. After all, isn’t that just common sense?