Everything is a Terrorist’s Friend…

Today’s Daily Mail front page… Jesus fucking wept… Anyway, here’a a link to some snark from New Statesman on the subject. I want to offer my own below.

Here’s a list of all the things that are friends of terrorists…

It took me two minutes to find a car that could be used to mow people down in an intentional attack

THE INTERNAL COMBUSTION ENGINE, THE TERRORISTS’ FRIEND…

I notice that the suspect also breathed a considerable amount of oxygen in the run up to the attack

RESPIRATION, THE TERRORISTS’ FRIEND…

It takes approximately two minutes to conceive a child, a child that could grow up to read a terror manual

SEXUAL REPRODUCTION, THE TERRORISTS’ FRIEND…

Look at all the terror attacks that rely on substances reacting to form other substances, from fuel burning in engines to bullets firing to bombs exploding

CHEMICALS, THE TERRORISTS’ FRIEND…

We found a terrorist manual printed in PDF format online

UNICODE, THE TERRORISTS’ FRIEND…

We found a map of parliament that showed exactly where it was in London, enabling us to go bomb it if we wanted

THE LONDON A-Z STREET ATLAS, THE TERRORISTS’ FRIEND…

We noted that the attack had not yet been killed by a tumour, we have to wonder why were they spared this disease

CANCER, THE TERRORISTS’ FRIEND…

Yesterday, it took the Mail two minutes on web to find a terrorist manual

TCP/IP PACKET SWITCHING NETWORKS, THE TERRORISTS’ FRIEND…

It takes seconds to punch someone in the face, potentially killing them immediately.

VOLTAGE GATED CALCIUM CHANNELS LEADING TO MUSCLE CONTRACTION UPON RECEIPT OF A NERVE IMPULSE COMBINED WITH PRE-EXISTING HEAD INJURIES THAT MAY BE SENSITIVE TO SUDDEN IMPACTS, THE TERRORISTS’ FRIEND…

If I had the time, I’m sure it’d be worth photoshopping these onto a front cover, but it’s the fucking Mail, I’m not sure they’re worth the damn effort.

5 Things They Don’t Tell You About Teaching in Higher Education…

Image result for university lecture

Have you ever considered a career in teaching? Does it sound totally great but the concept of a PGCE and a month’s mandated nose-wiping in a Primary School turn you off? Would you rather teach people you can be cynical and sarcastic to? Then try HE!

(note that this is primarily cathartic cynicism, it’s still a good job, and where I’ve highlighted problems below I do have solutions – at least for the things in my control – but maybe for another time)


It’s long been thought that the difference between teaching at school and at university is that the former were forced to be there – as subjects become more optional, attitudes improve. I do think that’s true, and is part about what makes teaching in further and higher ed. much more attractive. When students want to learn because they intrinsically value it, they’re great to teach, and this is backed up by (decent) research in education and psychology.

Except…

1. The students don’t want to be there

Once, long ago, students came to university because they had a passion for the subject – although this tended to correlate strongly with being wealthy and white for various reasons, that’s beside the point for now. People would happily come to university for the privilege and sheer honour of sitting in a stuffy room and listen to an academic talk endlessly about their area of expertise (a minor exaggeration, I’m sure). After all, it didn’t matter if you didn’t understand it, you just went to the library to learn it again properly before hitting the subsidised alcohol.

But that’s very much changed now.

We can blame the new fees regime, sure, but there’s been a broader cultural shift in what university is actually for – or, at least, seen to be for. It’s now a continuation of school, it’s just “what you do”, and if you don’t go to university you’re seen as a failure. Whether this comes from employers demanding “any degree” for jobs that don’t warrant it, wider society now valuing education for its own sake, or even direct bullshit-expectations from parents, students have to go to university. Students now scramble onto difficult STEM courses because they’re offered through clearing, but do so with a lack of maths qualifications and an interest in the subject that comes exclusively from being told “don’t do Art History or English, it’s a waste of time”. The expectation is “university”, as opposed to “physics” or “biochemistry”.

The end result is that students don’t really want to be taught by you. They see university as a 3-4 year prison sentence they must serve before they can graduate and a get a decent job with more money – a fallacy given that graduate wages are rapidly collapsing. Students increasingly see only the extrinsic value in the subject – the degree, the stepping stone to the next thing. You’re there to tell them how to pass the exam so that they can be graded and graduate with a 2:1 and put off deciding what they want to do with their lives for a bit longer. The effect this has on their motivation is just as bad as school pupils who are “forced” to stay in school way past the point where they care about it.

It’s not a universal, but it’s a large enough quantity to make the job much harder than it needs to be. In fact, the job is already hard enough given that…

2. There is no training (that’s of any use)

Surely, the person standing up to lecture you has been taught how to do it effectively, right? And when someone is organising a tutorial, they’ve been told how to structure the session, respond to queries, and their notes on the questions contain an extensive troubleshooter and FAQ?

Nah.

You’re pretty much thrown into it with nothing if you decide to go on a teaching route. You’ll go into the lab for the first time to supervise 100 or so 18-21 year-olds and know nothing of the practicals. You’ll have a group of 6 in a tutorial and you won’t have had the chance to practice what you’re going to do with them. You’ll turn up to a lecture and this is the first time you’ll have given a presentation where the audience’s comprehension of what you’re about to say actually matters.

Now, this isn’t to say you’re ultimately terrible at it. Junior academics usually have to present to a lecture theatre (their research and proposals) before they’re employed. The ones that don’t get the job are the ones that fail to realise this isn’t a presentation, it’s an audition. As a result, anyone employed in that position can at least speak clearly and won’t fidget and mumble their way through a lecture series. But that’s it, that’s the main bit of quality control that weeds out the physically incapable. Barring a yearly peer review (usually precipitated when the one person who cares decides to organising), which focuses mainly on ticking a few boxes along the lines of “were you any good? Yeah, whatever”, there’s little to no culture of review or quality control on HE teaching. Responses from student feedback are, generally speaking, either useless (“they were fine” ranging to needlessly personal insults) or unrepresentative (a response-rate of 5% is good) so can’t be used to improve teaching and direct your own development as a teacher.

Well, there is some training. But it doesn’t involve how to deliver or develop a curriculum, or make sure that your ideas are understood by people. Instead, you’ll get taught “learning styles” (largely debunked as hokum) or you’ll be taught Kolb’s learning cycle (I’ve yet to find a use for it) and countless over-complicated words that really do nothing but state the obvious. You’ll hear about “travelling theory” where you treat your “subject as a terrain to be explored with hills to be climbed for better viewpoints with the teacher as the travelling companion or expert guide”. This all sounds lovely and poetic and makes some abstract high-level sense, but doesn’t really help you help you teach someone how to normalise a wavefunction or integrate a rate equation. And the diagrams – be sure to always call them “models” – the bloody diagrams that mean nothing but will make your eyeballs bleed. Bloom’s taxonomy (or at least the cognitive domain of it) might be useful for you writing exam questions, but that’s it. Make sure you use “conceptions” and “discourse” a lot when it comes to writing your essay to prove you’ve learned this stuff.

The only useful thing I got out of a year’s worth of workshops and coursework was a half-hour session on vocal health – because talking your bollocks off to 200 people for 45 minutes is harder physical work than it seems. That was great; and something I appreciated more than most, thanks to being married to a pro vocalist who has schooled me in the theory of that for over a decade.

Anyway, why the “training” sucks segues nicely to the next bit. You’re not really being trained to teach, exactly…

3. If you want recognition, be prepared to do something useless

Teaching is a largely thankless task in higher education. This sounds a bit weird if you think of university primarily as an educational institution, yet, it makes perfect sense if you think of them as academic institutions designed to generate research. Teaching doesn’t generate headlines, it doesn’t bring in millions in grant money, and it will get you a new building only once in a blue moon when the university finally listens to the 800th email saying “the teaching labs are about to fall down and kill people!” (because “they’re too small to fit the students you demand we should take” doesn’t get the job done).

This is slowly changing, though. We can blame the fee regime for this. Students now make up the majority of funding for universities, and with the Teaching Excellence Framework around the corner, the higher-ups are taking it seriously.

Except…

The training and recognition don’t reward good teaching, they reward talking about good teaching. Hopefully, I shouldn’t need to hammer home that these aren’t the same thing.

Consider what you need to do for an HEA fellowship, for example. You need to write essays and take part in continuous personal development (CPD), but few of those are ever based around your actual teaching (you have to write a case-study of your own teaching, but the actual aim is to analyse it using the bullshit you learned in your ‘training’ workshops). As a result, the people who get published in the educational literature, and so make a name for themselves as ‘good’ teachers, are the ones who write things like “Conceptions of Student Learning: A New Model Paradigm For Higher Education” and then proceed to yank four student types out of their arse and call them “Square Thinkers” and “Circle Thinkers” and “Triangle Thinkers” and “Squiggle Thinkers”, each described with Barnum statements and no real evidence, and then try to say something profound like “you should make your tutorial group of Squares, Circles, Triangles and Squiggles”. I’m not naming names, but this actually happened once.

So if you can guff around and talk crap about teaching and learning, and make it sound complicated and theoretical and academic, you could very easily find yourself on route to a very cushy academic job in an education department.

Alternatively, you can innovate. Innovation is something I won’t bash outright, but innovation for the sake of innovation is the enemy. Want a teaching award? Start a Twitter account! Send out homework assignments via Snapchat! Get into a packed lecture theatre and do explosions with your students – don’t bother telling them why they explode and how to stop it, that might be useful to them, and that’s boring. Experiment with keeping your office door open! Do EBL and PBL and use the word “constructivism” a lot! Add your students on Facebook! Tear up the rule book because you’re cool and wait for lavish praise to fall upon you!

If you’re a softly-spoken lecturer who stands at the front to just talk – calmly, rationally, and with a clear message – the students will go away knowing a lot about a subject. But that sort of crap doesn’t get you an award or promotion. (before you think this just sounds like bitterness on my behalf, you need to know I’m not actually this kind of person)

Anyway, you can avoid most ‘training’ sessions, except the most important one, which they probably won’t tell you about…

4. You need to learn mental health first-aid

So, cynicism aside for a moment, if you want to work with students, seriously, learn mental health first-aid. Believe me, there’s a lot that “common sense” won’t get you through here so you need to know it and get taught by someone who knows what they’re doing. It’s difficult to deal with, but it’s something you will inevitably deal with and may even take up a measurable chunk of your time (which can’t be directly assigned to the Work Allocation Model, of course).

Why is this important and potentially time-consuming?

Look above at all the crap students have to deal with. Under pressure to perform from their parents, locked into a course they hate by the expense and the fears that they’ll never pay back these objectively ridiculous fees, surrounded by staff who would rather be writing their next Science paper than answer questions on thermodynamics, faced with lab work that’s almost designed to overload their working memory… and then panicked that they haven’t learned anything from the young, hip and trendy ones that are telling them to check their twitter feed for tutorial announcements.

All that on top of being young, a bit dim, unsure… by the gods, the list goes on. It is a perfect recipe for a mental breakdown. And this is strikingly common, and not just restricted to the stereotype of the emo goth girl who broke up with her boyfriend. Anyone who comes into your office could break down in tears at a moments notice.

I really don’t talk about this often, so I’ll get it over with in a single quick-fire list: in a few short years I’ve had students on anti-depressants, undergoing  CBT, having panic attacks in labs, admitting to being sexually assaulted, having been mugged, saying that their family has just imploded, discovered they’re dyslexic, passed out in an exam and woke up in hospital, passed out in a laboratory, passed out in my office…

This is serious fucking business. We’re not there to be therapists – we shouldn’t take on that role – but university counselling services are stretched thin, underfunded (by comparison to their need), and are only really available as palliative care rather than preventative. As a result, we often have no choice. If you want to take a teaching-track route into HE, you’re likely to be in close contact with students far more often than research-focused counterparts, you’re going to be seen as more approachable because of it, you’re going to deal with this whether you like it or not.

Maybe you want to stay in research over teaching, because…

5. We don’t know if it’s going to become a dead-end or not

As recently as 5-6 years ago, a teaching-track in a university was a dead-end. Teaching staff were recruited as a cheap and easy plugs to do jobs that senior academics didn’t want to do. They don’t want to spent 6 hours on their feet in teaching labs. They don’t want to blow 4 hours a week on tutorials. They’ll put up with a lecture course if it’s the only one they have to teach that term and they don’t have to do anything but stand and talk. And so, teaching-focused staff were born – costing only as much as a postdoc to employ, capable of absorbing much of the abuse students generate, and having copious free time to load up with that “any other duties” bit of the job description.

But there was no promotion track. There’s no way, as a teaching-focused academic, you can write and bring in a 6-7 figure grant. There’s no way, as someone who doesn’t run a research group, you can really publish a high-impact paper. And so there was no way that a university or department could reward you for it.

This has, however, mildly improved. There are now promotion criteria, there are pathways to get to senior positions, and – even if it is rare as astatine – you can get a tenured professorship purely on teaching. Some places are even slowly unpicking the distinction between teaching and research focused staff, allowing you to hold the title of “lecturer” officially – ironically, “lecturer” usually means you do less lecturing than the people without it. This is all fabulous, of course. Finally, universities are recognising that students bring in a load of cash, and so the staff to teach them stuff might be worth investing in.

But.

There’s always a ‘but’.

The UK is slowly moving over to the United States’ model in, well, every area, really – and this includes HE. We’re going to privatise our healthcare, prisons and welfare, and we’re going to hike higher education fees to make them inaccessible to all but the most advantaged people. We also run the risk of paying staff less, exploiting the eagerness of younger researchers and teaching staff to take poorly-paid positions for a 1-2% shot at the big time. The US runs on a frankly appalling system of “adjunct” professors, who are usually newly-minted PhDs who are typically paid per class they teach. The end result is that many of them teach classes at multiple institutions, often with long commutes between, and are paid only for the hours teaching. Once you factor in the travel times between jobs, the marking, grading, course development and other sundry overtime, the wages work out as just below minimum wage. Yet the system works because people feel they have no other choice – and they’d be right, that’s their only choice.

Is the UK heading that way, too? Maybe, maybe not. I don’t know. On the one hand, we’ve seen staggering improvements in the respect you get for teaching in HE, on the other hand we could revert to the US model at a moments notice if the suits in charge see that it’s cheaper to pay some young pup £3,000 to teach a class than it is to pay them £30,000 to be full-time and only teach 4-5.

I’ve seen an increase in teaching positions advertised as “term-time only”, which pro-rata down to quite a low salary for a year of work, meaning you’ll need a temp or part-time job to keep you busy in the long summer. But, more importantly, term-time-only contracts and per-class contracts robs universities of the chance to do any development work. Most teaching labs experiments were cutting edge back in the 60s, some lecture courses haven’t been updated since the 90s, intro courses given to first years are still the same tired old things despite evidence that flipped delivery would improve them. No one can do that unless teaching-focused staff are given the time, respect, and clout to develop – and that means employing them full time, even over the Christmas, Easter, and summer breaks. If the worrying trend to employ them for their hours only continues, we’ll lose any chance of curriculum development or review by people who actually care about effective teaching.

So there’s a lot of work being put in to make the position respectable. But it’s likely that the walking suits earning 10x what I’ll ever be able to won’t like that, and reverse the entire thing into a ditch.

We need to talk about “common sense”…

No two words in the English language have done more damage to the cause of human rationality than “common sense”.

(At least, I would like to simply assert that as some opening rhetoric/hyperbole, as quantifying that last sentence to prove it might be a little difficult and more trouble than it’s worth.)

Everyone’s heard of “common sense” before. There’s 62 million results for it if you bung it into Google, it’s probably been spat in your face since forever.

“We need more common sense!” you’ll hear from politicians as they begin to dismantle complex laws built up over time, or “you have no common sense!” you’ll read in a backwater comments section as if it refutes actual studies and research, “this is just common sense” says someone in support of their argument.

Image result for common sense

But what does it mean? Let’s run to Wikipedia and grab the first line:

Common sense is a basic ability to perceive, understand, and judge things that are shared by (“common to”) nearly all people and can reasonably be expected of nearly all people without need for debate.

Okay, not bad. We can all agree on that. And if you asked anyone who made an appeal to “common sense” this is probably what they’d cite as their definition.

Hopefully, though, you shouldn’t take too much convincing to know that there’s a big difference between what people say that they mean, and how they act out what they mean. Someone who is adamant that Jesus will protect them at all times will still look both ways when crossing a busy road. Someone with a stated mistrust of science and doctors will still knock back aspirin and paracetamol like it’s going out of fashion to cure their headache. Someone will proudly proclaim “I’m not a racist!” and then proceed to cite The Bell Curve and argue that Asians are all good at maths and that can’t be racist because it’s a positive thing. An intersectionalist will decry overgeneralisations, racism, assumptions, and promote the need to assess everyone as an individual that’s a sum of all their experience – and then rant about how terrible white people all are. The list goes on.

So it shouldn’t take too much imagination to realise that when someone says that “common sense” means “something obvious to all of us”, they might not really use it that way.

How is it used? As in, actually used by people who say it, and what do they want to achieve by saying it. Words themselves don’t have inherent meaning, but they do have use-cases and an intended effect by the users.

It doesn’t take too long to search a right-wing tabloid for examples of the phrase. That hardened bastion of “common sense”, The Daily Mail, is full of them. Lord Falconer proposes “common sense” human rights – doesn’t seem to suggest anything, just uses it as an excuse to throw out other ways. “She’s intelligent but doesn’t have common sense” – again, to dismiss someone or some other way. “Insult to common sense” – said about laws to stop workplace harassment.

Image result for common sense

These are random-ish examples, but in each one there’s not much reasoning to be found that’s “common to everyone” as the definition suggests. There’s a lot of “common to everyone who thinks like me” as a matter of course, but not much common to everyone. Everyone who thinks like a Daily Mail reader will certainly think lewd and sexist comments toward barstaff is acceptable, because “common sense” says that’s okay because at least it’s not really lewd and sexist. But I don’t agree with that conclusion, it’s not obvious nor self-evident to me without need for debate. I think it’s fine to stick a recommendation, regulation or law in place that says “that’s not acceptable”. That’s why laws exist – to tell people that something isn’t acceptable when they may well think it is. Do I “lack common sense”, or do I just have a different view of the world and would like to make it better for people? It seems that the “without need for debate” section of that Wikipedia definition should be bold, italic, underscored and in a much larger font-size than the rest of the definition.

I’ll leave you alone to stick things like “climate change” and “common sense” into Google to fill out those examples for yourself – it’s just too depressing otherwise.

Let’s look at another particularly insidious example. You’ve probably read a few things like this before – “Common Sense Died Today!” reads the usual headline, though there are many variants.

Image result for common sense

I want to use this example to convince you that while “common sense” supposedly  means “something obvious agreed upon by everyone”, it’s really used to mean “this heuristic that I’ve used for years and will now uncritically apply to a new situation whether it’s applicable or not” in addition to the “we won’t debate this” part.

This reflects how humans actually think – our brains are huge stores of prior experience and one of the reasons we can think and act so quickly (and sometimes efficiently) is that we look to these old experiences to help us deal with new ones. I don’t need to calculate the exact way I need to move my spine to counter-balance on one foot every time I walk – my brain is simply looking it up from prior experience. That’s what things like “practice” leads to.

“Common sense” sounds like it fits this heuristic-based description quite well, but all good skeptics and rationalists should recognise that doing exactly as you did before sometimes isn’t a good idea. It might work in 90% of situations, but applying those hard-learned rules to that remaining 10% could end pretty badly. Yes, we can live with those odds, a few screw ups is a small price to pay for efficiency the majority of the time – but I want to argue that if the only reason and rationale you give is “common sense” or “we’ve always done it like this”, then you’re far, far more likely to be operating in that 10% of situations where it’s not going to work. Because, of course, people who are right just give a reason that they’re right – they don’t need to say “this is just common sense” and refuse to give further reasons.

Anyway… back to the “death of common sense” trope. This is a very common meme associated with “common sense”, and I want to unpick that one I linked to in particular, and hopefully convince you that it is, in fact, utter nonsense…

Common Sense lived by simple, sound financial policies (don’t spend more than you can earn)…

Right from the off we have something that’s actually very much wrong – or at least painfully oversimplified to the point of being effectively useless. Everyone has to accrue some form of debt to get things done. No-one really buys a house in cash, few even buy a car outright – we all spend more than we earn, usually as a form of investment so we might earn more later. In fact, businesses run on this principle pretty much exclusively.

Reports of a 6-year-old boy charged with sexual harassment for kissing a classmate…

This is uncited bollocks, since it’s unlikely anyone would be able to charge a minor for sexual harassment – not because it’s “common sense” but because they don’t have legal capacity to be responsible for their actions. But mostly, damn fucking right that’s sexual harassment you stupid fuck.

It declined even further when schools were required to get parental consent to administer sun lotion or an aspirin to a student…

Further demonstrable nonsense because, get this, you’re their teacher not their doctor. It sounds like great “common sense” to just hand out painkillers to kids, but you don’t know their medical history. You don’t know what else they’re taking that could react with it. You don’t know their allergy information – and if you’re dealing with a school kid you can’t trust them to reliably tell you. What? The kid complains of a headache and wants teacher to give them paracetamol – you sure they didn’t take some before arriving at school? They go to a lesson an hour later, ask for more… before long this “common sense” has caused a kid liver failure.

Another version of this bit talks about elastoplasts – yeah, great, risk giving someone a dangerous reaction to latex because your “common sense” overruled actual medical responsibility.

…but could not inform parents when a student became pregnant and wanted to have an abortion.

Also not true. Generally speaking, teachers have no legal obligation to inform parents… nor do they have a legal obligation to confidentiality. But are you trying to imply that “common sense” dictates that you must go behind the back of a student you have a duty-of-care over in order to tell their parents about something without discussing it first? I really don’t want to be seen dead doing this “common sense” thing.

Common Sense finally gave up the will to live, after a woman failed to realize that a steaming cup of coffee was hot. She spilled a little in her lap, and was promptly awarded a huge settlement.

Randy Cassingham’s True Stella Awards is no longer active, but catch the page while it’s still up to learn while this one true but misreported. The full list of others nonsense lawsuits that have been fabricated can be found here, in case you spot one in a variant of this meme.

Okay, so a lot of the above is wrong on a factual basis – yet they still are, to a degree, intuitively correct. People will certainly go around pretending some of the points sound reasonable. But reality is far from intuitive. First-aid (particularly mental health first-aid) is hugely counter-intuitive. “Oh, you’ve broke your arm, clearly… here, give it to me while I yank it about and put it into a splint… stop screaming! This is just common sense that you need to do this!!” – erm, no, if someone has broken a bone and they’re cradling it, it’ll be literally in the most stable and comfortable position you can get it in, so much for common sense. “You want to self harm? NO! Stop that! You’ll hurt yourself!!” someone will scream, citing that it’s “common sense” to protect someone – the reality says if you remove someone’s stabilising mechanism you’re likely to do more harm in the long run. Someone might say “but you’re describing common sense!” – but, no, I’m not. This isn’t common. These things are genuinely counter-intuitive. They’re not well-known facts. And I certainly won’t defend them by saying “it’s just common sense”.

Exceptions come along far more regularly than you might expect – and, remember, if your main reasoning is intuition and “common sense”, then you’re probably dealing with one of those exceptions.

To go back and answer a previous question – how is it used? What is the intention behind someone saying “common sense”? Far from being an example of reason, it seems to be there principally to rebuke it – screw your argument, it’s common sense! It seems to be used to defend the status quo – a six-year old kissing a classmate without their consent isn’t sexual harassment, common sense says so! It’s there to shut down reason, with the implicit assumption that someone proposing something lacks the fabled “common sense” and so is, deep down, just stupid just because.

So, instead of appealing to “common sense” to defend something, how about appealing to an actual reason. You should be able to give one if you happen to be right. After all, isn’t that just common sense?

What if Trump’s tax returns really aren’t incriminating?

 

Rachael Maddow says she has Trump’s tax returns!

Yay! At last we get to see what’s really controlling The Donald! Is it the Russians? Has he avoided tax for a few decades? Does he claim his ridiculous crotch-length ties as a deductible expense?!

Image result for maddow tax returns

Oh, they’re from 2005, show he earned some millions and actually paid tax that year. Whoop-de-shit.

This is obviously quite a let down and a damp squib for some people. They expected a smoking gun, a big chunk of money from the Russians to show he was in their pocket or something. Obviously, this hasn’t happened. The reality has been, so to speak, a pretty dull affair that’s hard to string out into more than two minutes of discussion.

“Ah, but wait…” I can hear someone say. “…these tax returns are 10 years old, what do the more recent ones say.

“Quite…” says the next guy. “…they’re also just front sheets, they don’t say where the money comes from.”

“But we know he’s a criminal, we just have to go looking for the right tax return to show it.” – Genuine quote I’ve seen in a comments section.

Spot a pattern?

This is the foundation of conspiracy theory thinking. Not full-blown just yet, but it’s where it starts – because at this stage it still sounds reasonable to think like that.

But superficially reasonable or otherwise, it still says that the evidence didn’t support our hypothesis, therefore the evidence must be wrong, or limited, or incomplete. It says the real evidence is still out there to be found. Suddenly, this revelation, which doesn’t say much incriminating on its own, actually becomes incriminating by he sheer fact that it it isn’t. It must be evidence of the cover-up!

It’s a train of thought to be aware of, and hopefully protect yourself against. Don’t fall for it, take the evidence on board like a good skeptic, or rationalist, and treat it as what it is. Don’t fill in the gaps yourself.

Maddow and others have begun to spin it already. The real story, so it goes, is that these returns can be found. And fair play, Trump’s insistence that he couldn’t release them is very evidently, to use the technical term, utter bollocks.

Yet, I think Maddow made a very piss-poor choice in releasing and publicising such weak sauce early (and David Cay Johnston, too, if he also had a hand in saying “go public with this now!”). This leads to the above conspiracy-like thinking described above. It deepens the conjecture, and thanks to the backfire effect may well strengthen the opinion that Trump’s tax returns do show criminal behaviour and weaken belief in the alternative (whereas the evidence doesn’t seem to have a consequential effect so far).

I understand the desire to do so, though. Breaking the story of getting his tax returns? Gold dust. A quick victory for the liberal left? Definitely worth it after two months of groin kicks. But in the long term, I think this could push toward irrationality. The liberal-left can suddenly go crazy – being seen to overreact to a leak that means very little. Or it can descend into conspiracy theories – as more completely non-incriminating tax returns appear and everyone refuses to believe it. Either is likely, and would be a boon for right-wingers who already control the narrative that the left are stupid whiners who won’t give Trump the chance he deserves.

Or we can wait, quietly and attentively, for evidence that can actually say something useful.

Mayim Bialik and Melissa Rauch are paid less because they’re women – but the real reason why will shock you!

Clickbait title… #drink… But I couldn’t think of anything snappier that got the point across.

Yes, Mayim Bialik and Melissa Rauch are paid less to star in The Big Bang Theory than the rest of the cast, and that’s about to change, it seems, thanks to a fairly honourable decision by the rest to take pay cuts to finance this.

Yay for gender equality!

Or not… as people insist on saying, because being paid less has nothing to do with them being women.

Except it does. Really, it does. Just not in the way you think. No one in a studio has sat and said “they’re women, they don’t deserve better pay”. Precious few people think that, thankfully, so if that’s your impression when you hear some feminazi whine like a harpy about the wage gap, you’re just wrong. The actual reason is a bit more nuanced and complex than that.

To my regular readers – all 0.78 of you – the reason I’m about to go into won’t actually be shocking. In fact, it should be fairly trivial and obvious. Everyone else, though, this may take a bit of thought and processing.

First, I should say that I… like The Big Bang Theory, I suppose. I watch it, find it funny, and that’s it. It don’t roll around the floor like the studio audience does and it hasn’t changed my life and I don’t eagerly await each new episode as if it’s the ejaculate of Christ himself, nor do I froth at the mouth every time it’s mentioned, feeling the need to take to the internet to proclaim my dismay that it’s still a thing. That’s allowed, you know. All I’m trying to say is that, unlike most other people who are bone-crunchingly critical of it, I actually bother to watch it.

Onward…

Let’s go through the non-gender reasons for why two actors would be paid less than the rest of the cast in any particular TV show, and break them down for this show in particular.

  • They’re not in the main cast. This is no longer true, Bialik and Rauch have been full-time main cast and not supporting/recurring for some time.
  • They’re main cast, but not as important to the story. Also no-longer true. Since they finally painlessly euthanized the aged and decrepit will-they-won’t-they between Penny and Leonard, Amy and Bernadette are pretty much the only characters given any real plot or development any more. Good for them.
  • They’re main cast, and important, but not as good. Oh, please. You’ve watched this, right? No, of course you haven’t.
  • They’re main cast, important, and good at it, but haven’t been in it as long. Yep,  there we go.

And that’s where the train of logic and reason ends for a large bunch of people. All is fair, they claim, because these two haven’t been in the show as long. And you know what? That is very much valid as a reason. It wouldn’t be practical, nor expected, that they’d be paid on par with Jim Parsons after a few weeks on the show.

However, it is a bit of a stretch to use this as the only reason, since Bialik and Rauch have been in the show longer than they haven’t. In fact, I think they’ve been main cast in the show for longer than they haven’t been in it and were only recurring. And it would be an extreme stretch to justify the absolute magnitude of the pay disparity between them considering their position in the cast and that the entire show now pivots around them as much as it does with the others.

Now we need to go into why they haven’t been in it as long as the other five actors. For this, we need to fly back to 2007 when I was a pesky project student, the worst we had to worry about was Sarah Palin, and The Big Bang Theory first started.

The show began as a pretty troperific sitcom about standard nerds-who-can’t-get-dates and the hot blonde that moved in next door. I don’t want to go into how tired that trope was even by 2007, but my main point is that out of five central characters, only one was female. The entire cast was systematically one-sided to favour male leads from the start. This is important to realise, although right now I can hear the cogs working in your brain screaming out “beta cuck mangina feminazi snowflake”.

In the grand scheme of things (we’re on double-digits of seasons now), this state of affairs didn’t last very long. Two women were added way back in season 3 and gradually made into main characters. Although the show’s had/has a remarkable habit of tossing interesting women aside quite quickly, these two stuck around and helped balance out the utter sausage-fest that was the first 2-3 years.

Their introduction wasn’t perfect, by any means. They were presented as two competent (by TBBT’s lax health & safety standards) and passionate scientists, and have often been allowed to be funny in their own right – but they still fell into the usual sitcom trope of the nagging, disapproving wife/girlfriend. They’re in with the science, but look down on the nerd-culture aspects of TBBT because girls simply aren’t allowed to be into cosplay, comics, superheroes or Star Trek Wars. (note: my ‘Standard Nerds’ friends list on Facebook is predominantly female) They were also underused when it came to making any kind of commentary on women and science in society; there was a brief flirtation with asking whether women in science could also be considered sexy , in addition to intellectual, but this important commentary came a couple of years too late to the party and was brushed aside quickly by the writers who clearly were out of their depth in addressing social context. (If you think that’s a weird thing for me to want to see, note that the sexuality of the male characters informs much of the founding premise of the show), Despite these flaws, on balance, Amy Farrah Fowler and Bernadette Rostenkowski are probably a net-positive contribution to the world of televised fiction for both women and science.

Yet there was absolutely no place for them at all for the first few years of the show. If the casting call went out, it would have been four reasonably developed male leads with backgrounds, ambitions and personalities and one girl that must be hot. If ten people auditioned with an equal 50:50 gender split, then one man would have been disappointed, but four women would have been.

This is what we mean when we talk about various -isms being systematic. The system was built up to disadvantage them in the first place. There’s actually no route that Bialik and Rauch could have taken for them to satisfy the criteria of being on the show for as long as the other cast. Even if they hopped into a time machine to get their past selves to audition for the first season, their gender would have systematically and significantly lowered the probability of them landing a leading role. Which is the point.

More broadly, since there is more to life than one sitcom, there are fewer leading, well-paid roles available for women.  When they open up in later seasons of a show, it becomes increasingly difficult for them to stick around and build themselves up to the same level as the initial cast.

Since the opportunities are fewer, and the availability to progress is lower, this manifests as a major disparity in pay between men and women. Yes, women do take on roles that are lower paid, but that explains the existence of a tangible wage gap in the same way that “because it’s in orbit” explains why the Earth goes around the sun – it’s a tautology that fails a test any four year-old could administer by saying “….why?“.

Of course, immediately people are going to lash out and scream “but there are other shows with leading women!!” and probably mention Two Broke Girls (I don’t watch it, so have no opinion on it, I assume from the title that there’s at least two female leads) or something. Yet actual studies show this holds to be true systematically – with fewer roles made available to women who aren’t 20-something and hot, fewer in leading positions and fewer with opportunities to earn as much as male leads. This situation is gradually improving, at least. There are more roles coming out and more opportunities now than there were even a decade ago. Although people are still steadfast against accepting this incredibly basic statistical argument coupled to a simple logical syllogism; if all was equal, we wouldn’t see a broad discrepancy, we see a broad discrepancy, so all can’t be equal.

Have shows, movies and games with female leads been less successful? Yeah, pretty much. Is it because they’re objectively worse forms of art? Why is that? Is it because only novice writers get to make them? Is it because they’re given low budgets because studios and publishers consider them a risk? Is it because they lack the publicity and marketing budgets to ensure their success? Is it because society will outright reject them because of conditioning? Or is it because women are literally worse at everything so either belong in the kitchen or deserve unequal representation in fiction?

Those are questions that are worth asking, and worth discussing. And, to be honest, I don’t particularly mind if you do think that “yes, women are shit and unfunny so don’t deserve to be well-paid to appear in a comedy” – because that’s at least a reason, and not a flat-out denial like “it has nothing to do with gender”.


Addendum: as, generally, a skepticism/rationalism blog – I think, what is my subject, again? – undoubtedly someone is going to bring up Bialik’s vaccine stance or her attachment-parenting thing. Great. But that’s not for here. Go find another blog that is hosting that discussion right now if you need to say it.