Every Conversation With A Brexiteer Ever

Well, perhaps not ever… but this seems to be the summary of many:

Leaver: It was a vote, you have to accept it!

Remainer: But the referendum was only ever advertised as advisory, it was never legally binding for the government to enact. So it really should be given parliamentary approval in a free vote. Particularly, the terms agreed upon after 2 years of Article 50 negotiations should be ratified through our representative democracy.

Leaver: It was a vote, you have to accept it!

Remainer: But perhaps it’s dangerous to just enact something without proper expert consideration, especially now that multiple Vote Leave promises have been rescinded and it’s become clear that the population may have been (read: definitely were) mislead. The political and economic landscape has changed significantly since June, so you can’t say a decision taken by non-experts in one situation should be, by default and without consideration, applicable to a much different different situation at a later date.

Leaver: It was a vote, you have to accept it!brexit_recession

Remainer: But it was a very small win for Leave. The margin was a few percent, almost on par with a margin of error. Given the number of people expressing regret over their vote – a proportion that polls suggest would be high enough to swing the referendum in a different direction if it were done today – is it wise to plough on without further due consideration? Can we not take into account further opinion polls taken after a reflection on the impacts to the value of our currency, the economic impact, or the fact that many Vote Leave promises turned out to be complete fabrications?

Leaver: It was a vote, you have to accept it!

Remainer: Okay, but we have a constitution based around representative democracy. We elect people to make decisions on our behalf based on the fact they can take the time and do enough research to make an informed decision, whereas the general public can’t afford the time. In line with both the country’s precedent-based constitution, parliament should have a final say in both leaving the EU and accepting post-EU terms. They should take popular opinion under advisement ( as this was advertised as, and as they’ve always done) without accepting the narrow referendum result as a mandate for sweeping, unilateral change.

Leaver: It was a vote, you have to accept it!

Remainer: Thing is, many aspects of democracy require supermajorities to enact rather than 50% +1. Things like amending the US constitution, for example. That’s precisely to stop bad decisions being made on the back of popularism and to ensure broad, representative consensus rather than making sweeping changes when there’s a clear split and the margin is tight. It’s also why arguments about the counterfactual case of ‘Remain’ winning by a small margin don’t hold up – because you don’t need to get a supermajority or a large margin in favour of the status quo to keep with the status quo, because there would still be no strong mandate for change. This is also the essence of basic conservatism, incidentally, as well as part of mainstream political thought about democracy since the term was invented.

Leaver: It was a vote, you have to accept it!

Remainer: Part of the democratic process is that you can’t just accept things blindly, even when popular – as you have to have safeguards against a tyranny of the majority, where the rights of minorities can be removed or oppressed just because a majority says so. If some groups will be more negatively affected by a decision than others, then not everyone is equal when it comes to a simple ballot. Something that sounds good to a large number of people but will probably not affect them might be absolutely devastating to a small number of people who will never have their voice heard in a popular vote. This should be taken into account when taking the voting results into consideration as this forms the basis of a representative, egalitarian and equal society – again, the basis of democracy and mainstream political theories of justice.

Leaver: It was a vote, you have to accept it!

Remainer: Democracy doesn’t begin and end at voting. It starts at representation, and ends with beneficial decisions made through consensus – with voting as a means, not an end. It’s an involved process that continues beyond just voting when and where they tell you. There are countless opportunities to petition, or get involved in decision making. It doesn’t stop, it continues. That’s the actual point of democracy if we want it to mean something positive and beneficial rather than just hanging on the idea that it’s a popularity contest and the majority rules. Leaving it at “vote, and the majority rules!” is a really stunted view of democracy, one which really limits its ability to do the most good for the most number of people – particularly so when the question asked of the populace at large is a simple binary but the real-world options and their ramifications are numerous, complex, and nuanced.

Leaver: It was a vote, you have to accept it!

Remainer: Fuck it, I can’t be bothered with this shit anymore.



Consent Might Be Complicated If…

I’m re-blogging this owing to the whole “grab her by the pussy” thing. Not because of Donald Trump’s words, exactly, but because of the defenses made on his behalf.

I’ve seen things about supposedly hypocritical pop-starlets who grab their crotches being “offended” by it – because touching yourself and someone else touching you are obviously the same thing.

I’ve seen “but I thought liberals were about sexual liberation”, because, of course, liberation meant liberating ourselves from silly little things like consent.

And, of course, countless things about it being how “real men” talk. Now, that has numerous layers of bullshit, but suffice to say that if “real men” are supposed to casually ignore consent, we’ve got a serious problem.

Do I think all of those comments are spawned entirely by ignorance of consent and culture? Broadly, yes. I think there are shades between ignorance and malice, and rarely do you find one without a bit of the other propping it up. Ultimately, what the people defending “grab her by the pussy” demonstrate is that they literally do not understand the concept of consent, and that it’s foreign and alien to them. We can debate why, but it’s pretty clear they don’t understand it – to them, it literally does not compute.

Spherical Bullshit

I really like this post on consent, and it seems to have had a massive surge in popularity, and for good reason. As one of the later paragraphs concludes:

Do you think this is a stupid analogy? Yes, you all know this already  – of course you wouldn’t force feed someone tea because they said yes to a cup last week. Of COURSE you wouldn’t pour tea down the throat of an unconcious person because they said yes to tea 5 minutes ago when they were conscious. But if you can understand how completely ludicrous it is to force people to have tea when they don’t want tea, and you are able to understand when people don’t want tea, then how is so hard is it to understand when it comes to sex?

It’s a testament to the power of analogy (and logic, in fact) that something can seem…

View original post 822 more words

Confused? Good.

Are you unsure about an event that’s going on in the world?

Not sure what side to take?

Not sure what you think about it, or which is right and which is wrong?

Have you seen someone make a decision, and have no idea whether to support them or not?


Really, good for you. This is definitely a good thing, and stop pretending otherwise.

Because it means you’re thinking about it. It means you’re opening to any possibility or any conclusion. It means you have a good motive to really dive into it in some detail and make a decision that’s informed.

People who make a sweeping didactic statement about something almost immediately – with certainty, with judgement, and with no tentative hesitation – are almost certainly operating on what their prior biases tell them. They were always destined to think that, and nothing else. The alternative was never an option for them.

That has its place, sure. Mostly when things aren’t terrifyingly important or inherently confusing and difficult.

But if you’re unsure, great. If you have to stop and scratch your chin a for a bit, fantastic. If you need to think a bit longer and a bit harder before your decision, do that.



Teaching on TV Sucks – An Trope Over-analysis

I’ve been teaching in some capacity for about 7 years now, about 3 of them properly and getting paid for it, and I’ve even recently got qualifications that say that I can do it. While I haven’t got it all figured out by a long shot, I’m pretty sure I know when to spot it going awry.

So, while I haven’t seen this specific thing documented thoroughly on TV Tropes yet, I want to bang on about the following trope, you’re sure to see it pop-up on TV frequently:


**Rows of students sit attentively as the TEACHER walks up and down between the individual desks talking.**


…And then we add one to the power and divide through by the new power to get the integral of the function, which gives us –

**The bell rings. Suddenly, all the students grab their bags and begin to move. The camera begins to focus on MAIN CHARACTER. The TEACHER begins to shout to be heard.**


…And don’t forget that the mid-term is on Friday so bring a spare pencil and also your essay on Franco’s Spain is due on my desk tomorrow and also remember student elections are this Thursday!


Hey, are you going to SPUNKY HARRISON’S big party on Sunday?


Man, I still don’t have a date for prom!

Trashy dialogue aside, there are a few things going on here and I’ll address them in turn shortly.

If you see classroom teaching on TV, this is what it will look like. I know why this is a thing. If you set your characters in a school, you can’t have them perpetually on a lunch break. Even the most lax of sitcoms know that you have to show your characters doing something occasionally, and this is a great case of “show, don’t tell”. And because you don’t want your scene to be 45 minutes of a grown adult talking to youngsters, you have to set it at the end of a lesson so it only lasts a minute at most – it’s a nice set-up to get MAIN CHARACTER and SECONDARY CHARACTER in the same room with a reason to speak and move the plot onward.

This is fine… but every time you see the trope in action it always exemplifies piss-poor terrible teaching. As I will now over-labour to death below.

Bad classroom management

The bell is always there, and it always triggers an immediate exodus. You can make the disciplinarian argument that the teacher’s word is law, and that the bell is only a “suggestion” – a thing to keep the whole school in sync even if clocks are out by several minutes. As a result, students should stay seated until the teacher dismisses them, rather than rumble off immediately in spite of the classroom’s authority. This tends to be what happens in schools as far as I’ve seen, so it’s at the very least an unrealistic trope.

Personally, I think a purely authoritarian/disciplinarian argument for this is a bit weak. Still, poor-to-non-existent classroom management is the underlying fault in this trope, and it informs the other reasons it’s unrealistic / bad teaching below.

Waiting for the cue

If you ever watch this sort of scene closely, you’ll note the speed that the kids stand up and pack away the instant the bell rings. As if they’re waiting for it, mentally preparing like a star sprinter waiting for a starting gun to go off. But of course they are, because they’re actually actors waiting for the director to tell them to move (presumably the bell is added in post).

But assume that they are real students for a moment. If their reaction times are so unified and sudden, it means they were preparing and waiting – possibly even clock-watching until the bell rings. That means they’re not focused on what the teacher is saying or their own learning. The classroom is background noise to their mental preparation to move. The whole mental effort of the background actors goes toward not missing their cue, a similar amount of effort needs to be spent by real students if they were to react so quickly to such a signal.

Bad time for key reminders

Almost invariably this trope shows the teacher giving out important information over the din. Exams, essay deadlines, where they’ll pick up next… yet you may as well start shouting scores from your favourite Rotten Tomatoes reviews for all that information will stick. The tiny humans in the classroom have moved on, their brains shifted into “moving the plot on” mode and won’t switch back to “information receiving” mode quick enough to take it in. Regardless of what you might think of your own ability, humans don’t multi-task, they task-switch. Students won’t take in this concluding information and shuffle about their desks, pack their books and prepare for the next bit of dialogue at the same time.

From a trope perspective, it’s background noise. It’s filler until MAIN CHARACTER outlines their problem. But if this were to happen in reality… it would be equally banal and pointless background noise.

Do these people not have watches?

An invariable part of this trope is that the bell always rings in the middle of a point. Why? It’s as if the lesson started only a minute before the camera began to roll and suddenly the bell rings to signal the end. Which is probably true on TV, but again unrealistic for real-world teaching, where lesson lengths are known in advance rather than set to a random duration.

I don’t ask that directors and screenwriters develop a full lesson plan in advance and film it, but come on! Did the serious-but-fun teacher dude who’s about to break it to SUPPORTING CHARACTER that she’s about to flunk math not realise that time was ticking on? Did they seriously think that twenty seconds from the end of the time slot would be a fantastic time to start teaching a new thing?

You might think the example script above with integration is a bit extreme, but I’m pretty sure I lifted it directly from Season 3 of Gilmore Girls.

Not the time for new information

Thanks to a lot of psychology and educational studies (and some fun ones involving military snipers) we know the exact attention span of a human being. It varies a little, but generally speaking after approximately 1 hour without a break we’re absolutely frazzled. So at the 45-60 minute mark of a traditional lesson, students are effectively brain-dead. This isn’t the time to start making your key thesis or introduce a new topic because you’ll run out of time, but because the students literally won’t take it in.

On a shorter time scale, our attention spans are about 10-15 minutes. That’s about the limit of our dead-set focus. As a result, good lesson plans tend to chunk things down into blocks. A typical (good) 50 minute lecture should break down into at least two 20 minute structured blocks with about 5 minutes between. So as attention begins to dwindle over the course of those structured blocks, your mode switches from presenting new information to reviewing old information. It’s all about keeping the cognitive load manageable.

So, in the example above, the teacher has blown the last five minutes of their time on teaching a basic outline of integration rules. This is a complete waste of time because they’ll need to spend at least 10 minute re-capping and addressing misconceptions that arose from doing it while students had lower attention due to being at the end of a lesson and were distracted by clock-watching for the bell.

The end is for continuity announcements

I’ve discussed a bit about lesson plans and chunking, but the end of a lesson plan should always be used for a re-cap and review, and for building a lead-in for what happens next. That’s a good five minutes of your time you need, not 10 seconds after the bell has rung and everyone is moving. The trope usually fits in a “Okay, tomorrow we’ll discuss Chapter 4!” after the bell rings, which sounds like it should do the trick, but you need a bit more than that.

There are a lot of reasons why you use the end of the lesson to discuss what comes next and in-depth. Firstly, there’s the principles around scaffolding. Here, you need to structure prior and existing knowledge so that students become more receptive to new material. This is a two-way street – before you present new material, you relate it to the older material, and after you present new material you relate it to the next bit. This activates students’ awareness that they will need to keep their mind open to bolt on some new information at a later date. It also provides instrumental value to the new material by showing them where it progresses.

So, that last line should be more like “Okay, tomorrow we’ll discuss any questions you have about Chapter 3, and then move onto Chapter 4 where we get to apply the Thing in a new situation, so if you have time to revise the Thing, make sure you’re comfortable applying it to the situations of Chapter 3”. Or anything with a little substance, really.


So, from this, you should have learned that television tropes depicting teachers at the end of the lesson are often unrealistic, and frequently depict very bad teaching. The main take-home points are that the end of the lesson is a bad time to deal out novel information, and when students are distracted by packing up after the bell rings is a very bad time to deal out important information. Do remember to review the concept of cognitive load above, because we might discuss cognitive load theory in more detail in the future. Now, there’s about 2-3 minutes left until the bell, so take the time to pack up carefully and discuss the topic amongst yourselves and then you can leave when I tell you to.

People Are Good, But Stupid – A Maxim For Life

A while back, I ended up playing a game of Psychosis – a board game with questions loosely based around psychology studies, some of which are even still in-date. A more interactive element comes from group activities where Player A gets to answer a question in secret, while the others guess their answer. Usually, these take the format of “So tell me, __________, what is your favourite colour?” – but mostly a bit more interesting than that tepid example.

So I was asked, as you do in the game, “So tell me, ____________, do you think people are A) Mostly good, B) Mostly bad”. I think it may have been more of a scale, but I forget the precise details.

Do I think people are, generally speaking, good or bad?

That sparked off a bit of a debate, as these people know me quite well.

On the one hand, I display a huge amount of cynicism toward people. I generally believe the worst in them. I know the harm they cause, and my cynical reaction is to literally expect it at every turn. If someone is evil, I don’t seem to treat it as a mind-blowing exception to the pattern. On the other hand, came one argument, someone wouldn’t think such a thing if they didn’t fundamentally believe humans were, deep-down, good… but perhaps misguided. A social cynic would have to care about people, and care about their goodness, to rant and rave when they see it going awry.

And I suppose they got it right. I believe people are fundamentally good. I just also believe they’re too stupid to really know what that means.

Everyone wants to be “good”. The connotations of that term alone drive people toward it. It’s positive, it’s beneficial, it’s virtuous and admirable pretty much by definition. But even ignoring the definition, people try to act good – no-one truly wants to cause excessive harm and suffering, we all want to benefit the rest of the world. Even if all they have to go on is “to be good is to be like God”, they’ll instinctively drive toward the harm-reducing, well-being-maximising acts, and the Argumentum ad Dictionarium only comes out in the wash of post hoc rationalisation. We’re driven to be good, rather than bad, and broadly agree on what it means to act those ways even if we disagree when it comes to the tedious, academic unpacking of those terms.

The exceptions are usually driven either by a pragmatic need to break the vague Rules of Goodness (committing a theft because you need money) or a misunderstanding of what constitutes benefit to people (committing a theft because you believe it to be victimless or out of quasi-nihilistic self-interest). Even in the edge-cases of outright psychopathy, we attribute actions to a misfiring and a misinterpretation of morality rather than a drive to be evil.

Calling those exceptions “stupidity” may be an over-simplification – and I have something saved in my drafts folder about a better and more powerful definition of “stupid” to work with. Yet, “stupid” conveys the idea: we want to be good, we all agree that good means maximising well-being and reducing suffering… But we suck at the analytical component of figuring out what that all means in reality.

Mother Theresa thought she was doing good, and reducing suffering, and bringing dignity to people through bringing them, and herself, closer to God – yet those with a keen eye for detail may have seen suffering increase as she deprived the poor and sick of medical treatment while keeping them in squalor, and then spent her donation money on establishing convents. We can’t deny her intentions to do good, and her justifications that her acts were, ultimately, good. And I don’t think it’s a mere disagreement on the definition of good – she wanted to reduce harm and increase well-being, to bring dignity to people. She simply approached it in a… well, somewhat questionable way from the perspective of an outsider with identical motivations and values. Stupidity? Perhaps. Certainly a failure to objectively assess the situation and figure out exactly how to bring about more tangible well-being and happiness.

Look at, say, most racists, sexists or homophobes amongst others with an -ist or -phobe levelled at them. They probably don’t think that what they’re doing is bad. Even the hardened ones. They believe their opinions to be innocent and valid. They try to be good… at least, they don’t try to be evil. But do they understand the harm they cause? Is that because they’re stupid? Perhaps. “Stupid”, again, is not quite the right term – it’s the lack of a decent assessment of their actions.

This is perhaps where the social justice world fails to get through to them – by believing that a bigot is out to cause harm rather than simply misunderstanding whether they cause harm in the first place, they alienate rather than educate. If we approached them as having good intentions, we might be able to convince someone that their (erroneous) approach to implementing those intentions is where the harm comes from. People who say #AllLivesMatter just don’t understand the need to say #BlackLivesMatter, they don’t intend to say #BlackLivesDoNotMatter. Ignorance – not wilful ignorance, just plain, innocent, blameless ignorance – rather than malice is at work here.

“Where’s the harm?” is, ultimately, what hides underneath all the usual defences of hatred and intolerance. At the thin end, someone might defend a racist joke because “it’s just a joke!”; they’re asking where is the harm in something they perceive as truly harmless because they literally don’t see any harm derived from it. And it goes all the way to the extremes of “yes, I might be herding these people into a gas chamber, but, it’s just following orders so I’m not really complicit, and, besides, it’s purifying our race so is obviously beneficial – if we don’t gas this menace we’ll just suffer in the long term”. Okay, maybe that last one requires a little more work to get around… but it’s work we’ll happily do in the name of being good.

We’ll always find a motive to justify ourselves. We’ll always find a reasoning to back up our acts. We wouldn’t do it if we weren’t, fundamentally, driven to be good – because otherwise we’d be happy to admit that, yes, our actions are harmful to others and we don’t care. We’d admit to wanting to cause harm, minimise well-being, and be evil. Yet this is largely not what we see.

We wouldn’t be happy with flawed reasoning if we had the self-awareness to fully analyse it and come to a better conclusion, and then re-address our actions appropriately.

Or, in a soundbite; we want to be good, but we’re too dumb to figure out how to do it properly.

A Crisis of Identity

Allow me to go all special-snowflake and super-self-indulgent for a bit. Normal service will resume shortly.

I’ve had trouble recently figuring out exactly where I fit in the world.

I feel too weird for ‘normal’ society, but too normal for ‘weird’ society.

I mean, consider: My week isn’t spent counting down to Friday where I go out to get drunk in a packed club; my political opinions go beyond “They’re all crooks!”; I don’t work in an office where my surname has remarkably transformed into ‘from accounts’ or ‘from purchasing’; I can count on one hand the exact number of times I’ve given a shit about sport in the last twenty years; And my main sexual fetish isn’t “phwoar, tits!”.

Meanwhile, at the same time: I hate whimsy; I can’t stand poetry; I’ve committed the ultimate sin in thinking that Doctor Who is just a TV show and, really, just a wee-little-bit shit; I don’t have any ironic hobbies like knitting or collecting tea; I don’t have any mental illnesses or disorders, neither self- nor professionally-diagnosed; And I’m basically cishet scum through-and-through.

So I wonder why either group puts up with me.

I could become a conservative, but I think they’re the Evil Fucking Empire. I’m obviously a liberal, but the liberal-left’s innate talent for self-destruction through its purity culture makes me want to curl into a ball and cry. I could go the South Park route and become apathetic and develop a disdain for any thought that challenges me to care or develop or change but, at the end of the day, I just give too much of a shit about things for that nonsense.

Is my real place with the more-mainstream nerds, fighting for Comic-Con tickets and arguing about X-Box vs the PlayStation 19? Probably not, since I have no idea where I’d find the disposable income for all that bullshit, and I find the casual misogyny and the neckbeardiness that comes with the territory utterly repellent. Does that mean I should join in full-time with the Social Justice Enthusiasts, instead? I suppose so, but I find them to be mostly cloud-cuckoolanders who need to learn to live in reality as it is, first, before they have a hope in hell of changing it because, goat-dammit, guys, perfection is the enemy of good/better, here!

A religious group is a non-starter, obviously. Maybe I could get in with the hardened, out-and-proud Atheists? Well, to be honest, I’d rather join a religious cult that was happy to admit to it, and I like that when I use the word “logic” I mean some bollocks like “(∃x∈X|x=n)⇔n∉Y” and not “Feminism and Islam are the greatest threat to humanity because Logic”.

Metalheads? Frankly, I’d rather be locked in a lift for 24 hours with a Trump fan than a Tool fan, and if I can’t stand the liberal purity culture I’ll last about half a second in the world of “METAAAAAALL!!!!!”. Besides, the broader ‘alternative’ crowd have always looked at me with suspicion for having zero interest in ever getting a piecing or tattoo ever.

So all those sub-cultures and movements are out, and I’ve never felt right nor welcome in any of them.

I’m not, and probably never will be, the great, perfect, stalwart LGBT ally people want me to be, but I’ll never go back to the “eugh, why does it always have to be about the gays!” crowd because fuck that. I know for a damn fact that privilege is very real, but I know there is literally fuck-all I can do about it – which I know because I once asked what I could do about it and had shit slung in my face for it. And, yes, quite, simply not talking about racism won’t make it magically go away but neither will only talking about it.

Or do I just bite the bullet and turn normal – Get a trendy haircut, support the local sports team (Go Sports Team!), share post-memes with Minions on them, comment on a Facebook post that already has 150,000 comments on it, roll back my self-awareness, and start regularly watching Eastenders? Or go full tits-to-the-wall odd – Shave one eyebrow because “that’s so random!”, take up body-painting, change my Facebook profile picture to the flag of whatever country is going through the shit this time, buy some goofy hats, take up barefoot running, and then invent my own sexual orientation because “there isn’t a word that describes me!”?

Or, is this just normal and expected. Are we all like this and all thinking the same thing?

I’m Proud to be a Racist


I’m an increasingly-pudgy white guy, running headlong into middle age faster than I want to. I’m currently contract-hopping between various teaching positions at fairly decent universities. I’m a bit of a nerd.

Also, I’m a racist.

And, what the heck, while we’re at it, a bit of a sexist, and probably with a dash of homo-, trans- and xenophobe thrown in there, too. Add whatever else you like to the list. It’s probably the case.

“Holy fucking shit!” I can hear you cry already. “No, you can’t possibly say…”

Hold on a minute, Skippy. I’m going to unpack this one piece by piece. Although if you’re absolutely desperate for the reason for the title (it’s a cheeky bit of rhetoric), skip to the last few paragraphs.

Firstly, just to throw it out there, and I’ll re-colour this paragraph to set it aside and may refer to it later, I don’t quite believe the phrasing of “I am a…” has much use. In fact, I believe any phrase or sentence with the verb “to be” (am, is, was, were…) in it has inherent issues in meaning, though some use-cases have more trivial issues than others.

More specifically, I believe any phrase that begins with “I am…” is held hostage to the recipient’s conception of what follows it. “I am a feminist”, for instance, would flag up totally different conceptions if I said it to renowned feminist third-waver and YouTube vlogger, Bitchy McLesbianface, compared to saying it to renowned MRA, basement-dweller and masturbation enthusiast, Neckbeardsley Fedorason. Ultimately, their conception rules the resulting conversation, not mine.

To say that sort of “I am…” statement, I have to worry about everyone else’s prior biases, definitions and connotations in picking the words that follow. I believe such a phrase lacks an inherent meaning. And, if given the choice, I’d avoid it altogether and go straight to what someone actually believes, rather than an immutable identity, particularly a one-word identity. “I’m a feminist” results in a conflict between Fedorason and McLesbianface, but “I believe women are people, too, and deserve to be heard and have their problems addressed in the light of society’s well-documented systematic biases” has less (though, not zero) wiggle-room for interpretation, misinterpretation, and argumentum ad dictionarium responses.

BUT, that’s for another time. For now, I’m going to throw that personal philosophy aside, assume words have inherent, laconic meaning, and go for the phrasing that will resonate strongly with people (since people exist).

And so, instead of avoiding “I am a…”, I’ll unpack it instead.

So, I am a racist.

Oooooh… controversial!

Anyway, the reason I want to say this is because, as an increasingly-pudgy white guy heading inextricably toward middle age, I’ll often end up on the receiving end of such an accusation. Either explicitly to my face (rare) or implicitly when the words “white people” get thrown around as a broad-brush generalisation (pretty much whenever the day has a vowel in it).

Immediately, my first response – apparently – should be to absolutely lose my shit over this.

“I’m not a racist! I have plenty of black friends!” I should shout . Well, two, but I hail from the North, where even that’s considered a cultural invasion.

Or perhaps I’m supposed to get up and scream about social justice warriors and their assumptions. “Hey, you’re making it about race! Who’s the racist now?!” I scream, before high-fiving myself and leaving for a quick wank over how totally awesome that comeback was.

That’s what I’m supposed to do, if reactions elsewhere are any indication.

Screw that. I’ll take it like adult. Yes, I probably am a racist (see blue paragraph above).


In short, a little thing called “subconscious bias”.

Now, if you’re au fait with those two words, you can probably stop reading. This will mostly be revision. If not, then either keep reading or JFGI, and ram it into your squishy little brain in your own time.

Let’s start at the beginning: We can’t escape our subconscious.

It’s that thing that apparently makes us obsessed with our parents’ genitals, and so leads to like to be tied up in bed while staring at a strap-on shaped like a gummy bear (we’ve all had that one, right, guys? Come, safe space here, just us bros…). But, more than just a one-shot joke for taking the piss out of amateur psychoanalysts, our subconscious is immensely useful to us. It allows us to walk without thinking about the complexities of counter-balancing with our spine and centre-of-gravity, it lets us drive and listen to the radio at the same time after some practice, and it allows us to immediately switch our flight-or-fight response to the ‘ON’ position without having to look behind that rustling bush to see if it’ll eat us first.

But, just like how our flight-or-fight response switches on regardless of whether that rustling bush is caused by a tiger or just the wind, our subconscious is prone to taking a lot of background noise and forming a lot of patterns that aren’t fully helpful in the civilised world.

For instance; commute via train into a busy city – say, London – in the early morning and almost undoubtedly you will come across the sight of a lot of cleaning staff. They’ll push bin-buggies around, empty the bags, pick up discarded paper coffee cups, or scrub out the toilets of the arriving trains. Almost certainly, they’ll be young, black males. Immediately your brain makes a connection, and it begins to wire memories and information together in a complex web for future access and quick reference. See another young, black male picking up litter, and that connection gets stronger. See a white youth doing the job and your brain might remark on it, save it for later, and give you the false impression to your conscious mind that more white men are doing the job than reality suggests. You then pass another black male doing the cleaning job, and the brain ignores it – it’s part of the existing pattern, a pattern that begins to get stronger although your conscious mind begins to block it out.

{Young, black, male} = {blue-collar, low-paid, manual labour}

At the same time, you’re likely to see a lot of people commuting in to high powered office jobs in the Big City. They’ll be largely white,  probably male, a little older, and wearing suits. Again, your brain makes the connection. These are the commuters to be serviced – they’ll throw their coffee into those bins that are emptied by the young, black males. Your brain makes a connection.

{middle-aged, white, male}  = {suit, good job, money}

Now, if you think these demographic descriptions are wrong, I’m afraid there’s little I can do for you. It’s not my place here to prove that they’re right, just accept that they are. But you might assume they’re not right should your brain push the pattern into your subconscious side and highlight the exceptions to your conscious side. Much like someone who assumes 90% of their neighbours are foreign immigrants, when the reality is that less than 10% of them are. The minority and exceptions to the real rule are noticeable, you brain makes a note of them more often.

Your brain makes these connections and patterns automatically. Don’t claim it doesn’t. It does. Denial won’t help you here. This is a process we’ve evolved over time, and we wouldn’t survive without it. These connections and biases are known to be real. They’ve been demonstrated in the lab and in the wild, and they very much have an effect on our thoughts and actions.

Eventually the connections become very strong. Every time we see them reinforced, the connections begin to merge into one thought. And maybe, with a strong enough pattern, it begins to inform us of what the world should be like.

white = {suit, money, good job}

black = {low-paid, manual labour}

This is, of course, just one example. And it’s not just for skin-colour or ethnicity. It works for gender, sexuality, age and anything we can think of. So long as connections are made, stereotypes are enforced (“stereotype threat” is a related phenomenon, yet outside the scope of this), then the patterns will reinforce themselves and inform our attitudes.

The illustration may change location, it’ll change the details, but the overall story is the same. Patterns form – anything that conforms to the pattern reinforces it, anything that goes against it might flag up as an exception. Television advertising, for another instance, plays on this in both ways. It takes advantage of existing stereotypes in order to compress its story down to a few seconds: the woman of the house knows how to cook and clean, the man of the house is a deadbeat and feckless, the children suspiciously uniformly covered in the right amount of dirt to make the power of the washing powder clear, the bank-manager is crisply suited and trustworthy when talking about interest rates, the happy friends eating snacks are  clearly well-paid so you don’t have to worry about “so how do they afford such a trendy expensive apartment to eat their snack food in?”. It’s a cycle of stereotype and aspiration working together. These things prey on our preconceived notions to tell a story, and then at the same time reinforces that pattern by their mere existence.

In this sense, racism, sexism and so on, are all things your brain does automatically. This isn’t to say that very conscious decisions such as “we should string up all the niggers because they’re sub-human” don’t exist, but those attitudes are increasingly pushed to the margins in the modern world. We are “post-racism” in the sense that such attitudes are rare, obviously wrong when presented, and it’s the norm to openly remark upon them as wrong and/or immoral.

We’ve effectively ostracised overt racism (well, just), we now need to deal with subconscious biases and the deleterious effects they can have on society. Simple societal things ranging from casual references to “going for a chinkies” (that’s “Chinese food” for non-English readers) to asking anyone with non-white skin “where are you from?” (and following it up with “no, where are you originally from?”). And, of course, other aspects all the way to things like “that’s totally gay” (that’s “gay = bad” for those more acquainted with older English literature, where it’d mean happy and carefree).

Learning that these biases exist, but are hard to spot, is a first step. No, you shouldn’t really win prizes for this alone, but let’s give credit for the baby-steps here.

The patterns are absolutely everywhere. And then they misfire.

You see a female secretary, and you see another female secretary, and another, and another… soon, your brains says…

woman = {secretary}

…and then you find yourself asking a CEO to make you a cup of tea just before your job interview with her just because she’s a she and dressed that way.

Is the black guy in a suit the doorman, security guard… or the academic researcher you’re meeting? Well, it’s a black suit and he’s beside the door so… Your brain has made so many connections, that its instant response is to tell you something that, upon rational reflection, is completely wrong. Sure, you’ll realise the mistake and come to the rational conclusion – but in those vital moments of a first impression, what have you thought? Undoubtedly something that will inform your thought processes for some time to come. Before long, you’re not just creating a cringe-story for the internet, you’re hiring and firing based on your biases.

Without realising it, you’ve become the linchpin of the cycle yourself. All the while muttering “but I’m not…

Take, for instance, the well-known cases of blind CV studies. By merely changing the name on a CV from a “white-sounding” to a “ethnic-sounding” name, you can reduce the chances of getting a positive response to a job application dramatically. A first-class graduate called Muhammad will have to work harder than a first-class graduate called Dave (or even a second-class graduate called Dave, for that matter, as the effect here is very pronounced). By changing from a masculine name to a feminine name you can cause a significant drop in suggested starting salaries for the applicant, even though the CV content is the same. At no point, at all, do any of the people reviewing these CVs actively think something along the lines of “I just don’t like women and think they should be paid less” or “I don’t think we should employ ethnics because they smell funny” – it’s their subconscious biases talking, and having an unfortunate real-world effect. Their subconscious is so ingrained with ideas like “woman” = “menial typist” and “Muslim” = “terrorist” that they begin to act on them… whether they mean to or not.

And don’t think this is purely a “poor little old me and my oppressed minority” game at play. I mentioned above a bias that suggests “woman = secretary”, and anecdotally I know of a situation where a man’s CV was immediately thrown into the discard pile for a secretarial job, and the reason given was  why would a man want to be a secretary?

So, really, these biases have a wide effect and can seriously pollute our conscious thinking, too. They’ve informed us of how we think the world should be. And that cuts in a lot of different directions when it comes to both individual actions and systematic results.

But mostly, here’s the main thing about these biases – they won’t go away by pretending they don’t exist. We need to seriously examine them. We need to admit they’re there, and begin to look at how we could possibly address them. We need to treat them as real things, that we can address as grown adults.

Yet, instead of that, we treat it as a stigma. We treat it as something that says “those silly little black folk say that I think they’re arbitrarily sub-human!” and act as if it throws us in with the Klan. We’re happy to accept that an overtly-racist and overtly-sexist crowd acts irrationally, but refuse to even consider that we might act irrationally in response to a programmed bias – even if that’s pretty much the definition of irrationality. We panic. We deny it’s true – we shout and scream and demand the accusation be withdrawn because it can’t possibly be true because it isn’t. We interpret an accusation as “You’re a racist!” and counter it with “But I’m not a racist!” (blue paragraph).

As a result, nothing of value happens.

Nothing gets done. Nothing gets improved. We go about our business, as usual, being both the cause and effect of subtle, subconscious, social and systematic biases and prejudices. All the while, stating very clearly that we are not racist, sexist or whatever… yet telegraphing to the world that we very much are. (*cough*blue paragraph*cough*).

So, in that respect, standing up to say “I am a racist” (again, see blue paragraph) may well be a vital first step to progress. It’s the stepping stone to “I’m not a racist” carrying some actual weight. It says I’m willing to admit I may have a bias. It means I’ve analysed the world, and figured out that it doesn’t revolve around me. It says I know what the real problem is. I says I accept it, that I’m not ashamed of it, and will try by best to change. Dealing with it is for another time, but it says I’m ready to at least try. That’s something worth standing up for.

So, yes, I suppose I am proud to be a racist, because it means the people who come after me definitely shouldn’t be.