• Skip to main content
  • Skip to primary sidebar
  • About
  • Brief Dispatches

Multifarious Threads

The perils of ranking people’s worth by their intelligence

20th August 2024 by Finn Gardiner Leave a Comment

Content warning: racism, sexism, and ableism

(Originally written in late 2017 or early 2018. Major edits in summer 2024.)

It is laudable to honour intelligence as a positive attribute. The same applies for kindness, athletic prowess, musical ability, generosity, determination, patience or creativity—all these are traits that ought to be celebrated. But to designate the highly intelligent as morally and biologically superior is dangerous.

I was reminded of the perils of attaching moral and qualitative values to people’s intelligence when I came across the website of the so-called intelligence expert, Paul Cooijmans. His website is often quoted as an authoritative source of information on intelligence, personality and other traits, and he hosts a number of tests and quizzes ready to share on social media. (Admittedly, most of the people citing him aren’t reporting for major media sources, but the Spectator and BoingBoing article are exceptions.) I haven’t seen anybody else criticise his awful content, so I decided to do it. Are people not paying attention, or are they so ensconced in privilege that they can afford to ignore it? His material often sounds as though it could have come straight from a Nazi speech or an American eugenics leaflet from the early 20th century.

The first hint that there is something unsavoury on his site appears in his guide to the ranges of human intelligence. There are a number of problems with this page:

  • This guide is completely entirely unsourced, apart from a postscript that he used a “combination of sources.” If he used a combination of sources, he needs to cite them; it’s good academic practice. It’s intellectually dishonest not to provide sources if they supposedly exist—people may very well assume that these data were pulled from the author’s rear orifice.
  • He provides a range of IQ scores, though he does not clarify which tests he is using to define these ranges. There are myriad IQ tests available—the Wechsler tests, the Stanford-Binet, the Kaufman tests, Raven’s Progressive Matrices, and the the Cattell test, to name a few—and all of them will yield different results. They use different theories to underlie their methods, and they have different ranges, confidence intervals, and cutoff points. Moreover, previous versions will also produce different results from current tests that are based on different norms.
  • His classifications of “R*t*rd*d,” “Below Average,” “Average,” “Above Average,” “Gifted,” and “Intelligent” are not in current use on any real IQ test, apart from any term containing the word “average.” The first term is the infamous “R-word”: an out-of-date and demeaning term for people with intellectual disabilities. The term was phased in the late 2000s and early 2010s. “Gifted” is a predominantly educational term to refer to anyone who scores above a certain cutoff—typically 120 or 130—to qualify for enrichment programmes or academic acceleration. As for “Intelligent,” that isn’t an IQ range; it’s a general term for especially adept learners. Again, he gives no citations for these classifications.
  • He thinks that only people with an IQ of 140 or higher can think rationally and communicate clearly. He provides no proof for this assertion.
  • The page is oozing with rank ableism. His disdain for people with intellectual disabilities is such that he revels in using such outdated and dehumanising language.
  • This page appears on the first page of results when googling “IQ classifications,” above several legitimate websites.

Once one goes further into his site, though, his fascistic, hateful views come into even fuller display. It seems to get worse the further one goes. The ableism persists, and to the mix are added racism, sexism (including misogyny, homophobia, and transphobia), xenophobia, and classism. It is a toxic blend of reactionary, retrograde, benighted views that have long since been debunked by mainstream science. He is no iconoclast; in fact, he has merely reconstructed old icons to venerate.

For a man who purports to be an original thinker, Cooijmans covers little new ground when he considers the topic of human intelligence. He merely regurgitates white-supremacist talking points—the sort of thing one would find on American Renaissance, to which he links—and 1920s-era pseudoscience. If I wanted hundred-year-old insights on intelligence, I would much rather go to the source.

He has an online quiz to test whether the visitor is a cultural Marxist or not. Cultural Marxism is a dog-whistle term used by alt-righters and neofascists to refer to people who hold any views to the left of Donald Trump, Geert Wilders, or Marine Le Pen. It has little to do with the socioeconomic doctrine of Marxism. Liberals and progressives are unlikely to consider the proletarian ownership of the means of production their primary political aim.

The questions in this quiz are heavily loaded to favour right-wing views about gender, culture, disability, and race—and as always, Cooijmans’s contempt for anyone unlike him is evident in every question. (People with intellectual disabilities bear the brunt of this cruelty.)

I was most horrified, though, by the third page of his I saw. Cooijmans presents a harrowing tale of societal degeneration spawned by intellectual decline. It is clear that he views people with intellectual disabilities as being half-human brutes, unworthy of civil rights or fair treatment.

There are lots of classic authoritarian-right talking points scattered throughout this awful article, including support for the death penalty, corporal punishment, the loosening of human-rights laws, and other harsh social policies.

The ableism from his intelligence-ranking article and the “Cultural Marxism” quiz is in full display here, too. He claims that the mental faculties of intelligent people make them superior to others, regardless of their behaviour:

But in this article I have assumed a continuous decline to sketch what would happen. I hope this will open the eyes of those people who, knowing that I am occupied with intelligence, in a warning and patronizing manner say things to me like, “Intelligence is not important or valuable in itself!”, “A more intelligent person is not a better person!”, “A society with higher intelligence is not a better society!”, or “Higher intelligence is not something worth striving for!”

Of course intelligence is important and valuable. So are creativity, athletic talent, and persistence. Acknowledging a positive characteristic does not mean that we should eliminate those who lack that trait, or those who possess that trait to a lower degree. Let us say that we replace “intelligence” with “athletic talent.” Shall we then move toward eliminating the athletically mediocre, regardless of how creative, intelligent, or persistent they are? (If that were the case, I’d be the first one out, because I’m a relatively intelligent person who’s completely useless at sports and don’t have an athletic bone in my body.)

Moreover, these positive traits do not define a person’s morality or individual value. Yes, it is good to be intelligent. But as members of society, people have the obligation to live alongside their fellow human beings without inflicting pain and suffering on them.

For the intelligent to achieve goodness, it is not enough merely to be bright; it is to use their mental faculties to benefit those around them. It is possible to be devastatingly intelligent, but lack even a modicum of concern for others’ feelings, well-being, or existence. History is flooded with such people; they tend to become dictators, absolute monarchs, or tyrannical CEOs.

Cooijmans’s conflation of intelligence with goodness is particularly troubling, especially considering the lack of moral consideration he expresses for others. He seems to think of goodness as a fixed trait associated with a person’s status, rather than someone’s behaviour. If a “good person” does something, it is good; if a “bad person” does something, it is bad. People at more sophisticated levels of moral reasoning think otherwise; it is one’s behaviour and belief that confer goodness, not a static definition of goodness that often relies on demographic characteristics.

(His personal profile also suggests that he does not value decency as a personal value, which I find horrifying.)

This line is particularly chilling:

Of course, a full degeneration like this can only occur when there is no other society around to destroy, enslave, colonize, or “help” the degenerating population. (emphasis mine)

This is advocacy for genocide, colonisation, and slavery. A person who thinks that slavery and colonisation are ways to prevent societal decline is downright fucking evil. Cooijmans is so consumed with hate that he will casually advocate for the elimination of populations because they learn too slowly. The author himself is a perfect example of how goodness and intelligence are not synonymous with each other.

Anyone with at least a scintilla of decency and regard for human rights and the dignity of all human beings, regardless of their origins, identities or abilities, should stop citing this odious excuse for a human being and people like him. He does not deserve to be treated as an authority. We may as well cite Richard Spencer or some other neofascist, jackbooted thug who thinks that they are superior by virtue of the circumstances of their birth. By shunning Cooijmans, we avoid elevating the work of someone who diminishes the humanity and dignity of people with intellectual disabilities, as well as women, LGBTQ+ people, people of colour, and immigrants. We can avoid contributing to the mindset that only the highly intelligent can be moral actors, and promote the idea that our choices and attitudes toward ourselves and others determine our morality.

I’ll finish with some lyrics from Depeche Mode’s ‘People are People’:

So we’re different colours and we’re different creeds

And different people have different needs

It’s obvious you hate me though I’ve done nothing wrong

I’ve never even met you so what could I have done?

I can’t understand what makes another man hate another man

Help me understand

People are people so why should it be you and I get along so awfully?

Filed Under: To Be Filed

On autism, early diagnosis and identity

11th April 2024 by Finn Gardiner Leave a Comment

(Originally written in 2018)

I suppose this is an unorthodox Autism Acceptance Month post, but so was my previous one. This is an expansion of a Facebook status I wrote about two weeks ago.

I have a difficult time relating to the accounts of late-diagnosed autistic adults who did not grow up with this label being used to marginalise, segregate and discredit them. I see so many accounts of unalloyed joy at being diagnosed with autism and having the diagnosis explain most or all of the difficulties they’d encountered throughout their lives.

That’s not to say that late- and self-diagnosed adults don’t encounter oppression or stereotyping, especially after they receive a formal diagnosis and encounter stereotyping from misinformed practitioners, but that it’s of a different nature from what early-diagnosed people can experience. For me, though, it was a mark of Cain until I discovered the neurodiversity movement back in 2005 or so.

I was diagnosed before I started school and had my differences placed squarely within a pathologising framework. I was wrongly thought to be intellectually disabled – to the point that a doctor declared that I would ‘never learn’ – because I was late to start talking, though this impression of me was mercifully very short. People like my parents and teachers would emphasise my weaknesses, either real or perceived, over my strengths.

While I had two concurrent labels when I was at school – autistic (officially PDD-NOS and Asperger Syndrome) and gifted1 – much of what I dealt with was pigeonholing, pathologisation and exclusion from opportunities that would have benefited me socially and intellectually. I felt like a ‘fake intelligent person’ because of my diagnosis.

In fact, my methods of internalising and interpreting information were frequently treated as autism symptoms. I was forced to sit through boring classes that didn’t challenge me intellectually because ‘quiet hands’ and extinguishing ‘behaviours’ came first. While I did skip a grade and participate in gifted programming for the majority of my elementary- and middle-school years, I was still under-stimulated intellectually and often found myself disengaged from the general-education curriculum.

There were also a lot of neurodivergent traits I had that couldn’t be explained by my being autistic, but made sense given my learning style, but people tended to subsume all of these traits under the label of ‘autism’ just because they weren’t neurotypical. I realised that I couldn’t attribute everything different about me to autism after going through something of a quarter-life crisis at the age of seventeen.

I started rebuilding my self-concept to incorporate a more holistic interpretation of my cognition, though admittedly I was still struggling with internalised disablism and wondered over the course of two years whether I was Properly Autistic because I didn’t match the Asperger Syndrome stereotype. Some of this was because most of the literature I’d encountered on autism characterised it as a disability that primarily affected social interactions, even though social interactions were and are not what I find the most disabling, especially after the age of 14 or so. My social skills difficulties were contextual, not global.

Compared to many of the accounts I read in books and on blogs online, I got off pretty easily socially.

I’m not saying that social interaction was necessarily easy for me – it wasn’t – but it was easier if I was talking to people who understood what I was trying to tell them, which my parents frequently didn’t. They’d shut down if I tried to explain my reasoning for my behaviour and preferred superficial explanations that didn’t address the root problem. Somehow I was the one who was impaired for having complex interpretations of my behaviour, though, since they liked to pin anything they didn’t understand about me on the autism diagnosis. I remember arguing with my mother when I was 18 and trying to explain myself and having her tell me ‘You have Asperger Syndrome!’ as though it invalidated the content of my argument.

My executive functioning issues are vastly more disabling; in fact, I find them the most disabling (and expensive) part of being autistic. I have a reasonably active social life and have an easier time making friends.

For a period in my mid- to late twenties after I became more involved in public disability advocacy, I moved back towards attributing all my atypical perceptions to my being autistic, even if I knew multiple autistic people who saw the world very differently from me.

Much of this was from indirect peer pressure from late-diagnosed and self-diagnosed autistic people. I was using it as a crude, brute-force method to identify and categorise anything that seemed to separate me from the general public, even if there was no direct evidence that the experiences in question could actually be explained using an autism-centric framework. I also felt that I had to do this to be the Right Kind of disability advocate, even though I knew plenty of other autistic people who didn’t have the same Weird Brain Things as me; it didn’t help that I had internalised the idea that using other interpretations for it might suggest that I was somehow expressing some form of disablism for not using a framework that centred on disability.

I just felt crazy and isolated during that period. It is painful to read back from blog posts and private journal entries from between 2011 and 2016 in which I explain aspects of my thinking as being part of being autistic when they’re not necessarily autistic traits in and of themselves.

I still think I’m autistic, of course, but I no longer feel comfortable treating it as though it is the sole explanation for my divergent thinking. More specifically, I think the diagnostic label is useful for me to identify specific supports for the issues that I find disabling, and it is politically useful as a framework to advocate for the rights of a group of people with somewhat related experiences who experience systemic marginalisation for their disability.

This is different from internalised disablism or claiming that I don’t have a disability at all; it’s just that when I was growing up, I had a disability-centric narrative and identity imposed on me against my will. People who were diagnosed as teenagers and adults and encountered less pathologising narratives about autism when they found out about it are more likely to see a politicised disabled identity as something of a revelation. I won’t deny being disabled. I’m not a Shiny Aspie. There’s a difference, though, between denial and recognition of the complexity of one’s experiences.

Moving towards a more holistic way of interpreting my neurodivergence seems to be healthier for me. I can’t do to myself what people did to me when I was growing up.

  1. I am not a fan of the word ‘giftedness’, by the way. I use it reluctantly to refer to an educational label and a neurotype, or a set of neurotypes, but the current language around it is not that great. ↩

Filed Under: To Be Filed

Asides and aphorisms

5th April 2024 by Finn Gardiner Leave a Comment

  • Paper may only be processed wood pulp and pencil strokes only smears of graphite mixed with other materials, but it is the information they convey that is important. The same applies to digital media, too; the words I see on the screen may technically be made up of ones and zeroes, but they convey information to me and you thanks to the emergent nature of media.
  • Normative chauvinism is the practice of considering the normal, average or typical person superior to the outlier in some or all cases. They think majorities are better than minorities. These norms can be real averages based on population statistics, or they can be idealised norms (the Body Mass Index). They tend not to listen to the minorities they vilify, since their smugness seals them off from any criticism.
  • Making the internet accessible shouldn’t be a pain in the ass. Clearly, the people designing accessibility systems at companies like Adobe see it as an afterthought.
  • Personal health and finance websites exemplify the Protestant work ethic and puritanical thinking, delivered in the form of breezy soundbites about clean eating and budgeting.
  • This small-minded, supermarket-tabloid obsession with body weight says more about the intellectual bankruptcy of fat-shamers than it does larger people themselves.
  • An objectivity measurable outside human perception is unlikely; it is easier to believe in the confluence of eight billion subjectivities.
  • People whose learning potential varies significantly from the norm—that is, people with intellectual disabilities and the highly intelligent—have often been characterised as supernatural or unnatural. Rather than being variations on the human theme, they are wrongly cast into a role that separates them from the rest of humankind: the highly intelligent are witches; those with intellectual disabilities, changelings.
  • Like good software developers, everyone should use better exception-handling when encountering people whose experiences diverge greatly from the central tendency.

Filed Under: To Be Filed

A Weighty Issue: Enough with the O-Word.

2nd June 2023 by Finn Gardiner Leave a Comment

(content warning: weight stigma, ableism, insulting medical terms in linked content)

A photo of a doctor’s scales.
A photo of a doctor’s scales.

Reading anything to do with weight and health is similar to reading twentieth-century articles about intellectual and developmental disabilities. Idiot. High-grade moron. Intellectually subnormal. Low-grade imbecile. Mental defectives. Feeble-minded. R*tarded, r*tardates. As a disability activist who focuses on intellectual and developmental disabilities, I find the parallels disturbing.

Why? One word keeps coming up, and it starts with an O and rhymes with fleece.

What’s the matter with the O-word?

It engenders disgust, loathing and judgement that even overweight does not. It comes from a Latin term meaning “having eaten to the point of fatness”—a behavioural judgement, not a neutral clinical term. Before it was a diagnostic term, it was an ordinary insult. It’s as neutral as gluttony. Even corpulence would be an improvement, since it focuses on someone’s size, rather than how they got there. (It’s still insulting, so I’m not advocating its use.)

In the medical literature, the O-word is used as blithely as feebleminded, mental defective and high-grade moron were. People use it ad nauseam without a thought—or if they do think about it, they double down, saying “doctors use it,” as though that absolves them of their responsibility to acknowledge others’ dignity. After all, they once referred to female hysteria and drapetomania.

I am focusing solely on abandoning the O-word in clinical practice, as well as health and wellness websites that refer to the clinical literature. Metabolic science is still in the idiot and mental defective era. We categorise people by their size in ways that are uncomfortably parallel to high-grade moron, use disparaging diagnostic terms, and use “the science” to justify what would be called bullying outside a doctor’s office. Research has shown that higher-weight people object to the O-word—especially Black people—even though clinicians continue to use it repeatedly in their articles. Although clinicians publishing scholarly articles may be following standard practice in their field, it still makes for painful reading.

Weight researchers have started moving toward person-first language, but this is only a Band-aid, just as person with mental r*tardation was back in the 1990s.

I don’t know the right approach to improving people’s metabolic health. But I do know that a field that continues to use pejoratives as diagnoses for the people it claims to support, even if they are shifting toward person-first language, has probably not advanced enough to find the right answers. That was the case with developmental disabilities in the twentieth century, and it’s the case now with weight and metabolism.

Even for those who think that high weight is caused solely by unhealthy behaviour, this is no excuse. Medical history is littered with moral judgement disguised as concern for people’s health, moral defective chief among them. Also, there’s precedent in medicine for developing more sensitive terms in behavioural health: people with substance-use disorder, rather than drunks or junkies. Most practitioners would blanch at applying a term like drunk or junkie in a clinical setting—so why persist in using its modern-day equivalent in metabolic science or endocrinology? Instead of supporting people, we are diagnosing them as food junkies.

What other names should we use?

Radicals in the body-positive and fat-acceptance movements prefer fat, but most higher-weight people continue to avoid it. Fat is analogous to crip: widespread in radical activist circles, but rejected by people outside the movement. For that reason, I don’t advocate the use of fat in clinical settings. Instead, I recommend using an expression like higher-weight.

Even if you consider a high Body Mass Index a medical condition, then you are saying that someone has a disability or chronic illness. By that measure, the continued use of the O-word is a kind of ableism, just as the R-word, moron and mental defective were before it. If you want to use a clinical term to describe high weight and medical conditions that are often correlated with it, why not use metabolic syndrome? At the very least, it focuses on bodily processes (like diabetes or lower-back pain) and doesn’t have the whiff of a playground taunt.

How can we move forward?

The problems with the O-word go beyond the label itself. Social justice isn’t reducible to words—after all, there are people who use all the right nomenclature and still manage to be jerks. For example, I’ve seen a lot of pro-Russian or anti-Ukraine commentators using Kyiv, preferred by many Ukrainians, rather than the Russian-derived Kiev (CW: war coverage).

The O-word is harmful because it is an insult repurposed to be a medical term. It is more like junkies or gluttons than diabetics or people with cerebral palsy. It is used to justify “care” that fails to acknowledge people’s human dignity. It is used to blame and shame. Even as people come to understand the complexities of metabolic health, they continue to use a term that places all the blame on the individual rather than the psychological, social, material, cultural and interpersonal factors that affect their health. It is particularly jarring to see the O-word used in articles that decry weight stigma: it is similar to a substance-use specialist saying that “we should fight stigma against junkies,” or a clinical psychologist saying that “we must acknowledge the dignity of mental defectives.” If you want to avoid stigmatising people with substance-use disorders, you don’t call them junkies. If you want to acknowledge the dignity of people with intellectual and developmental disabilities, you don’t call them mental defectives. And if you want to end weight stigma, you shouldn’t use the O-word.

Weight stigma is detrimental to people’s mental health—and that stress can lead to adverse health outcomes. Ironically, stress can lead to the very thing that many clinicians want to avoid: weight gain. Because they’ve come to acknowledge the harmful effects of weight stigma, practitioners are starting to recommend health-promoting habits, like exercise and eating nutritious foods, rather than focusing on weight loss.

Clinicians are starting to make steps toward more compassionate ways to understand weight and metabolic health, and it’s time to take another step. Medicine abandoned mental defective, r*tarded, and feebleminded and relegated them to the terms of abuse that they always were. It’s time to do the same with the O-word.

Filed Under: To Be Filed

15 Rules for Writing Generic Scholarly Articles, White Papers, and Annual Reports That Absolutely No One Will Enjoy Reading

1st December 2022 by Finn Gardiner Leave a Comment

A graphic that says, "15 rules for writing generic scholarly articles, white papers, and annual reports that absolutely no one will enjoy reading."
A graphic that says, "15 rules for writing generic scholarly articles, white papers, and annual reports that absolutely no one will enjoy reading."
  1. The passive voice must always be used.
  2. Avoid simple words like “before” and “after.” Instead, opt for “prior to,” “in advance of,” and “subsequent to,” which will doubtless make you sound more intelligent.
  3. To leverage your core competencies, liaise with key stakeholders, and build capacity in oral and written expression, it is to be ensured that every management-speak cliché is utilized going forward by all personnel—or they may risk being rightsized for not using best practices in their writing methodology. Remember, you’ll never be impactful without lots of jargon and gobbledygook, so stay within the parameters.
  4. Never write “people.” Write “persons” and “individuals” instead—preferably ten times on the same page.
  5. Even though you finished grad school ten years ago, write as though you’ve got a professor who wants a specific word count. Pad your sentences and paragraphs with as many redundant, repetitious, pleonastic, redundant, and tautological phrases and locutions as possible. Make them circuitous and repetitive, too. “Overall,” “in the final analysis,” and “it is interesting to note” are also handy ways to lengthen a paper that’s otherwise short on content.
  6. Who needs “because” when you can use “due to the fact that,” “in light of the fact that,” and “owing to the fact that”?
  7. Add in some legalese for extra variety. You’ll always sound more authoritative if you say “pursuant to” instead of “under” or “according to.”
  8. Forget what Strunk and White said about omitting needless words. In fact, you should include a plethora of superfluous vocables in excess; otherwise, you won’t come across as scholarly enough. (See also Rule 5.)
  9. Spock and Data from Star Trek should be your models of good writing. Contractions make you sound personable, which is not the done thing.
  10. A paper is never complete without a few parenthetical references (PRs), especially for words that won’t be used anywhere else in the paper.
  11. Never start your sentences with “and” or “but.” Use heavy openers like “in addition” and “however” instead.
  12. When quoting sources, don’t use “said.” Words like “noted,” “indicated,” and “stated” sound more elegant, don’t you think?
  13. To sound more credible, say “evidence-based” and “best practices,” even when the evidence in question is a few online commenters, and the “best” practices aren’t even in the top ten.
  14. If you’re a psychology researcher, be sure to say that your respondents “endorsed” having depression and anxiety—because nine out of ten patients apparently endorse feeling like shit.
  15. Never get help with your writing. After all, your middle-school English teacher told you that you were the best writer in the class, so how much more help do you need?

Filed Under: To Be Filed

  • Page 1
  • Page 2
  • Page 3
  • Interim pages omitted …
  • Page 5
  • Go to Next Page »

Primary Sidebar

May 2025
S M T W T F S
 123
45678910
11121314151617
18192021222324
25262728293031
« Aug    

A blurb.

A blog about disability, psychology, philosophy, language, culture and more. It’s a hodgepodge, and I hope you’ll find something that interests you. (If not, there are always adorable cat videos on YouTube, or influencers making fools of themselves on TikTok.)

Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Copyright © 2025 · Lifestyle Pro on Genesis Framework · WordPress · Log in