‘The Cult of Hattie’: ‘wilful blindness’?

“There is a science to learning and we are finding out more and more about what works best to support the learning processes that make a difference for your learners.“ Advertising for a Visible Learning symposium at the Australian Council for Educational Leadership (ACEL) website

“Assisting practising teachers to maximise their impact on student learning relies on implementing practices that have been shown to benefit students the most – with constructive feedback on educational practices, collaboration and effective professional learning.”  From the Australian Institute for Teaching and School Leadership (AITSL) Chair John Hattie’s Statement of Intent

“Hattie’s work is everywhere in contemporary Australian school leadership. This is not to say that educators have no opportunity for resistance, but the presence and influence of brand Hattie cannot be ignored. The multiple partnerships and roles held by Hattie the man and the uptake of his work by systems and professional associations have canonised the work in contemporary dialogue and debate to the extent that it is now put forth as the solution to many of the woes of education.” Scott Eacott

“Unfortunately, in reading Visible Learning and subsequent work by Hattie and his team, anybody who is knowledgeable in statistical analysis is quickly disillusioned. Why? Because data cannot be collected in any which way nor analysed or interpreted in any which way either. Yet, this summarises the New Zealander’s actual methodology. To believe Hattie is to have a blind spot in one’s critical thinking when assessing scientific rigour. To promote his work is to unfortunately fall into the promotion of pseudoscience. Finally, to persist in defending Hattie after becoming aware of the serious critique of his methodology constitutes wilful blindness.”  Pierre-Jérôme Bergeron

Since the original publication of John Hattie’s book, Visible Learning, there have been questions raised about the statistical methodology underpinning his research and representation of ‘what works best for learning’. By 2014, the year Professor Hattie became the Chair of AITSL, it was clear, even to tertiary statistics students, that serious mathematical errors had been made. There continues to be a steady flow of journal articles contesting Hattie’s ideas. By 2017, concerns about flawed use of statistics and how the politics of education works in Australia sees many practitioners not really needing to read a journal article to know all about “the cult of Hattie” in our schools.

Hattie continues to rank the “195 Influences And Effect Sizes Related To Student Achievement” without acknowledging the concerns raised by statisticians. After reading the latest paper which derides the methodology, I decided to see what some influential educators thought. A simple tweeted question:

Your thoughts about this new analysis of Hattie’s statistics?

resulted in Stephen Dinham, Scott Eacott and Dylan Wiliam responding with the latter agreeing the stats are flawed:

In my view, yes. Issues: age dependence of ES; sensitivity to instruction, study selection; publication bias; atheoretical categorization…

Screen Shot 2017-08-26 at 5.19.43 pm

Scott Eacott‘s recent paper, School leadership and the cult of the guru: the neo-Taylorism of Hattie places Hattie’s work in an Australian context. It really is essential reading for educational leaders. I urge you to read it and engage with him, perhaps on twitterHattie’s reply to Eacott’s paper does not even remotely grapple with the issues raised and I note “no potential conflict of interest was reported by the author”. Eacott tweeted that the journal will not publish his response.

Corwin Australia (see screenshot from their website below) is on a good thing. Google “Visible Learning” + your town and see how many primary and secondary websites links you find back to this business/service. There is a growing coterie of trainers around the world delivering this trademarked professional learning, based on Hattie’s meta-analyses.

Screen Shot 2017-08-26 at 9.04.31 pm

Just to be clear. Professor Hattie has had a stellar career and much of his work makes complete sense without an iota of research. Who would argue with Hattie’s point that teachers with impact are:

  • passionate about helping their students learn
  • able to build strong relationships with their students
  • clear about what they want their students to learn
  • using evidence-based teaching strategies
  • monitoring their impact on students’ learning, and adjust their approaches accordingly
  • actively seek to improve their own teaching
  • viewed by the students as being credible 

However, when flawed statistical analysis is resulting in advice that high-impact, evidence-based teaching strategies include:

  • Direct Instruction
  • Note Taking & Other Study Skills
  • Spaced Practice
  • Feedback
  • Teaching Metacognitive Skills
  • Teaching Problem Solving Skills
  • Reciprocal Teaching
  • Mastery Learning
  • Concept Mapping
  • Worked Examples

but there’s little or no impact with:

  • giving students control over their learning
  • problem-based learning
  • teaching test-taking
  • catering to learning styles
  • inquiry-based teaching

one feels a little less comfortable with the advice considering the statistical analysis of effect size is worse than merely dubious. Research can only tell us what may have happened not what is needed next as we all grapple with the future.

Context is everything. That includes the context, throughly discussed by Dr Eacott, that has led to Australian schools looking for scientific, evidence-based solutions to the educational challenges highlight by PISA and NAPLAN. Dylan Wiliam, since at least 2009, has questioned the use of meta-analysis in education. It seems pretty obvious that Hattie’s number-crunching has appealed to politicians and administrators looking to solve what often feels like a manufactured series of education crises.

It is worthing quoting the conclusions from a 2009 paper that you really should read:

…we want to repeat our belief that John Hattie’s book makes a significant contribution to understanding the variables surrounding successful teaching and think that it is a very useful resource for teacher education. We are concerned, however, that:

(i) Despite his own frequent warnings, politicians may use his work to justify policies which he does not endorse and his research does not sanction;

(ii) Teachers and teacher educators might try to use the findings in a simplistic way and not, as Hattie wants, as a source for “hypotheses for intelligent problem solving”;

(iii) The quantitative research on ‘school effects’ might be presented in isolation from their historical, cultural and social contexts, and their interaction with home and community backgrounds; and

(iv) In concentrating on measureable school effects there may be insufficient discussion about the aims of education and the purposes of schooling without which the studies have little point.

It seems appropriate to close with one of the quotes that opened this brief post and to ask what you think? Your commentary is, as always, highly appreciated.

To believe Hattie is to have a blind spot in one’s critical thinking when assessing scientific rigour. To promote his work is to unfortunately fall into the promotion of pseudoscience. Finally, to persist in defending Hattie after becoming aware of the serious critique of his methodology constitutes wilful blindness.”  Pierre-Jérôme Bergeron


Bergeron, Pierre-Jérôme; Rivard, Lysanne,  How to Engage in Pseudoscience With Real Data: A Criticism of John Hattie’s Arguments in Visible Learning From the Perspective of a StatisticianMcGill Journal of Education / Revue des sciences de l’éducation de McGill, [S.l.], v. 52, n. 1, July 2017. ISSN 1916-0666. Available at: <http://mje.mcgill.ca/article/view/9475/7229>, Date accessed: 22 Aug. 2017.

Eacott, Scott, School leadership and the cult of the guru: the neoTaylorism of Hattie, School Leadership & Management, DOI: 10.1080/13632434.2017.1327428, 2017

Hattie, J., Visible learning for teachers: maximizing impact on learning, London: Routledge, 2012

Siebert J. Myburgh, Critique of Peer-reviewed Articles on John Hattie’s Use of Meta-Analysis in Education, Working Papers Series International and Global Issues for Research, No. 2016/3 December 2016. Availability:<http://www.bath.ac.uk/education/documents/working-papers/critique-of-peer-reviewed-articles.pdf> Date accessed: 26 Aug. 2017

Snook, Ivan; O’Neill, John; Clark, John; O’Neill, Anne-Maree and Openshaw, Roger. Invisible Learnings?: A Commentary on John Hattie’s Book – ‘Visible Learning: A Synthesis of Over 800 Meta-analyses Relating to Achievement’ [online]. New Zealand Journal of Educational Studies, Vol. 44, No. 1, 2009: 93-106. Availability:<http://search.informit.com.au/documentSummary;dn=467818990993648;res=IELNZC> ISSN: 0028-8276, Date accessed: 26 Aug. 2017



  1. Michelle Renshaw:

    The cultish nature & the use of stats to draw policy conclusions has always rested uncomfortably with me. Teaching is a reflective practice that is both contextual & relational. I’m always happy to read, reflects, trial & apply. I’ve found the professional development Regarding Hattie to be helpful because it makes me think. My biggest big bear is that his names is used to badger or thwart by zealots. As an HSC Studies of Religion teacher the Hattie chant echoes like a new age religion in the Australian Educational landscape. In essence, good teaching & learning is too complex & nuanced for cults, while fads are distilled in the daily practice of teachers & mined for what works today, with these young people in this part of the course.

  2. wayne:

    I think you have been too kind. The research is fatally flawed, therefore any of its conclusions are tainted and also flawed. The only sensible approach is to reject the whole lot of it.

    There are promising insights BUT the actual academic work on those is yet to be done and as such they are anecdotal and of no substance. If Hattie was the academic he presents as then he would redo the research, properly and validate or disprove his erroneous findings. The very fact he has not suggests academic misconduct of a high order.

  3. Natasha Watt:

    There is purpose to the powerful wanting to uphold Hattie as a guru. It is that his work makes the complex nature of learning simple ie. politicians think that progressing students is as straight forward as pre and post testing and the other reason is that it throws validation behind the idea that there is a crisis in teaching and reasons to take teachers in hand, take them on, de-professionalise teaching, attempt to make teachers untrustworthy to the public so that the solution is for Pearson n crew to weigh in.

  4. Merrideth:

    Thanks Darcy-that was an interesting read and I love the phrase, ‘willful blindness’
    I agree that much of Hattie’s work makes sense but the presentation of his findings as scientific fact is problematic. I think that Hattie is appealing to ‘leaders in education’ because he takes the complex nature of teaching and learning and breaks it down into tick boxes; providing solutions that will surely lead to ‘improvement’-of course this improvement can be measured by external assessment that offers but a narrow report of student progress but data can be collected! Politicians can report that ‘standards have been raised’ and schools can be congratulated for ‘bumping up’ thier NAPLAN results. Where to from here?

    • Duane E Swacker:

      “of course this improvement can be measured by external assessment”

      While “measuring is what is purported to be being done with those external assessments, nothing of the sort is actually be done with those standardized tests, i.e., external assessments. Noel Wilson delves into the nature of assessment in his seminal 1997 dissertation “Educational Standards and the Problem of Error”, arguably the most important piece of educational writing in the last half century. All should read his work: http://epaa.asu.edu/ojs/article/view/577/700

      Now as far as that measuring goes:

      The TESTS MEASURE NOTHING, quite literally when you realize what is actually happening with them. Richard Phelps, a staunch standardized test proponent (he has written at least two books defending the standardized testing malpractices) in the introduction to “Correcting Fallacies About Educational and Psychological Testing” unwittingly lets the cat out of the bag with this statement:

      “Physical tests, such as those conducted by engineers, can be standardized, of course [why of course of course], but in this volume , we focus on the measurement of latent (i.e., nonobservable) mental, and not physical, traits.” [my addition] (notice how he is trying to assert by proximity that educational standardized testing and the testing done by engineers are basically the same, in other words a “truly scientific endeavor”)

      Now since there is no agreement on a standard unit of learning, there is no exemplar of that standard unit and there is no measuring device calibrated against said non-existent standard unit, how is it possible to “measure the nonobservable”?

      THE TESTS MEASURE NOTHING for how is it possible to “measure” the nonobservable with a non-existing measuring device that is not calibrated against a non-existing standard unit of learning?????


  5. Askinggoodquestions:

    Thank you for this excellent post Darcy. A couple of thoughts. I see more principals and teachers starting to engage with the serious flaws in VL. However there is still too much ‘buying into’ what is held up as education research because it flies the ‘quantitative’ or ‘meta analysis’ banner. Recently I met a young teacher at an education conference who had just completed his Masters (Hons) thesis – it was all based on the JH work ie ‘effect size’ in particular. We had an excellent but robust conversation – he wanted to see the critique/read the critical blog posts/peer-reviewed papers that tackle the ‘JH juggernaut’ head on. He expressed frustration that none of this had been drawn to his attention while undertaking the study and he questioned the whole premise on which his study (now completed) was based – so there are ‘silences’ and some HE institutions are complicit too. The ‘guru’ is never a good thing and singing songs/constructing lyrics/wearing cult tee shirts at expensive VL seminars is not what our learning profession requires. Education research is messy and complex – the quantitative paradigm provides a partial view – deep studies of practice are necessary to really understand classrooms, learning, teachers, students and schools. Education researchers must ‘get their hands dirty’ in the process of engagement and change. The current and ongoing commercialisation of education and charging systems/schools and the community enormous fees for ‘service’ when you can position yourself as having all of the answers is not healthy and needs to be called out – it drains funds/resources and ‘dumbs down’ the extraordinary and relentless nature of the work that principals and teachers do in our schools every day.

  6. Deanna:

    The issue I had with the Cult of Visible Learning from the very first information session I went to was the “hard sell” to pay huge amounts of money “training” school teams in Hattie’s research findings. It is a business that is raking in the money. Questioning ANYTHING that was presented during the foundation course I attended was actively muted by the facilitators. I have always said that I walked out of the “training” feeling like I had escaped being sucked into the vortex of some cult – as a person who questioned aspects of the research and how the findings applied to my experiences in low sociology economic schools. I actually had a facilitator take me aside and attempt to “turn” me via some sort of interventionist approach. From that day I have been exceptionally cynical of the Visible Learning megalith.
    As a layman (within the academic research context) I have found it has been hard to argue against a “research based” framework, where I have just KNOWN, as an experienced educator, that effect size and ranked strategies DO NOT suit flexible approaches to education – where the student and local context means educational planning and implementation MUST specifically address the needs of this local context.
    Then – to have Hattie installed as Chairperson of AITSL- allows him to have a national vehicle for shoe-horning his PRODUCT into.
    Fascinating reading, Darcy, thanks for the thought provoking post.

  7. Lizzie Chase:

    I love Hattie’s stuff – to use it for trends – but because he uses meta-analyses, he’s presumably comparing apples with oranges sometimes (because different questions were posed or methodologies were used).

  8. Thanks for this post Darcy and the willingness to be a critical consumer of research. I have always been amazed when working with school leaders and they identify as either ‘working in a Hattie school’ or we do this because ‘Hattie says …’. Much, if not all, of what Hattie reports we already knew. That is the nature of meta-analysis. They call on previous work. The packaging has made it attractive (not to mention the many commerical partners – as you rightly identify) and a perception of it as the solution to the problems of under-achievement or mis-investment. To question Hattie is not to critique evidence, but to question any single solution. I truly hope the questioning continues and educators continue to embrace their professionalism as critical consumers of research.

  9. Alex Brown:

    The fatal flaw of Hattie’s work is its reliance on effect size. However, his broad strokes are useful as an entry point for the development of teachers as reflective practitioners. It would be great if the money being spent on AITSL, Hattie, etc. was instead being used to buy release time for teachers to professionalise during work hours, instead of in their free time. I’m not sure if you linked this, Darcy (your references appear comprehensive!), but this is the salient evaluation of Hattie for mine:


  10. Daren:

    Hattie’s work cannot be ignored. We have to take a long hard look at education from a global perspective and see that what we do in Australia might not necessarily be the best way to teach kids. He is looking to challenge the notion of what a great classroom looks like. Truth is.. We need to re-examine how and why we give homework. We need to critically examine the effects, meaningful goal setting and constructive feedback have on learning. I won’t touch on class size as that is outside my area of expertise. You can pull statistics apart and make it say whatever you want to say. Governments do this all the time. I think Hattie’work is a call to re-examine and critically evaluate what we hold as true in teaching. Honestly… Look at the homework the average child brings home or the work that is done across the world in mathematicss classrooms alone, and you will see irrelevant text book based one size fits all wasted opportunities everywhere you look. If we are doing it right, then why are so many failing and hating maths or even homework? Putting the statistics aside, change needs to happen in many of our pedagogical beliefs.

  11. Data troll:

    “Lies, damn lies and Education based data”

    The blind and simplistic adherence to Hattie’s research reflects a much wider issue in Australian education which that the teaching profession is not allowed to, and has forgotten how to think for itself. There is also a vacuum in good educational discourse that Hattie has filled in terms of reflecting on what “works”. People need something to think and follow and Hattie fills that void. (i’m thinking of Monty Python’s “Life of Brian” scene – “He’s the messian !”)
    Authors and bloggers who denote the political element of this issue are exactly right. The research can be manipulated to suit political agendas so politicians can be seen to be “doing something”. Politics is at the root of all educational evil in Australia today. Short term, paternalistic and simplistic solutions that can be sold to a public at a level the public can grasp. Hattie has been misused to suit this.
    At a local level my school uses Hattie and as a leader I quote him, but with the qualification that he isn’t the guru in our setting and his research is a guide and should be questioned in our context. Interestingly, when effect sizes are questioned in our school the position on Hattie is immediately softened to recognise its flaws. There is some worth in the research to get us thinking but “wilful blindness” and adherencemeans we’ve stopped thinking as a profession.

  12. This is an excellent response to the growing concerns that many people have about the influence of Hattie’s work on national and international educators and educational institutions. I’ve also loved reading the very measured and insightful responses left in the comments. Thank you for sharing your thoughts and I hope that educators (and politicians!) read this and that rational dialogue and debate can occur about the manufacturing of the cult of Hattie.

  13. I was first introduced to the work of Hattie when I started my Masters of Education at USyd and my lecturer told me that I had better read his ‘Visible Learning’ because he had proven that PBL (what I planned to do my research on) was not effective. As someone who had been using PBL successfully with all students (from the incredibly disengaged to the very talented learners) in years 7-12, I found it confronting that someone had scientifically proven it was an ineffective methodology. Upon reading his book, and then reading more research into PBL and the strategies which underpin it (and, admittedly as I became more research literate) I decided that meta-analysis was the best way to approach edu research. So, I ignored Hattie and his work for many years until it was thrown in my face – I was made to attend a VL ‘Symposium’ earlier this year… and what I experienced was appalling. It was the cult of Hattie – THAT was the most visible thing. I recorded this on my Twitter feed, and don’t need to rewrite it here. What the event did force me to do was to think critically (and somewhat objectively) about the general tenets of VL, and relate them to PBL… essentially to show how ludicrous it is to denounce it when those things they champion as having significant effect sizes are evident in well-planned and run projects… you can read my take on it here: https://biancahewes.wordpress.com/2017/03/25/the-skill-will-and-thrill-of-project-based-learning/ Looking forward to your thoughts on VL and PBL, Darcy.

    • *meta-anlysis was NOT the best way to approach edu research (sorry for typo!)

    • Darcy Moore:

      Thanks for your thoughtful comment, Bianca. Happy to share my thoughts about effect size, assessment, raising student achievement and PBL (although you are much more knowledgable about this than I).

      Why does inquiry-based learning only have an effect size of 0.31? Hattie would say that inquiry-based learning is not introduced at the right time for the students. Watch his 2-minute spiel: https://www.youtube.com/watch?v=YUooOYbgSUg&ab_channel=Corwin
      I do not disagree with him but once again, it is all about context (inc. how good/bad the research he number-crunched actually was etc.).

      PBL can be done well or poorly, like any other strategy or approach. Teachers working together collaboratively has to be key. Assessment is important and backward mapping what skills/knowledge is needed for the student to produce the product requires a great deal of planning/thought/trial/error. However, the potential for developing students who are “futures-focused” and engaged in challenging, relevant, contextualised learning is just excellent.

      You have constantly striven to learn, share, model, share, learn, fail, share, improve, have fun, strive, write, present blog, share, tweet and attend conferences in an effort to improve your skills and then share, learn, model, fail, grow etc, etc,

      With this in mind, how much of what Dylan Wiliam has say make a great deal of sense compared to the VL model?

      “…if you’re serious about raising student achievement you have to improve teachers’ use of assessment for learning; if you’re serious about helping teachers implement assessment for learning in their own practice, you have to help them do that for themselves as you cannot tell teachers what to do; and the only way to do that at scale is through school-based teacher learning communities. The good news is that you do not need experts to come in and tell you what to do. What you need is for you, as groups of teachers, to hold yourselves accountable for making changes in your practice.”

      Sound sage to me. Sounds like what you endeavour to do.

      • Duane E Swacker:

        “if you’re serious about raising student achievement”

        As a teacher I never gave a damn about “raising student achievement”. I gave a damn about each individual student learning as much Spanish as they could in the 45 minutes a day I had with them in a class of 25-30.

        “raising student achievement” is an edudeformer term used to disparage the actual teaching and learning process that occurs on a minute to minute, hour by hour, day by day and year to year in classrooms around the world.

  14. Not a bad career trajectory – politically highly regarded yet apparently statistically flawed book used as a CV to work for the Commonwealth anointed teaching standards support act. Happily though, I would contend that no-one outside of teaching knows what AITSL actually does (actually, let me correct that – “few inside or many outside teaching”). I wonder if he’s proud to be used as part of the first concerted political push to mandated teacher accountability. That his effect size doesn’t stack up should have been discovered well before now. Admittedly, some of his work has presented useful leads for teachers seeking quality practice, but it has also (now erroneously, it seems) been used as a club to whack teachers into infighting about teaching effectiveness. The sad thing is, if you can’t trust the CEO of your peak teaching excellence organisation for guidance, who can you trust?

  15. Victor Davidson:

    In correspondence he refused to acknowledge the impact of trained and qualified teacher librarians on student outcomes. End story.

  16. Craig Petersen:

    Thanks for posting this, Darcy. It highlights the perennial problem of education systems wanting to cling to those that they see as being able to provide the ‘silver bullet’ – think Finland, Fullan and functional grammar! Teaching is more than a science – it is also a craft and it is the skilful application of knowledge along with the careful consideration of context and relationships which (may) make the successful educator. I have always advocated the importance of teachers and leaders being critical in their adoption of new thinking. This is not to say that we should reject new research or ways of thinking, but the wise educator will consider research carefully it in the context of their experience and their classroom. Critical is the first teaching standard – “Know students and how they learn.”

  17. With all the talk about Hattie, I am always left wondering how Marzano’s work is different? I was of the impression that his model(s) were built on meta-analyses? Could be wrong.

    • Duane E Swacker:

      Marzano? Built on meta-analysis?

      Ay ay ay!

      I had an asst supe adminimal try to get me to have an interactive smart board in my room. (I had been clamoring for another foreign language teacher to be hired for quite a while at that point and didn’t want that supposed smart board, didn’t need it.) She stated “Don’t you know that according to Marzano using a smartboard in class raises student achievement 17%. Why wouldn’t you want one?”

      I asked her if she had read the study. No, she hadn’t. I asked her if she had read any rebuttal to the study. No, she hadn’t. So when I got back to my room I emailed her the following links that destroy Marzano’s ‘meta-analysis’ of the supposed effects of using an interactive smart board. Do you know that Marzano was specifically hired by, paid for by a company, Promethean, that makes interactive boards, to come up with research that they could use in selling that technology?

      See: http://edinsanity.com/2009/06/02/marzano_part1/ And from there you can access the “rest of the story”.

      The edupreneur Marzano has product to sell. Don’t let a little analysis get in the way of buying his “products”

  18. Wayne:

    Make Duane E Swacker the Minister for Education!

    • Duane E Swacker:

      Sorry, Wayne, I don’t think I’d qualify, being a Gringo and all of that. But thanks for the kind thought!

      And that “academic misconduct” to which you refer is routine here in Gringolandia what with all the bought and paid for “research” (like Marzano’s to which I referred) put out by stink tanks, oops, that’s supposed to be think tanks, and the bought and paid for by the same culprits university departments. Lends a fine patina to the crap that comes out of those “institutions”. Think of it as gold plated bovine excrement.

  19. Katy Lumkin:

    It is never a one size fit all. Hattie provides one perspective. There is a place for explicit and systematic teaching and there is a place for autonomy, mastery and purpose. I totally agree with you Darcy – It is about context. Watch these three different perspectives from ILETC project, Melbourne: http://www.iletc.com.au/events/project-launch/455-2/

Post a Comment

* (will not be published)

Random Posts