“They should teach that in school….”

One of the academic studies I haven’t had time to pursue (so only blog about) is a thematic analysis of editorials in medical journals – with a focus on the many many “musts”, “need to s”, “shoulds” and “have to s” imposed on doctors, “policymakers”, and so on.

Education is more prone to this, and from a wider group of people. Everyone has their idea of what “they” should teach, ascribing to schools magical powers to end social ills by simply putting something on the curriculum.

Much of this is very worthy and well-intentioned. People want their children to be prepared for life. That the things suggested may not lend themselves to “being on the curriculum” with any degree of effectiveness is rarely considered.
That curricula are pretty overloaded anyway is rarely considered.

Anyway, the UK Organisation “Parents and Teachers for Excellence” has been keeping track of these “X should be taught in schools calls” in 2018 so far.:

How often do you hear the phrase “Schools should teach…” in the media?
We’ve noticed that barely a week goes by without a well-meaning person or organisation insisting that something else is added to the curriculum, often without any consideration as to how it could be fitted into an already-squeezed school day. Obviously the curriculum needs to be updated and improved upon over time, and some of the topics proposed are incredibly important. However, there are only so many hours in the school week, and we believe that teachers and schools are the ones best placed to decide what their students need to know, and not have loads of additional things forced on them by government because of lobbying by others.

So far, as of today, this is the list:

So far this year we count 22 suggestions for what schools should do with pupils:
Why We Should Teach School Aged Children About Baby Loss
Make schools colder to improve learning
Schools ‘should help children with social media risk’
Pupils should stand or squat at their desks, celebrity GP says
MP’s call for national anthem teaching in schools to unite country
It’s up to us: heads and teachers must model principled, appropriate and ethical online behaviour
Primary school children need to learn about intellectual property, Government agency says
Call for more sarcasm at school is no joke
Schools should teach more ‘nuanced’ view of feminism, Girls’ School Association president says
Schools ‘should teach children about the dangers of online sexual content’
Schools should teach children resilience to help them in the workplace, new Education Secretary says
Government launches pack to teach pupils ‘importance of the Commonwealth’
Schools must not become like prisons in fight against knife crime, headteacher warns
Schools should teach all pupils first aid, MPs say
Call for agriculture GCSE to be introduced as UK prepares to leave the EU
Councils call for compulsory mental health counselling in all secondary schools
Set aside 15 minutes of dedicated reading time, secondary schools told
Pupils must be taught about architecture, says Gokay Deveci
A serious education on the consequences of obesity is needed for our most overweight generation

Teach girls how to get pregnant, say doctors
Start teaching children the real facts of life

I am confident there are a lot more out there PTE haven’t been linked with. From sarcasm to “how to get pregnant” to first aid to intellectual property to resilience.

I do wish someone would do my study on medical journals’ imperatives for me!


Why isn’t William C Campbell more famous in Ireland?

There have been only two Irish winners of Nobel Prizes other than Literature and Peace – Dungarvan-born Ernest Walton for physics in 1951 and Ramelton-born William C Campbell for Physiology or Medicine in 2015.

My memory of being in school in the 1990s was that Ernest Walton loomed fairly large in science popularisation at the time. I recall quite vividly coverage of his death in 1995, but also recall his being quoted and profiled fairly extensively. Of course, I could be a victim of a recall bias – I probably am. Yet it does seem that William C Campbell has not had nearly as much coverage, especially when you consider how media-saturated we are now.

Or perhaps that is the whole point. It feels like a silly comparison, but is may be like the Eurovision; once we cared deeply about winning this competition and getting recognition, now there is a flurry of excitement if we get to the final. Having said that, it isn’t like we have had any other science Nobels to get excited about since 1995.

Of course there is a reasonable amount of coverage of Campbell, in the Irish Times in particular some of it quite recent. A fair percentage of online coverage seems to be from the Donegal papers of the hail-the-local-hero variety, which is fair enough.

A search for ‘William Campbell “Irish Independent”‘ starts with two articles from the Independent on Campbell, then has this , then a range of articles about unrelated topics.

I came across this excellent piece on “the fragile culture of Irish journalism” by Declan Fahy – the fragility exemplified by the coverage of Campbell’s prize:

The reporting of Campbell’s Nobel win illuminated several more general features of Irish media coverage of science. The story originated outside Ireland, yet its local dimension was stressed. Its tone was celebratory. It was not covered by specialist science journalists. Only The Irish Times probed deeper into the background of the scientist and his work.

The story was interesting also because of the aspects of Campbell’s story that were not developed. Reporters did not use the announcement as a jumping-off point to explore some of the novel dimensions of Campbell’s story, such as the rights and wrongs of pharmaceutical companies’ ownership of drugs that could help millions of the world’s poorest people, the unseen research work of an industry-based scientist, and the complex case of a scientist of faith with an admitted “complicated sense of religion”.

The superficial reporting of the Campbell story is not an isolated case. It reflects more generally the state of Irish science journalism, where there are few dedicated science journalists, a shortfall of science coverage compared to other countries, a neglect of science policy coverage, a reliance on one outlet for sustained coverage, a dependence on subsidies for the production of some forms of journalistic content, and a dominant style of reporting that lacks a critical edge.

(in passing, Walton was also a scientist of faith, although perhaps with less “complicated sense of religion” than Campbell)

Fahy goes on, in what is a an extract from a book co-edited by Fahy, “Little Country, Big Talk” to enumerate some fo the issues both within the structure of media institutions and within Irish society and culture overall which contribute to this relative neglect. While there is an Irish Science and Technology Journalists Association, there is not a critical mass of science journalists. Writing in 2017, Fahy observes:

Compared to the US and UK, Ireland has a far less developed culture of science journalism. There are currently no full-time science journalists in mainstream Irish newspapers and broadcasters. The Irish Times had a dedicated science editor in Dick Ahlstrom, who has now retired (and, during his tenure, he had other significant editorial duties at the news organisation).

The Irish Times also had a longtime environmental correspondent, Frank McDonald, who retired in recent years. Earlier this year, former editor Kevin O’Sullivan combined these two roles, becoming environment and science editor. The paper also has a health correspondent and a specialist medical writer. The Irish Independent has an environment editor, Paul Melia.

The public service broadcaster, RTÉ, has had specialists in science or technology, but its correspondents have usually had dual briefs, reporting on education or health as well as science, and tending to cover education or health more so than science. That tendency, identified by Brian Trench in 2007’s Mapping Irish Media, has continued. In 2016, the incumbent in the role is responsible for science and technology, and tends to cover technology more than science.

Fahy also discusses the wider place of science in Irish culture and society. There are many many fascinating stories to tell about science in Ireland, such as Erwin Schrodinger’s time here (perhaps illustrative of Fahy’s point is that the very first Google result for “Schrodinger in Ireland” is this) and the many many stories collected by Mary Mulvihill in Ingenious Ireland. As I have just posted on Seamus Sweeney, I only learnt while researching this post that Mary Mulvihill died in 2015.

Of course, some of these stories can be told with a celebratory, or I-can’t-believe-this-happened-in-little-auld-Ireland focus, which again illustrates Fahy’s point. My own perception is that in 1995 the situation was actually a little better than it is now – that Irish science journalism is not in stasis but actually in reverse .

One striking point made by Fahy is that the science beat is often combined with health or technology- and these tend to win out in terms of focus. And the hard , critical questions don’t tend to get asked – often there is a strong bang of barely rewritten press release about articles on science topics.

Another thought – the retirement of Dick Ahlstrom and death of Mary Mulvihill alone robbed the already small pool of Irish science writers of some of the finest practitioners. Irish journalism – like Irish anything- is pretty much a small world and a couple of such losses can have a huge impact.

The myth of digital natives and health IT 

I have a post on the CCIO website on the Digital Native myth and Health IT

The opening paragraph: 

We hear a lot about digital natives. They are related to the similarly much-mentioned millenials; possibly they are exactly the same people (although as I am going to argue that digital natives do not exist, perhaps millenials will also disappear in a puff of logic). Born after 1980, or maybe after 1984, or maybe after 1993, or maybe after 2007, or maybe after 2010, the digital native grew up with IT, or maybe grew up with the internet, or grew up with social media, or at any rate grew up with something that the prior generation – the “digital immigrants” (born a couple of years before the first cut off above, that’s where I am too) – didn’t.

A Way Out of Burnout: Cultivating Differentiated Leadership Through Lament

Some interesting (and provocative) thoughts from the world of church leadership. “Lament” is not prominent in our culture anymore, at least not in our official culture… and one could wonder how to translate these ideas into a secular setting. Nevertheless, there is much to ponder here and I would feel that all in leadership positions – or roles susceptible to burnout – could benefit from reading this, whether they have religious faith or not.

I found the following paragraphs (of what is a long paper) especially resonated:

Leaders who are most likely to function poorly physically or emotionally are those who have failed to maintain a well-differentiated position. Either they have accepted the blame owing to irresponsibility and constant criticism of others, or they have gotten themselves into an overfunctioning position (that is, they tried too hard) and rushed in where angels and fools both fear to tread.[12]

Many programs often aim to cure clergy burnout by offering retreats that focus on rest and relaxation. However, Friedman asserts, “Resting and refreshment do not change triangles. Furthermore, because these programs focus on the burned-out ‘family’ member, they can actually add to his or her burden if such individuals are inclined to be soul searchers to begin with.”[13] These same soul-searching and empathetic clergy are vulnerable to seeing the overwhelming burdens that they carry for others as crosses that they ought to bear. Friedman calls this way of thinking “sheer theological camouflage for an ineffective immune system.”[14] When clergy bear other people’s burdens, they are encouraging others not to take personal responsibility. And often in bearing other people’s burdens, clergy easily tend to ignore their own “burdens” (ie. marriage issues, financial problems, etc.) and thus fail to be personally responsible for themselves.

London also discusses how “lament” and in some ways “passing the buck onto God” has Biblical roots:

God responds with sympathy to Jesus’ ad deum accusation and lament. Furthermore, one may easily interpret the empty tomb at the end of the Gospel as a sign of God’s ultimate response to Jesus’ lament: the resurrection (Mark 16:4-7). In the psalms of lament and in the cry of dereliction, we see that God does not respond with hostility but with a sympathetic openness to our struggle, our need for someone to blame and, in the words of Walter Brueggemann, our “genuine covenant interaction.”[34] God responds with sympathetic openness to Jesus’ ad duem accusation and then dispels the blame and emotional burden that no human could ever bear. Jesus receives the blame that humans cast upon him and then gives it to God who receives it, absorbs it and dispels it. Jesus let go of the blame by giving it to God. His cry of dereliction became his cry for differentiation. In this way, Jesus serves as a role model for leaders who receive blame from others and then need to differentiate in order to not take accusations personally. By practicing lament, leaders can turn the ad hominem accusations against themselves into ad deum accusations against God, who responds with sympathetic openness while receiving and dispelling the blame. Moreover, leaders can respond with empathy to the suffering of others, knowing that they will not have to bear the emotional burden that they have taken on, indefinitely. They can let go of the emotional burden by passing it on to God through the practice of lament.

This “passing of the buck” to God does not encourage irresponsibility. Rather, it gives the emotional baggage away to the only One who can truly bear it, thus freeing the other to take personal responsibility, without feeling weighed down by unbearable burdens. With this practice, a pastor can therefore receive blame and emotional baggage from parishioners in a pastoral setting because they can differentiate through lament. They can take the blame like Jesus because they, like Jesus, can also pass the buck to God through ad deum accusation. Eventually, the pastor will want to teach the parishioners to redirect their human need to blame onto God as well so as to occlude the cycle of scapegoating in the community.[

Fr. Daniel London

This is the final paper I wrote for the class “Leading Through Lament” with Dr. Donn Morgan at the Church Divinity School of the Pacific.


On August 1, 2010, New York Times published an article titled “Taking a Break From the Lord’s Work,” which began with the following statements:  “Members of the clergy now suffer from obesity, hypertension and depression at rates higher than most Americans. In the last decade, their use of antidepressants has risen, while their life expectancy has fallen. Many would change jobs if they could.”[1] Although these are troubling reports, some of the statistics that came out of a study conducted by Fuller Theological Seminary in the late 1980s prove more disturbing: “80 percent [of pastors] believe that pastoral ministry is affecting their families negatively, 90 percent felt they were not adequately trained to cope with the ministry demands placed upon them, 70 percent…

View original post 5,779 more words

“Working here makes us better humans”

A daily thought from Leandro Herrero:

I have had a brilliant two day meeting with a brilliant client. One aspect of my
work with organizations that I truly enjoy is to help craft the ‘Behavioural DNA’ that shapes the culture of the company. This is a set of actionable behaviours that must be universal, from the CEO to the MRO (Mail Room Officer). They also need to pass the ‘new hire test’: would you put that list in front of a prospect employee and say ‘This is us’?

There was one ‘aspirational’ sentence that I put to the test: ‘Working here makes us better human beings’.

It was met with scepticism by the large group in the meeting, initially mainly manifested through body language including the, difficult to describe, cynical smiles. The rationalists in the group jumped in hard to ‘corporatize’ the sentence. ‘Do you mean better professionals?’ The long discussion had started. Or, perhaps, ‘do you mean…’ – and here the full blown corporate Academy of Language – from anything to do with skills, talent management, empowerment to being better managers, being better leaders, and so on.

‘No, I mean better human beings. Period!’- I pushed back. Silence.

Next stage was the litany of adjectives coming form the collective mental thesaurus: fluffy, fuzzy, soft, vague…

I felt compelled to reframe the question: ‘OK, so who is against working in a place that makes you inhuman? Everybody. OK, ‘ So who is against working in a place that makes you more human? Nobody. But still the defensive smiling.

It went on for a while until the group, ‘organically’, by the collective hearing of pros and cons, turned 180 degrees until everybody agreed that ‘Working in a place that makes you a better human being’ was actually very neat. But – there was a but – ‘Our leadership team wont like it. They will say that its fluffy, fuzzy, soft etc… In the words of the group, it was not ‘them’ anymore who had a problem, it was the infamous ‘they’.

The “difficult to describe” cynical smiles are familiar…. indeed I am sure I have perpetrated such smiles more than once myself!

Medicine can be a dehumanising profession, sometimes literally. Dehumanising in both ways – patients, especially some categories of patient, colleagues, but also we ourselves. Of course, the rationalist part of us can pick apart what “better humans” means…

“a tendency to overhype fixes that later turn out to be complete turkeys”

An interesting passage on the contemporary dynamics of the quick fix, from “The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World” by Carl Honore:

“The media add fuel to that fire. When anything goes wrong – in politics, business, a celebrity relationship – journalists pounce, dissecting the crisis with glee and demanding an instant remedy. When golfer Tiger Woods was outed as a serial philanderer, he vanished from the public eye for three months before finally breaking his silence to issue a mea culpa and announce he was in therapy for sex addiction. How did the media react to being made to wait that long? With fury and indignation. The worst sin for a public figure on the ropes is to fail to serve up an instant exit strategy.

“That impatience fuels a tendency to overhype fixes that later turn out to be complete turkeys. An engineer by training, Marco Petruzzi worked as a globetrotting management consultant for 15 years before abandoning the corporate world to build better schools for the poor in the United States. We will meet him again later in the book, but for now consider his attack on our culture of hot air. ‘In the past, hard-working entrepreneurs developed amazing stuff over time, and they did it, they didn’t just talk about it, they did it,’ he says. ‘We live in a world now where talk is cheap and bold ideas can create massive wealth without ever having to deliver. There are multi-billionaires out there who never did anything but capture the investment cycle and the spin cycle at the right moment, which just reinforces a culture where people don’t want to put in the time and effort to come up with real and lasting solutions to problems. Because if they play their cards right, and don’t worry about the future, they can get instant financial returns’

“The myths of the digital native and the multitasker”

One common rhetorical device heard in technology circles – including eHealth circles – is the idea that those born after 1980, or maybe 1984, or maybe 1993, or maybe 2000, or maybe 2010 (you get the picture) are “digital natives” – everyone else is “digital immigrant” In the current edition of Teaching and Teacher Education, Kirschner and de Bruyckere have an excellent paper on this myth, and the related myth of multitasking.

The “highlights” of the paper (I am not sure if these are selected by the authors or by the editors – UPDATE: see comment by Paul Kirschner below!) are pretty to the point:


Information-savvy digital natives do not exist.

Learners cannot multitask; they task switch which negatively impacts learning.

Educational design assuming these myths hinders rather than helps learning.

The full article is via subscription/library online, and this recent post on the blog of Nature discusses this paper and others on this myth. This is Kirschner and de Bruyckere’s abstract:

Current discussions about educational policy and practice are often embedded in a mind-set that considers students who were born in an age of omnipresent digital media to be fundamentally different from previous generations of students. These students have been labelled digital natives and have been ascribed the ability to cognitively process multiple sources of information simultaneously (i.e., they can multitask). As a result of this thinking, they are seen by teachers, educational administrators, politicians/policy makers, and the media to require an educational approach radically different from that of previous generations. This article presents scientific evidence showing that there is no such thing as a digital native who is information-skilled simply because (s)he has never known a world that was not digital. It then proceeds to present evidence that one of the alleged abilities of students in this generation, the ability to multitask, does not exist and that designing education that assumes the presence of this ability hinders rather than helps learning. The article concludes by elaborating on possible implications of this for education/educational policy.

The paper is one of those trenchantly entertaining ones academia throws up every so often. For instance here the authors are on the origins of the “digital native” terminology (and “homo zappiens”, a new one on me):


ccording to Prensky (2001), who coined the term, digital natives
constitute an ever-growing group of children, adolescents,
and nowadays young adults (i.e., those born after 1984; the official
beginning of this generation) who have been immersed in digital
technologies all their lives. The mere fact that they have been
exposed to these digital technologies has, according to him,
endowed this growing group with specific and even unique characteristics
that make its members completely different from those
growing up in previous generations. The name given to those born
before 1984 – the year that the 8-bit video game saw the light of
day, though others use 1980 – is digital immigrant. Digital natives
are assumed to have sophisticated technical digital skills and
learning preferences for which traditional education is unprepared
and unfit. Prensky coined the term, not based upon extensive
research into this generation and/or the careful study of those
belonging to it, but rather upon a rationalisation of phenomena and
behaviours that he had observed. In his own words, he saw children
“surrounded by and using computers, videogames, digital music
players, video cams, cell phones, and all the other toys and tools of
the digital age” (2001, p.1). Based only upon these observations, he
assumed that these children understood what they were doing,
were using their devices effectively and efficiently, and based upon
this that it would be good to design education that allows them to
do this. Prensky was not alone in this. Veen and Vrakking (2006),
for example, went a step further coining the catchy name homo
zappi€ens to refer to a new breed of learners that has developed e
without either help from or instruction by others e those metacognitive
skills necessary for enquiry-based learning, discovery based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self regulation,
problem solving, and making their own implicit (i.e.,
tacit) and explicit knowledge explicit to others.

The saw that children are invariably more tech savvy then their parents is also a myth:

Looking at pupils younger than university students, the largescale
EU Kids Online report (Livingstone, Haddon, Gorzig, € &
Olafsson, 2011 ), placed the term ‘digital native’ in first place on
its list of the ten biggest myths about young people and technology.
They state: “Children knowing more than their parents has been
136 P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142
exaggerated … Talk of digital natives obscures children’s need for
support in developing digital skills” and that “… only one in five
[children studied] used a file-sharing site or created a pet/avatar
and half that number wrote a blog … While social networking
makes it easier to upload content, most children use the internet for
ready-made, mass produced content” (p. 42). While the concept of
the digital native explicitly and/or implicitly assumes that the
current generation of children is highly digitally literate, it is then
rather strange to note that many curricula in many countries on
many continents (e.g., North America, Europe) see information and
technology literacy as 21st century skills that are core curriculum
goals at the end of the educational process and that need to be

Two more recent studies show that the supposed digital divide
is a myth in itself. A study carried out by Romero, Guitert, Sangra,
and Bullen (2013) found that it was, in fact, older students (>30
years and thus born before 1984) who exhibited the characteristics
attributed to digital natives more than their younger counterparts.
In their research, 58% of their students were older than 30 years
who “show the characteristics of this [Net Generation profile]
claimed by the literature because, on analysing their habits, they
can be labelled as ICT users more than digital immigrants” (p. 176).
In a study on whether digital natives are more ‘technology savvy’
than their middle school science teachers, Wang, Hsu, Campbell,
Coster, and Longhurst (2014) conclude that this is not the case.

The authors are not arguing that curricula and teaching methods do not need to change and evolve, but that the myth of the digital native should not be the reason for doing so:

Finally, this non-existence of digital natives makes clear that one
should be wary about claims to change education because this
generation of young people is fundamentally different from previous
generations of learners in how they learn/can learn because
of their media usage (De Bruyckere, Hulshof, & Kirschner, 2015).
The claim of the existence of a generation of digital natives, thus,
cannot be used as either a motive or an excuse to implement
pedagogies such as enquiry-based learning, discovery-based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self-regulation or
problem solving as Veen and Vrakking (2006) argued. This does not
mean education should neither evolve nor change, but rather that
proposed changes should be evidence informed both in the reasons
for the change and the proposed changes themselves, something
P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142 137
that ‘digital natives’ is not.
The non-existence of digital natives is definitely not the ‘reason’
why students today are disinterested at and even ‘alienated’ by
school. This lack of interest and alienation may be the case, but the
causes stem from quite different things such as the fact that
diminished concentration and the loss of the ability to ignore
irrelevant stimuli may be attributed to constant task switching
between different devices (Loh & Kanai, 2016; Ophir, Nass, &
Wagner, 2009; Sampasa-Kanyinga & Lewis, 2015). This, however,
is the topic of a different article.

The paper also deals with multi-tasking. Firstly they examine the nature of attention. “Multi-tasking” is an impossibility from this point of view, unless the tasks are automatic behaviours. They cite a range of research which, unsurprisingly enough, link heavy social media usage (especially with the user instantly replying to stimuli) with poorer educational outcomes:

Ophir et al. (2009) in a study in which university students who
identified themselves as proficient multitaskers were asked to
concentrate on rectangular stimuli of one colour on a computer
monitor and ignore irrelevant stimuli entering their screen of a
different colour observed that
heavy media multitaskers are more susceptible to interference
from irrelevant environmental stimuli and from irrelevant
representations in memory. This led to the surprising result that
heavy media multitaskers performed worse on a test of taskswitching
ability, likely because of reduced ability to filter out
interference from the irrelevant task set (p. 15583).
Ophir et al. (2009) concluded that faced with of distractors,
heavy multitaskers were slower in detecting changes in visual
patterns, were more susceptible to false recollections of the distractors
during a memory task, and were slower in task-switching.
Heavy multitaskers were less able than light/occasional multitaskers
to volitionally restrain their attention only to task relevant

The authors specifically warn caution about the drive that students bring their own device to school.

Why is this paper so important? As the authors show (and the author of the Nature blog post linked to above also observes) this is not a new finding. There are many pieces out there, both academic and journalistic, on the myth of the digital native. This paper specifically locates the dicussion in education and in teacher training (they say much also on the issue of supposedly “digital native” teachers) and is a trenchant warning on the magical thinking that has grown up around technology.

There are obvious parallels with health and technology. The messianic, evangelical approach to healthtech is replete with its own assumptions about digital natives, and magical thinking about how easily they navigate online worlds. Using a handful of social medial tools or apps with visual interactive systems does not translate into a deep knowledge of the online world, or indeed a wisdom about it (or anything else)