A surgical education: Joseph Queally on what surgical training can learning from music

It’s been a while since I posted anything on this blog, a longer while since I posted anything that wasn’t just a link to something else here, and an even while since I posted anything all that medical education related.

So here is Addrenbrookes orthopaedic surgeon Joseph Queally with an excellent piece on the BMJ site on what surgical training can learn from music. :

Anyone who has learnt a musical instrument knows that countless hours of practice are needed to achieve success. As a musician who has performed as an individual and as part of a group, I have spent many hours practicing before competitions and performances. It becomes apparent that how one practices is a skill in itself and the type or quality of practice is often more important than the quantity of practice. Ericsson formally described this phenomenon as deliberate practice after studying violinists in a music academy in Berlin. Rather than monotonous repetition of a skill or task, deliberate practice involves breaking the task up into chunks, identifying which ones need improvement, and performing focused practice on this chunk or task until a goal is achieved.

As a surgical educator, I can also see a role for deliberate practice in surgical training. As in music, complex tasks (e.g. percutaneous screw placement in fracture surgery) can be broken up into basic steps or “chunks,” such as image intensifier positioning, appropriate screw entry point identification, and trajectory planning. Trainees can then practice the steps they are deficient in under supervision. Here trainers provide critical feedback by identifying the troublesome parts of a technique that an individual trainee is struggling with. Simulation in particular can provide a safe environment for deliberate practice where trainees can practice tasks repeatedly without risk to patients.

Read the whole thing, as they say.

#LivingLibrary – College of Psychiatrists of Ireland event for #GreenRibbon month, 31st May 2018

I will be speaking as a living book in this:
https_cdn.evbuc.comimages447094501981379252071original

The College is delighted to announce our 4th annual event in partnership with See Change for Green Ribbon Month – A Living Library
When it comes to mental health everyone has a story to share and we find comfort, empathy and compassion in shared experiences. Social contact is known to be one of the most effective ways of reducing mental health related stigma and discrimination so with this in mind, and to mark Green Ribbon month, the College is delighted to announce our ‘Living Library’ event, a library come to life in the outdoors!

At our library the ‘books’ are a little different, they are people; people with different experiences and stories to tell related to mental health including those who have experienced mental health issues and illness, their family members and carers, and the psychiatrists who help them towards the path of recovery. Mental health stigma too often creates discrimination and misunderstanding so we want to give members of the public the opportunity to connect and engage with psychiatrists and people they may not normally have the occasion to speak with.

The aim is to better understand the lived experiences of others who have experienced or facilitated recovery from mental illness and distress and to challenge their own assumptions, prejudices and stereotypes. We invite you to ‘read’ the human books through conversation and gain understanding of their experiences.

For Green Ribbon Month Let’s End the Stigma by not judging a book by its cover and develop a greater understanding of each other’s stories.

Thursday 31st May 2018
12.30pm – 2.30pm
St Stephens Green, Dublin

This is a Free Event, but space is limited. Book your place here.

The myth of digital natives and health IT 

I have a post on the CCIO website on the Digital Native myth and Health IT

The opening paragraph: 

We hear a lot about digital natives. They are related to the similarly much-mentioned millenials; possibly they are exactly the same people (although as I am going to argue that digital natives do not exist, perhaps millenials will also disappear in a puff of logic). Born after 1980, or maybe after 1984, or maybe after 1993, or maybe after 2007, or maybe after 2010, the digital native grew up with IT, or maybe grew up with the internet, or grew up with social media, or at any rate grew up with something that the prior generation – the “digital immigrants” (born a couple of years before the first cut off above, that’s where I am too) – didn’t.

A Way Out of Burnout: Cultivating Differentiated Leadership Through Lament

Some interesting (and provocative) thoughts from the world of church leadership. “Lament” is not prominent in our culture anymore, at least not in our official culture… and one could wonder how to translate these ideas into a secular setting. Nevertheless, there is much to ponder here and I would feel that all in leadership positions – or roles susceptible to burnout – could benefit from reading this, whether they have religious faith or not.

I found the following paragraphs (of what is a long paper) especially resonated:

 

Leaders who are most likely to function poorly physically or emotionally are those who have failed to maintain a well-differentiated position. Either they have accepted the blame owing to irresponsibility and constant criticism of others, or they have gotten themselves into an overfunctioning position (that is, they tried too hard) and rushed in where angels and fools both fear to tread.[12]

Many programs often aim to cure clergy burnout by offering retreats that focus on rest and relaxation. However, Friedman asserts, “Resting and refreshment do not change triangles. Furthermore, because these programs focus on the burned-out ‘family’ member, they can actually add to his or her burden if such individuals are inclined to be soul searchers to begin with.”[13] These same soul-searching and empathetic clergy are vulnerable to seeing the overwhelming burdens that they carry for others as crosses that they ought to bear. Friedman calls this way of thinking “sheer theological camouflage for an ineffective immune system.”[14] When clergy bear other people’s burdens, they are encouraging others not to take personal responsibility. And often in bearing other people’s burdens, clergy easily tend to ignore their own “burdens” (ie. marriage issues, financial problems, etc.) and thus fail to be personally responsible for themselves.

 

London also discusses how “lament” and in some ways “passing the buck onto God” has Biblical roots:

God responds with sympathy to Jesus’ ad deum accusation and lament. Furthermore, one may easily interpret the empty tomb at the end of the Gospel as a sign of God’s ultimate response to Jesus’ lament: the resurrection (Mark 16:4-7). In the psalms of lament and in the cry of dereliction, we see that God does not respond with hostility but with a sympathetic openness to our struggle, our need for someone to blame and, in the words of Walter Brueggemann, our “genuine covenant interaction.”[34] God responds with sympathetic openness to Jesus’ ad duem accusation and then dispels the blame and emotional burden that no human could ever bear. Jesus receives the blame that humans cast upon him and then gives it to God who receives it, absorbs it and dispels it. Jesus let go of the blame by giving it to God. His cry of dereliction became his cry for differentiation. In this way, Jesus serves as a role model for leaders who receive blame from others and then need to differentiate in order to not take accusations personally. By practicing lament, leaders can turn the ad hominem accusations against themselves into ad deum accusations against God, who responds with sympathetic openness while receiving and dispelling the blame. Moreover, leaders can respond with empathy to the suffering of others, knowing that they will not have to bear the emotional burden that they have taken on, indefinitely. They can let go of the emotional burden by passing it on to God through the practice of lament.

This “passing of the buck” to God does not encourage irresponsibility. Rather, it gives the emotional baggage away to the only One who can truly bear it, thus freeing the other to take personal responsibility, without feeling weighed down by unbearable burdens. With this practice, a pastor can therefore receive blame and emotional baggage from parishioners in a pastoral setting because they can differentiate through lament. They can take the blame like Jesus because they, like Jesus, can also pass the buck to God through ad deum accusation. Eventually, the pastor will want to teach the parishioners to redirect their human need to blame onto God as well so as to occlude the cycle of scapegoating in the community.[

 

DANIEL DeFOREST LONDON

This is the final paper I wrote for the class “Leading Through Lament” with Dr. Donn Morgan at the Church Divinity School of the Pacific.

INTRODUCTION

On August 1, 2010, New York Times published an article titled “Taking a Break From the Lord’s Work,” which began with the following statements:  “Members of the clergy now suffer from obesity, hypertension and depression at rates higher than most Americans. In the last decade, their use of antidepressants has risen, while their life expectancy has fallen. Many would change jobs if they could.”[1] Although these are troubling reports, some of the statistics that came out of a study conducted by Fuller Theological Seminary in the late 1980s prove more disturbing: “80 percent [of pastors] believe that pastoral ministry is affecting their families negatively, 90 percent felt they were not adequately trained to cope with the ministry demands placed upon them, 70 percent…

View original post 5,779 more words

“Working here makes us better humans”

A daily thought from Leandro Herrero:

I have had a brilliant two day meeting with a brilliant client. One aspect of my
work with organizations that I truly enjoy is to help craft the ‘Behavioural DNA’ that shapes the culture of the company. This is a set of actionable behaviours that must be universal, from the CEO to the MRO (Mail Room Officer). They also need to pass the ‘new hire test’: would you put that list in front of a prospect employee and say ‘This is us’?

There was one ‘aspirational’ sentence that I put to the test: ‘Working here makes us better human beings’.

It was met with scepticism by the large group in the meeting, initially mainly manifested through body language including the, difficult to describe, cynical smiles. The rationalists in the group jumped in hard to ‘corporatize’ the sentence. ‘Do you mean better professionals?’ The long discussion had started. Or, perhaps, ‘do you mean…’ – and here the full blown corporate Academy of Language – from anything to do with skills, talent management, empowerment to being better managers, being better leaders, and so on.

‘No, I mean better human beings. Period!’- I pushed back. Silence.

Next stage was the litany of adjectives coming form the collective mental thesaurus: fluffy, fuzzy, soft, vague…

I felt compelled to reframe the question: ‘OK, so who is against working in a place that makes you inhuman? Everybody. OK, ‘ So who is against working in a place that makes you more human? Nobody. But still the defensive smiling.

It went on for a while until the group, ‘organically’, by the collective hearing of pros and cons, turned 180 degrees until everybody agreed that ‘Working in a place that makes you a better human being’ was actually very neat. But – there was a but – ‘Our leadership team wont like it. They will say that its fluffy, fuzzy, soft etc… In the words of the group, it was not ‘them’ anymore who had a problem, it was the infamous ‘they’.

The “difficult to describe” cynical smiles are familiar…. indeed I am sure I have perpetrated such smiles more than once myself!

Medicine can be a dehumanising profession, sometimes literally. Dehumanising in both ways – patients, especially some categories of patient, colleagues, but also we ourselves. Of course, the rationalist part of us can pick apart what “better humans” means…

Marcus Aurelius: reflection good enough for an emperor but is it good enough for medicine?

Sati Heer-Stavert very kindly asked my permission to link to the paper I wrote a while back on Marcus Aurelius, stoicism and reflective practice – here is the post that has resulted which I am very impressed by! Certainly Sati has provided an excellent framework to prompt students and learners to reflect on what reflection means and what the obstacles to it are….

UNEXAMINED MEDICINE

Reflection is an important part of training, appraisal and revalidation for doctors based in the UK. However, for many doctors the very thought of reflection can cause feelings of frustration, non-engagement or even rejection. Where did we go wrong?

Learning objectives

1. Consider the definition of reflection used in medicine

2. Understand how reflection can be assessed

3. Encourage you to read Meditations by Marcus Aurelius

Oh no! A patient has complained about your refusal to supply antibiotics for a cold. Wow! This would make a really good entry in your learning portfolio:

“That men of a certain type should behave as they do is inevitable. To wish it otherwise were to wish the fig-tree would not yield its juice. In any case, remember that in a very little while both you and he will be dead, and your very names will quickly be forgotten.”

You have to respond to…

View original post 885 more words

Presentation by Pedro de Bruyckere: Urban Myths about Learning and Technology

An excellent presentation by Pedro De Bruyckere, co author of the recent paper on the myth of the digital native I blogged about before… “I believe in education, I believe in teachers… but do I believe in technology in education? It depends”

Obviously these are slides which can’t compete with the real thing and clearly Pedro de Bruyckere has a rich sense of humour!

From experience to meaning...

This is the presentation I gave at the National ResearchED conference, September 9 2017. The presentation is in part based on our book Urban Myths about Learning and Education and in part based on the recent article I co-wrote with Paul Kirschner published in Teaching and Teacher Education (yes the one that was mentioned in Nature).

View original post

“a tendency to overhype fixes that later turn out to be complete turkeys”

An interesting passage on the contemporary dynamics of the quick fix, from “The Slow Fix: Solve Problems, Work Smarter and Live Better in a Fast World” by Carl Honore:

“The media add fuel to that fire. When anything goes wrong – in politics, business, a celebrity relationship – journalists pounce, dissecting the crisis with glee and demanding an instant remedy. When golfer Tiger Woods was outed as a serial philanderer, he vanished from the public eye for three months before finally breaking his silence to issue a mea culpa and announce he was in therapy for sex addiction. How did the media react to being made to wait that long? With fury and indignation. The worst sin for a public figure on the ropes is to fail to serve up an instant exit strategy.

“That impatience fuels a tendency to overhype fixes that later turn out to be complete turkeys. An engineer by training, Marco Petruzzi worked as a globetrotting management consultant for 15 years before abandoning the corporate world to build better schools for the poor in the United States. We will meet him again later in the book, but for now consider his attack on our culture of hot air. ‘In the past, hard-working entrepreneurs developed amazing stuff over time, and they did it, they didn’t just talk about it, they did it,’ he says. ‘We live in a world now where talk is cheap and bold ideas can create massive wealth without ever having to deliver. There are multi-billionaires out there who never did anything but capture the investment cycle and the spin cycle at the right moment, which just reinforces a culture where people don’t want to put in the time and effort to come up with real and lasting solutions to problems. Because if they play their cards right, and don’t worry about the future, they can get instant financial returns’

#Grief on a #Booterstown plaque: “A particularly bright, holy and gifted child” – the life and losses of Richard Robert #Madden

Richard Robert Madden was one of those polymathic doctors of the 19th Century whose medical career, as I observe in passing here, was almost incidental to a life packed with incident and scholarship (thought clearly some disputed aspects of the scholarship) Nevertheless, he evidently rose through the institutional ranks of medical memberships and fellowships – and became a “convert” to homeopathy to boot (at a time when, after all, “mainstream” medicine was not exactly evidence based itself)

For all these achievements, there is a keen poignancy to this plaque. I’ve read (must track down source) that the common contemporary belief that in previous centuries, because of high child mortality, parents did not have the same emotional reaction to the loss of a child than we do now is in fact a myth (I think it was in a rebuttal to one of the historians cited by
Neil Postman in his The Disappearance of Childhood)

Séamus Sweeney

wp-image-1723125758

In The Church of the Assumption, Booterstown, Dublin we find the above poignant plaque. Here is the text as the above turns out to be a little blurry:

MADDEN. Of your charity pray for the soul of
/Richard Robert Madden, M.D.
/formerly Colonial Secretary
/of Western Australia &c. “A man who loved his Country.”/
Author of “History of United Irishmen” and many other works.
/Remarkable for Talents Piety, and Rectitude, the 21st and last surviving son of/Edward Madden, born in Dublin August 20th 1798 died at Booterstown Feb 5th 1886
/and interred in Donnybrook Churchyard/
also for the soul of his relict Mrs Harriet T Madden, the 21st and last surviving child of
/John Elmslie Esq. Born in London August 4th 1801
/converted by a singular grace to the Catholic Faith in Cuba (circa) 1837
/died at Booterstown Feb 7th 1888/
A woman of rare culture, endowments and piety, a…

View original post 1,043 more words

“The myths of the digital native and the multitasker”

One common rhetorical device heard in technology circles – including eHealth circles – is the idea that those born after 1980, or maybe 1984, or maybe 1993, or maybe 2000, or maybe 2010 (you get the picture) are “digital natives” – everyone else is “digital immigrant” In the current edition of Teaching and Teacher Education, Kirschner and de Bruyckere have an excellent paper on this myth, and the related myth of multitasking.

The “highlights” of the paper (I am not sure if these are selected by the authors or by the editors – UPDATE: see comment by Paul Kirschner below!) are pretty to the point:

Highlights

Information-savvy digital natives do not exist.

Learners cannot multitask; they task switch which negatively impacts learning.

Educational design assuming these myths hinders rather than helps learning.

The full article is via subscription/library online, and this recent post on the blog of Nature discusses this paper and others on this myth. This is Kirschner and de Bruyckere’s abstract:

Current discussions about educational policy and practice are often embedded in a mind-set that considers students who were born in an age of omnipresent digital media to be fundamentally different from previous generations of students. These students have been labelled digital natives and have been ascribed the ability to cognitively process multiple sources of information simultaneously (i.e., they can multitask). As a result of this thinking, they are seen by teachers, educational administrators, politicians/policy makers, and the media to require an educational approach radically different from that of previous generations. This article presents scientific evidence showing that there is no such thing as a digital native who is information-skilled simply because (s)he has never known a world that was not digital. It then proceeds to present evidence that one of the alleged abilities of students in this generation, the ability to multitask, does not exist and that designing education that assumes the presence of this ability hinders rather than helps learning. The article concludes by elaborating on possible implications of this for education/educational policy.

The paper is one of those trenchantly entertaining ones academia throws up every so often. For instance here the authors are on the origins of the “digital native” terminology (and “homo zappiens”, a new one on me):

A

ccording to Prensky (2001), who coined the term, digital natives
constitute an ever-growing group of children, adolescents,
and nowadays young adults (i.e., those born after 1984; the official
beginning of this generation) who have been immersed in digital
technologies all their lives. The mere fact that they have been
exposed to these digital technologies has, according to him,
endowed this growing group with specific and even unique characteristics
that make its members completely different from those
growing up in previous generations. The name given to those born
before 1984 – the year that the 8-bit video game saw the light of
day, though others use 1980 – is digital immigrant. Digital natives
are assumed to have sophisticated technical digital skills and
learning preferences for which traditional education is unprepared
and unfit. Prensky coined the term, not based upon extensive
research into this generation and/or the careful study of those
belonging to it, but rather upon a rationalisation of phenomena and
behaviours that he had observed. In his own words, he saw children
“surrounded by and using computers, videogames, digital music
players, video cams, cell phones, and all the other toys and tools of
the digital age” (2001, p.1). Based only upon these observations, he
assumed that these children understood what they were doing,
were using their devices effectively and efficiently, and based upon
this that it would be good to design education that allows them to
do this. Prensky was not alone in this. Veen and Vrakking (2006),
for example, went a step further coining the catchy name homo
zappi€ens to refer to a new breed of learners that has developed e
without either help from or instruction by others e those metacognitive
skills necessary for enquiry-based learning, discovery based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self regulation,
problem solving, and making their own implicit (i.e.,
tacit) and explicit knowledge explicit to others.

The saw that children are invariably more tech savvy then their parents is also a myth:

Looking at pupils younger than university students, the largescale
EU Kids Online report (Livingstone, Haddon, Gorzig, € &
Olafsson, 2011 ), placed the term ‘digital native’ in first place on
its list of the ten biggest myths about young people and technology.
They state: “Children knowing more than their parents has been
136 P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142
exaggerated … Talk of digital natives obscures children’s need for
support in developing digital skills” and that “… only one in five
[children studied] used a file-sharing site or created a pet/avatar
and half that number wrote a blog … While social networking
makes it easier to upload content, most children use the internet for
ready-made, mass produced content” (p. 42). While the concept of
the digital native explicitly and/or implicitly assumes that the
current generation of children is highly digitally literate, it is then
rather strange to note that many curricula in many countries on
many continents (e.g., North America, Europe) see information and
technology literacy as 21st century skills that are core curriculum
goals at the end of the educational process and that need to be
acquired.

Two more recent studies show that the supposed digital divide
is a myth in itself. A study carried out by Romero, Guitert, Sangra,
and Bullen (2013) found that it was, in fact, older students (>30
years and thus born before 1984) who exhibited the characteristics
attributed to digital natives more than their younger counterparts.
In their research, 58% of their students were older than 30 years
who “show the characteristics of this [Net Generation profile]
claimed by the literature because, on analysing their habits, they
can be labelled as ICT users more than digital immigrants” (p. 176).
In a study on whether digital natives are more ‘technology savvy’
than their middle school science teachers, Wang, Hsu, Campbell,
Coster, and Longhurst (2014) conclude that this is not the case.

The authors are not arguing that curricula and teaching methods do not need to change and evolve, but that the myth of the digital native should not be the reason for doing so:

Finally, this non-existence of digital natives makes clear that one
should be wary about claims to change education because this
generation of young people is fundamentally different from previous
generations of learners in how they learn/can learn because
of their media usage (De Bruyckere, Hulshof, & Kirschner, 2015).
The claim of the existence of a generation of digital natives, thus,
cannot be used as either a motive or an excuse to implement
pedagogies such as enquiry-based learning, discovery-based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self-regulation or
problem solving as Veen and Vrakking (2006) argued. This does not
mean education should neither evolve nor change, but rather that
proposed changes should be evidence informed both in the reasons
for the change and the proposed changes themselves, something
P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142 137
that ‘digital natives’ is not.
The non-existence of digital natives is definitely not the ‘reason’
why students today are disinterested at and even ‘alienated’ by
school. This lack of interest and alienation may be the case, but the
causes stem from quite different things such as the fact that
diminished concentration and the loss of the ability to ignore
irrelevant stimuli may be attributed to constant task switching
between different devices (Loh & Kanai, 2016; Ophir, Nass, &
Wagner, 2009; Sampasa-Kanyinga & Lewis, 2015). This, however,
is the topic of a different article.

The paper also deals with multi-tasking. Firstly they examine the nature of attention. “Multi-tasking” is an impossibility from this point of view, unless the tasks are automatic behaviours. They cite a range of research which, unsurprisingly enough, link heavy social media usage (especially with the user instantly replying to stimuli) with poorer educational outcomes:

Ophir et al. (2009) in a study in which university students who
identified themselves as proficient multitaskers were asked to
concentrate on rectangular stimuli of one colour on a computer
monitor and ignore irrelevant stimuli entering their screen of a
different colour observed that
heavy media multitaskers are more susceptible to interference
from irrelevant environmental stimuli and from irrelevant
representations in memory. This led to the surprising result that
heavy media multitaskers performed worse on a test of taskswitching
ability, likely because of reduced ability to filter out
interference from the irrelevant task set (p. 15583).
Ophir et al. (2009) concluded that faced with of distractors,
heavy multitaskers were slower in detecting changes in visual
patterns, were more susceptible to false recollections of the distractors
during a memory task, and were slower in task-switching.
Heavy multitaskers were less able than light/occasional multitaskers
to volitionally restrain their attention only to task relevant
information.

The authors specifically warn caution about the drive that students bring their own device to school.

Why is this paper so important? As the authors show (and the author of the Nature blog post linked to above also observes) this is not a new finding. There are many pieces out there, both academic and journalistic, on the myth of the digital native. This paper specifically locates the dicussion in education and in teacher training (they say much also on the issue of supposedly “digital native” teachers) and is a trenchant warning on the magical thinking that has grown up around technology.

There are obvious parallels with health and technology. The messianic, evangelical approach to healthtech is replete with its own assumptions about digital natives, and magical thinking about how easily they navigate online worlds. Using a handful of social medial tools or apps with visual interactive systems does not translate into a deep knowledge of the online world, or indeed a wisdom about it (or anything else)