Theranos, hype, fraud, solutionism, and eHealth

CV3cJegU4AA5kcY

Anyone who has had to either give or take a blood sample has surely thought “there must be a better way.” The promise of replacing the pain of the needle and the seeming waste of multiple blood vials has an immediate appeal. If there was a technology that could

Theranos were one of the hottest health teach startups of the last decade. Indeed, their USP – that existing blood testing could be replaced by a pin prick – would have been a genuinely disruptive one.

Theranos was founded in 2003 by Elizabeth Holmes, then 19 years old, who dropped out of studying engineering in Stanford in order to start the company. In 2015 she was named by Forbes magazine as the youngest self-made female billionaire in history, with an estimated worth of $4.5 billion. In June 2016, Forbes revised their estimate to zero. What happened?

At times of writing, Holmes has been charged with “massive fraud” by the US Securities and Exchange Commission, and has agreed to pay a $500,000 fine and accept a ban from serving as a company director or officer for ten years. It is unclear if a criminal investigation is also proceeding.
At its height, Theranos had a seemingly stellar team of advisors. The board of directors has included such figures as Henry Kissinger, current US Secretary of Defence James “Mad Dog” Mattis, various former US Senators and business figure. In early 2016, in response to criticism that, whatever their other qualities, the clinical expertise of Mad Dog Mattis et al was perhaps light, it announced a medical advisory board including four medical doctors and six professors.

 

Elizabeth Holmes’ fall began in October 2015, when the Wall Street Journal’s John Carreyrou published an article detailing discrepancies between Theranos’ claims and the actual performance of their technology. This was in response to a Fortune cover story by Roger Parloff, who subsequently wrote a thoughtful piece on how he had been misled, but also how he missed a hint that all was not what it was.

 

Theranos’ claims to be able to perform over 200 different investigations on a pinprick of blood were not borne out; and it turned out that other companies’ products were used for the analysis of many samples.

 

The fall of Theranos has led to some soul-searching among the health tech stat up community. Bill Rader, an entrepreneur and columnist at Forbes, wrote on What Entrepreneurs Can Learn From Theranos:

 

     I have been watching first in awe of perceived accomplishments, and then feeling burned, then later vindicated, when the actual facts were disclosed. Don’t get me wrong, I really wanted their efforts to have been both real and successful – they would have changed healthcare for the better. Now, that seems unlikely to be the case.

 

By now, almost everyone has heard of Holmes and her company, and how she built Theranos on hype and secrecy, and pushed investors into a huge, $9 billion valuation. Now everyone in the industry is talking about this and lawsuits are flying.

Just a couple months ago, a Silicon Valley venture capitalist appeared on CNBC’s “Closing Bell” and instead of talking about the elephant in the room, he diverted to a defense strategy for the Theranos CEO.

 

He claimed Elizabeth Holmes had been “totally attacked,”and that she is “a great example of maybe why the women are so frustrated.”

He also went on to say, “This is a great entrepreneur who wants to change health care as we know it.”

 

The last statement was the strangest thing he said. Wouldn’t we all like to change things for the better? But “wanting” and “doing” are two different things.

 

 

 

Rader’s piece is worth reading for clinicians and IT professionals involved in health technology. The major lesson he draws is the need for transparency. He describes being put under pressure by his own board; why wasn’t he able to raise as much money as Theranos? It transpires that Theranos’ methods may make life more difficult for start-ups in the future, and Rader fears that legitimate health tech may suffer:

 

Nothing good has come of the mess created by Theronos secrecy, or as some have characterized, deception. The investor has been burned, the patient has been left with unfilled promises (yet again) and life science industry start-ups, like my company, have been left with even more challenges in raising much needed investment. And worse of all, diagnostic start-ups in general are carrying an unearned stigma.

 

In this interesting piece, Christine Farr notes that the biggest biotech and health care venture capital firms did not invest in Theranos, nor did Silicon Valley firms with actual clinical practices attached. As Farr writes, the Theranos story reflects systemic issues in funding of innovation, and the nature of hype. And one unfortunate consequence may be an excessive focus on Elizabeth Holmes; a charismatic figure lauded unrealistically at one point is ripe to become a scapegoat for all the ills of an industry the next.

 

The “stealth mode” in which Theranos operated in for the first ten years of its existence is incompatible with the values of healthcare and of the science on which it is based. Farr points out how unlikely it would be that a biotech firm vetting Theranos would let their lack of peer reviewed studies pass. The process of peer review and building evidence is key to the modern practice of medicine.

Another lesson is simply to beware of what one wants to be true. As written above, the idea of Theranos’ technology is highly appealing. The company, and Holmes, sailed on an ocean of hype and admiring magazine covers. The rhetoric of disruptive and revolutionizing healthcare featured prominently, as the 2014 Fortune magazine cover story reveals:

518ecmssujl-_sx387_bo1204203200_.0

 

Perhaps a healthy scepticism of claims to “revolutionise” health care will be one consequence of the Theranos affair, and a more robustly questioning attitude to the solutionism that plagues technology discourse in general.

Clinicians and health IT professionals should be open to innovation and new ideas, but also hold on to their professional duty to be confident new technologies will actually benefit the patient.

#LivingLibrary – College of Psychiatrists of Ireland event for #GreenRibbon month, 31st May 2018

I will be speaking as a living book in this:
https_cdn.evbuc.comimages447094501981379252071original

The College is delighted to announce our 4th annual event in partnership with See Change for Green Ribbon Month – A Living Library
When it comes to mental health everyone has a story to share and we find comfort, empathy and compassion in shared experiences. Social contact is known to be one of the most effective ways of reducing mental health related stigma and discrimination so with this in mind, and to mark Green Ribbon month, the College is delighted to announce our ‘Living Library’ event, a library come to life in the outdoors!

At our library the ‘books’ are a little different, they are people; people with different experiences and stories to tell related to mental health including those who have experienced mental health issues and illness, their family members and carers, and the psychiatrists who help them towards the path of recovery. Mental health stigma too often creates discrimination and misunderstanding so we want to give members of the public the opportunity to connect and engage with psychiatrists and people they may not normally have the occasion to speak with.

The aim is to better understand the lived experiences of others who have experienced or facilitated recovery from mental illness and distress and to challenge their own assumptions, prejudices and stereotypes. We invite you to ‘read’ the human books through conversation and gain understanding of their experiences.

For Green Ribbon Month Let’s End the Stigma by not judging a book by its cover and develop a greater understanding of each other’s stories.

Thursday 31st May 2018
12.30pm – 2.30pm
St Stephens Green, Dublin

This is a Free Event, but space is limited. Book your place here.

“Mental health apps offer a head start on recovery” – Irish Times, 18/01/18

Here is a piece by Sylvia Thompson on a recent First Fortnight panel discussion I took part in on apps in mental health.

Dr Séamus Mac Suibhne, psychiatrist and member of the Health Service Executive research technology team says that while the task of vetting all apps for their clinical usefulness is virtually impossible, it would be helpful if the Cochrane Collaboration [a global independent network of researchers] had a specific e-health element so it could partner with internet companies to give a meaningful rubber stamp to specific mental health apps.

“There is potential for the use of mental health apps to engage people with diagnosed conditions – particularly younger patients who might stop going to their outpatients appointments,” says Dr Mac Suibhne. However, he cautions their use as a replacement to therapy. “A lot of apps claim to use a psychotherapeutic approach but psychotherapy is about a human encounter and an app can’t replace that,” he says.

Here are some other posts from this blog on these issues:

Here is a post on mental health apps and the military.

Here is a general piece on evidence, clinical credibilty and mental health apps.

Here is my rather sceptical take on a Financial Times piece on smartphones and healthcare.

Here is a piece on the dangers (and dynamics) of hype in health care tech

Here is a post on a paper on the quality of smartphone apps for panic disorder.

The myth of digital natives and health IT 

I have a post on the CCIO website on the Digital Native myth and Health IT

The opening paragraph: 

We hear a lot about digital natives. They are related to the similarly much-mentioned millenials; possibly they are exactly the same people (although as I am going to argue that digital natives do not exist, perhaps millenials will also disappear in a puff of logic). Born after 1980, or maybe after 1984, or maybe after 1993, or maybe after 2007, or maybe after 2010, the digital native grew up with IT, or maybe grew up with the internet, or grew up with social media, or at any rate grew up with something that the prior generation – the “digital immigrants” (born a couple of years before the first cut off above, that’s where I am too) – didn’t.

#digitalnatives and #edtech and #woolongong- The ‘digital natives’ debate: A critical review of the evidence. British Journal of Educational Technology, Bennett et al Feb 2008

I blogged the other day on a recent paper on the myth of the digital native. Here is another paper, by Sue Bennett, Karl Maton and Lisa Kervin, from nearly a decade ago, on the same theme – and equally trenchant:

The idea that a new generation of students is entering the education system has excited recent attention among educators and education commentators. Termed ‘digital natives’ or the ‘Net generation’, these young people are said to have been immersed in technology all their lives, imbuing them with sophisticated technical skills and learning preferences for which traditional education is unprepared. Grand claims are being made about the nature of this generational change and about the urgent necessity for educational reform in response. A sense of impending crisis pervades this debate. However, the actual situation is far from clear. In this paper, the authors draw on the fields of education and sociology to analyse the digital natives debate. The paper presents and questions the main claims made about digital natives and analyses the nature of the debate itself. We argue that rather than being empirically and theoretically informed, the debate can be likened to an academic form of a ‘moral panic’. We propose that a more measured and disinterested approach is now required to investigate ‘digital natives’ and their implications for education.

On an entirely different note, the authors are/were affiliated with the University of Woolongong. Recent days have seen the death of Geoff Mack, who wrote the song “I’ve Been Everywhere” Originally a list of Australian placenames :

The song inspired versions internationally – the best known being Johnny Cash’s and The Simpsons’ – but the wittiest alternative version is this (NB – Dapto is a few miles from Wollongong)

Anyway, back the digital natives. Bennet et al begin with a quote from Marcel Proust:

The one thing that does not change is that at any and every time it appears that there have been
‘great changes’.
Marcel Proust, Within a Budding Grove

The authors summarise what a digital native is supposed to be like – and the not exactly extensive evidence base for their existence:

The claim made for the existence of a generation of ‘digital natives’ is based on two
main assumptions in the literature, which can be summarised as follows:

1. Young people of the digital native generation possess sophisticated
knowledge of and skills with information technologies.
2. As a result of their upbringing and experiences with technology, digital natives have particular learning preferences or styles that differ from earlier generations of students.

In the seminal literature on digital natives, these assertions are put forward with limited
empirical evidence (eg, Tapscott, 1998), or supported by anecdotes and appeals to
common-sense beliefs (eg, Prensky, 2001a). Furthermore, this literature has been referenced,
often uncritically, in a host of later publications (Gaston, 2006; Gros, 2003;
Long, 2005; McHale, 2005; Skiba, 2005). There is, however, an emerging body of
research that is beginning to reveal some of the complexity of young people’s computer
use and skills.

No one denies that a lot of young people use a lot of technology – but not all:

In summary, though limited in scope and focus, the research evidence to date indicates
that a proportion of young people are highly adept with technology and rely on it for a
range of information gathering and communication activities. However, there also
appears to be a significant proportion of young people who do not have the levels of access or technology skills predicted by proponents of the digital native idea. Such generalisations about a whole generation of young people thereby focus attention on
technically adept students. With this comes the danger that those less interested and less able will be neglected, and that the potential impact of socio-economic and cultural factors will be overlooked. It may be that there is as much variation within the digital native generation as between the generations.

It is often suggested that children who are merrily exploring the digital world are ground down with frustration by not having the same access to computers in school. This is part of a more general (with familiar rhetoric for the health IT world) demand for transformation (the word “disruptive” in its modern usage had not quite caught on in 2008) As is often the case, the empirical evidence (and also, I would say, a certain degree of common sense) is not with the disrupters:

The claim we will now examine is that current educational systems must change in
response to a new generation of technically adept young people. Current students have
been variously described as disappointed (Oblinger, 2003), dissatisfied (Levin & Arafeh,
2002) and disengaged (Prensky, 2005a). It is also argued that educational institutions
at all levels are rapidly becoming outdated and irrelevant, and that there is an urgent
need to change what is taught and how(Prensky, 2001a; Tapscott, 1998). For example,
Tapscott (1999) urges educators and authorities to ‘[g]ive students the tools, and they
will be the single most important source of guidance on how to make their schools relevant and effective places to learn’ (p. 11).Without such a transformation, commentators
warn, we risk failing a generation of students and our institutions face imminent
obsolescence.

However, there is little evidence of the serious disaffection and alienation among students
claimed by commentators. Downes’ (2002) study of primary school children
(5–12 years old) found that home computer use was more varied than school use and
enabled children greater freedom and opportunity to learn by doing. The participants
did report feeling limited in the time they were allocated to use computers at school and
in the way their use was constrained by teacher-directed learning activities. Similarly,
Levin and Arafeh’s (2002) study revealed students’ frustrations at their school Internet
use being restricted, but crucially also their recognition of the school’s in loco parentis
role in protecting them from inappropriate material. Selwyn’s (2006) student participants
were also frustrated that their freedom of use was curtailed at school and ‘were
well aware of a digital disconnect but displayed a pragmatic acceptance rather than the
outright alienation from the school that some commentators would suggest’ (p. 5).

In 2008 Bennett et al summarised similar issues relating to students actual rather than perceived technical adeptness and net savviness to the 2016 authors:

Furthermore, questions must be asked about the relevance to education of the everyday
ICTs skills possessed by technically adept young people. For example, it cannot be
assumed that knowing how to look up ‘cheats’ for computer games on the Internet
bears any relation to the skills required to assess a website’s relevance for a school
project. Indeed, existing research suggests otherwise. When observing students interacting
with text obtained from an Internet search, Sutherland-Smith (2002) reported
that many were easily frustrated when not instantly gratified in their search for immediate
answers and appeared to adopt a ‘snatch and grab philosophy’ (p. 664). Similarly,
Eagleton, Guinee and Langlais (2003) observed middle-school students often making
‘hasty, random choices with little thought and evaluation’ (p. 30).
Such research observes shallow, random and often passive interactions with text,which
raise significant questions about what digital natives can actually do as they engage
with and make meaning from such technology. As noted by Lorenzo and Dziuban
(2006), concerns over students’ lack of critical thinking when using Internet-based
information sources imply that ‘students aren’t as net savvy as we might have assumed’
(p. 2). This suggests that students’ everyday technology practices may not be directly
applicable to academic tasks, and so education has a vitally important role in fostering
information literacies that will support learning.

Again, this is a paper I could quote bits from all day – so here are a couple of paragraphs from towards the end that summarises their (and my) take on the digital natives:

Neither dismissive scepticism nor uncritical advocacy enable understanding of whether
the phenomenon of digital natives is significant and in what ways education might need
to change to accommodate it. As we have discussed in this paper, research is beginning
to expose arguments about digital natives to critical enquiry, but much more needs to be
done. Close scrutiny of the assumptions underlying the digital natives notion reveals
avenues of inquiry that will inform the debate. Such understanding and evidence are
necessary precursors to change.

The claim that there is a distinctive new generation of students in possession of sophisticated
technology skills and with learning preferences for which education is not
equipped to support has excited much recent attention. Proponents arguing that education
must change dramatically to cater for the needs of these digital natives have
sparked an academic form of a ‘moral panic’ using extreme arguments that have lacked
empirical evidence.

Finally, after posting the prior summary of Kirschner and deBruckyne’s paper, I searched hashtag #digitalnatives on Twitter and – self-promotingly – replied to some of the original tweeters with a link to the paper (interestingly quite a few #digitalnatives tweets were links to discussions of the Kirschner/deBruckyne paper) Some were very receptive, but others were markedly defensive. Obviously a total stranger coming along and pedantically pointing out your hashtag is about something that doesn’t exist may not be the most polite way of interacting on twitter – but also quite a lot of us are quite attached to the myth of the digital native

“The myths of the digital native and the multitasker”

One common rhetorical device heard in technology circles – including eHealth circles – is the idea that those born after 1980, or maybe 1984, or maybe 1993, or maybe 2000, or maybe 2010 (you get the picture) are “digital natives” – everyone else is “digital immigrant” In the current edition of Teaching and Teacher Education, Kirschner and de Bruyckere have an excellent paper on this myth, and the related myth of multitasking.

The “highlights” of the paper (I am not sure if these are selected by the authors or by the editors – UPDATE: see comment by Paul Kirschner below!) are pretty to the point:

Highlights

Information-savvy digital natives do not exist.

Learners cannot multitask; they task switch which negatively impacts learning.

Educational design assuming these myths hinders rather than helps learning.

The full article is via subscription/library online, and this recent post on the blog of Nature discusses this paper and others on this myth. This is Kirschner and de Bruyckere’s abstract:

Current discussions about educational policy and practice are often embedded in a mind-set that considers students who were born in an age of omnipresent digital media to be fundamentally different from previous generations of students. These students have been labelled digital natives and have been ascribed the ability to cognitively process multiple sources of information simultaneously (i.e., they can multitask). As a result of this thinking, they are seen by teachers, educational administrators, politicians/policy makers, and the media to require an educational approach radically different from that of previous generations. This article presents scientific evidence showing that there is no such thing as a digital native who is information-skilled simply because (s)he has never known a world that was not digital. It then proceeds to present evidence that one of the alleged abilities of students in this generation, the ability to multitask, does not exist and that designing education that assumes the presence of this ability hinders rather than helps learning. The article concludes by elaborating on possible implications of this for education/educational policy.

The paper is one of those trenchantly entertaining ones academia throws up every so often. For instance here the authors are on the origins of the “digital native” terminology (and “homo zappiens”, a new one on me):

A

ccording to Prensky (2001), who coined the term, digital natives
constitute an ever-growing group of children, adolescents,
and nowadays young adults (i.e., those born after 1984; the official
beginning of this generation) who have been immersed in digital
technologies all their lives. The mere fact that they have been
exposed to these digital technologies has, according to him,
endowed this growing group with specific and even unique characteristics
that make its members completely different from those
growing up in previous generations. The name given to those born
before 1984 – the year that the 8-bit video game saw the light of
day, though others use 1980 – is digital immigrant. Digital natives
are assumed to have sophisticated technical digital skills and
learning preferences for which traditional education is unprepared
and unfit. Prensky coined the term, not based upon extensive
research into this generation and/or the careful study of those
belonging to it, but rather upon a rationalisation of phenomena and
behaviours that he had observed. In his own words, he saw children
“surrounded by and using computers, videogames, digital music
players, video cams, cell phones, and all the other toys and tools of
the digital age” (2001, p.1). Based only upon these observations, he
assumed that these children understood what they were doing,
were using their devices effectively and efficiently, and based upon
this that it would be good to design education that allows them to
do this. Prensky was not alone in this. Veen and Vrakking (2006),
for example, went a step further coining the catchy name homo
zappi€ens to refer to a new breed of learners that has developed e
without either help from or instruction by others e those metacognitive
skills necessary for enquiry-based learning, discovery based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self regulation,
problem solving, and making their own implicit (i.e.,
tacit) and explicit knowledge explicit to others.

The saw that children are invariably more tech savvy then their parents is also a myth:

Looking at pupils younger than university students, the largescale
EU Kids Online report (Livingstone, Haddon, Gorzig, € &
Olafsson, 2011 ), placed the term ‘digital native’ in first place on
its list of the ten biggest myths about young people and technology.
They state: “Children knowing more than their parents has been
136 P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142
exaggerated … Talk of digital natives obscures children’s need for
support in developing digital skills” and that “… only one in five
[children studied] used a file-sharing site or created a pet/avatar
and half that number wrote a blog … While social networking
makes it easier to upload content, most children use the internet for
ready-made, mass produced content” (p. 42). While the concept of
the digital native explicitly and/or implicitly assumes that the
current generation of children is highly digitally literate, it is then
rather strange to note that many curricula in many countries on
many continents (e.g., North America, Europe) see information and
technology literacy as 21st century skills that are core curriculum
goals at the end of the educational process and that need to be
acquired.

Two more recent studies show that the supposed digital divide
is a myth in itself. A study carried out by Romero, Guitert, Sangra,
and Bullen (2013) found that it was, in fact, older students (>30
years and thus born before 1984) who exhibited the characteristics
attributed to digital natives more than their younger counterparts.
In their research, 58% of their students were older than 30 years
who “show the characteristics of this [Net Generation profile]
claimed by the literature because, on analysing their habits, they
can be labelled as ICT users more than digital immigrants” (p. 176).
In a study on whether digital natives are more ‘technology savvy’
than their middle school science teachers, Wang, Hsu, Campbell,
Coster, and Longhurst (2014) conclude that this is not the case.

The authors are not arguing that curricula and teaching methods do not need to change and evolve, but that the myth of the digital native should not be the reason for doing so:

Finally, this non-existence of digital natives makes clear that one
should be wary about claims to change education because this
generation of young people is fundamentally different from previous
generations of learners in how they learn/can learn because
of their media usage (De Bruyckere, Hulshof, & Kirschner, 2015).
The claim of the existence of a generation of digital natives, thus,
cannot be used as either a motive or an excuse to implement
pedagogies such as enquiry-based learning, discovery-based
learning, networked learning, experiential learning, collaborative
learning, active learning, self-organisation and self-regulation or
problem solving as Veen and Vrakking (2006) argued. This does not
mean education should neither evolve nor change, but rather that
proposed changes should be evidence informed both in the reasons
for the change and the proposed changes themselves, something
P.A. Kirschner, P. De Bruyckere / Teaching and Teacher Education 67 (2017) 135e142 137
that ‘digital natives’ is not.
The non-existence of digital natives is definitely not the ‘reason’
why students today are disinterested at and even ‘alienated’ by
school. This lack of interest and alienation may be the case, but the
causes stem from quite different things such as the fact that
diminished concentration and the loss of the ability to ignore
irrelevant stimuli may be attributed to constant task switching
between different devices (Loh & Kanai, 2016; Ophir, Nass, &
Wagner, 2009; Sampasa-Kanyinga & Lewis, 2015). This, however,
is the topic of a different article.

The paper also deals with multi-tasking. Firstly they examine the nature of attention. “Multi-tasking” is an impossibility from this point of view, unless the tasks are automatic behaviours. They cite a range of research which, unsurprisingly enough, link heavy social media usage (especially with the user instantly replying to stimuli) with poorer educational outcomes:

Ophir et al. (2009) in a study in which university students who
identified themselves as proficient multitaskers were asked to
concentrate on rectangular stimuli of one colour on a computer
monitor and ignore irrelevant stimuli entering their screen of a
different colour observed that
heavy media multitaskers are more susceptible to interference
from irrelevant environmental stimuli and from irrelevant
representations in memory. This led to the surprising result that
heavy media multitaskers performed worse on a test of taskswitching
ability, likely because of reduced ability to filter out
interference from the irrelevant task set (p. 15583).
Ophir et al. (2009) concluded that faced with of distractors,
heavy multitaskers were slower in detecting changes in visual
patterns, were more susceptible to false recollections of the distractors
during a memory task, and were slower in task-switching.
Heavy multitaskers were less able than light/occasional multitaskers
to volitionally restrain their attention only to task relevant
information.

The authors specifically warn caution about the drive that students bring their own device to school.

Why is this paper so important? As the authors show (and the author of the Nature blog post linked to above also observes) this is not a new finding. There are many pieces out there, both academic and journalistic, on the myth of the digital native. This paper specifically locates the dicussion in education and in teacher training (they say much also on the issue of supposedly “digital native” teachers) and is a trenchant warning on the magical thinking that has grown up around technology.

There are obvious parallels with health and technology. The messianic, evangelical approach to healthtech is replete with its own assumptions about digital natives, and magical thinking about how easily they navigate online worlds. Using a handful of social medial tools or apps with visual interactive systems does not translate into a deep knowledge of the online world, or indeed a wisdom about it (or anything else)

“Development is always going to destabilize a fragile balance of social forces.”

Via the work of John Adams, I have had some familiarity with the Douglas-Wildavsky Cultural Theory of Risk. Like this reviewer, I find the Douglas/Wildavksy treatment of environmentalism rather crude, while their overall cultural typology of risk stimulating. As the reviewer points out:

Most readers will be struck not by the abstract theory but by its application to the rise of environmentalism. This emphasis is unfortunate. The attempt to “explain” environmentalism makes a few good points, but on the whole this part of the book is crude, shortsighted, and snide. On the other hand, the sections that consider the relationship between risk and culture on a more fundamental level are sensitive and thoughtful.
Even at its best, Risk and Culture is not entirely successful at explaining the paradox of risk – the problem of managing the unknown – but parts of the book deserve to be read seriously by people interested in the problem of risk, including environmental lawyers.

 

9781446254677

I am now reading Mary Douglas directly, currently her Culture and Crises.: Understanding Risk and Resolution  Although she has a prose style that sometimes grates, and I am wary of possibly being unaware of technical anthropological issues that may be taken-for-granted, there is much to enjoy and think about.

Here is a brief quote from one essay – Traditional Culture, Let’s Here No More About It, which follows a passage about the occasional pitting against each other of development and “traditional culture” (usually, under western eyes, to the detriment of traditional culture):

Development is always going to destabilise a fragile balance of social forces. The people are understandably reluctant to do the gruelling hard work and accept the diversion of resources if the resulting prosperity will only line the pockets of outsiders. Furthermore, if it going to erode the community’s accumulated store of trust, and dissolve their traditional readiness to collaborate, the well-being of the community may be worse after development than before. There certainly is inherent ambiguity about the moral case. At least we can say that what stops development is not cultural traditionalism so much as the way it arrives, how it is organised.

This applies – in spades – to the many many “cultural change” / “transformation” etc projects that health services become the subject of. The suspicion that sacrifice and hard work on the part of staff will benefit only a narrow few (the Minister getting good headlines, various outside consultancies, higher management) surely underlies some at least of the cynicism about such projects that is undoubtedly prevalent.

 

#OceanOptimism, powerlessness, hope, and change.

The current BBC Wildlife Magazine has a fascinating article by Elin Kelsey, of the Ocean Optimism Project, on how media-fuelled environmental despair and nihilism ends up demoralising people to the degree that positive action seems impossible. She cites much research on the “finite pool of worry” and the paralysing effect of despair, and the power optimism to reverse this trend. The article isn’t available online, but in the post below from my other blog I highlight relevant passages from a Kelsey piece in Smithsonian Magazine on similar themes.

This article is obviously focused on ecology, but is all too true of our healthcare systems. For similar reasons to those Kelsey ascribes to environmentalists who are wary of being overly focused on good news, frontline workers in the health service naturally tend to focus on what is wrong, what is proving impossible, what needs to change. This is necessary, but can become an overwhelming counsel of nihilism, fostering cynicism and very often helping to entrench negative practices.

This is very relevant to the various themes on valuesmorale, “blame culture”, and possibility of positive change within not only the HSE but any healthcare organisation.

Séamus Sweeney

The current issue of BBC Wildlife Magazinehas a fascinating cover story by Elin Kelseyon hope and optimism versus despair in how we think about they environment. Essentially, much media discourse on the environment tends to be gloomy, doom, and generally despairing. Kelsey cites a wide range of research on how this negativity effects how we think about the environment and our beliefs about what can be done – and therefore what is done – to improve things. The full article is not available online. This article from Smithsonian Magazine is briefer, but captures her idea:

Things are far more resilient than I ever imagined. Me, green sea turtles, coral reefs blown to bits by atomic bombs. In a twist of fate that even surprised scientists, Bikini Atoll, site of one of the world’s biggest nuclear explosions, is now a scuba diver’s paradise. Bikini Atoll located in the Pacific’s…

View original post 985 more words

Hype, The Life Study and trying to do too much

A while back I reviewed Helen Pearson’s, “The Life Project” in the TLS. I had previously blogged on the perils of trying to do too much and mission creep and overload.

From the original draft of the review (published version differed slightly):

Pearson is laudably clear that the story of the birth cohorts is also a study of failure; the failure of the NHS to improve the inequality of health incomes between social classes, the failure of educational reforms and re-reforms to broach the similar academic achievement gap. Indeed, the book culminates in a failure which introduces a darker tone to the story of the birth cohort studies.

Launched in January 2015, the Life Study was supposed to follow 80,000 babies born in 2015 and intended to be a birth cohort for the “Olympic Children.” It had a government patron in David Willetts, who departure from politics in May 2015 perhaps set the stage for its collapse. Overstuffed antenatal clinics and a lack of health visitors meant that the Life Study’s participants would have to self-select. The optimistic scenario has 16,000 women signing up in the first eighteen months; in the first six months, 249 women did. By October 2015, just as Pearson was completing five years of work on this book, the study had officially been abandoned.

Along with the cancellation of the National Institute for Health’s National Children’s Study in December 2014, this made it clear that birth cohorts have been victims of their own success. An understandable tendency to include as much potentially useful information as possible seemed to have created massive, and ultimately unworkable cohorts. The Life Study would have generated vast data sets: “80,000 babies, warehouses of stool samples of placentas, gigabytes of video clips, several hundred thousand questionnaires and much more” (the history of the 1982 study repeated itself, perhaps.) Then there is the recruitment issue. Pregnant women volunteering for the Life Study would “travel to special recruitment centres set up for the study and then spend two hours there, answering questions and giving their samples of urine and blood.” Perhaps the surprise is that 249 pregnant women actually did volunteer for this.

Pearson’s book illustrates how tempting mission creep is. She recounts how birth cohorts went from obscure beginnings to official neglect with perpetual funding issues to suddenly becoming a crown jewel of British research. Indeed, as I observe in the review, while relatively few countries  have emulated the NHS’ structure and funding model, very many have tried to get on the birth cohort train.

This situation of an understandable enthusiasm and sudden fascination has parallels across health services and research. It is particularly a risk in eHealth and connected health, especially as the systems are inherently complex, and there is a great deal of fashionability to using technology more effectively in healthcare. It is one of those mom-and-apple-pie things, a god term, that can shut down critical thinking at times.

Megaprojects are seductive also in an age where the politics of funding research loom large. The big, “transformative” projects can squeeze out the less ambitious, less hype-y, more human-scale approaches. It can be another version of the Big Man theory of leadership.

Whatever we do, it is made up of a collection of tiny, often implicit actions, attitudes, near-reflexes, and is embedded in some kind of system beyond ourselves that is ultimately made up of other people performing and enacting a collection of tiny, often implicit actions, attitudes, and near-reflexes.

 

“Blame Culture” in the Irish Healthcare System – another ARCH blog post

Another excellent blog post on the ARCH website, this time by Dr Marcella McGovern, on the blame culture of the Irish health system. The trigger for this article was the recent controversy on “hidden waiting lists.” This led to a rather predictable response from the current Minister for Health:

The Minister for Health, Simon Harris, responded to this programme by saying that he “intends to shine a light” on management in the Irish Health Service Executive (HSE) and that if management does not “measure up”, they will be removed from their roles.

 

Tough talk, but as Dr McGovern writes:

 it fails to acknowledge the Government’s responsibility for that problem. Governance, performance oversight and holding the HSE to account for the implementation of national health policy are key functions that the Minister for Health and his Department are responsible for performing on an ongoing basis; not in response to a crisis. The question put to Ministers for Health in a crisis therefore, should be where in your Department’s oversight of the HSE did you fail to detect this problem and what steps are you taking to correct the problem and ensure that it doesn’t happen again?

 

Of course, this is hardly new:

Paul Cullen highlighted in an analysis piece in the Irish Times (Irish Times, 11th February 2017) that Minister Harris’ predecessor, Leo Varadkar, similarly promised that “heads will roll” over hospital overcrowding. Yet, this winter again saw overcrowded Accident & Emergency Departments resulting in planned inpatient and outpatient appointments being postponed.  The back-log of these postponed appointments are now contributing to the current crisis over long waiting lists, illustrating that unjustifiably “blaming the bureaucrats” (Dubnick, 1996) has knock on effects.

 

Dr McGovern uses the work of Dubnick on “prejudical blame culture” as a framework for her piece:

Three major conditions (for defining prejudical blame culture):
1. It makes no requirement that the blamed person or collection of persons have assumed responsibility for the condition they are blamed for; rather, it targets an ill-defined but inclusive group that everyone knows to exist (e.g. bureaucrats);

2. It doesn’t require any role for the blamed in contributing to the cause of the blameworthy or harmful condition. It is assumed that the vaguely defined ‘they’ are highly influential in shaping the world;

3. [It] eliminates the need for any degree of specificity regarding what the harmful condition entails. It could be the decline of the economy, or the loss of national prestige, or the general malaise of society. (Dubnick, 1996: 22).

Dr McGovern’s work, in the ARCH context, is on the effect on system readiness for innovation. Of course, blame culture has a much wider impact, paralysing innovations beyond the technological sphere:

From a Connected Health perspective, there is a danger that a blame culture demonstrated at the highest levels of the Irish health system will have a trickle-down effect, compromising system readiness for innovation. If the Department of Health blame the HSE for poor management and the HSE blame the Department of Health for inadequate resources, and if clinicians blame managers for excessive bureaucracy and managers blame clinicians for resisting change; organisational trust may be lost in the battles between “us” and “them” (Firth-Cozen, 2004). Within such environments, potentially transformative leaders and early adopters behave cautiously and become reluctant to take “ownership” of innovations (Heitmueller et al. 2016), which by their nature carry the risk of failure and unintended consequences (Ash et al. 2004).