#digitalnatives and #edtech and #woolongong- The ‘digital natives’ debate: A critical review of the evidence. British Journal of Educational Technology, Bennett et al Feb 2008

I blogged the other day on a recent paper on the myth of the digital native. Here is another paper, by Sue Bennett, Karl Maton and Lisa Kervin, from nearly a decade ago, on the same theme – and equally trenchant:

The idea that a new generation of students is entering the education system has excited recent attention among educators and education commentators. Termed ‘digital natives’ or the ‘Net generation’, these young people are said to have been immersed in technology all their lives, imbuing them with sophisticated technical skills and learning preferences for which traditional education is unprepared. Grand claims are being made about the nature of this generational change and about the urgent necessity for educational reform in response. A sense of impending crisis pervades this debate. However, the actual situation is far from clear. In this paper, the authors draw on the fields of education and sociology to analyse the digital natives debate. The paper presents and questions the main claims made about digital natives and analyses the nature of the debate itself. We argue that rather than being empirically and theoretically informed, the debate can be likened to an academic form of a ‘moral panic’. We propose that a more measured and disinterested approach is now required to investigate ‘digital natives’ and their implications for education.

On an entirely different note, the authors are/were affiliated with the University of Woolongong. Recent days have seen the death of Geoff Mack, who wrote the song “I’ve Been Everywhere” Originally a list of Australian placenames :

The song inspired versions internationally – the best known being Johnny Cash’s and The Simpsons’ – but the wittiest alternative version is this (NB – Dapto is a few miles from Wollongong)

Anyway, back the digital natives. Bennet et al begin with a quote from Marcel Proust:

The one thing that does not change is that at any and every time it appears that there have been
‘great changes’.
Marcel Proust, Within a Budding Grove

The authors summarise what a digital native is supposed to be like – and the not exactly extensive evidence base for their existence:

The claim made for the existence of a generation of ‘digital natives’ is based on two
main assumptions in the literature, which can be summarised as follows:

1. Young people of the digital native generation possess sophisticated
knowledge of and skills with information technologies.
2. As a result of their upbringing and experiences with technology, digital natives have particular learning preferences or styles that differ from earlier generations of students.

In the seminal literature on digital natives, these assertions are put forward with limited
empirical evidence (eg, Tapscott, 1998), or supported by anecdotes and appeals to
common-sense beliefs (eg, Prensky, 2001a). Furthermore, this literature has been referenced,
often uncritically, in a host of later publications (Gaston, 2006; Gros, 2003;
Long, 2005; McHale, 2005; Skiba, 2005). There is, however, an emerging body of
research that is beginning to reveal some of the complexity of young people’s computer
use and skills.

No one denies that a lot of young people use a lot of technology – but not all:

In summary, though limited in scope and focus, the research evidence to date indicates
that a proportion of young people are highly adept with technology and rely on it for a
range of information gathering and communication activities. However, there also
appears to be a significant proportion of young people who do not have the levels of access or technology skills predicted by proponents of the digital native idea. Such generalisations about a whole generation of young people thereby focus attention on
technically adept students. With this comes the danger that those less interested and less able will be neglected, and that the potential impact of socio-economic and cultural factors will be overlooked. It may be that there is as much variation within the digital native generation as between the generations.

It is often suggested that children who are merrily exploring the digital world are ground down with frustration by not having the same access to computers in school. This is part of a more general (with familiar rhetoric for the health IT world) demand for transformation (the word “disruptive” in its modern usage had not quite caught on in 2008) As is often the case, the empirical evidence (and also, I would say, a certain degree of common sense) is not with the disrupters:

The claim we will now examine is that current educational systems must change in
response to a new generation of technically adept young people. Current students have
been variously described as disappointed (Oblinger, 2003), dissatisfied (Levin & Arafeh,
2002) and disengaged (Prensky, 2005a). It is also argued that educational institutions
at all levels are rapidly becoming outdated and irrelevant, and that there is an urgent
need to change what is taught and how(Prensky, 2001a; Tapscott, 1998). For example,
Tapscott (1999) urges educators and authorities to ‘[g]ive students the tools, and they
will be the single most important source of guidance on how to make their schools relevant and effective places to learn’ (p. 11).Without such a transformation, commentators
warn, we risk failing a generation of students and our institutions face imminent

However, there is little evidence of the serious disaffection and alienation among students
claimed by commentators. Downes’ (2002) study of primary school children
(5–12 years old) found that home computer use was more varied than school use and
enabled children greater freedom and opportunity to learn by doing. The participants
did report feeling limited in the time they were allocated to use computers at school and
in the way their use was constrained by teacher-directed learning activities. Similarly,
Levin and Arafeh’s (2002) study revealed students’ frustrations at their school Internet
use being restricted, but crucially also their recognition of the school’s in loco parentis
role in protecting them from inappropriate material. Selwyn’s (2006) student participants
were also frustrated that their freedom of use was curtailed at school and ‘were
well aware of a digital disconnect but displayed a pragmatic acceptance rather than the
outright alienation from the school that some commentators would suggest’ (p. 5).

In 2008 Bennett et al summarised similar issues relating to students actual rather than perceived technical adeptness and net savviness to the 2016 authors:

Furthermore, questions must be asked about the relevance to education of the everyday
ICTs skills possessed by technically adept young people. For example, it cannot be
assumed that knowing how to look up ‘cheats’ for computer games on the Internet
bears any relation to the skills required to assess a website’s relevance for a school
project. Indeed, existing research suggests otherwise. When observing students interacting
with text obtained from an Internet search, Sutherland-Smith (2002) reported
that many were easily frustrated when not instantly gratified in their search for immediate
answers and appeared to adopt a ‘snatch and grab philosophy’ (p. 664). Similarly,
Eagleton, Guinee and Langlais (2003) observed middle-school students often making
‘hasty, random choices with little thought and evaluation’ (p. 30).
Such research observes shallow, random and often passive interactions with text,which
raise significant questions about what digital natives can actually do as they engage
with and make meaning from such technology. As noted by Lorenzo and Dziuban
(2006), concerns over students’ lack of critical thinking when using Internet-based
information sources imply that ‘students aren’t as net savvy as we might have assumed’
(p. 2). This suggests that students’ everyday technology practices may not be directly
applicable to academic tasks, and so education has a vitally important role in fostering
information literacies that will support learning.

Again, this is a paper I could quote bits from all day – so here are a couple of paragraphs from towards the end that summarises their (and my) take on the digital natives:

Neither dismissive scepticism nor uncritical advocacy enable understanding of whether
the phenomenon of digital natives is significant and in what ways education might need
to change to accommodate it. As we have discussed in this paper, research is beginning
to expose arguments about digital natives to critical enquiry, but much more needs to be
done. Close scrutiny of the assumptions underlying the digital natives notion reveals
avenues of inquiry that will inform the debate. Such understanding and evidence are
necessary precursors to change.

The claim that there is a distinctive new generation of students in possession of sophisticated
technology skills and with learning preferences for which education is not
equipped to support has excited much recent attention. Proponents arguing that education
must change dramatically to cater for the needs of these digital natives have
sparked an academic form of a ‘moral panic’ using extreme arguments that have lacked
empirical evidence.

Finally, after posting the prior summary of Kirschner and deBruckyne’s paper, I searched hashtag #digitalnatives on Twitter and – self-promotingly – replied to some of the original tweeters with a link to the paper (interestingly quite a few #digitalnatives tweets were links to discussions of the Kirschner/deBruckyne paper) Some were very receptive, but others were markedly defensive. Obviously a total stranger coming along and pedantically pointing out your hashtag is about something that doesn’t exist may not be the most polite way of interacting on twitter – but also quite a lot of us are quite attached to the myth of the digital native

“actual clinic services with real doctors”

Again, from MobiHealthNews:

A new kind of doctor’s office opened in San Francisco this week: Forward, a membership-based healthcare startup founded by former Googler Adrian Aoun that infuses a brick-and-mortar office with data-driven technology and artificial intelligence.

For $149 per month, Forward members can come to the flagship office that features six examination rooms – equipped with interactive personalized displays – and doctors from some of the Bay Area’s top medical systems. Members are given wearable sensors that work with Forward’s proprietary AI for proactive monitoring that can alert members and their doctors of any abnormalities as well as capture, store and analyze data to develop personalized treatment plans. Members also have 24-7 mobile access to their data, rounding out what Aoun believes is a new type of preventative care.

What is interesting about this piece is that there are various other start-ups whose vision is not based on telemedicine or on “empowering consumers”, but on what is at its core the traditional surgery office except with much slicker tech. It is also interesting that Forward’s approach is based on a personal experience:

The impetus for Forward came from a personal experience of Aoun’s. When one of his close relatives had a heart attack, he found himself sitting in the ICU and realizing healthcare wasn’t quite what he thought it was. Seeing doctors having to obtain health records from multiple sources and wait days or weeks for test results and suffering from all-around communication breakdowns within their health system, he was inspired to create an alternative model – one focused on prevention, efficiency and connected tools to create a increasingly smart healthcare plans based on each individual’s needs and goals.

I took the title of this post from what I found a rather amusing aside in a later paragraph:

It also isn’t the first company to offer a hybrid of physical and digital services. In September 2016, startup Carbon Health opened its first clinic, also in San Francisco, that offers actual clinic services with real doctors

“actual clinic services with real doctors”! – sounds truly revolutionary – and quite a difference from the techno-utopian slant of the Financial Times piece I blogged about earlier in the week. At times readers may detect a certain weariness with the hype that surrounds digital health, the overuse of “revolutionary” and “transformative” and so on, the goes-without-saying presumption that healthcare is bloated and inefficient while tech is gleaming and slick and frictionless.  This is far from saying that healthcare doesn’t need change, and can’t learn from other fields – I look forward to hearing more about Forward.

Engaging clinicians and the evidence for informatics innovations

A few weeks ago Richard Gibson from Gartner spoke to members of the CCIO group. It was a fascinating, wide-ranging talk – managing the time effectively was a challenge. Dr Gibson talked about the implications for acute care and long term care of technological innovations – as might be obvious from my previous post here, I have a concern that much of the focus on empowerment via wearables and consumer technology misses the point that the vast bulk of healthcare is acute care and long term care. As Dr Gibson pointed out, at the rate things are going healthcare will be the only economic, social, indeed human activity in years to go

One long term concern I have about connected health approaches is engaging the wide group of clinicians. Groups like the CCIO do a good job (in my experience!) of engaging the already interested, more than likely unabashedly enthusiastic. At the other extreme, there always going to be some resistance to innovation almost on principle. In between, there is a larger group interested but perhaps sceptical.

One occasional response from peers to what I will call “informatics innovations” (to emphasise that this not about ICT but also about care planning and various other approaches that do not depend on “tech” for implementation) is to ask “where is the evidence?” And often this is not a call for empirical studies as such, but for an impossible standard – RCTs!

Now, I advocate for empirical studies of any innovation, and a willingness to admit when things are going wrong based on actual experience rather than theoretical evidence. In education, I strongly support the concept of Best Evidence Medical Education and indeed in following public debates and media coverage about education I personally find it frustrating that there is a sense that educational practice is purely opinion-based.

With innovation, the demand for the kind of RCT based evidence is something of a category error. There is also a wider issue of how “evidence-based” has migrated from healthcare to politics. In Helen Pearson’s Life Project we read how birth cohorts went from ignored, chronically underfunded studies ran by a few eccentrics to celebrated, slightly less underfunded, flagship projects of British epidemiology and sociology. Since the 1990s, they have enjoyed a policy vogue in tandem with a political emphasis on “evidence-based policy.” My own thought on this is that it is one thing to have an evidence base for a specific therapy in medical practice, quite another for a specific intervention in society itself.

I am also reminded of a passage in the closing chapters of Donald Berwick’s Escape Fire (I don’t have a copy of the book to hand so bear with me) which essentially consists of a dialogue between a younger, reforming doctor and an older, traditionally focused doctor. Somewhat in the manner of the Socratic dialogues in which (despite the meaning ascribed now to “Socratic”) Socrates turns out to be correct and his interlocutors wrong, the younger doctor has ready counters for the grumpy arguments of the older one. That is until towards the very end, when in a heartfelt speech the older doctor reveals his concerns not only about the changes of practice but what they mean for their own patients. It is easy to get into a false dichotomy between doctors open to change and those closed to change; often what can be perceived by eager reformers as resistance to change is based on legitimate concern about patient care. There are also concerns about an impersonal approach to medicine. Perhaps ensuring that colleagues know, to as robust a level as innovation allows, that patient care will be improved, is one way through this impasse.


A Medical Informatics Education, 1996.

Today I walked to UCD much as I did nearly 20 years ago on 21st September 1996, to begin college. This time I was walking not to Belfield itself, but to UCD Nexus, located a little further on in Belfield Office Park, for a meeting in my new roles as CCIO liaison to ARCH (if that’s too many acronyms, don’t ask)

Various nostalgic impressions mingled. Cyclists seem more aggressive than they were. UCD is a slicker operation and more given to self-promotion than it was. It had been a while since I had actually walked through campus; the last few times I had driven in, found parking near-impossible, gone to a meeting, and left. Belfield seemed to have become a bit like Docklands , a rather alienating landscape dominated by massive buildings without human scale.


Walking through, however, I find Belfield reassuringly unchanged at its core. The Science Block has greatly expanded, but the central lecture theatre structure is unchanged. The Arts Block, the fundamental library structure, the lake, the restaurant – all are different only superficially. The cafe that was officially known as “Finnegan’s Break” and was always called “Hilpers” is now gone.

I was also a little taken aback by how much human interaction there was. I expected serried ranks of screen-focused students. In the restaurant, I saw only one person texting while talking to here friends, and while that wouldn’t have happened in 1996, it would have in 2000. A few years ago there were PC terminals all over the place, which seem to have largely disappeared.

Given the nature of of the  meeting I was going to, I thought about one of the academic highlights of that first year of medicine; medical informatics. This was a subject which, frankly, was much derided. Why? Because it seemed irrelevant, I think, somewhat beneath those who knew anything much about computers and somewhat irksome to those who didn’t. Crucially, I can’t recall anything specifically medical about medical informatics.

We had lecturers on what a CPU was and so forth (more of which anon) and workshops on the use of Word, Excel, Access and the other Microsoft biggies at the time. The undoubted highpoint was the lecturer, Mel ´Ó Cinneide, suddently pulling  a mouse out of his pocket with the immortal words “for those who haven’t seen one, this is a mouse.”

Now, the wheel has come full circle; one wonders how many of a laptop and tablet focused cohort of students would have seen a mouse. UCD Netsoc was, for a few years, the only way to get internet access as a student, and the enthusiastic queued up from early morning to get an account.

As with many other pre clinical subjects at the time, Medical Informatices teaching was by academics in their specific discipline who no doubt found the prospect of teaching medical students even less enticing than teaching students who at least were pursuing the subject at more length.

In subsequent years, Medical Informatics was revamped and, I gather, made more clinically relevant. And now as Ireland slouches towards eHealth the relevance of IT to medicine is much more obvious. I am sure that Medical Informatics in UCD and equivalent courses in other medical schools is now taught in a clinically relevant, pedagogically sound manner with defined learning objectives and so forth. Nevertheless, I have my doubts that in twenty years anyone will recall a moment from this teaching as vividly as what would (mostly) be the class of 02 recall Mel whipping out the mouse.