While there are plenty of reasons why students should be exposed to technology in schools, educational research is yet to produce consensus on the degree to which personal laptops boost learning.
Historically, when researchers examine what makes a difference in education, laptops, and other technology, come way down the ranks. Some educationalists go as far as to describe the use of computers in schools as distractions, plus there are concerns about screen time.
A report from the European Commission which looked at 31 recent “one laptop per child” initiatives from across 19 countries found little or no improvement in learning outcomes. However, recent research which examined a group of Australian schools found laptops did make a positive difference to learning. Not surprisingly, how the laptops were used determined the size of the benefit.
The Digital Education Revolution
In 2008, the then newly elected Labor government began implementing the (subsequently much maligned) A$2.1 billion “Digital Education Revolution”, whereby it was intended that every Year 9 student would receive a laptop over four or five years, thus creating a 1:1 computer-to-student ratio.
For 12 Catholic secondary schools in Sydney this meant that half of the Year 9 students in 2008 received a laptop and half did not. The distribution of who received the laptops was random in terms of socioeconomic status and average performance, having being imposed independently by a federal audit.
This ultimately lead to a dichotomous scenario whereby in 2011 half of the students in these schools sitting for the NSW HSC had been schooled for over three years with 1:1 laptops and half had not.
This created a natural experiment beyond our influence rather than a researcher-designed randomised experiment. This was also quite timely as many principals and education authority directors were wondering what would happen to their exam results.
We looked at the examination data from the 12 schools to see if the students with laptops performed better or worse in the sciences (our field of research) than those without. We predicted a null result.
To our surprise, when controlling for other factors (socioeconomic status, gender, school type, prior attainment and more), we found that those who had been schooled with a laptop did better to varying degrees and that this was statistically significant in biology, chemistry and physics.
In senior science laptops were found to have no effect and the sample size for earth and environmental science was too small to produce a result.
We then found the “effect size” (an approach taken by prominent education researcher John Hattie who gave a score of effect size to every kind of educational intervention so that we may compare them) was much greater in physics than in biology or chemistry. This presented the follow up question – why?
In our follow up paper we investigated why the students with laptops did better, particularly in physics, by surveying how physics and biology teachers and students actually used their laptops.
Interestingly, the physics students and teachers consistently reported performing more “higher-order” activities such as simulations and spreadsheets with their laptops than their biology counterparts, and much than those without laptops.
The biology students and teachers consistently reported more use of “lower-order” activities such as word processing, electronic textbooks and internet searching.
We also scrutinised the NSW HSC syllabuses. Despite both the biology and the physics syllabuses providing identical motherhood statements about the use of technology in their guidelines there were no explicit mandates or recommendations for the use of technology in the biology content, unlike physics where there were many.
Ultimately we found that in HSC biology, chemistry and physics, those students schooled with laptops actually performed better than those without. This effect was much more pronounced in physics which correlated with greater higher-order use as mandated by the curriculum.
There are several repercussions from this research. The findings, as ever, are highly contextual (for these 12 schools; in southwest and south Sydney; in the HSC sciences; in 2011), but we now have some robust quantitative data regarding the use of technology and student academic performance in Australia. The crude data is freely available for anyone to perform their own analysis.
The research also suggests the “Digital Education Revolution” was not as shambolic or a waste of money in all cases, as portrayed in the media . With the NSW HSC syllabuses about to be rewritten, we hope there will be greater consistency in the capitalisation on technology for “higher-order activities” across all subjects.