Yesterday the World Bank hosted a great discussion related to strategies for tackling the high cost and low availability of textbooks, with a specific focus on needs and contexts across Sub-Saharan Aftrica.
As a complement to yesterday's discussions, a number of posts related to the use of digital teaching and learning materials that have appeared on the World Bank's EduTech blog have been collected here, to make them easier to find, and in case making them available in this way can help in a small way to help enrich any related conversations.
(Please note that additional links will be added to this page over time as relevant related posts appear on the blog.)
When the Russian Empress Catherine the Great visited Crimea in the late 18th century, the nobleman Grigory Potemkin is meant to have had fake village facades erected along her travel route, as well as to have spruced up some of the existing visible buildings (and people), so that she would be fooled into thinking that things were better than they really were. While historians have expressed considerable doubt about whether this actually occurred (indeed, many place it in the category of persistent "cultural myth"), the concept of a Potemkin Village, where things are tarted up so that occasional visitors get a false sense of reality, is not too difficult to understand.
Over the years I have visited hundreds and hundreds of schools in scores of countries to get a sense how they are using (and not using) technology. Whether in rural Eritrea or highly developed, urban Singapore, as an outsider I am always conscious of the fact that there is an element of 'show' to what I am seeing -- or at least that there might be. The act of observing can often change what is being observed (social scientists refer to this as the Hawthorne effect). As an employee of the World Bank, I know that government officials who arrange and accompany me on school visits often want to showcase what 'works', and what is 'best practice'. This is especially the case where World Bank (or other external) funding has been involved, as people are eager to show that related monies have been well spent.
This is not always the case, of course. I was once lucky enough to visit a school in Latin America for children with special educational needs in a country that was buying *lots* of technology for use by teachers and students. We arranged to meet the relevant government officials at the school early in the morning so that they could act as our guides. However, it turned out that there were actually two schools for special needs students located on the same street in different parts of the city -- and we had gone to the wrong one. After waiting for a while in the office of the headmistress (who was clearly surprised that we were there), it was decided that we should just begin the tour and start talking to people. A few hours later, after the national educational officials had finally figured out where we were, we were picked up and driven to the 'correct' school. It will probably come as little surprise that our experiences in both places were quite different. Chats with teachers, administrators and parents at the first school contrasted rather markedly with the quite sunny picture presented to us at the second 'showcase' school. This is not to say that we couldn't learn anything from the showcase school, however, just that we learned different things -- and perhaps had to work a bit harder to do so.
For the past seven years the World Bank's EduTech blog has sought to "explore issues related to the use of information and communications technologies (ICTs) to benefit education in developing countries".
While there are plenty of sources for news, information and perspectives on the uses (and misuses) of educational technologies in the so-called 'highly industrialized' countries of North America, Europe, East Asia and Australia/New Zealand, regular comparative discussions and explorations of what is happening with the uses of ICTs in middle and low income (i.e. so-called 'developing') countries around the world can be harder to find, which is why this remains the focus of the EduTech blog.
The term 'developing countries' is employed here as convenient (if regrettable) shorthand in an attempt to reinforce the context in which the comments and questions explored on the blog are considered, and as a signal about its intended (or at least hoped for) audience. That said, given how much we still don't know and the fact that things continue to change so rapidly, when it comes to technology use in education, as a practical matter we all live in 'developing countries'.
When speaking about some of the early EduTech blog posts, one rather prominent and outspoken commenter (rather comfortably ensconced at an elite U.S. research university, for what that might be worth) said basically that 'there is nothing new here, we've been aware of all of these issues for some time'.
This might possibly be true – if you are a tenured professor sitting in Cambridge, perhaps, or a technology developer working out of Helsinki, Mountain View or Redmond.
(One could nonetheless note that being aware of something, and doing something useful and impactful as a result of this awareness, are not necessarily the same thing, a lesson that seems to need to be learned and re-learned again and again, often quite painfully and expensively, as 'innovations' from 'advanced' places are exported to other 'less advanced' places around the world with results that can at times be rather difficult to determine. It is also perhaps worth briefly recalling the insightful, if ungrammatical, words of the U.S. humorist Mark Twain, who observed back in the 19th century that, "It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.")
However, these are often relatively new discussions – and often very different discussions, it should be noted! – in other, less 'economically privileged' parts of the world. As computing devices and connectivity continue to proliferate, practical knowledge and know-how about what works, and what doesn't, when it comes to technology use in education is increasingly to be found in such places. It is to participate in, learn from and help catalyze related discussions that the EduTech blog was conceived and continues to operate.
While the posts in 2015 were published less frequently, they were on average much longer than in the past ("too long!", some might say) and largely explored themes (e.g. 'tablets', 'teachers', 'coding') drawing on experiences across multiple countries, rather than profiling specific individual projects or activities in one place, which was often the case in previous years.
It perhaps shouldn't need to be said (but I'll say it anyway, as I am obliged to do) that, whether taken individually or collectively, nothing here was or is meant to be definitive, exhaustive or 'official' in its consideration of a particular topic or activity. The EduTech blog serves essentially as a written excerpt of various ongoing conversations with a wide variety of groups and people around the world and as a mechanism for 'thinking aloud in public' about these conversations. Nothing is formally 'peer-reviewed' before it appears online, and the views expressed are those of the author(s) alone, and not the World Bank. (If you find a mistake, or just really disagree with something that appears on the EduTech blog, please feel free to blame the guy who writes this stuff, and not his bosses or the institution which employs him).
With those introductory comments out of the way, here are the ...
Testing: For better and/or for worse, many education systems are built around it (although some perhaps not as much as others). In numerous countries, a long-standing joke asks whether 'MOE' stands for 'Ministry of Education', or 'Ministry of Examination'? (This joke is not meant to be funny, of course.)
'Testing' is a source of and trigger for controversies of all different sorts, in different places around the world. The word 'standardized' is considered negative by many people when it is used to modify 'testing', but it is perhaps worth noting that a lack of 'standardized' tests can have important implications for equity. Within the U.S., the Obama administration recently declared that students should spend no more than 2% of classroom instruction time taking tests. However one feels about the wisdom of setting such hard targets (one could argue, for instance, that it's not 'testing' per se that's the problem, to the extent that it is indeed a problem, but rather 'bad testing') and the various types of time accounting shenanigans that might predictably emerge so that the letter but not the spirit of such declarations are met (a school could be creative about what it feels constitutes a 'test' or 'instruction time', for example), there is no denying the centrality of testing to approaches to education in schools around the world.
'Testing' means different things to different people. There are important distinctions between assessments that are formative (i.e. low stakes means to provide feedback to teachers and students on how much students are learning, as a way to identify strengths and weaknesses and act accordingly) and those that are summative (e.g. high stakes final exams).
The point here is not to get into a debate about testing, as illuminating and energetic (or frustrating and political) as such a debate might be. Rather, it is to shine a light on some related things happening at the frontier of activities and experiences in this area that are comparatively little known in most of the world but which may be increasingly relevant to many education systems in the coming years.
The nature of tests and testing is changing, enabled in large part by new technologies. (Side note: One way to predict where there are going to be large upcoming public sector procurement activities to provide computing equipment and connectivity to schools is to identify places where big reforms around standardized testing are underway.) While there continues to be growing interest (and hype, and discussion, and confusion) surrounding the potential for technology to enable more 'personalized learning', less remarked on in many quarters is the potential rise in more personalized testing.
The science fiction author William Gibson has famously observed that, The future is already here, it's just not evenly distributed. When it comes to educational technology use around the world, there are lots of interesting 'innovations at the edges' that are happening far away from the spots where one might reflexively look (like Seoul, Silicon Valley or Shanghai, to cite just a few examples that begin with the letter 'S') to learn practical lessons about what might be coming next, and how this may come to pass.
When it comes to testing, one such place is ... Georgia. This is not the Georgia in the southern United States (capital = Atlanta, where people play football while wearing helmets), but rather the small, mountainous country that borders the Black Sea which emerged as a result of the breakup of the Soviet Union (capital = Tbilisi, where people play football with their feet).
Georgia is the first country in the world to utilize computer adaptive testing for all of its school leaving examinations.
What does this mean,
what does it look like in practice,
what is there to learn from this experience,
and why should we care?
This week over a million students around the world will participate in the Hour of Code, an annual event designed to excite interest in computer science and computer programming. This reflects a growing interest in some quarters to promote efforts within schools to broaden awareness of what it means to 'code' (i.e. write a set of step-by-step directions that instruct computers to do something) and to help students develop related skills.
Perhaps not surprisingly, many leadingtechnologyfirms have been keen proponents and supporters of this educational coding 'movement'. While such support has been particularly pronounced and high profile in the United States -- many of the prominentorganizations have close ties to and/or roots in Silicon Valley -- this is long past being only a North American phenomenon.
Citing the increasing importance of coding skills, and IT jobs more broadly, to their national economies, policymakers in many countries are considering national coding education efforts of various sorts – and a few education systems have already begun to implement related initiatives. From Trinidad and Tobago to Indonesia to Nigeria, 'coding' is being introduced into classrooms and curricula around the world in various ways, both informally and (increasingly) formally as well, for better and/or for worse (depending on your perspective, and the particular nature or rigor of the specific initiatives).
This phenomenon is notably observable across Europe, where, rather famously (at least within the communities of people who care about and pay attention to such things), Estonia and the United Kingdom have introduced coding curricula for students beginning in early primary grades (the UK has actually made this mandatory – as has Slovakia, for what that’s worth). Each year in October, CodeWeek.eu serves as a continental focal point and showcase for many of these sorts of national and regional efforts. A recent report from the European Schoolnet (Computer programming and coding - Priorities, school curricula and initiatives across Europe [pdf]) features results from a survey of 21 ministries of education about their current coding-related initiatives and plans for the future. To date, 16 European countries have integrated coding into their curricula at some level (with Finland and the Flemish part of Belgium expected to do so in 2016). While the focus of most of these countries has been at the upper secondary level, coding is increasingly to be found (or soon to be found) at the primary level at a majority of these countries as well. The report highlights a number of important related pedagogical questions that are emerging out of European experience:
How to design effectively the learning processes and outcomes involving coding?
Which concrete activities (and programming languages) are most appropriate for different students, according to their age, interests and capacities?
What are the particular merits (and limits) of adopting a cross-curricular approach to teaching coding or a discrete computer science subject?
How to refine assessment, in particular where coding is integrated in a cross-curricular approach in other subjects?
It also highlights many challenges related to training and support for teachers. While many of the startups developing the tools and services that make the coding movement possible are in the United States, Europe is in many the ways at the center of actual related activities in schools.
“Coding”, it is said by some, is the “new literacy”. The ability to write and understand computer code, some contend, is increasingly fundamental to understanding how to navigate one’s way through, to say nothing of succeeding in, a modern society where more and more of our lives are enabled and/or constrained by the actions of devices and information systems that run on computer code.
I expect that few people would argue that efforts to expose some students to ‘coding’ (and to develop some related skills) is a bad thing. That said:
Should *all* students learn how to code? All? That’s ridiculous! some would answer. All? Absolutely! others respond.
I’ve sat in on a number of related discussions in ministries of education and at education policy forums around the world. At times, it can seem like members of these two groups are not only on different pages, but reading from totally different books. Those people just don’t get it, I’ve have heard representatives from both groups lament about each other after the conclusion of such meetings.
For what it’s worth, and in case it might be of any interest to others, here are, in no particular order, some of the most common arguments I hear made both in support of, and against, educational coding initiatives:
(In my experience, this query may be a result of an evaluation that showed little or no 'impact of technology on student learning’, despite massive amounts of money that have been invested … or it may simply come about because a lot of initiatives are coming to an end and the groups involved in them are looking for stuff to do. There are other impetuses as well, but I am regularly reminded that the motivations which animate this sort of question can vary quite a bit!)
In such instances, I usually respond by first congratulating them on all of these great accomplishments. A lot of hard work by a lot of dedicated people was needed to make these things happen, and no doubt lots of difficult challenges popped up along the way.
As difficult (and expensive) as it may have been to achieve all of these things, in many ways they represent just table settings, and not the main course. In other words (and to adopt another metaphor, in case you didn't like that last one), they are some of the key raw materials that can be used to help do something purposeful with technology to supports learning that has real, demonstrable impact.
It's about the content, not the container, after all. (That will be the last metaphor for a while, I promise.) As devices proliferate, and as better and more widespread connectivity enables connections to networks from different places using a variety of different devices, the value will decreasingly be in the devices themselves, but rather in the educational content they enable learners and teachers to access (and in the connections to communities of other people as well).
OK, they counter, if the value is indeed in the content, and not the container ... where do we get this content?
The textbooks we use do not have straightforward digital equivalents or complements.
We really haven't budgeted for any digital learning content.
We spent almost all of our money on hardware (and a little on related training).
We know that there is a lot of 'free' content available on the Internet that would be useful to teachers and students.
Many critics of contemporary schooling practices have noted that, if a teacher from the 19th century was magically transported into a typical classroom today, she would feel very comfortable with how things look. The room itself would be very familiar.
(Whether that teacher would be comfortable with today's students is another matter entirely, given that they probably look a little different than they did 'back in the day' -- to say nothing of how they might act and some of the opinions they might have!)
Contrast this, such critics note, with the situation of a surgeon from the 19th century teleported into an operating room today -- he would be bewildered, and perhaps disoriented, by all of the technology on display.
Few would deny that, in many fundamental and obvious ways, technology has revolutionalized medicine and healthcare.
Why hasn't it done so (yet) for learning and education?
One way that critics illustrate and reinforce this question is to share pictures of 'typical' operating rooms in the 19th and 21st centuries, alongside pictures of 'typical' classrooms from both centuries. The classrooms in such examples usually do look quite the same, with a teacher standing at the front of the room and neatly lined up rows of students intently (if metaphorically) drinking from the fountain of the teacher's knowledge. The chief noticeable difference (again, apart from the students themselves -- and the teachers as well) is that there are now computing devices of some sort on display in the 'modern' classroom, sometimes (depending on the country) lots of them, although the room essentially looks and functions the same way. The arrangement and nature of these ICT devices don't fundamentally alter the architecture of the room, nor what occurs inside it. In others words, the changes are additive, not transformative. (It is of course possible to provide pictures of some of today's 'innovative' classrooms that complicate this simple and popular narrative, as well as to ask some fundamental and important questions about what such pictures may obscure and what they illuminate, but I'll ignore such inconvenient complications here.)
Side note: Over a dozen years ago I visited the launch of a computer lab at a school in Cambodia. The headmaster had proudly transformed a room formerly used for sewing instruction into a 'technology lab', with a new PC atop each desk in place of the 'old-fashioned' technology of the sewing machine, with neat rows of students facing forward toward a teacher who was energetically shouting instructions.
Let's also put aside for a moment whether all of this technology 'makes a difference' (as well as perhaps more relevant questions about how and under what circumstances ICTs have an 'impact'). Let's ignore discussions about whether or not today's classrooms are a legacy of a 'factory model of education' that once existed but is no longer useful, or about the potential need to re-think school architecture in the age of ICT. Let's also ignore related 'big picture' issues around policymaking and planning.
Let's focus instead just on the technology itself.
Many regular readers of the EduTech blog are no doubt familiar with scenes of ICT equipment sitting unused in schools, locked away in computer labs or even still resting peacefully (and undamaged!) in unopenedboxes. Often times, getting teachers and students to use such equipment, let alone to use it 'productively', can be a rather tall order, for all sorts of reasons. Nevertheless, education ministries, local educational authorities, and schools around the world are buying lots of technology: PCs, laptops, tablets, projectors, and lots of other devices and peripherals.
What are they doing to make sure that this stuff doesn't get stolen?
At 9:00 am this past Monday morning, almost 30 people crammed into a small conference room at the World Bank in DC to talk about ... videogames. (A good number more were queued up online to join in, but unfortunately technical snafus prevented them from participating -- our continued apologies if you count yourself among that group.) The featured presenter at this discussion, my colleague Mariam Adil ("Meet the Woman Who's Shaking Up Pakistan's Social Gaming Industry"), the founder of GRID (Gaming Revolution for International Development), shared some of the interesting and innovative things she has been doing to help create and roll out a number of educational mobile apps, as a contribution to broader discussions on topics related to 'early childhood development' (ECD).
Providing children and their caregivers with access to quality pre-school education opportunities is a primary activity of the World Bank's work related to early childhood development. No one who participated in Monday's discussion expressed the view that 'technology is the answer to the challenges of ECD'. That said:
Are there approaches and activities related to early childhood development worth pursuing that can be complemented, and in some cases helpfully enabled by, new technologies?
As the related World Bank strategy states, "Investing in young children through ECD programs—ensuring they have the right stimulation, nurturing and nutrition—is one of the smartest investments a country can make to address inequality, break the cycle of poverty, and improve outcomes later in life."
Given the proliferation of mobile phones in communities around the world, there can be no denying that such things are increasingly in the hands of parents and caregivers (and, for better or worse in the hands of children as well, both briefly and for extended periods of time).
What are we learning about what is possible, and what is useful, to do with these devices that can complement and extend many ECD activities and programs?
I began my career exploring the uses of information and communication technologies (ICTs) in education in Ghana, Uganda and a number of other places in Africa in the late 1990s, and have continued to stay engaged with lots of passionate and innovative groups and people working with ICTs in various ways to help meet a variety of challenges related to education across the continent. Because of this history, and continued connections to lots of folks doing related cool stuff, I am from time to time asked:
"So, what's happening with technology use in education in Africa these days?"
That said, while the impulse from some corners to refer to 'Africa' may be both unfortunate but nevertheless predictable, being asked this sort of question at least provides an opportunity to unpack this question in ways that are (hopefully) useful and interesting. The EduTech blog was conceived in part, and in a decidedly modest way, to help direct the gaze of some folks to some of the interesting questions and challenges being addressed in different ways in different communities in Africa related to ICT use in education by groups who are, along the way, coming up with some interesting answers and solutions.
The UNESCO Institute for Statistics (UIS), the arm of the United Nations charged with collecting global data related to education (and some other sectors as well), recently came out with a report that provides some useful data that collectively can help outline the general shape of some of what is happening across the African continent when it comes to the availability and use of educational technologies. Information and Communication Technology (ICT) in Education in Sub-Saharan Africa: A Comparative Analysis of Basic e-Readiness in Schools is certainly not the first such report that has taken a continent-wide perspective, but it is notable in a number of regards -- not only because it is the most recent such effort, but also because it is intended as a precursor to more regular data, systematic data collection efforts going forward.
Written by Peter Wallet (with the assistance of Beatriz Valdes Melgar), the report presents data and related analysis from a survey co-sponsored by UIS, the Korea Education and Research Information Service (KERIS) and the Brazilian Regional Center for Studies on the Development of the Information Society (CETIC.br -- the group responsible for the annual Survey of ICT and Education in Brazil). The report notes that, unfortunately, "data on ICT in education in the region are sparse. Collecting more and better quality statistics will be a priority in the post-2015 development agenda given the growing role of ICT in education. In response, the UIS is working with countries to establish appropriate mechanisms to process and report data, and to better measure the impact of technology on the quality of education." With that caveat and announcement out of the way, the report then utilizes the UIS Guide to Measuring Information and Communication Technologies (ICT) in Education as a framework with which to examine what could be discovered about the existence of related national policies, data about learner-computer ratios, school electrification and connectivity, and ICT-related instruction and curricula in ways consistent with other regional reports that the UIS has published on Asia, Latin America and a number of Arab countries. (Here's a list of international surveys of this sort from UIS and others for anyone who might be interested, as well as some general information about efforts of this sort.)
The OECD today released a landmark report on students, technology and learning based on data from PISA, the international assessment of 15-year-old school pupils' scholastic performance on mathematics, science, and reading. This new publication presents the most detailed set of data and analysis to date on student access to computers, their use of computers, and learning outcomes (as measured by PISA).
Here’s an excerpt from the beginning of a related blog post by the OECD’s Andreas Schleicher:
"Totally wired. That’s our image of most 15-year-olds and the world they inhabit. But a new, ground-breaking report on students’ digital skills and the learning environments designed to develop those skills, paints a very different picture. Students, Computers and Learning: Making the Connection finds that, despite the pervasiveness of information and communication technologies (ICT) in our daily lives, these technologies have not yet been as widely adopted in formal education. And where they are used in the classroom, their impact on student performance is mixed, at best."
Press reports today have (unsurprisingly) not been terribly nuanced or sophisticated in their understanding or analysis of what the OECD report actually says. Witness the Irish Times: "Ireland has one of the lowest rates of internet use in schools in the world but, ironically, it may be doing students more good than harm, according to a global study published on Tuesday" or the BBC: "Computers 'do not improve' pupil results, says OECD". The Register concludes that the main message is "Don't bother buying computers for schools, says OECD report". More sophisticated and substantive takes on these findings will hopefully emerge in the coming weeks. (I don’t know about you, but it seems to me that a more relevant, and practical, directive might be to figure out how to make good use of all of this technology rather than simply to avoid it entirely, but maybe I am a biased observer here.)
My very quick summary take on a few of the key findings, for what it might be worth: