Mobile Learning Week 2016 begins on Monday, March 7 at UNESCO headquarters in Paris. The fifth such annual international gathering, #MLW2016 will feature a great lineup of speakers who will share information and perspectives on the use of 'mobile technologies' in education around the world, with specific attention to contexts, initiatives, perspectives and innovations in middle- and low-income countries. The program of the event itself looks to be great, with a mixture of workshops, a policy forum (together with the ITU) and a two-day symposium, all kicked off by a special online 'debate' at 6pm Paris time organized by the folks at Education Fast Forward ("Innovation & Quality: Two sides of the same coin?"). I expect the real attraction of the event for many won't be found on the official program itself. Rather, it will be the opportunities to meet like-minded folks from around the world who are asking lots of useful questions and doing cool stuff 'on-the-ground'. A lot of this stuff is largely under the radar of the press and blogosphere, which directs most of its attention to what's happening in the 'developed' countries of Europe and North America and so is often not clued into some of the fascinating 'innovations at the edges' that are emerging.
Mobile Learning Week is in many ways a companion event to the annual meeting of the mEducation Alliance, the USAID-led initiative which includes many of the same international institutions as sponsors and participants. The mEducation Alliance has also been bringing together people to talk about what is happening in the 'mobile learning' space in so-called 'developing countries' for five years. As someone who has worked in this area for some time, it is clear that we all really live in 'developing countries' when it comes to 'the use of small mobile devices in education', but there have been some notable changes in the nature of related discussions over the past half-decade. In case anyone might care to listen, here are a few of them that I've observed:
Yesterday the World Bank hosted a great discussion related to strategies for tackling the high cost and low availability of textbooks, with a specific focus on needs and contexts across Sub-Saharan Aftrica.
As a complement to yesterday's discussions, a number of posts related to the use of digital teaching and learning materials that have appeared on the World Bank's EduTech blog have been collected here, to make them easier to find, and in case making them available in this way can help in a small way to help enrich any related conversations.
(Please note that additional links will be added to this page over time as relevant related posts appear on the blog.)
When the Russian Empress Catherine the Great visited Crimea in the late 18th century, the nobleman Grigory Potemkin is meant to have had fake village facades erected along her travel route, as well as to have spruced up some of the existing visible buildings (and people), so that she would be fooled into thinking that things were better than they really were. While historians have expressed considerable doubt about whether this actually occurred (indeed, many place it in the category of persistent "cultural myth"), the concept of a Potemkin Village, where things are tarted up so that occasional visitors get a false sense of reality, is not too difficult to understand.
Over the years I have visited hundreds and hundreds of schools in scores of countries to get a sense how they are using (and not using) technology. Whether in rural Eritrea or highly developed, urban Singapore, as an outsider I am always conscious of the fact that there is an element of 'show' to what I am seeing -- or at least that there might be. The act of observing can often change what is being observed (social scientists refer to this as the Hawthorne effect). As an employee of the World Bank, I know that government officials who arrange and accompany me on school visits often want to showcase what 'works', and what is 'best practice'. This is especially the case where World Bank (or other external) funding has been involved, as people are eager to show that related monies have been well spent.
This is not always the case, of course. I was once lucky enough to visit a school in Latin America for children with special educational needs in a country that was buying *lots* of technology for use by teachers and students. We arranged to meet the relevant government officials at the school early in the morning so that they could act as our guides. However, it turned out that there were actually two schools for special needs students located on the same street in different parts of the city -- and we had gone to the wrong one. After waiting for a while in the office of the headmistress (who was clearly surprised that we were there), it was decided that we should just begin the tour and start talking to people. A few hours later, after the national educational officials had finally figured out where we were, we were picked up and driven to the 'correct' school. It will probably come as little surprise that our experiences in both places were quite different. Chats with teachers, administrators and parents at the first school contrasted rather markedly with the quite sunny picture presented to us at the second 'showcase' school. This is not to say that we couldn't learn anything from the showcase school, however, just that we learned different things -- and perhaps had to work a bit harder to do so.
Testing: For better and/or for worse, many education systems are built around it (although some perhaps not as much as others). In numerous countries, a long-standing joke asks whether 'MOE' stands for 'Ministry of Education', or 'Ministry of Examination'? (This joke is not meant to be funny, of course.)
'Testing' is a source of and trigger for controversies of all different sorts, in different places around the world. The word 'standardized' is considered negative by many people when it is used to modify 'testing', but it is perhaps worth noting that a lack of 'standardized' tests can have important implications for equity. Within the U.S., the Obama administration recently declared that students should spend no more than 2% of classroom instruction time taking tests. However one feels about the wisdom of setting such hard targets (one could argue, for instance, that it's not 'testing' per se that's the problem, to the extent that it is indeed a problem, but rather 'bad testing') and the various types of time accounting shenanigans that might predictably emerge so that the letter but not the spirit of such declarations are met (a school could be creative about what it feels constitutes a 'test' or 'instruction time', for example), there is no denying the centrality of testing to approaches to education in schools around the world.
'Testing' means different things to different people. There are important distinctions between assessments that are formative (i.e. low stakes means to provide feedback to teachers and students on how much students are learning, as a way to identify strengths and weaknesses and act accordingly) and those that are summative (e.g. high stakes final exams).
The point here is not to get into a debate about testing, as illuminating and energetic (or frustrating and political) as such a debate might be. Rather, it is to shine a light on some related things happening at the frontier of activities and experiences in this area that are comparatively little known in most of the world but which may be increasingly relevant to many education systems in the coming years.
The nature of tests and testing is changing, enabled in large part by new technologies. (Side note: One way to predict where there are going to be large upcoming public sector procurement activities to provide computing equipment and connectivity to schools is to identify places where big reforms around standardized testing are underway.) While there continues to be growing interest (and hype, and discussion, and confusion) surrounding the potential for technology to enable more 'personalized learning', less remarked on in many quarters is the potential rise in more personalized testing.
The science fiction author William Gibson has famously observed that, The future is already here, it's just not evenly distributed. When it comes to educational technology use around the world, there are lots of interesting 'innovations at the edges' that are happening far away from the spots where one might reflexively look (like Seoul, Silicon Valley or Shanghai, to cite just a few examples that begin with the letter 'S') to learn practical lessons about what might be coming next, and how this may come to pass.
When it comes to testing, one such place is ... Georgia. This is not the Georgia in the southern United States (capital = Atlanta, where people play football while wearing helmets), but rather the small, mountainous country that borders the Black Sea which emerged as a result of the breakup of the Soviet Union (capital = Tbilisi, where people play football with their feet).
Georgia is the first country in the world to utilize computer adaptive testing for all of its school leaving examinations.
What does this mean,
what does it look like in practice,
what is there to learn from this experience,
and why should we care?
This week over a million students around the world will participate in the Hour of Code, an annual event designed to excite interest in computer science and computer programming. This reflects a growing interest in some quarters to promote efforts within schools to broaden awareness of what it means to 'code' (i.e. write a set of step-by-step directions that instruct computers to do something) and to help students develop related skills.
Perhaps not surprisingly, many leadingtechnologyfirms have been keen proponents and supporters of this educational coding 'movement'. While such support has been particularly pronounced and high profile in the United States -- many of the prominentorganizations have close ties to and/or roots in Silicon Valley -- this is long past being only a North American phenomenon.
Citing the increasing importance of coding skills, and IT jobs more broadly, to their national economies, policymakers in many countries are considering national coding education efforts of various sorts – and a few education systems have already begun to implement related initiatives. From Trinidad and Tobago to Indonesia to Nigeria, 'coding' is being introduced into classrooms and curricula around the world in various ways, both informally and (increasingly) formally as well, for better and/or for worse (depending on your perspective, and the particular nature or rigor of the specific initiatives).
This phenomenon is notably observable across Europe, where, rather famously (at least within the communities of people who care about and pay attention to such things), Estonia and the United Kingdom have introduced coding curricula for students beginning in early primary grades (the UK has actually made this mandatory – as has Slovakia, for what that’s worth). Each year in October, CodeWeek.eu serves as a continental focal point and showcase for many of these sorts of national and regional efforts. A recent report from the European Schoolnet (Computer programming and coding - Priorities, school curricula and initiatives across Europe [pdf]) features results from a survey of 21 ministries of education about their current coding-related initiatives and plans for the future. To date, 16 European countries have integrated coding into their curricula at some level (with Finland and the Flemish part of Belgium expected to do so in 2016). While the focus of most of these countries has been at the upper secondary level, coding is increasingly to be found (or soon to be found) at the primary level at a majority of these countries as well. The report highlights a number of important related pedagogical questions that are emerging out of European experience:
How to design effectively the learning processes and outcomes involving coding?
Which concrete activities (and programming languages) are most appropriate for different students, according to their age, interests and capacities?
What are the particular merits (and limits) of adopting a cross-curricular approach to teaching coding or a discrete computer science subject?
How to refine assessment, in particular where coding is integrated in a cross-curricular approach in other subjects?
It also highlights many challenges related to training and support for teachers. While many of the startups developing the tools and services that make the coding movement possible are in the United States, Europe is in many the ways at the center of actual related activities in schools.
“Coding”, it is said by some, is the “new literacy”. The ability to write and understand computer code, some contend, is increasingly fundamental to understanding how to navigate one’s way through, to say nothing of succeeding in, a modern society where more and more of our lives are enabled and/or constrained by the actions of devices and information systems that run on computer code.
I expect that few people would argue that efforts to expose some students to ‘coding’ (and to develop some related skills) is a bad thing. That said:
Should *all* students learn how to code? All? That’s ridiculous! some would answer. All? Absolutely! others respond.
I’ve sat in on a number of related discussions in ministries of education and at education policy forums around the world. At times, it can seem like members of these two groups are not only on different pages, but reading from totally different books. Those people just don’t get it, I’ve have heard representatives from both groups lament about each other after the conclusion of such meetings.
For what it’s worth, and in case it might be of any interest to others, here are, in no particular order, some of the most common arguments I hear made both in support of, and against, educational coding initiatives:
(In my experience, this query may be a result of an evaluation that showed little or no 'impact of technology on student learning’, despite massive amounts of money that have been invested … or it may simply come about because a lot of initiatives are coming to an end and the groups involved in them are looking for stuff to do. There are other impetuses as well, but I am regularly reminded that the motivations which animate this sort of question can vary quite a bit!)
In such instances, I usually respond by first congratulating them on all of these great accomplishments. A lot of hard work by a lot of dedicated people was needed to make these things happen, and no doubt lots of difficult challenges popped up along the way.
As difficult (and expensive) as it may have been to achieve all of these things, in many ways they represent just table settings, and not the main course. In other words (and to adopt another metaphor, in case you didn't like that last one), they are some of the key raw materials that can be used to help do something purposeful with technology to supports learning that has real, demonstrable impact.
It's about the content, not the container, after all. (That will be the last metaphor for a while, I promise.) As devices proliferate, and as better and more widespread connectivity enables connections to networks from different places using a variety of different devices, the value will decreasingly be in the devices themselves, but rather in the educational content they enable learners and teachers to access (and in the connections to communities of other people as well).
OK, they counter, if the value is indeed in the content, and not the container ... where do we get this content?
The textbooks we use do not have straightforward digital equivalents or complements.
We really haven't budgeted for any digital learning content.
We spent almost all of our money on hardware (and a little on related training).
We know that there is a lot of 'free' content available on the Internet that would be useful to teachers and students.
Many critics of contemporary schooling practices have noted that, if a teacher from the 19th century was magically transported into a typical classroom today, she would feel very comfortable with how things look. The room itself would be very familiar.
(Whether that teacher would be comfortable with today's students is another matter entirely, given that they probably look a little different than they did 'back in the day' -- to say nothing of how they might act and some of the opinions they might have!)
Contrast this, such critics note, with the situation of a surgeon from the 19th century teleported into an operating room today -- he would be bewildered, and perhaps disoriented, by all of the technology on display.
Few would deny that, in many fundamental and obvious ways, technology has revolutionalized medicine and healthcare.
Why hasn't it done so (yet) for learning and education?
One way that critics illustrate and reinforce this question is to share pictures of 'typical' operating rooms in the 19th and 21st centuries, alongside pictures of 'typical' classrooms from both centuries. The classrooms in such examples usually do look quite the same, with a teacher standing at the front of the room and neatly lined up rows of students intently (if metaphorically) drinking from the fountain of the teacher's knowledge. The chief noticeable difference (again, apart from the students themselves -- and the teachers as well) is that there are now computing devices of some sort on display in the 'modern' classroom, sometimes (depending on the country) lots of them, although the room essentially looks and functions the same way. The arrangement and nature of these ICT devices don't fundamentally alter the architecture of the room, nor what occurs inside it. In others words, the changes are additive, not transformative. (It is of course possible to provide pictures of some of today's 'innovative' classrooms that complicate this simple and popular narrative, as well as to ask some fundamental and important questions about what such pictures may obscure and what they illuminate, but I'll ignore such inconvenient complications here.)
Side note: Over a dozen years ago I visited the launch of a computer lab at a school in Cambodia. The headmaster had proudly transformed a room formerly used for sewing instruction into a 'technology lab', with a new PC atop each desk in place of the 'old-fashioned' technology of the sewing machine, with neat rows of students facing forward toward a teacher who was energetically shouting instructions.
Let's also put aside for a moment whether all of this technology 'makes a difference' (as well as perhaps more relevant questions about how and under what circumstances ICTs have an 'impact'). Let's ignore discussions about whether or not today's classrooms are a legacy of a 'factory model of education' that once existed but is no longer useful, or about the potential need to re-think school architecture in the age of ICT. Let's also ignore related 'big picture' issues around policymaking and planning.
Let's focus instead just on the technology itself.
Many regular readers of the EduTech blog are no doubt familiar with scenes of ICT equipment sitting unused in schools, locked away in computer labs or even still resting peacefully (and undamaged!) in unopenedboxes. Often times, getting teachers and students to use such equipment, let alone to use it 'productively', can be a rather tall order, for all sorts of reasons. Nevertheless, education ministries, local educational authorities, and schools around the world are buying lots of technology: PCs, laptops, tablets, projectors, and lots of other devices and peripherals.
What are they doing to make sure that this stuff doesn't get stolen?
At 9:00 am this past Monday morning, almost 30 people crammed into a small conference room at the World Bank in DC to talk about ... videogames. (A good number more were queued up online to join in, but unfortunately technical snafus prevented them from participating -- our continued apologies if you count yourself among that group.) The featured presenter at this discussion, my colleague Mariam Adil ("Meet the Woman Who's Shaking Up Pakistan's Social Gaming Industry"), the founder of GRID (Gaming Revolution for International Development), shared some of the interesting and innovative things she has been doing to help create and roll out a number of educational mobile apps, as a contribution to broader discussions on topics related to 'early childhood development' (ECD).
Providing children and their caregivers with access to quality pre-school education opportunities is a primary activity of the World Bank's work related to early childhood development. No one who participated in Monday's discussion expressed the view that 'technology is the answer to the challenges of ECD'. That said:
Are there approaches and activities related to early childhood development worth pursuing that can be complemented, and in some cases helpfully enabled by, new technologies?
As the related World Bank strategy states, "Investing in young children through ECD programs—ensuring they have the right stimulation, nurturing and nutrition—is one of the smartest investments a country can make to address inequality, break the cycle of poverty, and improve outcomes later in life."
Given the proliferation of mobile phones in communities around the world, there can be no denying that such things are increasingly in the hands of parents and caregivers (and, for better or worse in the hands of children as well, both briefly and for extended periods of time).
What are we learning about what is possible, and what is useful, to do with these devices that can complement and extend many ECD activities and programs?
I began my career exploring the uses of information and communication technologies (ICTs) in education in Ghana, Uganda and a number of other places in Africa in the late 1990s, and have continued to stay engaged with lots of passionate and innovative groups and people working with ICTs in various ways to help meet a variety of challenges related to education across the continent. Because of this history, and continued connections to lots of folks doing related cool stuff, I am from time to time asked:
"So, what's happening with technology use in education in Africa these days?"
That said, while the impulse from some corners to refer to 'Africa' may be both unfortunate but nevertheless predictable, being asked this sort of question at least provides an opportunity to unpack this question in ways that are (hopefully) useful and interesting. The EduTech blog was conceived in part, and in a decidedly modest way, to help direct the gaze of some folks to some of the interesting questions and challenges being addressed in different ways in different communities in Africa related to ICT use in education by groups who are, along the way, coming up with some interesting answers and solutions.
The UNESCO Institute for Statistics (UIS), the arm of the United Nations charged with collecting global data related to education (and some other sectors as well), recently came out with a report that provides some useful data that collectively can help outline the general shape of some of what is happening across the African continent when it comes to the availability and use of educational technologies. Information and Communication Technology (ICT) in Education in Sub-Saharan Africa: A Comparative Analysis of Basic e-Readiness in Schools is certainly not the first such report that has taken a continent-wide perspective, but it is notable in a number of regards -- not only because it is the most recent such effort, but also because it is intended as a precursor to more regular data, systematic data collection efforts going forward.
Written by Peter Wallet (with the assistance of Beatriz Valdes Melgar), the report presents data and related analysis from a survey co-sponsored by UIS, the Korea Education and Research Information Service (KERIS) and the Brazilian Regional Center for Studies on the Development of the Information Society (CETIC.br -- the group responsible for the annual Survey of ICT and Education in Brazil). The report notes that, unfortunately, "data on ICT in education in the region are sparse. Collecting more and better quality statistics will be a priority in the post-2015 development agenda given the growing role of ICT in education. In response, the UIS is working with countries to establish appropriate mechanisms to process and report data, and to better measure the impact of technology on the quality of education." With that caveat and announcement out of the way, the report then utilizes the UIS Guide to Measuring Information and Communication Technologies (ICT) in Education as a framework with which to examine what could be discovered about the existence of related national policies, data about learner-computer ratios, school electrification and connectivity, and ICT-related instruction and curricula in ways consistent with other regional reports that the UIS has published on Asia, Latin America and a number of Arab countries. (Here's a list of international surveys of this sort from UIS and others for anyone who might be interested, as well as some general information about efforts of this sort.)
Much is made of the need for 'innovation' in education. Bullet points containing words like 'disruption' and 'transformation' increasingly characterize presentations at big education gatherings -- especially in North America, and especially where educational entrepreneurs and 'Silicon Valley-types' are to be found. The popular press is replete with (sometimes breathless) articles about the 'revolutionary' potential of some new technology to impact teaching and learning in ways that are often quite exciting. Indeed: There can be little doubt that the increased diffusion of low(er) cost, (more) powerful, connected IT devices across and within communities offers exciting possibilities and potential to do things differently -- potentially in a good way.
For many people, the use of technology in education constitutes a de facto 'innovation'. Whether or not this belief is actually accurate, or useful, is a legitimate question for discussion. That said, there is no denying that many of the educational innovations celebrated (or at least touted) today are enabled by the use of such technologies in some way.
Around the world, there few more conservative and traditional sectors than those related to public education. In many ways this is totally understandable, and appropriate. Investments in education represent investments in the future -- of our children, of our future citizens and workers and leaders and community members. We don't want to gamble with or experiment with the way we educate our children and try out too many new things, or so goes one line of thinking. The potential downside, or failure, carries with it consequences that are just too great.
And yet: We know that, for millions children around the world, the education they are getting today isn't actually all that great. Some frightening stats from just one page of the latest Global Monitoring Report [pdf], drawing on recent research from RTI:
In Nicaragua in 2011, around 60% of second-graders could not identify numbers correctly and more than 90% were unable to answer a subtraction question.
In Malawi, 94% of second-graders could not respond correctly to a single question about a story they read in Chichewa, the national language.
In Iraq, 25% of third-graders were unable to tell the sound of a letter in Arabic.
And if you think that the situations in certain education systems are bad: Around the world, many children and adolescents -- 124 million, according to the latest figures from UNESCO -- are out of school and not getting any formal education at all.
In many cases then -- too many -- education systems aren't actually working all that well. In others -- like the global 'high performers' that are regularly held up as 'best practice' examples for other countries to emulate (Finland, Shanghai, Korea, Singapore) -- there is the danger that what worked well in the past (or what appears to be working well now) might not work so well in the future. The future is changing -- shouldn't we change the way we prepare for it? The riskiest course of action might well be one where people and institutions don't take risks.
Where business as usual is decidedly not working today,
or where it is feared that business as usual may not work tomorrow ...
what are some examples of business unusual from which
we might draw inspiration -- as well as practical insight?
Many good examples of this sort are regularly cited from experiences in highly developed, industrialized economies of North America, Europe and East Asia. No doubt much can be, and will be, profitably learned from what is happening such places. That said, the challenges facing education systems and families around the world are particularly acute where the needs are greatest: in many low- and middle-income countries, and especially within remote communities and traditionally disadvantaged populations.
Examples of 'innovation in education' from such places might just be more relevant to policymakers in Phnom Penh or Quito than are ones which originate in, say Palo Alto or Cambridge. (And, it is perhaps worth noting, that, if you believe that innovation often arises 'at the edges', where constraints compel people to be inventive in their approaches to solving problems in ways that folks in more resource-rich environments may never consider, it may just be that policymakers in Paris and Canberra may learn something to learn from what's happening in 'developing countries' as well.)
What examples do we have of innovative uses of educational technologies in such places?