A number of places around the world have made very large, (hopefully) strategic investments in technology use across their formal education systems featuring so-called "1-to-1 computing", where every student has her own laptop or tablet learning device.
One of the largest national initiatives of this sort is largely unknown outside that country's borders. To the extent that Turkey's ambitious FATIH projectis known around the world, it is probably as a result of headlines related to plans to buy massive numbers of tablets (news reports currently place the figures at about 11 million) and interactive whiteboards (over 450,000 will be placed in classrooms, labs, teacher rooms and kindergartens). The first big phase of the project began in 2011 with 52 schools receiving tablets and interactive whiteboards as a sort of pilot project to test implementation models, with results (here's one early evaluation report) meant to inform later, larger stages of (massive) roll-outs.
The project's acronymic title, FATIH (which stands for Fırsatları Artırma ve Teknolojiyi İyileştirme Hareketi, or 'Movement to Increase Opportunities and Technology'), deliberately recalls the conqueror of Istanbul, Fatih Sultan Mehmet. Speaking at the project's inauguration, Turkish Prime Minister Recep Tayyip Erdoğan noted that, “As Fatih Sultan Mehmet ended the Middle Ages and started a new era with the conquering of İstanbul in 1453, today we ended a dark age in education and started a new era, an era of information technology in Turkish education, with the FATİH project.”
What do we know about FATIH,
how might it develop,
and how might lessons from this development
be of interest and relevance to other countries
considering ambitious plans of their own to roll out educational technologies?
The New York Times famously labeled 2012 the 'year of the MOOC', acknowledging the attention and excitement generated by a few high profile 'massive open online courses' which enrolled tens of thousands of students from all of the world to participate in offerings from a few elite universities in the United States.
What might 2014 bring for MOOCs, especially as might relate to situations and circumstances in so-called ‘developing countries’?
It may be hard for some in North America to believe, given the near saturation coverage in some English language web sites that focus on higher education and in certain thematically-linked corners of the English-language blogosphere, but the 'MOOC' phenomenon is only just now starting to register with many educational policymakers in middle and low income countries around the world. While many MOOCs have (from the start, and increasingly) attracted students from all over the world, at the policy level, 'MOOCs' have not – at least in my experience during the course of my work at the World Bank on education and technology issues -- been a topic much discussed by our counterparts in ministries of higher education and universities. Yes, one does see the occasional bullet point in a PowerPoint presentation towards the end of an institutional planning meeting, but my impression is that this can often be as much a reflection of the speaker’s desire to project a familiarity with emerging buzzwords as it is a reflection of any sustained strategic or practical consideration of the potential relevance (or threat) of MOOCs to traditional practices in higher education outside of ‘rich’ countries.
More than a few commenters in North America have invoked the Technology Hype Cycle(a concept developed and popularized by Gartner to represent the maturity, adoption and social application of certain technologies, and their application) when proclaiming that MOOCs have now past a 'peak of inflated expectations' to enter a period known as the 'trough of disillusionment' as a result of things like the recent change of course or ‘pivot’ of Udacity, one of the leading MOOC platform providers.
While this assessment of the state of maturity/adoption may or may not be true from a North American perspective, and even if we concede that technology hype cycles are being compressed (it took Second Life and other ‘virtual worlds’, another recent notable educational technology phenomenon, three times as long to move from a period of great hype in educational circles to one of ‘disillusion’), such commenters may often neglect to consider that many hype cycles can exist simultaneously for the same technology or technology-enabled approach or service, depending on where you might find yourself in the world.
While perhaps unsure of the extent to which MOOCs represent a 'threat' to existing educational practices, a new avenue for higher education, or perhaps something else entirely, I agree with people who say that the reports of the death of the MOOC are highly exaggerated. Roy Amara, the longtime president of the Institute for the Future, famously remarked that "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run." I would not be surprised if this holds for many of the trends that we, as a matter of convenience, and correctly or not, group together under the general heading of ‘MOOCs’ today.
In my personal experience working at the World Bank on projects at the intersection of technology and education sectors, and when in discussions in many similar sorts of international organizations, ‘MOOCs’ are, generally speaking, still not a hot topic of consideration for educational policymakers in most middle and low income countries. That said, they are starting to gain increasing mindshare in some places. At the very least, they are generating some real confusion (and where there is confusion, there is potentially opportunity as well, for better and for worse).
As a result, many folks in the international donor community are now beginning to ask themselves questions like:
• How can, or should, we be talking about MOOCs when speaking with our counterparts in government around the world?
• What are the real, practical opportunities to consider in the short and medium term?
• Where, and how, might education ministries and universities wish to engage with related issues -- and what role (if any) should organizations like the World Bank play in this process of engagement?
Many countries and education systems around the world are currently engaged in large-scale efforts to introduce huge numbers of computing devices (PCs, laptops, tablets) into schools and into the hands of teachers and students, and many more initiatives are under serious consideration. However one might feel about such projects (in general, or in particular instances), there is no denying that these can be quite complex undertakings, rolling out over many years, in multiple stages, with many interdependent components (related to e.g. infrastructure, content, training, assessment), and costing (tens of, sometimes hundreds of) millions of dollars. When planning such initiatives, there are many questions to be asked, large and small. One question that I don’t find is typically given much serious attention relates to what would, at first glance, probably appear to be a rather simple one, with a simple answer:
Who owns the laptops (tablets) that will be distributed to students (teachers)?
I regularly ask this question as part of my interactions with leaders of various such projects around the world. I find that I rarely get a simple or complete answer. This is potentially problematic, as the responses to this question, and a set of related ones, can have a very profound impact on how such projects function in practice, and thus on their (potential) impact as well.
Here’s one example of why this sort of thing is important:
For the past seven years, the Korean Education & Research Information Service (KERIS) has hosted an annual global symposium on ICT use in education, bringing senior policymakers and practitioners from around the world to Seoul to share emerging lessons from attempts to introduce and utilize information and communications technologies to help meet a wide variety of goals in the education sector. Each year this event, which is one important component of a strong multi-year partnership between the World Bank education sector and the Korean Ministry of Education, focuses on one particular theme. This year's symposium examined the 'changing role of teachers' and featured presentations from, and discussions with, policymakers from 22 countries. This was also the dominant theme of the first global symposium back in 2007 -- but, oh my, how the nature and content of the discussions have changed!
At that first symposium, much of the talk from policymakers in middle and low income countries was still of promise and potential, of the need to begin preparing for what was inevitably going to come. Where there were specific lessons and models and research to share, these were largely those from places like the United States, the UK, Australia -- and of course from Korea itself! For most of the policymakers from middle and low income countries participating in the event, helping to prepare and support teachers as they sought to use ICTs in various ways in support of their teaching, and to support student learning, was something being explored in various 'pilot' activities, and a topic perhaps given some (at least rhetorical) attention in national education policy documents. It wasn't yet a real area of large and sustained activity and expenditure -- largely because there just weren't that many computers in most schools, and what computers that were present were mostly to be found in computer labs, presided over by 'computer teachers' of various sorts. As this year's event made clear, the introduction of PCs, laptops, tablets, and interactive whiteboards is something that is now happening *right now* in very large numbers in countries of all sorts, and ministries of education are ramping up and reforming their teacher training efforts as a result.
A few quick highlights from this year's symposium:
Following up on previousblogposts exploring issues related to planning for new investments in digital teaching and learning materials to be used across education systems, I thought I'd share some of the general recommendations that have often featured in related discussions with policymakers in which I have been involved, in case they might be of utility or interest to anyone else.
This list certainly isn't comprehensive. As with all posts on the EduTech blog, the standard disclaimers should apply (e.g. these are the views of the author and do not necessarily represent official views of the World Bank, etc.). It is perhaps worth noting that these sorts of suggestions are typically made and discussed within a specific context: A country has decided, for better or for worse, that it will consider significant new investments in digital teaching and learning materials. With this decision already made, policymakers are looking for some additional perspectives and inputs to help guide their thinking as they move forward.
In other words: These sorts of recommendations typically are not meant to inform higher level discussions about fundamental strategic priorities in the education sector (although, where they may help trigger reconsideration of some broader decisions made at higher levels, that may not always be such a bad thing). They are not meant to help, for example, policymakers assess whether or not to spend money on digital textbooks versus buying related hardware, let alone whether or not investments in digital learning resources should be made instead of spending money on things like school feeding programs, improvements in instruction at teacher training colleges, or hiring more teachers. Rather, they are more along the lines of:
So you have decided to buy a lot of 'digital textbooks'?
Here is some potential food for thought.
With that context and those caveats in place, here are ten general recommendations that education officials contemplating the use of digital teaching and learning materials at scale across a country’s education system may wish to consider during their related planning processes:
Across Africa, a variety of devices are increasingly being used to disseminate and display teaching and learning materials in electronic and digital formats. As costs for such devices continue to fall, and as the devices themselves become more widely available and used across communities, the small pilot, and largely NGO-led, projects that have characterized most efforts to introduce educational technologies in schools across Africa will inevitably be complemented, and in many cases superseded, by large-scale national initiatives of the sorts now taking place in Rwanda and Kenya, where hundreds of thousands of devices are being, or will soon be, distributed to schools.
Few would argue that the use of such devices do not offer great promise and potential to improve the access to and quality of education by providing access to more educational content than is currently available inside and outside of schools. Internet connectivity can provide access to millions of educational materials available on the Internet; low cost, handheld e-reading devices can hold more than a thousand books. Depending on the availability of connectivity, or local resourcefulness in transferring materials to devices manually, digital content used in schools can be updated more regularly than is possible with printed materials. Depending on the device utilized, this content can be presented as ‘rich media’, with audio, video and animations helping content be displayed in ways that are engaging and interactive. It is possible to track electronically how such content is used, and, depending on the technologies employed, to present content to teachers and learners in personalized ways. In some cases , this content can be delivered at lower costs than those incurred when providing traditional printed materials.
Given the increased availability and diffusion of consumer computing technologies across much of the continent in less than a decade, it is perhaps not surprising that a number of widespread misconceptions about the promise and potential of using digital technologies and devices across Africa to increase access to learning materials appear to have taken hold. On one level, this is consistent with the ‘hype cycle’ model of technology diffusion in which, according to Gartner, a technology breakthrough is soon followed by a period of time of “inflated expectations” about what sort of changes might be possible as a result.
A few countries across Africa are considering rather ambitious initiatives to roll out and utilize digital textbooks, a general catch-all term or metaphor which I understand in many circumstances to be ‘teaching and learning resources and materials presented in electronic and digital formats’.
How much will such initiatives cost?
Reflexively, some ministries of education (and donors!) may think this is a pretty straightforward question to answer. After all, they have been buying textbooks in printed formats for a long time, they have a good handle on what such materials traditional cost, and so they may naturally presume that they can think about the costs of ‘digital textbooks’ in pretty similar ways.
Many people are surprised to discover that calculating costs associated with the introduction and use of digital teaching and learning materials is often a non-trivial endeavor. At a basic level, how much an education system spends will depend on what it intends to do, its current capacity to support such use – and of course what it can afford. As they investigate matters more deeply (and sit through many presentations from publishers and other vendors, sometimes wowed at what is now possible and available while at the same time rather confused about what is now possible and available), education officials seeking to acquire digital teaching and learning materials for use at scale across an education system may find costing exercises to be, in reality, rather challenging and (surprisingly) complex when compared to their ‘standard’ textbook procurement practices.
Much of the discussion related to how new technologies can be used in classrooms in low and middle income countries focuses on the use of PCs, desktops and tablets. Less discussed, I often find, is the strategic potential of various so-called peripheral devices, which are (in my experience) typically only considered within the context of how they can be used to enhance or extend the functionality of the 'main' computing devices available in schools.
Many education systems (for better or for worse) have specific 'hardware' budgets, and, when they are looking to tap these budgets to introduce more hardware into schools, in my experience they often look to buy more of what they already have, supplemented in places by things like interactive whiteboards, or networked printers, as a complement to what is already available in a school.
When talking with educational planners contemplating how to use funds specifically dedicated to purchase computer hardware, I often counsel them to think much more broadly about what they may wish to buy with these monies, within a larger context of discussing things like how such equipment can be utilized to meet larger educational objectives, what sorts of training and maintenance support may be needed, and how the use of this technology can complement other, non-technology-enabled activities in a classroom. As part of such discussions, I often find myself attempting in various ways to challenge policymakers and planners to think beyond their current models for technology use.
One general type of gadget that I only rarely hear discussed is so-called 'probeware', which refers to set of devices which are typically used in science classes to measure various things -- temperature, for example, or the pH level of soil, or the salinity of water. Despite the increasing emphasis in STEM subjects in many countries, and what is often a rhetorical linkage between the use of computers in schools and STEM topics, I rarely find that World Bank client countries are considering the widespread use of probeware in a strategic way as part of their discussions around ICT use in schools. That said, one suspects that such an interest is coming, especially once the big vendors direct more of their attentions to raising awareness among policymakers in such places (much like the interactive whiteboard vendors began to do a half-decade or so ago).
While probeware is a new type of peripheral for many education policymakers, there is another peripheral that policymakers are already quite familiar with, and which is already used in ad hoc ways in many schools, but which rarely seems to be considered at a system level for use in strategic ways. Once you have a critical mass of computers is in place, and in place of buying one additional PC, might it be worth considering (for example) utilizing video cameras instead? Video can be put to lots of productive uses (and some perhaps not-so-productive uses). Considering three concrete examples from around the world may shed some light on how video can be used to improve teaching -- and support teachers.
Recent headlines from places as diverse as Kenya ("6,000 primary schools picked for free laptop project") and California ("Los Angeles plans to give 640,000 students free iPads") are just two announcements among many which highlight the increasing speed and scale by which portable computing devices (laptops, tablets) are being rolled out in school systems all over the world. Based on costs alone -- and the costs can be very large! -- such headlines suggest that discussions of technology use in schools are starting to become much more central to educational policies and planning processes in scores of countries, rich and poor, across all continents.
Are these sorts of projects good ideas? It depends. The devil is often in the details (and the cost-benefit analysis), I find. Whether or not they are good ideas, there is no denying that they are occurring, for better and/or for worse, in greater frequency, and in greater amounts. More practically, then:
What do we know about what works,
and what doesn't (and how?, and why?)
when planning for and implementing such projects,
what the related costs and benefits might be,
and where might we look as we try to find answers to such questions?
There are, broadly speaking, two strands of concurrent thinking that dominate discussions around the use of new technologies in education around the world. At one end of the continuum, talk is dominated by words like 'transformation'. The (excellent) National Education Technology Plan of the United States (Transforming American Education: Learning Powered by Technology), for example, calls for "applying the advanced technologies used in our daily personal and professional lives to our entire education system to improve student learning, accelerate and scale up the adoption of effective practices, and use data and information for continuous improvement."
This is, if you will, a largely 'developed' country sort of discourse, where new technologies and approaches are layered upon older approaches and technologies in systems that largely 'work', at least from a global perspective. While the citizens of such countries may talk about a 'crisis' in their education systems (and may indeed have been talking about such a crisis for more than a generation), citizens of many other, much 'less developed' countries would happily switch places.
If you want to see a true crisis in education, come have a look at our schools, they might (and do!) say, or at least the remote ones where a young teacher in an isolated village who has only received a tenth grade education tries to teach 60+ children in a dilapidated, multigrade classroom where books are scarce and many of the students (and even more of their parents) are often functionally illiterate.
Like so many things in life, it all depends on your perspective. One country's education crisis situation may be (for better or for worse) another country's aspiration. While talk in some places may be about how new technologies can help transform education, in other places it is about how such tools can help education systems function at a basic level.
The potential uses of information and communication technologies -- ICTs -- are increasingly part of considerations around education planning in both sorts of places. One challenge for educational policymakers and planners in the remote, low income scenario is that most models (and expertise, and research) related to ICT use come from high-income contexts and environments (typically urban, or at least peri-urban). One consequence is that technology-enabled 'solutions' are imported and (sort of) 'made to fit' into more challenging environments. When they don't work, this is taken as 'evidence' that ICT use in education in such places is irrelevant (and some folks go so far to state that related discussions are irresponsible as a result).
There is, thankfully, some emerging thinking coalescing around various types of principles and approaches that may be useful to help guide the planning and implementation of ICT in education initiatives in such environments. As part of my duties at the World Bank, I have been discussing a set of such principles and approaches with a number of groups recently, and thought I'd share them here, in case they might be of wider interest or utility to anyone else. Are they universally applicable or relevant? Probably not. But the hope is that they might be useful to organizations considering using ICTs in the education sector in very challenging environments -- especially where introducing these principles and approaches into planning discussions may cause such groups to challenge assumptions and conventional wisdom about what 'works', and how best to proceed.