The case for a new Global Edtech Readiness Index

|

This page in:

pointing to an index
pointing the way to an index

Many medium and low income countries around the world are preparing to significantly increase their investments in the use of educational technologies -- or have already begun to do so.

How might these countries, at a very high level, measure and track key components of their edtech investments and compare what they measure against what is happening in other countries in order to better understand what is working, and what isn't?

Let's be clear: Countries should ultimately measure the impact of their edtech-related investments against their educational goals ('improved student learning', for example -- whatever that may mean). But as countries plan and roll out large national edtech initiatives, a set of interim measures might be rather useful to measure related progress (or lack of it) on the input side. As part of such an effort, it might be rather helpful to adopt some standardized general measures, so as to allow for benchmarking what is happening in a given country against situations in other countries and to set related targets that are globally comparable.

More broadly: By articulating and highlighting a set of 'indicators' as part of a new global edtech readiness index, it might be possible to shape and influence high level discussions within education ministries, broadening related conversations beyond a traditional focus on just buying more (and more) hardware.

Unless you are the sort of person who questions the value in trying to measure most anything in education (in which case, you should probably just stop reading at this point), all of this probably seems rather reasonable. Whether you are an evangelical enthusiast for, or diehard skeptic against, investments in educational technologies, having data to support or confirm your biases and beliefs might be rather useful. And if your view of such things is a bit more nuanced, having data -- especially data that can be compared to what is happening in other places -- to help inform your thinking and actions would be rather useful as well.

Around 15 years ago, the UNESCO Institute for Statistics (UIS) recognized this challenge -- and opportunity -- and proposed a set of "ICT/education indicators" to help policymakers fill related knowledge gaps. Developed over many years, with the help of statistical agencies and education ministries in over a dozen countries, UIS led a process that defined, debated and field-tested related indicators in a variety of high, middle and low income countries around the world. The UIS Guide to Measuring Information and Communication Technologies in Education appeared in 2009 and quickly set the global standard for the collection of globally comparable data sets related to 'edtech', and was used in related official data collection efforts in many countries.

A lot can happen in ten years. Just as investments in educational technologies started to explode in many middle and low income countries around the world, the UIS ICT/education indicator initiative was phased out, a victim of budget cuts and other pressing priorities. Many of the technology tools and related practices defined and explored as part of the UIS ICT/education indicators work have changed, new ones have emerged, and the collective understanding and belief about what's important when it comes to technology use in education has changed in important ways as well.

Might it be worth reviving some elements of this effort,
updating and adapting them to focus on a few key measures
related to a country's perceived 'readiness'
to utilize educational technologies within its education system?

---

Earlier this year, I was contacted 'out of the blue' by three different countries about to engage in massive new national edtech projects (more devices for more kids, more bandwidth for schools, more training programs to promote 'digital literacy', more digital textbooks -- the usual stuff).

All asked:

  • Is there some way we can compare our current situation to that of other countries?
  • This could help inform us as we set some related investment targets.
  • In the long term, we are interested in the impact on 'student learning', but as we work toward that ultimate goal, it would be useful to have clarity on some of the things we should be measuring along the way related to our investments in educational technologies.

At the same time:

International donor agencies like the World Bank are considering ambitious 'moonshot' initiatives that will include significant investments in school connectivity and the development of a variety of 'digital skills' for young people.

  • They wonder: How might related 'progress' be measured and tracked over time?

And:

Some large philanthropies are exploring if there are critical gaps in national and local edtech-related 'ecosystems' that aren't being filled through existing public or private investments.

  • They ask things like: How might we be able to quickly gauge if the lion's share of edtech-related investments are going into buying hardware and software, but critical complementary investments in the capacity and skills of teachers and students to utilize increasingly available technology tools are not being made?

If such gaps exist, perhaps philanthropic monies can help to fill in some of them, and/or help build a case that others should do so?

---

A natural inclination is to measure things that are most easily counted. When it comes to the use of education technologies, people typically measure the number of digital devices for use by students, for example, or available bandwidth in schools. Such things aren't always that easy to count in practice, as it turns out, but they can in the end be measured, and it isn't too difficult to convince people that doing so is worthwhile. That said, there is little compelling evidence to suggest that the mere available of devices and connectivity alone makes a positive impact on student learning. While many people argue that related investments are necessary in the 21st century, only the most extreme techno-utopians would argue that they are sufficient.

Given the amount of money being spent on large scale edtech projects around the world, it would (presumably) be broadly useful to be able to track and compare the what is happening as a result. If related investments in digital infrastructure in education systems are all that are tracked and compared across countries, however, there is a real danger that policymakers will largely focus their attention on such measures, ignoring other ones that may be equally -- or perhaps even more -- important in the end.

One can imagine a scenario, for example, where a policymaker in Country X can proudly proclaim that her country jumped 42 places in an international survey of 'digital infrastructure for education' (and that neighboring countries did not) ... an achievement that obscures that fact that there is little likelihood that related investments will make tangible positive impacts on what students actually learn -- unless some other things happen as well.

With this in mind, implicitly 'suggesting' some additional measures via an edtech readiness index could potentially help highlight the necessity to make a number of complementary investments *if* investments in devices and connectivity are likely to be building blocks for other activities more integral and fundamental to teaching and learning. It might be, for example, that investments in things like digital education content, 'human capacity' (i.e. the development of skills by teachers and students to use technology tools effectively) and the capacity of an overall education system (both at a policy and implementation level) are important complements to investments in digital infrastructure. If so, might it might not be useful to measure them in some way as well?

At a high level, this is the thinking behind a growing movement to create some sort of new 'global edtech readiness index', comprising a limited set of key indicators that could help education policymakers, and decision makers at other organizations committed to support national education systems (in other public institutions, as well as in the private and non-profit sectors, in community organizations and academia), monitor and track related progress and better assess whether complementary investments might be useful or necessary.

By including indicators beyond things related to simple (if expensive) infrastructure-related investments (i.e. the stuff that people usually measure), such an edtech readiness index could signal to decision makers key elements of a 'broader approach to edtech' and help track progress related to investments in these elements over time. While scoring highly on such an index would offer no guarantee that the desired impact on student learning would be achieved, low scores might suggest that some of the vital preconditions for impact are not in sufficient evidence -- in which case, it might be worthwhile to reconsider whatever is being planned.

Follow-on posts will explore some potential principles that could inform the creation of a new global edtech readiness index, what components of such an index might look like, and examine the case for and against creating such an index in more detail.
 

You may also be interested in the following EduTech blog posts:


Note: The image used at the top of this blog post ("pointing to an index") comes from Pixabay and is used according to the terms of the Pixabay license.

Authors

Michael Trucano

Global Lead for Innovation in Education, Sr. Education & Technology Policy Specialist

Join the Conversation

Nicola Pitchford
August 19, 2019

This is a really interesting idea. For a country to be EdTech ready they also need to have their eyes wide open to the benefits and pitfalls that can arise with EdTech. It's so much more than hardware: effective implementation is critical to the successful introduction and sustainability of an EdTech programme. It will be important to show what works and what doesn't and WHY. So, research findings will be critical in nurturing a country’s readiness for EdTech as a means of raising understanding as to what they are going into. In addition, scoping of attitudes and beliefs around EdTech in key players will be important. A country that thinks EdTech is a silver bullet is as ill-prepared as one that thinks it has no value. EdTech programmes can be transformational when implemented effectively. It’s not a trivial thing to do though!

August 19, 2019

Thanks for your comments, Nicola. They are well heard! Best, Mike (ps For those who don't recognize her name: Professor Pitchford has, among others things, published a number of very useful research papers related to the use of edtech, especially in what I guess could be deemed 'challenging contexts'. See, for her example, her work investigating the impact of the onebillion initiative in Malawi and Tanzania https://onebillion.org/impact/evidence/.)

Ed Gaible
August 16, 2019

What a great idea! IMHO it's critical to recognize that a lot of the 'readiness' to be indexed lies only partially within the education sector -- factors like private-sector infrastructure, consumers' use of mobile devices, even (somehow) relations with multi-national tech and EdTech companies and presence of EdTech VC... You might ask start by asking What are useful bases of comparison for EdTech readiness between Kyrgyz Republic and Kenya (or other countries that start with "K"....). Complementing the Readiness Index should probably be a series of case-study histories that explore the processes and factors that contribute to the rare passages made by some countries from UNready to highly ready. What happened and why? Why were these phases important?

In other words, for countries scoring low on the readiness index, what needs to happen? Why does it need to happen? How long might it take for it to happen? What can accelerate the process?

August 19, 2019

Hi Ed, Thanks for your comment. The complementary case studies you suggest would indeed be valuable (as Sarah already mentions in her comment, RTI did some of this sort of stuff in the reports for Omidyar around 'edtech ecosystems'). One challenge in trying to conceive of an index of this sort is that there is so much work to be done, and mission/scope creep can be irresistible. How much can you bite off and chew at one time? How can you help provide materials that others can use? Some attempt to help inform discussions about whether a country is 'ready' to consider the large scale use of edtech tools is, when viewed against larger, more important work to be done, admittedly rather modest, and certainly incomplete. But it might be helpful, and provide a spark to help others do work that is even more helpful (and eventually impactful). Or maybe not. But I thought that thinking aloud in public about the potential utility of such an index might be a low cost way to help clarify some related thinking (even if, in the end, the thinking is only my own. :-) ) -- thus this blog post, and a few related ones to follow in short order.

Sarah Pouezevara
August 19, 2019

Ed -- Have you seen the Omidyar Network ''EdTech ecosystems'" framework and report? https://www.omidyar.com/insights/scaling-access-impact-realizing-power-…. I was co-author on this report in which we did something like you suggest--case studies of the journeys certain countries made to scaling and sustaining EdTech. We found that these countries progressed in very different ways, but somehow there were similar, recurring themes that were then described as the "EdTech ecosystem". We were careful NOT to call this a readiness index nor imply that the strength of any one component or the 'score' as a whole had any kind of predictive validity related to impact. Rather, it was a way to describe a country at a certain point in time and begin exploring how inputs or influences in one area might impact another and emphasizing, as Mike does above, that neither technology alone, nor training alone, or nor content alone will necessarily result in the desired outcomes. It is in that sense that I worry about a "readiness" index implying that there is any linear way of getting from A to B. Maybe there are certain pre-requisites, but more often than not I've seen a "chicken and egg" scenario with no consistent answers---which came first, the policy or the infrastructure? Which came first, the delivery of the hardware or the teacher training? Which came first, the supply (of apps and content) or the demand for them? Etc.... We hope this ecosystem model emphasizes the interrelatedness between the components that are present, but not necessarily a consistent directional influence though certainly a larger and more systematic review of countries and experiences might uncover such patterns. It will be interesting to see the next steps in the readiness index! Thanks Mike, as always, for a thought-provoking blog post!

August 19, 2019

HI Sarah, Thanks for your comment. The Omidyar work is really great, in my opinion. The ecosystem model advanced in the Omidyar report is super useful and can/should help inform high level conversations in many countries when it comes to 'edtech topics'. To one of your specific comments: Such a readiness index would, hopefully, not mean to imply a 'progression' from one stage to another. As a practical matter : if the hardware isn't there, and teachers aren't trained in how to use it, and there is no digital content available, considerations of how to use educational technologies at scale might not be very practical. And if you are thinking about only doing one of these things, and not the others, you might be missing some important pieces of the puzzle. Which order you do such things in is, of course, important, and most likely dependent on lots of contextual factors in a given operating environment. That said, might there be a small set of things that you probably need to have accomplished (in any order) before you start thinking about how you can use tech at scale across an education system? Maybe. And if so, a 'readiness index' of this sort might be helpful. There's also a quite practical potential utility to developing and implementing an index of this sort -- it could help in the collection of data that, for the most part, simply don't exist. How much edtech-related training have teachers received? What is the current state of pedagogically relevant digital skills within a teaching workforce? How many schools are connected to broadband -- or at all? How many functioning tech devices are in use in schools? Lots of people make huge (and very expensive) decisions without having even these sort of basic data at hand to help inform their thinking. The reason for drafting this quick blog post is to try to try to think aloud in public about this stuff, in the hope of crowding in the ideas and perspectives of others much more expert in this stuff than I! Thanks again for your comment, and for your work on the (excellent) Omidyar report.