As more and more countries make larger and larger investments to support the use of educational technologies, a case can be made that some sort of new global edtech readiness index might help inform education policymakers as they set related targets and measure their progress against such targets. Such an index, if well-constructed, could also help reveal where some of the likely preconditions for the successful use of educational technologies at scale across an education system are not in place. Making available a set of globally comparable, ‘core’ edtech data widely available via such an index could assist groups looking to support activities in the education sector identify where there are opportunities to make complementary investments, especially where it is clear that a few of the necessary preconditions to utilize edtech are not in place, as well as to identify where their own edtech-related activities might be mostly likely to bear fruit.
A starting principle to consider when constructing such an index would be to ‘do no harm’; a second could be simply summarized as ‘be useful’. When constructing the index, and using it to help inform discussions of the use of edtech at scale, it should always be acknowledged that that not all things that can be measured are important, and that not all important things are easily measurable.
That said: Might some of them be?
A number of groups are exploring the potential feasibility of supporting the creation of a global ‘edtech readiness index'. In case it might be a useful input into any related deliberations and plans, here are ten quick additional thoughts about what such an index might aim to do.
A quick caveat: The ideas presented here should in no way be construed as the official views of the World Bank on any of this stuff. The point here is to help spark related discussion, and, by signaling that related conversations are underway, to help bring various groups talking about this stuff into the same room. This is 'thinking aloud in public', in the hope that doing so might be useful to other people smarter and more informed than I am about related topics and issues.
With this context in mind, a new global edtech readiness index could and or should ...
1. ... provide a quick, high level overview of a few key elements within the operating environment in a given country, helping people better assess whether it might be feasible to deploy edtech at scale across an education system.
The objective would be to create something that is simple and easy to understand and not include too many elements or components. ‘Quick’ and ‘high level’ do not imply that such an index would be ‘comprehensive’ or ‘in depth’, nor that the topic itself does not deserve analytical attention that is comprehensive and in-depth. The idea is to create a 'lite' index specific to the use of educational technologies that would function something like World Economic Forum’s Networked Readiness Index, attempting to help gauge "whether a country possesses the drivers necessary for digital technologies to meet their potential". This would be a tool to help start conversations – not finish them.
2. ... be about readiness, not impact.
The objective here would be to focus on investments in a few areas, helping shine on a light on a few key elements of the enabling environment that likely need to be in place if edtech-related initiatives are likely to be feasible to roll out at scale in a country, especially where such initiatives are meant to ‘transform teaching and learning’ (as the rhetoric often has it). Of course, many things need to be in place for this potential to be realized, but a comprehensive attempt to measure and assess such things would beyond the scope of such a ‘readiness’ index. It is worth noting that an investment touted to help realize the potential for education reforms to take place at scale may in fact end up further entrenching the status quo. The impact of the use of a tool depends on the intention of the person who wields it, and that person’s related expertise. (You can use a hammer to build, to destroy, or to help support what is already in place.) In the end, 'impact' is what is important. But how can a country get itself ready to realize the potential impact of the use of educational technologies at scale?
3. ... draw on data that are practical to collect today, while helping to point to what additional data might be useful to collect tomorrow.
Collecting internationally comparable data is hard. Many of even the most basic, up-to-date data related to 'edtech', especially in middle and low income countries, simply don't exist -- and the capacity and political will to collect them is often missing. This lack of data contributes to decision-making processes related to the use of educational technologies in many countries that are, at their core, essentially faith-based. It might turn out the sorts of basic data initially collected and utilized to create an edtech readiness index really don't, in the end, help policymakers answer some of their most pressing questions related to the potential uses of educational technologies in their countries. But we need to start somewhere. If we don't have systems in place to collect even the most basic data, what hope is there that we'll be able to collect more complicated and sophisticated types of data that could help inform decision-making processes going forward?
4. ... build off and be informed by past work led by the UNESCO Institute for Statistics (UIS).
The objective would be *not* to reinvent the wheel. Many groups have done – and continue to do -- foundational work in this area. UIS has been the leading global institution articulating sets of key national indicators related to the use of ICTs in education and in collecting related data. Published in 2009, the UIS Guide to measuring information and communication technologies (ICT) in education remains the most complete and useful reference document on this topic. This, and the capacity to collect globally comparable data related to edtech topics that UIS efforts have enabled, could inspire the creation and implementation of such an index, highlighting what is possible, useful, practical, and desirable.
5. ... draw on other analytical work already complete – and ongoing.
While grounded in work originally done by UIS, such an index could benefit from existing work developed by other institutions, such as the SABER-ICT framework at the World Bank, the Global EdTech Ecosystems initiative from Navitas Ventures, and the work of the Omidyar Network on Scaling Access & Impact. The International Computer and Information Literacy Study (ICILS) offers another useful point of reference, as does the SELFIE digital assessment tool for schools in Europe.
6. ... complement other data collection efforts already underway in the international donor community.
A new global edtech readiness index could also learn from efforts underway to measure progress against the UN Sustainable Development Goals (SDGs), especially those related to SDG 4, 'Quality Education', and SDG 9, 'Build resilient infrastructure, promote sustainable industrialization and foster innovation' . More concretely, UIS is helping lead a group of partners working on measures related to:
- Indicator 4.a.1: Proportion of schools with access to: (b) the Internet for pedagogical purposes; (c) computers for pedagogical purposes;
- Indicator 4.4.1: Proportion of youth and adults with information and communications technology (ICT) skills, by type of skill
Work led by UIS exploring how to define and measure digital literacy skills, as well as the Education Policy Dashboard initiative at the World Bank, are also potentially relevant here.
7. ... help kickstart and provide further momentum for related data collection efforts around the world related to edtech.
Many ongoing efforts exist to collect edtech-related data at a national level. Such a readiness index could be inspired by, support and complement efforts by national organizations, such as e.g. CETIC and CIEB in Brazil. The development and implementation of such an index could help contribute to discussions around standard definitions of key edtech-related indicators, as well as related approaches to measurement and data collection.
8. ... be especially relevant to a few specific countries in the early and middle stages of large-scale, national initiatives seeking to utilize educational technologies.
What is important when it comes to edtech investments, especially in countries just beginning to make them at scale? Much of what is known and believed in this regard is based on theory, intuition and extrapolating from lessons from small pilot projects. Related, reliable data is very limited. Beginning to collect data in rigorous, globally comparable ways about even a few related topics could help inform such efforts. Let's posit that countries like Singapore, Sweden and Canada are already 'ready', and so a readiness index of this sort may not be all that useful in those places. This may or may not be the case across the board, of course. Conditions in affluent suburbs of Toronto and Vancouver are not the same as in remote communities in the countries far north, for example. And, when it comes to edtech, readiness may well be a moving target, as new technologies emerge to challenge conceptions what it means to be 'ready'. That said, a readiness index could be designed to be relevant decisions currently being considered in countries like Kenya, Paraguay and Bangladesh, as well as to benefit less developed regions of middle income countries like Brazil, Indonesia and Turkey.
9. ... be specifically relevant to a few specific ongoing and proposed activities around the world related in some way to the use of education technologies.
Activities under the African Moonshot initiative and the EdTech Hub effort are two of many notable international initiatives exploring, analyzing and documenting the use of educational technologies that could potentially benefit from the existence of such an index.
10. ___
All thoughts presented here are tentative, initial, and, as the blank space here is meant to demonstrate: incomplete. Principles and objectives discussed here may not in the end be the operative ones. Indeed, what is presented here could come crashing down once it is subjected to the gaze and consideration of people much more expert in such matters than I am. But hopefully a few such people may find this enough of a provocation to come up with something much richer, more robust, and ultimately more practical and useful.
Informed by these principles and objectives, what might such a global edtech readiness index look like? In order to promote related discussion, a follow-on blog post will present one potential, and rather crude, model for consideration.
Blog posts in this series:
- The case for a new global edtech readiness index
- National online testing as an indicator for edtech readiness?
You may also be interested in the following EduTech blog posts:
- How to measure technology use in education
- Comparing ICT use in education across countries
- Surveying ICT use in education in Africa
- Surveying ICT use in education in Asia
- Surveying ICT use in education in five Arab States
- Surveying ICT use in education in Latin America & the Caribbean
- Surveying ICT use in education in Europe
- Key themes in national educational technology policies
- Collecting data about educational technology use in *all* countries in the world
- How many schools are connected to the Internet?
- How many computers are in schools can depend on who's asking
Note: Note: The image used at the top of this blog post of a street lamp ("if you're looking for something, the natural impulse it to look where the light is brightest; the most important stuff may not be there -- but it might be a practical place to start looking") comes via Wikimedia Commons from Haddon561 and is used according to the terms of its Creative Commons Attribution-Share Alike 4.0 International license.
Join the Conversation