I have recently been involved in discussions with three countries that are considering *huge* new investments to introduce lots of new technologies in their primary and secondary education systems. Such discussions typically focus quite a bit on what technologies will be purchased; what additional products, services and support will need to be provided if the technology is to be used effectively; and how to pay for everything. Increasingly (and encouragingly), there is also talk of how to measure the impact of these sorts of investments. To measure 'impact' (however you choose to define it), you of course need to know what has actually happened (or not happened). When you are putting computers in all schools, or rolling out lots of new digital learning content, or training lots of teachers, how do you know that these sorts of things are actually taking place?
In the pre-digital age, the most common way to find out answers to questions like, for example, Are any teachers actually using the computers and new multimedia materials that we purchased in their classes?, was often to simply ask teachers a related question as part of a survey. In some cases, in-person observations were then used to help to validate the claims expressed in such surveys. Activities of these sorts are certainly useful, but interpreting the results can be problematic (respondents may tell you what they think you want to hear, your sampling of schools may leave a little to be desired, etc.). As schools become more 'connected', and as management information systems become more robust, increasingly there are technical means to help monitor automatically what is going on within parts of the system. Firms that have developed tools to help them remotely manage technical support across tens or hundreds or thousands of schools can also use those tools to help report to others on what is actually happening in the system. Educational publishers are increasingly baking in to their product offerings the ability to know how many times a particular online resource was accessed, for how long, and by whom. Some vendors tout the ability to actually see at a very granular level how individual students are performing, tracking their clicks as they move through various on-line learning activities and interpreting and reporting on the results. Such things offer the potential to provide much more comprehensive picture of the extent to which various technologies are being used. (They often potentially make even more acute issues related to data privacy, a topic we increasingly find ourselves challenging education ministries to think about proactively ... and perhaps a topic for a future blog post.) Given the increasing prevalence of the use of online 'dashboards') to report on system performance, ministries of education are (connectivity and infrastructure permitting) able to access related information in real time (or close to it).
This is exciting stuff to be sure. But how do you know that the information system, or the vendor providing it, is telling you the truth? Ministries of education can become quickly overwhelmed by all of the data reported by such mechanisms, and may not have the expertise to judge if the technical reporting mechanisms themselves are accurate. In order to help with this, independent third party auditors can (and do) also help with this sort of stuff, but many ministries often revert to old, tried-and-true methods (e.g. teacher surveys, selected in-person observations of a few representative classrooms or schools, etc.) to help in this regard. In other words: exactly the type of practices they found inadequate in the first place.
There is another group that can be very well placed to help in this regard: 'ordinary' citizens. Why not make available to the general public some (or all) of the monitoring information reported to the ministry of education about vendor performance? If a vendor creates an online 'dashboard' that allows a group in the ministry of education to better monitor the roll-out of a particular component of a large scale investment in the use of ICTs in education, why not make this available (in whole or in part) to the general public as well? Citizens are paying for this stuff with their tax dollars, after all, and many of them may be closer to the level of activity, and possibly more personally invested in the outcomes of specific activities, than are bureaucrats sitting in a faraway central government office.
When assigned to do a seemingly overwhelming task all by himself, Mark Twain's 19th century literary figure Tom Sawyer got others to help him paint his fence. Why can't governments do the same? Many around the world are doing just this sort of thing, harnessing the power of various new technologies and the collective interests and energies of civil society groups and individual citizens to help monitor and report on the delivery of public services of various sorts. What might such a thing look like in practice in the education system in a developing country? A recent case study from the World Bank Institute attempts to provide a nuts-and-bolts description of how this occurred in a pilot 'crowdsourcing' initiative in Southeast Asia that has caught the excitement of educational policymakers on other continents, from Latin America to Africa to Europe.
A Case Study on Citizens’ Monitoring of the Education Sector in the Philippines [pdf], by my colleague Jennifer Shkabatur (with assistance and guidance from Luiza Nora), documents what exactly has happened with the innovative CheckMySchool (CMS) project (which has been mentioned twice before on the EduTech blog), a "community monitoring project that aims to promote transparency and social accountability in the Philippine education sector by tracking the provision of services in public schools" -- things like the existence of sufficient numbers of textbooks, working toilets, teacher attendance, and use of school funds. This information is then made available on public web sites in easily accessible formats, in order for citizens to comment on the accuracy of the data collected and to voice related concerns and issues. This particular case study focuses not so much on what has been accomplished to date, but rather *how* CMS was created and implemented, in order to help share some basic practical information with countries seeking to do something similar, but which are not sure where to start:
"CMS is often cited as a “good practice” in the field, and the governments of several countries, including Indonesia, Kenya, and Moldova, are interested in adapting the CMS model to their country contexts. This case study sheds light on the design and implementation features of the first pilot cycle of CMS in public schools across the Philippines. The case study discusses the general political background and operating environment of the CMS project, its concept and operating principles, the roles and incentives of the major stakeholders involved in its design and implementation, and the ways in which CMS aims to use ICTs. In addition, the case study provides a step-by-step analysis of the first CMS project cycle in 2011, examines its accomplishments and challenges, and provides lessons from the first pilot year of the project’s operation. The case study concludes with recommendations for projects that aim to follow the footsteps of CMS."
There is much here that should be of interest to education ministries grappling with challenging issues related to monitoring the extent to which various activities planned at some central level are actually being implemented 'on the ground' at the local level. One of the major lessons from the case study is that "innovative ICT-enabled projects are an investment in the future of community monitoring". At a conceptual level, most people will probably find this to be a rather obvious finding. But just what to do with such an 'obvious' lesson may not be so obvious at all. The CMS case study highlights the extent to which non-ICT issues, like the importance of contructive, cooperative relations between civil society groups and government, and that "complementarity with ongoing government projects creates an environment conducive to initiatives". It especially highlights the fact that, even (or perhaps especially) in ICT-related initiatives, which are attractive because they often can allow for physical distances to be bridged and many things that used to happen in person can be done virtually, an "organized presence on the ground is critical." In the case of CMS, the use of so-called 'infomediaries' (local community leaders and socially active individuals recruited to assist with data collection at the local level and share results online) was seen to be a key ingredient in success, even where most of the actual exchange of project information was happening via the project web site and tools like Facebook.
A number of 'enabling conditions' existed in the Philippines that were also very important. The fact that the Philippines has a very active and capable civil society, that government was willing to make available basic data about public schools, that there were believers within government of the need for increased transparency and social accountability in the education system and that the CMS project team tailored its activities and approaches based on specific local contexts and cultural sensitivities -- all of these things were seen to be critical to the project, and it is noted that the combination of these factors in the Philippines may not be at hand in other countries. Countries immediately looking to replicate the Philippines experience using ICTs under CMS to help promote greater transparency and accountability in the education sector would do well to note that this was not the first large scale government accountability initiative in the education sector there. CMS was able to build on some of the partnerships and networks formed as a result of these earlier projects (like Textbook Count [pdf], which began a decade ago), and lessons learned from these projects greatly influenced what the CMS team hoped to enable through the innovative introduction of use of ICTs to complement earlier and on-going related activities.
The specific circumstances in the Philippines -- the local environmental and culture contexts, the alignment of incentives of various stakeholder groups -- may not apply in other places. Identifying where and how a country's circumstances differ in key regards from that of the Philippines (or other countries with interesting lessons from public accountability and transparency initiatives in the education sector, like India) will be important for those places seeking to adopt some version of the CMS model for their own purposes. However this analysis plays out, I expect that it will be the very practical, twelve step implementation process that the case study documents that will be of most immediate utility to policymakers in other countries planning similar transparency and accountability initiatives of their own. Many World Bank studies in both the education and ICT sectors attempt to identify and analyze key issues as an input into the policy formulation process. This quite useful case study functions more as a sort of how-to guide (or, given that it documents previous experience in the Philippines, perhaps calling it a 'how-did' guide might be more accurate) than as a means to advocate for a specific policy approach, or to evaluate the impact of a given policy approach.
Reading through the case study, it was clear to me that the potential of a mechanism like CheckMySchool is far from fully realized. That said, there can be little doubt that this sort of thing is quite promising, and one suspects that this is just the tip of the iceberg, providing some tangible examples about what the increased diffusion and use of a variety of ICT tools across societies makes possible -- and what might be coming.
Note: The image used at the top of this blog post of a lot of people jumping aboard the iconic method of conveyance in the Philippines, the jeepney ("let's all jump on and have a look inside and see what we might see") is adapted from an image that comes via Wikimedia Commons from Flickr user dboy and is used according to the terms of its Creative Commons Attribution 2.0 Generic license, as confirmed by the FlickreviewR bot.