“Do you decide on what types of clothes to wear based on your own preferences?” That’s a question on a survey instrument designed to assess whether Tamil Nadu’s Empowerment and Poverty Reduction Project (part of the Pudhu Vaazhvu Project or PVP) is actually having an impact on women’s empowerment. The question resonated strongly with the project beneficiaries I met. For them, it was a touchstone indicator of empowerment. That may be because it was crafted by a group of the women for whom the project is designed.
The project attracted me to Tamil Nadu because it is at the intersection of my two current concerns: promoting better monitoring and evaluation, and assessing the potential for information and communication technologies (ICT) to improve development practice. The backdrop is an often-heard set of complaints about project monitoring: It an ‘extractive industry’ that burdens beneficiaries and staff with demands to produce information that they see as irrelevant. And the information it does manage to produce is too slow and thin to allow course correction and improvement. But in theory, and with the right incentives, ICT should be able to remedy this. It should make it cheaper and easier to gather better information, do it more inclusively, and get it to decision makers at all levels in a timely fashion.
P-tracking builds on a few innovations. First, as the name suggests, the survey instrument was designed in a participatory fashion with a group of beneficiaries, to ensure that the information is seen as relevant. Indeed, the women I met said that the acts of administering and responding to the survey by themselves had a consciousness raising impact. (The survey is not village-specific; after testing, it was standardized for statewide use.) Second, it is designed to be administered by local staff, not by some external contractor. Third, it is designed not only to inform higher level project management, but also to provide actionable information at the village level.
ICT plays an important role. The surveys are administered using tablet-based software, and the data is pooled via village-based internet connections. (These, by the way, also allow villagers to buy train tickets, pay electric bills, and check exam scores without hours-long travel and queueing.) Automation of the survey makes it easier to train the local interviewers. And it is difficult to imagine how the Social Observatory could realize its goal of scaling up to a million respondents, and providing timely annual information, using traditional paper-based surveys.
Most interestingly, the Social Observatory tackled the problem of how to make this massive information set digestible. This is a problem facing all producers of data, big or otherwise, and it is particularly challenging when many of the users are illiterate. The Observatory commissioned an interactive graphical display that allow a village to compare its performance with neighbors’, catalyzing healthy competition. I watched a test of this interface on a group of villagers, who were able immediately to comprehend a beautifully designed graphic that represented four aspects of women’s control over marriage.
Like so many promising ideas that we are looking at for the WDR 2016, P-tracking is still in a demonstration phase. They have completed 15000 of 32000 interviews in a pilot district, and it will be a big leap to cover the projected million. Another challenge will be to mainstream the use of annual P-tracking data into project implementation at all levels. Information supply is just half the battle. So I look forward to feedback on the feedback system.
The project attracted me to Tamil Nadu because it is at the intersection of my two current concerns: promoting better monitoring and evaluation, and assessing the potential for information and communication technologies (ICT) to improve development practice. The backdrop is an often-heard set of complaints about project monitoring: It an ‘extractive industry’ that burdens beneficiaries and staff with demands to produce information that they see as irrelevant. And the information it does manage to produce is too slow and thin to allow course correction and improvement. But in theory, and with the right incentives, ICT should be able to remedy this. It should make it cheaper and easier to gather better information, do it more inclusively, and get it to decision makers at all levels in a timely fashion.
Enter the Social Observatory, a DEC initiative. It seeks to build diagnostics and feedback loops into the implementation of PVP and similar livelihood projects in India. These are complex projects, operating in hundreds or thousands of locations. They have multiple objectives and lots of moving parts: job training, loans for livelihoods ranging from dairy production to sanitary napkin manufacture, even self-defense classes. So it’s hard to get a fix on what’s going on without a lot of information. To provide useful feedback, the Social Observatory deploys a variety of tools, including (arguably extractive, but definitive) randomized controlled trials, internet-connected management information systems, and “P-tracking”, the participatory approach to monitoring.
P-tracking builds on a few innovations. First, as the name suggests, the survey instrument was designed in a participatory fashion with a group of beneficiaries, to ensure that the information is seen as relevant. Indeed, the women I met said that the acts of administering and responding to the survey by themselves had a consciousness raising impact. (The survey is not village-specific; after testing, it was standardized for statewide use.) Second, it is designed to be administered by local staff, not by some external contractor. Third, it is designed not only to inform higher level project management, but also to provide actionable information at the village level.
ICT plays an important role. The surveys are administered using tablet-based software, and the data is pooled via village-based internet connections. (These, by the way, also allow villagers to buy train tickets, pay electric bills, and check exam scores without hours-long travel and queueing.) Automation of the survey makes it easier to train the local interviewers. And it is difficult to imagine how the Social Observatory could realize its goal of scaling up to a million respondents, and providing timely annual information, using traditional paper-based surveys.
Most interestingly, the Social Observatory tackled the problem of how to make this massive information set digestible. This is a problem facing all producers of data, big or otherwise, and it is particularly challenging when many of the users are illiterate. The Observatory commissioned an interactive graphical display that allow a village to compare its performance with neighbors’, catalyzing healthy competition. I watched a test of this interface on a group of villagers, who were able immediately to comprehend a beautifully designed graphic that represented four aspects of women’s control over marriage.
Like so many promising ideas that we are looking at for the WDR 2016, P-tracking is still in a demonstration phase. They have completed 15000 of 32000 interviews in a pilot district, and it will be a big leap to cover the projected million. Another challenge will be to mainstream the use of annual P-tracking data into project implementation at all levels. Information supply is just half the battle. So I look forward to feedback on the feedback system.
Join the Conversation