Assessments make a lot of people nervous, and I’m not just talking about the students who have to take them. As a psychometrician (assessment expert) and World Bank staffer, I’ve worked on assessment projects in more than 30 countries around the world over the past 10 years. Time and again, I’ve found great interest in student assessment as a tool for monitoring and supporting student learning coupled with great unease over how exactly to go about ‘doing’ an assessment.
Recently, the OECD released the results for PISA 2015, an international assessment that measures the skills of 15-year-old students in applying their knowledge of science, reading, and mathematics to real-life problems. There is a sense of urgency to ensure that students have solid skills amidst modest economic growth and long-term demographic decline in Europe and Central Asia (ECA).
Ed: This guest post is by Alan Ruby, senior scholar at the University of Pennsylvania’s Alliance for Higher Education and Democracy who also serves as a consultant to the World Bank, an adviser to the Nazarbayev University in Kazakhstan, the Head Foundation in Singapore, and the American Institutes of Research.
Nearly 50 years ago, 40 classmates and I spent the last two weeks of November taking our higher school certificate examinations. In a cavernous, hot, and poorly ventilated hall, we sat in widely-spaced rows, writing essays, solving mathematics and science problems, and answering multiple-choice questions.
By Emily Gardner, READ Trust Fund
It's been a busy year and a half for the Russia Education Aid for Development (READ) trust fund, since it launched in 2009 to further critical work on quality learning assessments. The program is gearing up for another productive year, working to move the pendulum forward on the global imperative to measure progress in learning. Evidence on learning matters and assessment is central to improving education effectiveness.