Around the world, there is no shortage of rhetoric related to the potential for the use of new information and communication technologies (ICTs) to 'transform teaching and learning'. Indeed, related pronouncements often serve as the rallying cry around, and justification for, the purchase of lots of educational technology hardware, software, and related goods and services. Where 'business as usual' is not thought to be working, some governments are increasingly open to considering 'business unusual' -- something that often involves the use of new technologies in some significant manner.
One challenge that many countries face along the way is that their procurement procedures are misaligned with what industry is able to provide, and with how industry is able to provide it. Technology changes quickly, and procurement guidelines originally designed to meet the needs of 20th century schooling (with a focus on school construction, for example, and the procurement of textbooks) may be inadequate when trying to operate in today's fast-changing technology environments. Indeed, in education as in other sectors, technological innovations typically far outpace the ability of policymakers to keep up.
Faced with considering the use of new, 'innovative' tools and approaches that hadn't been tried before at any large scale within its country's schools, education policymakers may reflexively turn to precedent and 'old' practices to guide their decisions, especially when it comes to procurement. This is usually seen within government ministries as a prudent course of action, given that such an approach is consistent with the status quo, and that related safeguards are (hopefully) in place. As a result, however, they may end up driving forward into the future primarily by looking in the rear view mirror.
When considering the scope for introducing various types of technology-enabled 'innovations' (however one might like to define that term) into their education systems, many governments face some fundamental challenges:
They don't know exactly what they want.
And even where they do:
They don't have the in-house experience or expertise to determine if what they want is practical, or even feasible, nor do they know what everything should cost.
One common mechanism utilized in many countries is the establishment of a special 'innovation fund', designed to support the exploration of lots of 'new stuff' in the education sector. Such efforts can be quite valuable, and they often end up supporting lots of worthwhile, innovative small scale projects. (The World Bank supports many 'innovation funds' related to the education sector around the world, for what that might be worth, and the EduTech blog exists in part to help document and explore some of what is learned along the way.) There is nothing wrong with small scale, innovative pilot projects, of course. In fact, one can argue that we need many more of them -- or at least more of them with certain characteristics. That said, introducing and making something work at a very small scale is a much different task than exploring how innovations can be implemented at scale across an entire education system.
In such circumstances:
What is a ministry of education to do?
How can it explore innovative approaches to the procurement of 'innovative' large scale educational technology programs in ways that are practical, appropriate, cost-effective, likely to yield good results, informed by research and international 'good practice', and transparent?
Education is a ‘black box’ -- or so a prevailing view among many education policymakers and researchers goes.
For all of the recent explosion in data related to learning -- as a result of standardized tests, etc. -- remarkably little is known at scale about what exactly happens in classroomsaroundtheworld, and outside of them, when it comes to learning, and what the impact of this has.
This isn't to say that we know nothing, of course:
The World Bank (to cite an example from within my own institution) has been using standardized classroom observation techniques to help document what is happening in many classrooms around the world (see, for example, reports based on modified Stallings Method classroom observations across Latin America which seek to identify how much time is actually spent on instruction during school hours; in many cases, the resulting data generated are rather appalling).
Common sense holds various tenets dear when it comes to education, and to learning; many educators profess to know intuitively what works, based on their individual (and hard won) experience, even in the absence of rigorously gathered, statistically significant 'hard' data; the impact of various socioeconomic factors is increasingly acknowledged (even if many policymakers remain impervious to them); and cognitive neuroscience is providing many interesting insights.
But in many important ways, education policymaking and processes of teaching and learning are constrained by the fact that we don't have sufficient, useful, actionable data about what is actually happening with learners at a large scale across an education system -- and what impact this might have. Without data, as Andreas Schleicher likes to say, you are just another person with an opinion. (Of course, with data you might be a person with an ill-considered or poorly argued opinion, but that’s another issue.)
side observation: Echoing many teachers (but, in contrast to teaching professionals, usually with little or no formal teaching experience themselves), I find that many parents and politicians also profess to know intuitively ‘what works’ when it comes to teaching. When it comes to education, most everyone is an ‘expert’, because, well, after all, everyone was at one time a student. While not seeking to denigrate the ‘wisdom of the crowd’, or downplay the value of common sense, I do find it interesting that many leaders profess to have ready prescriptions at hand for what ‘ails education’ in ways that differ markedly from the ways in which they approach making decisions when it comes to healthcare policy, for example, or finance – even though they themselves have also been patients and make spending decisions in their daily lives.
One of the great attractions of educational technologies for many people is their potential to help open up and peer inside this so-called black box. For example:
When teachers talk in front of a class, there are only imperfect records of what transpired (teacher and student notes, memories of participants, what's left on the blackboard -- until that's erased). When lectures are recorded, on the other hand, there is a data trail that can be examined and potentially mined for related insights.
When students are asked to read in their paper textbook, there is no record of whether the book was actually opened, let along whether or not to the correct page, how long a page was viewed, etc. Not so when using e-readers or reading on the web.
Facts, figures and questions scribbled on the blackboard disappear once the class bell rings; when this information is entered into, say, Blackboard TM (or any other digital learning management system, for that matter), they can potentially live on forever.
And because these data are, at their essence, just a collection of ones and zeroes, it is easy to share them quickly and widely using the various connected technology devices we increasingly have at our disposal.
A few years ago I worked on a large project where a government was planning to introduce lots of new technologies into classrooms across its education system. Policymakers were not primarily seeking to do this in order to ‘transform teaching and learning’ (although of course the project was marketed this way), but rather so that they could better understand what was actually happening in classrooms. If students were scoring poorly on their national end-of-year assessments, policymakers were wondering: Is this because the quality of instruction was insufficient? Because the learning materials used were inadequate? Or might it be because the teachers never got to that part of the syllabus, and so students were being assessed on things they hadn’t been taught? If technology use was mandated, at least they might get some sense about what material was being covered in schools – and what wasn’t. Or so the thinking went ....
Yes, such digital trails are admittedly incomplete, and can obscure as much as they illuminate, especially if the limitations of such data are poorly understood and data are investigated and analyzed incompletely, poorly, or with bias (or malicious intent). They also carry with them all sorts of very important and thorny considerations related to privacy, security, intellectual property and many other issues.
That said, used well, the addition of additional data points holds out the tantalizing promise of potentially new and/or deeper insights than has been currently possible within 'analogue' classrooms.
But there is another 'black box of education' worth considering.
In many countries, there have been serious and expansive efforts underway to compel governments make available more ‘open data’ about what is happening in their societies, and to utilize more ‘open educational resources’ for learning – including in schools. Many international donor and aid agencies support related efforts in key ways. The World Bank is a big promoter of many of these so-called ‘open data’ initiatives, for example. UNESCO has long been a big proponent of ‘open education resources’ (OERs). To some degree, pretty much all international donor agencies are involved in such activities in some way.
There is no doubt that increased ‘openness’ of various sorts can help make many processes and decisions in the education sector more transparent, as well as have other benefits (by allowing the re-use and ‘re-mixing’ of OERs, teachers and students can themselves help create new teaching and learning materials; civil society groups and private firms can utilize open data to help build new products and services; etc.).
What happens when governments promote the use of open education data and open education resources but, at the same time, refuse to make openly available the algorithms (formulas) that are utilized to draw insights from, and make key decisions based on, these open data and resources?
Are we in danger of opening up one black box, only to place another, more inscrutable back box inside of it?
One of the early, decidedly modest goals for this event was simply to bring together key decisionmakers from across Asia (and a few other parts of the world -- it would become more global with each passing year) in an attempt to help figure out what was actually going on with technology use in education in a cross-section of middle and low income countries, and to help policymakers make personal, working level connections with leading practitioners -- and with each other. Many countries were announcing ambitious new technology-related education initiatives, but it was often difficult to separate hope from hype, as well as to figure out how lofty policy pronouncements might actually translate to things happening at the level of teachers and learners 'on-the-ground'.
As the first country to move from being a recipient of World Bank donor assistance to become a full-fledged donor itself, Korea presented in many ways an ideal host for the event. (Still is!) The Korean story of economic development over the past half century has been the envy of policymakers in many other places, who see in that country's recent past many similarities to their own current situations. Known for its technological prowess (home to Samsung and many other high tech companies) and famous in education circles for the performance of its students on international assessments like PISA, educational technology issues could be found at the intersection of two important components in a Venn diagram of 'Brand Korea'.
Since that first global symposium, over 1400 policymakers from (at least by my quick count) 65 countries have visited Korea annually as part of the global symposium to see and learn first hand from Korean experiences with the use of information and communication technologies (ICTs) in education, to be exposed to some of the latest related research around the world, to share information with each other about what was working -- and what wasn't -- and what might be worth trying in the future (and what to avoid). Along the way, Korea has come to be seen as a global hub for related information and knowledge, and KERIS itself increasingly is regarded by many countries as a useful organizational model to help guide their own efforts to help implement large scale educational technology initiatives.
While international events bringing together policymakers to discuss policy issues related to the use of new technologies in education are increasingly common these days, across Asia and around the world, back in 2007 the Global Symposium on ICT Use in Education represented the first regularly scheduled annual event of its type (at least to my knowledge; there were many one-off regional events, of course, many of the good ones organized by UNESCO) bringing together policymakers from highly developed, middle and low income countries.
Participating in the event for each of the past ten years has offered me a front row seat to observe how comparative policy discussions have evolved over the past decade in a way that is, I think, somewhat unique. What follows is a quick attempt to descibe some of what has changed over the years. (The indefatigable Jongwon Seo at KERIS is, I think, the only other person to have participated in all ten global symposia. As such, he is a sort of spiritual co-author of these reflections -- or at least the ones which may offer any useful insights. I'm solely responsible for any of the banal, boring or inaccurate comments that follow.)
It is conventional wisdom in many quarters -- indeed, for some people it approaches the level of 'incontrovertible fact' -- that young people are 'digital natives', possessed of some sort of innate ability to understand and utilize digital devices and applications merely because of their youth, because they have 'grown up surrounded by technology', in ways that older folks can't -- and perhaps never will. Anecdotes from amazed and proud parents and grandparents detailing how adept little Johnny (or Gianni, or Krishna, or Yidan, or Fatima, or Omar, or Maria) is at manipulating his (or her) parents' mobile phone or tablet "even though s/he doesn't even know how to read yet!" are commonly heard in conversations around the world.
In a very influential essay that appeared about 15 years ago ("Digital Natives, Digital Immigrants" [pdf]), Mark Prensky coined the term 'digital natives', asserting that "students today are all “native speakers” of the digital language of computers, video games and the Internet" and that, as a result, "today's students think and process information fundamentally differently from their predecessors". In contrast, "[t]hose of us who were not born into the digital world but have, at some later point in our lives, become fascinated by and adopted many or most aspects of the new technology are, and always will be compared to them, Digital Immigrants." While Prensky's views on this topic have evolved over the years and become more nuanced (those interested in his particular views may wish to visit his web site), this original definition and delineation of what it means to be a digital native and a digital immigrant remains quite potent for many people.
At the same time, and for over a decade, this assertion has come under consistent challenge and criticism from many academics, who contest various aspects of the 'digital natives myth', as well as the policy and design implications that often flow from them. The observable differences at the heart of the digital native narrative relate more to culture, or to geography, to socio-economic status or even just to personal preferences than they do to age, critics argue. No doubt some of these folks may glance at this post and ask: 'Digital natives', haven't we moved on from that stuff? When it comes to related academic discourse, the answer to this question is probably a qualified 'yes, at least in some circles'.
That said, in my experience, the digital natives hypothesis remains alive and well in many educational policymaking circles (as it does with many parents -- and grandparents, and marketers, and with many kids themselves), especially in places around the world that are just now beginning to roll-out or consider the use of educational technologies at a wide scale. Indeed, while meeting with education ministries on three different continents over the course of the last month, I've had very senior education officials in three different governments explain to me how the concept of 'digital natives' was central for their vision for education going forward. These recent conversations -- and many others -- prompted me to write this quick blog post (as well as one that will follow).
"Digitale Teilhabe für alle" (digital participation for everyone) was the theme of last week's Volkshochschultag 2016, an international conference convened in Berlin by the German Adult Education Association (DVV) to explore the impact and consequences of the increasing use of digital technologies in education around the world, especially as they relate to equity and inclusion. "Does digitisation provide an opportunity for educational justice or does it strengthen the unequal access to education even more?" This question (which admittedly flows off the tongue a little better in German than it does in English) animated a related debate (in which I participated) on the last day of the conference.
In support of my pithy, one word response to this question (an enthusiastic and deliberately argumentative ja!), I drew heavily on the 2016 World Development Report, which the World Bank released earlier this year. This widely read, 'flagship' annual World Bank publication explores a topic of broad relevance in the fields of international development and development economics. The 2016 report, Digital Dividends [pdf, 10.8mb), examines the impact that the Internet and mobile networks are having (and not having) around the world.
As a primer on the uses of ‘informational and communication technologies for development’ (what’s known as ‘ICT4D’ by those in related fields who like acronyms), the 2016 World Development Report is quite comprehensive. Surveying and exploring how ICTs are impacting fields such as agriculture, finance, government services, education, energy, the environment and healthcare (and many others), ‘Digital Dividends’ is a World Bank report written for people who don’t normally read (or perhaps even care about) World Bank reports.
It is relatively catholic in its worldview, although not surprisingly there is a decided focus on things the Bank cares about (e.g. economic growth, jobs), but thankfully in language a bit more accessible than what one often finds in publications put out by an institution which employs over 1,000 PhD economists. Happily, there’s not a single mention of a ‘production function’, for example; and I really like the cover!
But I don’t mean to ‘bury the lede’, as journalists say. Here, quickly, are the main messages from the 2016 World Development Report:
Recognizing its relevance in the global marketplace, the small South American country of Uruguay has placed increasing emphasis on improving the abilities of its schoolchildren to speak English. In trying to achieve this objective, however, it has faced a very acute resource constraint: There just aren't enough qualified teachers of English working in Uruguayan schools.
What to do?
How do you strike a balance between the immediate needs of students *right now* and an education system's requirements to train teachers to help meet such needs over the long term?
The traditional approach to dealing with such a challenge in many places has been to focus primarily on pre-service training, gradually introducing new teachers into classrooms over many years who have prepared to teach the subject through dedicated courses of study at teacher training colleges, together with occasional in-service professional development activities for existing teachers (normally during holiday breaks). In Uruguay, it was recognized that the gap between the abilities and capacities of many teachers to teach English, and student needs to learn English (which became compulsory in 2008) was so huge in many parts of the country that they needed to do things differently than they had done in the past. Instead of having teachers learn English separately, might it be possible to have them learn alongside their students, in their own classrooms?
As it happens, almost a decade ago Uruguay began its ambitious and innovative Plan Ceibal, which (among other things, and as profiled in a number of previousposts on the EduTechblog) made this small South American nation the first country to connect all of its schools to the Internet and provide all primary school children with a free laptop.
Given the technical infrastructure and know-how that was developed under Plan Ceibal, Uruguayan policymakers asked themselves:
Now that all of the schools are connected to the Internet,
and all students have their own laptops,
might it be possible to offer high quality
English language instruction live over the Internet,
connecting to teachers many miles away from the schools?
The answer to this question, it would appear, is 'yes'. Working out of its remote teaching center in Buenos Aires, its global digital learning hub in the neighboring country of Argentina, the British Council is beaming out English lessons to children in hundreds of individual classrooms across Uruguay, complementing and supporting the work of local teachers in these same classrooms. This is not a 1-to-many broadcast of the sort commonly done in many countries through the use of broadcast television, but rather connects individual classrooms in Uruguay with individual teachers sitting in other places. Some of these English teachers are based in the Uruguayan capital of Montevideo, many others next door in Argentina, and still others much further afield -- including halfway around the world in the Philippines and the UK! Along the way, the capacity of local teachers, who continue to lead English classes on their own other days of the week, is developed, through their interactions with and observations of the remote teachers.
Over the past two decades, I've had the good fortune to visit hundreds and hundreds of schools across all six continents to learn about how they are using new technologies -- and hope to use them in the future. (Maybe some day I'll visit the Antarctic school that was connected to the Internet by Chile's pioneering Enlaces program and I'll be able to claim I've done this on *all* continents!)
From Korea to Costa Rica, Sri Lanka to Syria, Lesotho to Laos, Papua New Guinea to Puerto Rico: School visits in over 50 countries have run the gamut, from observing the shared use of quite old graphing calculators and lectures at the blackboard describing how to navigate Microsoft Windows (even though there was a nary a PC to be found in the building) to marvelling at technology-rich classrooms filled students and teachers doing things with hardware and software that I couldn't have dreamed of doing when I was a student myself, many years ago.
I have visited schools in prosperous countries in peacetime and in very poor countries emerging from conflict (and in some cases, still technically at war). I learned firsthand about technology use in schools in Iceland when that country was labelled the world's 'most developed' and in schools in Haiti, the poorest country in the western Hemisphere, after that country suffered its devastating earthquake.
In pretty much all cases and contexts, investments in 'technology' were meant to be deliberately forward-looking (if not always necessarily that 'strategic' or well-planned), to some extent symbols (often explicit ones) of progress and optimism about the future, no matter the education system, from the most 'high performing' to the most dysfunctional.
Because I've had lots of comparative experiences visiting schools in 'other places' around the world, I am sometimes asked to provide an 'international perspective' on what is happening within a set of schools in a given country, part of a larger effort to benchmark what is being done and planned against norms in other countries. It can be a pretty cool gig at times (although the travel can be rather punishing). I am always learning, and the dynamism and determination of students, teachers, principals and education officials whom I observe and chat with quite often leaves me inspired and (re-)energized.
Since I have been doing this for so long, I sometimes help 'train' people (at ministries of education, at NGOs) who are assuming leadership positions in educational technology initiatives on how to develop their own "carpenter's eye" -- the ability to make quick assessments and judgments about what they are seeing in ways those less experienced in the field may struggle to do.
What's a 'carpenter's eye'?
A carpenter can often quickly judge whether an angle is truly 90 degrees, or that a wrong tool was used for a particular job, or make educated guesses about why one material was employed instead of another, or that something is destined to break. Such judgments may not always be accurate, and may be informed by various biases, but they are often qualitatively different than those of people less skilled and experienced with woodworking, who may not notice such things -- and who in fact may not care about them, nor understand why they might be important.
In my personal experience working with new technologies in the education sector, many of these folks have come from 'technical backgrounds' and typically direct their gaze toward, and ask the majority of their questions about, the technology itself. Often times the end goal of such investigations is meant to build an accurate inventory the equipment that is available in a school, rather to trying to learn about how the equipment itself is being used (and not used), why this might be the case, and how people feel about this. Fair enough: We all have different bosses, different ideas about what is important, and different incentives for doing whatever it is we may do. I don't mean to deny the importance of surveying what technologies are currently available in schools. But in my experience, visiting a school to learn about the technology it has and only focusing on that technology (what processor a device has, which operating system it runs, how much memory is available) represents a real lost opportunity to learn about and gain insight into many more things at the same time.
In case it might be of use to anyone else, I have assembled a quick list of some of the things that I often ask about and consider, usually automatically and unconsciously, when I visit a school to learn about how information and communication technologies (ICTs) are being used (and not used) for a variety of purposes. It's by no means comprehensive, and of course every context is different, but I find that these are often the types of things that I ask about and look for (in addition, of course, to the more general educational and demographic stuff that would be common areas of inquiry related to most school visits, and the hyper-specific stuff that might be the reason I am visiting one school in particular).
I have cobbled this list together from a much larger, slightly unruly 'master' list of questions that I maintain, which draws on notes and emails I have shared with people over almost two decades of school visits, working with hundreds of people, many of whom had little prior experience in visiting schools to assess what was happening with technology. Sometimes -- if not often -- sharing these sorts of questions is meant as much to spark discussion and debate within a team about what might be important (and what isn't so important), and how to go about finding this out, as it is to suggest actual questions that should be posed. Every context is different.
Over the past 15 years, tremendous strides have been made in providing computing equipment and Internet access to schools around the world. Despite this, however, many teachers and students – especially those in rural communities in middle and low-income countries (and occasionally in OECD countries as well) remain largely un-connected.
In response, and as a (presumably, or at least hopefully) temporary stop-gap measure, scores of countries have piloted and championed the use of ‘mobile internet computing facilities’ of various sorts as a way to provide access for learners in remote communities to digital teaching and learning resources through the use of things like ‘internet buses’. For some students, ‘mobile learning’ takes place not with the aid of a smart phone, but rather through monthly visits of Internet-connected buses filled with computers. From Big Blue in Zimbabwe to the Google Internet Bus in India to similar sorts of efforts in countries as diverse as Tunisia, Pakistan, Rwanda, Mauritius, the Philippines, Malaysia, the United States, Canada, Mexico and China, technology-rich portable classrooms on wheels of various sorts are in use – and many more are being considered and planned.
Most efforts of these sorts seem to have been conceptualized and implemented in a vacuum, not informed by related experiences in other places. Even where such efforts help meet objectives that are (if we are honest) more related to politics and public relations than they are to learning, what guidance should the people in charge of such efforts consider in order to get the most out of related investments?
Might there be some related lessons and insights drawn from experience in operating mobile computing learning classrooms that can inform ongoing investments in other areas (school transportation, distance learning, school computer labs, rural Internet access)?
Over the past dozen years or so, I have seen and/or heard dozens (probably hundreds) of education project proposals that have sought in some way to include the use of text messages. Whether to send reminders to teachers about what they are meant to teach on a given topic, provide students with a 'learning fact of the day', disseminate exam results, inform parents of student absences, or make available simple SMS quizzes for language learners, many of these proposals have shared a common approach to financing one type of related expense.
"We'll ask the mobile phone company to give us lots of text messages for free. Since we are an education project, we are sure that they will do this." ("By the way," some of these project proponents subsequently asked me, "do you know anyone at the mobile provider we can talk to make this happen?")
Only in very rare cases does this approach to funding seem to work, however. When I explain this to people, noting that phone companies typically don't give away airtime for free and then ask, 'what makes you think they will do so for text messages?', most folks tend to explore a wider variety of potential financing options. (A few clever people will note that text messages don't really costmobile providersanything to send; this may be true, but it doesn't change the fact that just because something costs very little, or even nothing at all, doesn't mean that someone is willing to give it away for free.) Most providers (and many third-parties) offer bulk ('high volume') SMS rates that can dramatically lessen the costs incurred when sending out thousands of emails, but in my experience those costs are very rarely waived entirely by mobile providers as part of their corporate social responsibility efforts. (You can always try, though!)
Whether it is the sender or the mobile provider that ends up covering the cost of sending a text message, pretty much all of the education-related project proposals insist that the cost to the beneficiary (a teacher, a student, a parent) should be *zero*.
The cost of receiving text messages in many countries is already zero, of course, and sending SMS is typically quite cheap as well. When it comes to Internet access, however, standard data rates and packages in most of the world can be quite expensive -- prohibitively so for people with low incomes. Paying so that you can receive information via text message on your mobile phone is one thing -- paying to access the Internet using your phone (or other device), can be another matter entirely.
Recognizing this, for a few years there has been a movement to make certain types of educational content available for use by people on mobile networks without incurring any costs related to data transfer. When it comes to education, the Wikipedia Foundation famously pioneered this sort of thing by offering a way for people to receive information from Wikipedia via a free text message. Free text messages: Sounds great, you might say, but there's something that would be even better: free access to educational content directly on the Internet itself – even where such content is already available for ‘free’ on the Internet, users often have to pay their mobile or Internet provider in order to be able to download the content!
Networked devices of various sorts (phones, tablets) are increasingly cheap, and powerful, and in the hands of more and more teachers and students. Improvements in connectivity however -- more bandwidth, greater reliability, lower costs -- are not happening anywhere near as quickly. Wouldn't it be great if people could use these devices to get access to the wealth of educational resources on the Internet (many of which are provided for free) and not have to pay for the bandwidth that would enable this?
As it turns out, this has actually been happening in some places around the world, a development that has been greeted by different people in different ways -- with delight, with debate, and, in some quarters: with disdain.
Not many educational policymakers have entered into related debates, however, perhaps because they are scared away by some of the language and technical focus that characterize discussions around so-called ‘net neutrality’ issues. In fact, in my experience, few education policymakers are even aware of such discussions, nor of why they should care about so-called 'zero-rating', and its potential relevance to, and application in, education.
Mobile Learning Week 2016 begins on Monday, March 7 at UNESCO headquarters in Paris. The fifth such annual international gathering, #MLW2016 will feature a great lineup of speakers who will share information and perspectives on the use of 'mobile technologies' in education around the world, with specific attention to contexts, initiatives, perspectives and innovations in middle- and low-income countries. The program of the event itself looks to be great, with a mixture of workshops, a policy forum (together with the ITU) and a two-day symposium, all kicked off by a special online 'debate' at 6pm Paris time organized by the folks at Education Fast Forward ("Innovation & Quality: Two sides of the same coin?"). I expect the real attraction of the event for many won't be found on the official program itself. Rather, it will be the opportunities to meet like-minded folks from around the world who are asking lots of useful questions and doing cool stuff 'on-the-ground'. A lot of this stuff is largely under the radar of the press and blogosphere, which directs most of its attention to what's happening in the 'developed' countries of Europe and North America and so is often not clued into some of the fascinating 'innovations at the edges' that are emerging.
Mobile Learning Week is in many ways a companion event to the annual meeting of the mEducation Alliance, the USAID-led initiative which includes many of the same international institutions as sponsors and participants. The mEducation Alliance has also been bringing together people to talk about what is happening in the 'mobile learning' space in so-called 'developing countries' for five years. As someone who has worked in this area for some time, it is clear that we all really live in 'developing countries' when it comes to 'the use of small mobile devices in education', but there have been some notable changes in the nature of related discussions over the past half-decade. In case anyone might care to listen, here are a few of them that I've observed: