Technology in education is a $7.9 billion-a-year industry that plays on a dual promise. It purports both to redefine traditional learning and to alleviate the socioeconomic disparities of American education. The term “educational technology” encompasses a range of meanings, from Web-based tools to mobile apps to computer programming skills. All of these digital permutations have two overlapping priorities: making (or saving) money and providing access. For Silicon Valley entrepreneurs, the big pot of funds available for “edtech” products is an alluring prospect wrapped in a high-minded mission. For schools, such technology bears the promise of innovative teaching and cost cutting. But when digital possibilities meet authentic needs, the complex issue of equity in educational technology is usually misconstrued.
The concept of a digital divide was first publicized by President Bill Clinton during his 2000 State of the Union address and referred to the gap between classrooms with and without computers. The country has certainly made strides, but nearly a decade and a half later, inequity remains. Only 30 percent of U.S. schools currently have reliable wireless access, and fewer than 10 percent teach computer science. Remarkably, some of the computer science schools are the same ones without reliable Wi-Fi. They are forced, therefore, to teach computer science unplugged — a Sisyphean task.