A Brief History of Computers in Schools
In the United States, desktop computers arrived in classrooms beginning in the late 1970’s and early 1980’s, which is when they arrived on the consumer market as well. If you walked into one of the classrooms where there were those first desktop computers installed, you probably would have seen one or two computers; nearby, there would have been a box of disks with applications and data stored on them. To start a word processing program, for example, one would insert the disk with the application, then boot the computer. Another disk was inserted to load a document. To use another application, the user powered the computer down, then inserted those disks and rebooted the computer.
Before desktop computers arrived in schools, some public schools connected to mainframe computers located at colleges and universities. These localized efforts were centered near colleges in New Hampshire, Minnesota, and Illinois that were recognized as early leaders in computer science research and education. Students, teachers, and other users entered data on teletype terminals in schools to write programs. Commands entered in the teletypes were sent over the same plain old telephone system connections to the mainframes where they were executed, and the results sent back to be displayed on the terminal. One of the largest expenses associated with these efforts was the cost of long-distance telephones charges.
Eventually, the teletype systems were abandoned, and one-computer classrooms were abandoned as well. As demand for computer courses increase, larger numbers of desktop computers were installed, and rather than placing them in individual classrooms, they were installed in “computer labs.” Until the early years of the 20th century, computer rooms were the dominant model of computer-based education. In elementary schools, classroom teachers took all their students there at once for special instruction at regularly scheduled times, and in high schools, students enrolled in computer literacy or programming courses as electives. Teachers also took students to these computer labs to write papers and use computers for similar projects.
Late in the 1990’s, two programs supported by the federal government increased the computing infrastructure in schools in the United States. Technology Literacy Challenge (TLC) grants provided funds to purchase computers and support professional development for teachers. The Schools and Libraries Program of the Universal Service Fund (eRate) provided financial support to connect computers to the Internet and sustain internet access. Schools have largely assumed responsibility for purchasing devices and teaching teachers, while eRate funds continue to support internet access in schools. Another effort, led by local activists, called NetDays found volunteers installing the cabling necessary to connect the new computers purchased with TLC funds to Internet connections supported by eRate. This was necessary because the schools were built before computer networks were necessary, and few budgets allowed for the considerable capital expenses of installing Cat-5 cables throughout buildings.
At about the same time, the findings of Apple’s Classrooms of Tomorrow project were influencing many educational technology initiatives. One of the important observations was that access to computers was not sufficient to create effective lessons. Teachers needed help understanding how to use the devices in their teaching as well as how to operate them. This explains, in part, the inclusion of support for professional development in the projects that were awarded TLC funds. Even today, IT professionals are involved with training teachers and others to use IT systems and in some cases teach with the systems that they manage.
As computers arrived in schools that had high speed Internet access in each instructional space, and as teachers began to gain experience teaching with them, there was increasing interest in moving computers back into classrooms. It was reasoned that teaching with computers required they be in classroom so they would be available when students were engaged with all classroom activities, rather than the focus of special learning activities which was common when computers were in “labs.”
Since about 2010, one-to-one computing and cloud-based computing has come to dominate school computing. In many schools, students carry Chromebook with them, and sometimes they take them home. (While the market share of educational computing devices is difficult to ascertain, estimates are that Chromebooks represent over 60% of the devices purchased for school users. Some schools do continue to maintain computer rooms for special functions, and computers with full operating systems for administrative staff, but in many schools, Chromebooks are the only devices maintained by IT professionals. Google Workspaces provide most students with productivity applications, and student information systems (including grade books), library card catalogs, and learning management systems are web-based, so students access them from home and school. Because those systems are based in the cloud, robust, reliable, and secure networks are essential to school functions.
While the move to cloud-based computing has many benefits for students, teachers, and IT professionals, it has introduced inequity into education. The “digital divide” has been used to describe the inequitable access to digital learning for generations. Originally, it was used to describe the fact that marginalized populations attended schools with fewer computing devices. It has also been used to describe inequitable access to high-quality instruction with digital tools. As cloud-computing became ubiquitous it described inequitable access to network connections to use those resources away from the school.