In 1913, Thomas Edison proudly proclaimed, “Books will soon be obsolete in the schools…It is possible to teach every branch of human knowledge with the motion picture. Our school system will be completely changed in the next ten years.”[1]

Perhaps we should all be reminded that Edison was not an educator. He was an inventor…and a shrewd, litigious businessman. In 1897, Edison was granted a patent for the Kinetograph, which eventually became the Kinetoscope, a device for showing early motion pictures. He also owned a majority of the other patents related to motion picture cameras. In 1909, Edison formed a group with other American film companies to create the Motion Pictures Patents Company (MPPC), which protected their patent interests and severely limited new competition. In 1917, however, the Supreme Court found the MPPC in violation of the Sherman Antitrust Act and dissolved the trust. The following year, Edison left the film industry altogether. I wouldn’t be surprised if his interest in education and motion pictures also ended at the same time.

Edison’s comments about the educational possibilities of film in 1913 should be read through his financial interests, not his pedagogical concerns. He was trying to make money in a field that is – understandably – interested in improving its methodology and overall efficacy. Moreover, Edison was part of an era where new visual media, such as glass lantern slides, stereoviews, even school museums, were championed as important advancements in education.[2] It must be added that these products were pushed by businesses equally shrewd as Edison’s company, such as the Keystone View Company which published a teacher’s guide to using lantern slides and stereoviews in the classroom, entitled Visual Education.

This trend has continued into the twenty-first century. In her review of the worst of education technology in the past decade, Audrey Watters highlights the lingering presence of commercial interests in education, placing “venture capitalism” and “(venture) philanthropy” third and second, respectively, on her list of 100 worst debacles from the 2010s (only to be surpassed by anti-school shooter software for number one). Unsurprisingly, Watters notes venture capitalists and venture philanthropists are responsible for many of the other entries on her list of the worst.

It is not uncommon when I talk to my colleagues in higher education for them to ask if the insights that I’ve gleaned from instructional development represent trends or fads. I always respond first by trying to make a fine distinction between instructional development and instructional technology, although they are inextricably intertwined. Instructional technology, by its nature, is trendy because so much money (and cultural capital) is invested in advancing technology generally. The first real wave of calls for instructional change (and arguably the origins of instructional development) came on the heels of new technological advancements in visual media in the early twentieth century. Unsurprisingly, the continued advancement of technology, with computers, cell phones, and the internet as the most recent and significant, has also led to calls for educational change, reflected in the new initiatives around the expanded use of MOOCs, phone apps, or online videos, to just name a few.

Instructional development does not tend to be as fickle, but it is also not immune to fad chasing. Nevertheless, there has been slow and steady amount of research developing a repertoire of strategies to increase teaching efficacy over the past few decades. Now-standard aspects of university education, such as formulating clear learning objectives, derived from earlier theories about criteria-referenced measures first postulated by Robert Glaser and others in the 1960s.[3] Even the more recently contentious arguments over the move away from strict lecturing (not the abolishment of all lecturing!), has roots in insights about constructivism and active learning in the 1990s. One could argue there has been a slow building of knowledge about pedagogy over the past few decades, not a flip-flopping between mutually exclusive teaching methods. I’ve discussed the history and goals of instructional development elsewhere and will refer the reader to that post [here: Instructional Design in Higher Education: What is It?].

Though anecdotal, the advice I’ve gotten from most folks in instructional development has always been strategic and specific, not global. Group activities work in some scenarios, while in other scenarios a lecture will be better. I’ve always seen instructional development about building a toolbox of teaching skills that can be deployed when rhetorically and pedagogically appropriate. Even when I’ve talked about educational technology, I’ve always seen it as a tool that needs to be appropriate to your pedagogical purpose. The hanging question should not be how do we integrate technology into our classrooms, but whether technology can sustain or even advance our educational aims. We should not adapt our teaching just to incorporate technology, to me that is pedagogically unsound.


[1] Quoted in Reiser 2001a: 55.

[2] Reiser 2001a: 55.

[3] Reiser 2001b.


  • Reiser, Robert A. 2001a. “A History of Instructional Design and Technology: Part I: A History of Instructional Media,” Educational Technology Research and Development, Vol. 49, No. 1, pp. 53-64.
  • Reiser, Robert A. 2001b. “A History of Instructional Design and Technology: Part II: A History of Instructional Media,” Educational Technology Research and Development, Vol. 49, No. 2, pp. 57-67.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s