There’s growing recognition today of a huge problem slowing innovation in personalized learning: we don’t have a clear pipeline for preparing and developing personalized learning teachers. Although many aspects of teaching translate across personalized and traditional settings, the schools driving personalized learning forward often find that their teachers need some additional skills and mindset shifts that they just don’t pick up in traditional teacher preparation.

The solution, as many funders, experts, and school leaders see it, is collective action. They talk of bringing together a diverse array of stakeholders to define a common set of educator competencies and then working with established teacher education programs to create new pathways for developing next-generation educators.

On the surface this approach makes sense; no single organization today has the scale to impact all of K–12 education. But if we look to how innovation problems have played out in other sectors, it’s clear that the “collective action” approach will likely flounder at creating the pipeline of excellent personalized learning teachers that the field needs.

The collaborative approach

A close analogy is the problem that the computer industry wrestled with when trying to launch touch-screen devices. Today everyone knows the story of how Apple created an entirely new product category with the iPad. What’s less known, however, is that PC makers tried for roughly a decade ahead of Apple to launch mobile tablets. So why did the PC makers flounder and Apple succeed? The answer is multifaceted, but innovation theory makes clear that “collaboration” was a major hindrance to success.

PC makers, like many education thought leaders today, tried developing something new with an ecosystem of partner organizations. No one company had enough scale across PC components to make a complete touch screen device, so companies like HP and Lenovo worked on the overall hardware architecture; Microsoft made the operating system; and a host of other companies supplied the central processors, hard drives, etc. These companies thought they had everything they needed to make a successful mobile tablet: the core components and specifications were basically the same as the desktop and laptop machines they had built together in the past; the new devices just needed to be compact and touch compatible.

A restricted design

But the devil was in the details. No one really knew how to design a great mobile tablet because it had never been done before. Getting the form factor just right in order to nail what customers needed meant making important tradeoffs between interdependent components—things like processing speed, weight, software compatibility, and cost. However, with all the companies relying on predetermined standards and specifications to define how the interdependent components would work together, no one had the design freedom to experiment with all the important feature tradeoffs to get the user experience just right. In other words, the ability to continue innovating on the overall design of the devices was ultimately sacrificed for the common set of group-determined design goals. Yet this freedom to test and experiment with new designs was critical for early innovation, since no one had yet proven how to design a great tablet.

The result: devices born of these partnerships came to market, but they never gained much traction beyond tech enthusiasts. They were too heavy to carry comfortably in one hand, their screen buttons and menus didn’t work very well with fingers, and their battery life didn’t last very long when they were untethered from a power cord. As can be seen in this example, the supplier partnerships that worked well for building laptops actually held back efforts to create great tablets.

An explanation in Modularity Theory

Clayton Christensen’s Modularity Theory illustrates one important reason why the PC makers’ approach proved less effective. According to the theory, when new innovations are still stretching to meet our expectations, the best strategy for pushing a product’s performance forward is for a single entity to control all the interdependent pieces of the solution (e.g. the processor, screen, memory, and operating system) that affect performance. Only by doing this, can innovators gain the degrees of freedom they need to tinker with the interdependent components of a solution to meet customers’ expectations. If a single PC component supplier had integrated its business across all the interdependent parts of a mobile tablet it would have been better able to meet customers’ needs.

Hopefully, those working to develop teacher pipelines for personalized learning don’t make the same mistake. Although we have a rough idea of the instructional models, teaching practices, and educator mindsets and skills (i.e. interdependent components) we want teachers of the future to have, we’re still a ways off from having clear and reliable blueprints for effective personalized teaching and learning. Given this current reality, there’s little chance any collaborative group of stakeholders is going to collectively develop clear and common standards for defining the teacher of the future at this stage in the field’s development.

Potential solutions

So, what should personalized-learning proponents do instead? Given where education is at as a field, the best solutions are going to come from integration. Rather than working to build consensus on common educator competencies and form partnerships with established teacher education programs, the field should focus on supporting leading innovators, like Summit or Lindsay Unified, in developing their own integrated talent pipelines to meet the needs of their particular contexts and instructional models.

This integrated approach is not without precedent in the education space. A decade ago, when a few equity-focused charter school networks in New York City found that traditional teacher preparation programs weren’t preparing teachers in line with their instructional philosophies and approaches, they launched their own teacher preparation program, which went on to become the Relay Graduate School of Education. Now with sites in 14 different metropolitan areas, Relay provides a unique, practice-oriented approach to teacher preparation, and its graduates go on to work across the district and charter landscape. In a parallel fashion, I can imagine Summit’s teacher residency, or something like it, becoming for personalized learning schools what Relay is for equity-focused schools.

The tale of touch-screen tablets also bears testament to the wisdom of an integrated approach. By engineering the iPad from end to end, Apple could be more strategic about tradeoffs between various design decisions in order to make sure it could deliver the optimal user experience. 

Lastly, for those worried about the need for partnerships in order to reach scale, Modularity Theory also offers hope. The theory predicts that modular, partnership-based solutions can eventually work—and may well dominate—once the integrated innovators pave the way. For example, Android and Windows tablets—whose components come from multiple suppliers—have gained substantial shares of the mobile device market today; they just needed Apple to first show the world how a good tablet should be made.

When it comes to training personalized learning teachers, modular “partnership” options will have only mediocre success until a single organization with an integrated solution proves how to do personalized learning and teacher development really well. Thus the better strategy, at least for now, is to put our bets on integrated solutions.


  • Thomas Arnett
    Thomas Arnett

    Thomas Arnett is a senior research fellow for the Clayton Christensen Institute. His work focuses on using the Theory of Disruptive Innovation to study innovative instructional models and their potential to scale student-centered learning in K–12 education. He also studies demand for innovative resources and practices across the K–12 education system using the Jobs to Be Done Theory.