Why chatbots make me nervous

By:

Jul 7, 2017

For the past three years, I’ve been tracking down any and all edtech tools I can find that introduce students to new relationships—be those one-off interactions with experts over video chat, short-cycle tutoring sessions, or ongoing interactions through mentorship models that offer blended online and in-person interactions between young people and adults.

In the digital age, this may sound like a near-infinite category of tools. The distinguishing factor in what I’m researching, however, is identifying tools in which a human being exists on the other side of the digitally enabled interaction. That’s because an interaction with a real human plants the seed for a new or stronger relationship. And nurturing those seeds is critical: we know that throughout life and work, people’s networks—their reservoirs of social capital—are important drivers of opportunity and well-being.

But I’m finding that my focus on tools that help students forge new relationships can sound quaint in 2017. The past year’s swell of enthusiasm for automated interactions facilitated by the likes of AI, virtual reality, and chatbots makes my growing market map of relationship-enabling tools seem almost arcane. Who needs new, technology-enabled relationships when the one-off benefit of a relationship—such as sound advice or on-demand academic help—could pop up through a virtual chatbot programmed to answer to your every need?

Chatbots’ disruptive potential

Now, I realize that I work at a think tank focused on disruptive innovation. Ergo, you might presume that I’d be delighted with anything with “bot” in its name. These emerging chatbot technologies do indeed hold massively disruptive potential, according to our own litmus test of disruption. A chatbot-enabled guidance model, for example, dramatically drives down the cost of delivering accurate information or targeted advice. Business models deploying chatbots are tending to get their start in areas of nonconsumption, where students’ alternative is nothing at all at worst or an aimless Google search for information at best. And all such models hinge on a technological core—algorithm-based outreach and response—that positions them to scale quickly.

But sometimes spotting disruptions early on can be as much about warning people of those disruptions’ potential unintended consequences as celebrating their potential upside. Especially in a market like education, unless the metrics driving the market align to what’s best for students in the long run, disruptive innovations may scale without delivering on the impact promised or intended at the outset. In the case of chatbots in education, the metrics that investors and postsecondary institutions seem to be touting may be too shortsighted: retention and graduation. To this end, these tools do stand to tackle real access gaps—access to advice, accurate information, and nudges that can keep students well on track. Indeed, early data suggests that this access can make a real difference against indicators like college persistence and “summer melt,” especially among at-risk students.

But against these metrics, chatbots stand to offer the discrete benefits that a relationship with a human might offer in a single point in time. They are not, however, reliably scaling access to relationships themselves or the benefits that relationships stand to deliver over a longer time horizon.

Scaling help vs. scaling relationships 

The data on the benefits of completing college is undeniable, and if chatbots can produce useful nudges to get there, we should celebrate their potential. If we take a longer view, however, the reality is that gaps in students’ access to sound advice, on-demand support, and meaningful guidance are just the tip of the iceberg of a much bigger gap shaping students’ trajectories in postsecondary and beyond: relationship gaps. Students from more affluent backgrounds, on average, have measurably larger networks and report knowing significantly more adults working in the knowledge economy. They also, on average, have more informal mentors—coaches, parents’ friends, etc.—to whom they can turn. These relationships can provide not just sound advice or guidance, but internship and job opportunities, social safety nets, and emotional support.

And while chatbot technology may expand access to one-off accurate advice that otherwise remains locked in those more exclusive networks, that advice will not make a meaningful dent in relationship gaps over time, as new and different circumstances arise in students’ lives at school and beyond. Chatbots, in other words, don’t stand to expand students’ stock of social capital—they just provide a stopgap solution when a student’s network otherwise isn’t providing information or interventions he needs.

Skeptics of this argument will point out that today chatbots in higher education aren’t “replacing” humans.  Rather, they are funneling students more efficiently to human resources on campuses. Chatbots, in other words, can be paired with relationship-rich environments to lend efficiencies to helping students continue to build deeper and more diverse networks. For example, a chatbot could lower the search costs of helping students find the person who can answer their questions—rather than providing that answer itself. Those models do exist, although they incur marginally higher costs to institutions.

Still, if chat bots take on greater and greater roles in doling out help and advice, they will start to replace humans. This will likely offer unmitigated cost savings for universities that are trying to keep students enrolled. And it could—as early data suggests—continue to serve as a promising lever to keep students on track to successfully graduating successfully. But over time, it risks stagnating or even depleting, rather than increasing, students’ access to social capital and human relationships. And that social capital proves to be a crucial asset as students go searching for everything from internships to jobs to emotional support. Ironically, especially insofar as chatbots are used as retention tools, this means we may find ourselves using automated help among students at risk of dropping out—but who might benefit most from broader and deeper networks after graduation.

I hope I’m wrong—I hope that we start to see institutions taking a longer view and edtech tools finding creative ways to pair chatbots with human help that can plant the seed of real, potentially enduring relationships in students’ lives. Our social capital is our reservoir of connections that help us to get ahead and our net of connections that help us to get by. From what I see so far, chatbots rarely serve to fill that reservoir nor weave that net.

Julia is the director of education research at the Clayton Christensen Institute. She leads a team that educates policymakers and community leaders on the power of disruptive innovation in the K-12 and higher education spheres.