Artificial Intelligence Comes to Learning

Published November 21, 2017

By Zach Posner

Amazon, Pandora and Netflix pioneered our technology-driven future, making personalized recommendations a reality for millions. The time has come to take on learning and make a learner-centered vision of education a reality.

Artificial Intelligence

Artificial intelligence, an area of computer science concerned with the ability of machines to simulate human intelligence, is swiftly remaking the world of work.

From health care to industrial manufacturing, the ramifications are endless. In marketing, chatbots have already transformed customer experience. On-demand personalized service is the new norm.

Within human resources functions, the applications are numerous. From recruiting to personalized coaching and development, it will soon be unthinkable to manage human assets without machine intelligence capable of responding to the unique needs of every individual. For this reason, there is one area within HR for which artificial intelligence is uniquely suited: learning and development.

AI and the Knowledge Explosion

Artificial intelligence is well suited for learning because the needs of learners have grown increasingly complex and difficult to satisfy with a one-size-fits-all approach. There has never been so much knowledge in the world. Due to the increasing pace of change in business, there has never been so much debate about what to learn and how to learn it. Increasingly, learners need artificial intelligence to help them navigate knowledge in a way best-suited for them.

The very concept of knowledge is now fraught with all the political and economic tensions of the day. With the growth of user-generated content, websites and blogs, content is proliferating at an astronomical rate. There has never been so much to read and learn.

The concept of the long tail most saliently reflected at Amazon means that any content, however niche, can find its ideal audience as long as the engine cataloging it and powering its discovery is up to the task. Optimists say this matching of content with every need and taste is a strength of the internet. The more cynical observers point to the more destructive aspects. Fake news is now here to stay. For better or worse, anyone can now exist in a world that mirrors whatever content they choose.

With that comes the question of what is necessary to learn. With so much to read and digest, what does it mean to be educated or proficient in an area? What knowledge is unquestionably part of a domain? Few industries have a broad certification exam, as with the CPA designation for accountants or the CFA series for finance.

The concept of canon as in English literature is also debated. Who deserves to tell everyone else what must be read and learned in order to be considered educated? And now that all knowledge is theoretically archivable and searchable, what does it mean to be knowledgeable about an area, reading aside?

Some argue memory is no longer necessary — that basic recall no longer counts as knowledge since a computer can handle it. Perhaps standards have been raised such that application, evaluation and creation of knowledge are the bedrock skills.

Then there are authors such as Nicholas Carr who argue our reliance on AI to perform simple automated tasks like factual recall is decreasing our cognitive capacity and that it’s cognitive work — the simple act of struggling to remember something — that produces intelligence in the first place. In other words, can you apply, evaluate and create knowledge if you can’t recall knowledge in the first place and have grown accustomed to outsourcing that function to computers?

AI and Adaptive Learning at Work

These developments mean that employees arrive at the workplace with dramatically different strengths, weaknesses, backgrounds and beliefs about what constitutes knowledge and skill. Jobs and careers no longer follow predictable paths. Entirely new roles have been invented. Legions of new graduates now aspire to roles like “people scientist” and "master data storyteller."

The future of business is chaos, wrote Robert Safian in a 2012 Fast Company article. This is leading to a new generation of workers who embrace instability, continually reinvent careers and business models and are poised to thrive in the new economy. He called these new workers Generation Flux.

A growing distrust for formal education has resulted in the widespread belief that technical skills mastery of various tools and software is paramount. Some clarify this and say it’s not mastery of specific tools and software, but rather a generalized computational thinking, or knowing how to make computers work for you that matters. Others maintain an education that teaches how to think and not what to think is what works in today’s fast-paced and disruptive world. It’s not just computational thinking but systemic thinking and general analytical ability that matter.

Regardless, those who thrive in this environment are agile and adept at redefining themselves continually. It’s tiring for some and energizing for others because of the massive opportunity available.

Badging and microlearning have been touted as the solutions to the chaos and complexity of the new knowledge economy. If it’s not clear what to learn and how to learn it, bite-sized content makes it a lot easier to consume a range of content quickly.

Let’s take this a step further and suggest that the next generation of content has now arrived: It’s no longer enough for content to be bite-sized, modularized, and conducive to recombination. It now must be backed and synchronized by data, and made personalized through artificial intelligence as if governed by a robot who perfectly understands you, your needs, your background and your style. The technological term for this is adaptive learning.

What’s adaptive learning? It’s when learning data is used to personalize a learner’s experience so that it adapts in real-time to behavior and performance. Every moment is optimized so the perfect piece of content is presented at the perfect time for each learner.

The adaptive learning field which uses computers to actively tailor content to each individual’s needs draws upon knowledge domains as diverse as machine learning, cognitive science, predictive analytics and educational theory to make this learner-centered vision of education a reality.

This isn’t anything new. It is already the standard in our consumer lives. Amazon, Pandora and Netflix make personalized and continually tailored recommendations based on your past decisions and behavior. This trend is shifting from consumer entertainment to corporate learning.

How does it work and how do we ensure that learning accounts for holistic mastery? That it emphasizes efficiency and engagement as well as short-term and long-term goals?

The Four Educational Theories Embedded in AI

Adaptive learning comes in different varieties but there are four educational theories embedded in the algorithms that power the personalized learning paths taken on our platforms. Together, these algorithms deliver holistic mastery or true knowledge acquisition:

  • Metacognitive Theory: This theory holds that learners learn best when they know what they don’t know. From a technology standpoint, as learners move through content the platform captures data concerning accuracy, confidence and time. The platform takes this data into account, serving up content that helps each learner increase accuracy but also improve awareness around accuracy and confidence level so that the learner walks away "knowing what they know."
  • The Theory of Deliberate Practice: This principle holds that understanding where we are weakest helps us focus our practice. To address this, an AI platform continuously adjusts the content to focus on each individual’s weaknesses, ensuring that time is used efficiently and effectively.
  • The Theory of Fun for Game Design: Learners are most engaged when challenged but not too challenged. If too many questions are answered wrong in a row, for instance, an AI platform will serve up a question that proves a quick win for the learner and builds confidence.
  • Ebbinghaus Forgetting Curve: This theory holds that to truly learn something, learners need to commit it to long-term memory and that the best time to do so is just before learners are about to forget it. Incorporating this theory, AI platforms can use data to predict when someone is most likely to lose a concept from short-term memory and recharge it so that the learner commits it to long-term memory.

Learning is ripe for the artificial intelligence revolution. The needs of learners are increasingly complex. The incredible explosion of knowledge makes it near impossible to master a field in the way it once was. The pace of change in business means that by the time a learner has developed a new skill, it is already obsolete. Learners need artificial intelligence to help navigate knowledge in a way best suited for them and for the good of the organization.

This article was originally published in Chief Learning Officer Magazine online on October 23, 2017 and can be viewed here.