Why Leaders Must Inject the Possibility of Error Into AI
The dependence on artificial intelligence in business is widespread. In fact, it can be said that businesses that do not leverage AI will eventually dissolve—it is that much of a tool for productivity. However, you must have heard the stories about AI serving up incorrect information, after all, it is a “garbage in-garbage out” system. Even platforms like Perplexity that consult 8-10 different AI tools—each with their own database of information— forms consensus between these tools and provides the best result, sometimes just plain gets it wrong. An important lesson to remember for managers of any business in leveraging AI.
FROM INC MAGAZINE / BY LARRY ROBERTSON, FOUNDER, LIGHTHOUSE CONSULTING
To work hand-in-hand with humans, AI needs to anticipate that it will be wrong.
Some jingles stick in your head, and for me, the Safelite tune is one. When you read the words “Safelite repair, Safelite replace,” there’s a good chance you too can provide the background music all on your own.
Like other marketing melodies, the aim of Safelite’s is to help you remember who they are and what they do. As a car window replacement company, their branding ditty is a business tool well-conceived and thoughtfully created. As proof, when I recently had a rock kick up and break my car window, my first thought was, “Call Safelite.” That’s when I met Scarlett.
Safelite is the largest and most successful car window repair business in America. Scarlett is Safelite’s customer-facing AI tool. Scarlett is all about efficiently helping you. The tool can offer immediate support, without waiting in a call queue. Scarlett can schedule an appointment for you and file your insurance claim. The tool can also seemingly cover your every need when something goes wrong.
If that were the end of the story, you might conclude, especially as a leader of a business speeding to use AI, that it really works. But what if you’re wrong?
Seeking perfection by adding imperfection
By “What if you’re wrong?” I don’t mean wrong about AI‘s potential to streamline your processes. I’m asking instead how good your AI will be when things in your business go unexpectedly wrong. This isn’t the top-of-mind question most think to ask when rushing to offer AI solutions to keep up with the Safelites of the world. By and large, the nearly singular emphasis is on how AI can enable an organization’s systems to work in the ideal.
However, organizations are imperfect. They can’t anticipate every scenario, and they will make mistakes, which isn’t all bad either. Such imperfection is central and indeed necessary, especially when it comes to innovation. If you fail to leave room for error, including in your AI design, you raise the risk of missing out on what you’re after—satisfied and loyal customers who stick with you, even when you mess up.
An all-too-common case in point
Safelite is far from alone in this oversight. Still, my recent experience with them is a teachable moment. At the start, Scarlett appeared to me to have covered all my needs to get my repair work done. Information about my broken window was taken in detail, from the vehicle’s make, model and identification number, right down to the color of the window tinting. My appointment was easily scheduled in just minutes.
Over several days, I received regular reminders to be ready for my at-home repair. They even guided me to make sure the workspace at my home was safe for their repair person, an unanticipated human touch. I also received regular confirmations telling me my replacement part had been searched for, ordered, and shipped.
All of it raised my confidence in the company and the only representative I’d interfaced with: Scarlett. All of it suggested that Safelite was ready to serve me. Then, the morning of the planned repair, I learned my appointment was canceled and my part order pulled.
As it turned out, as efficient as Scarlett appeared, she failed to do the most important thing. She didn’t inform and check in with Safelite’s systems and its employees. When the human team got around to engaging with the information Scarlett had been gathering, they found that the promised part wasn’t in stock. It could take days to get the part by special order.
By their own logic, different than Scarlett’s, the humans concluded that the appointment and part order should be canceled. In between the AI and people’s parts of the process was me. Because of that, neither Scarlett nor her human counterparts considered it their job to reach out and explain to me what had happened. I stumbled on the information by accident.
Know your business and why you’re in it
Days into the process, I was back at square one. As you might imagine, anything Scarlett had done to impress me at the beginning was lost because I never got the thing that brought me to Safelite. They did not repair, nor replace my broken window.
The signs suggest that Scarlett was never programmed to consider what should happen if something in the system went wrong. It was a blind spot, one many companies don’t recognize that they too have when developing their own AI tools. This isn’t to suggest that AI should be ready to predict every scenario or how to fix each. Yet Scarlett didn’t even know how to hand the baton back to the human side for help when things went astray. Customer care is about care in every scenario. This was something that, even if by oversight, Scarlett hadn’t been taught. That’s a human error.
Any leader of any experience knows that as much as you plan and strategize for the ideal, reality rarely delivers it. The journey of business is messy, especially in uncertain times. My guess is that Safelite’s leadership knows this, at least conceptually. In practice, to really understand a business and provide it with the kind of support AI is capable of, the technology must reflect the business reality. Absent that, it’s no better than a forgotten jingle—though far more costly.