Algorithms And The Stories They Tell

“The Value [of products and services] will increasingly come from being great at reading the tea leaves in the data” — Randy Komisar (Partner at Kleiner, Perkins, Caufield & Byers)

Sensors are listening to us and gathering more of our data than ever before. And, the unspoken pact that we have with machines is that their algorithms will be able to “read the tea leaves” in our data and tell us a story about ourselves in a new way or help us better manage aspects of our lives by being more informed.

For example, the motion sensor in the Fitbit listens to our movement and their algorithms tell us a detailed story of our exercise and sleep — both quantity and quality — and where we need to improve.

Yet in spite of ever increasing sophistication of algorithms and machine learning, the stories that machines tell us don’t always make us more aware of our behavior, better informed about ourselves or help us make better choices. Here are three examples of how algorithms can underdeliver on the value they promise:

  1. Although sensors are becoming ubiquitous, there is are always gaps in the signal, resulting in stories with thin or no data to back them up.

  2. Algorithms are designed to optimize around specific metrics and sometimes they optimize to a fault, no longer providing value to people

  3. The intelligence behind an algorithm and its recommendations may not be transparent to users making them difficult to trust

Gaps in the data signal

In recent years the consumer health tech industry has exploded with “wearables” and apps that track our sleep activity, our exercise, our heart rates and more. Algorithms look at that data and offer recommendations on how to improve our health. The Apple watch tells its user, “you got 5 minutes of exercise, but you should be getting 30 minutes,” but how does it know I didn’t take my watch off and go for an hour long swim? Or that it hasn’t been tracking my movement because I simply forgot to charge my watch last night..

There are always gaps in the data, and this is only compounded when there is no way for humans to manually correct the data.

 

Optimizing to a fault

The Facebook feed is the lens through which users see the stories of their world of connections. By clicking “liking” and commenting on posts in their Facebook feed, users signal to the social network’s algorithm that they care about a particular story or message. That in turn helps influence what the algorithm shows in their feed later.

But the algorithm has its own selfish goal as well — to deliver as many promoted articles and ad views as it can to your feed. So what happens if a user liked everything he saw on Facebook for two days? Mat Honen did exactly that, and the result was very telling. His News Feed took on an entirely new character — devoid of any sign of friends and family. His feed became about brands and messaging, rather than real people with messages about their lives.

In more extreme cases, an algorithm can optimize and cause damage at a staggering scale such as the 2010 Flash Crash where algorithms caused a trillion dollar stock market crash in a matter of 36 minutes.

 

Lack of transparency or “Opaque Intelligence”

UPS reportedly spent 10 years developing the Orion algorithm to give its drivers the most efficient route to take to complete their daily deliveries. The algorithm saves a dollar or two here and there but when scaled to UPS’ more than 55,000 daily delivery routes, the savings can be huge.

According to a WSJ article “Driver reaction to Orion is mixed.” The experience can be frustrating for some who might not want to give up a degree of autonomy, or who might not follow Orion’s logic. For example, some drivers don’t understand why it makes sense to deliver a package in one neighborhood in the morning, and come back to the same area later in the day for another delivery. But Orion often can see a payoff, measured in small amounts of time and money that the average person might not see.”

The article continues “One driver, who declined to speak for attribution, said he has been on Orion since mid-2014 and dislikes it, because it strikes him as illogical.

Alex Tabarrok in the The Rise of Opaque Intelligence writes “the problem isn’t artificial intelligence but opaque intelligence. Algorithms have now become so sophisticated that we humans can’t really understand why they are telling us what they are telling us.”

 

Only as good as the humans who build them

As algorithms and the stories they tell us become more prevalent in our lives, it’s important to recognize those stories may be based on noisy data, over optimized, or just seem illogical and therefore ignored. The stories that they tell are only as good as the humans who create them. And every day, people such product designers and data scientists work to improve these algorithms so that they can tell more meaningful stories and add even more value to our lives.

Previous
Previous

Why Product teams should reconsider using that “faster horse” quote