Scott Monty - Strategic Communications & Leadership Advisor

Scott Monty - Strategic Communications & Leadership Advisor

[Note: I typically post the transcript to my podcast episodes two weeks following the airing of the shows, but this was so topical, I bumped it up.]

Have you noticed anything different about the Trending section on Facebook? That is, if you even paid attention to the Trending section in the first place…

Back in May, we shared reports that Facebook was supposedly suppressing conservative news topics in the Trending section. This was according to the team of "news curators" that Facebook hired to oversee the feature. The process gave them access to a topics that were generated via Facebook’s algorithm, which determined which news stories would be shown to users in the Trending section. The curators then wrote headlines and summaries of each topic, with links to news sites.

Facebook denied that such activity was happening — at least not intentionally — and even hosted a number of prominent conservatives at its Palo Alto headquarters for an open discussion about its practices and how to improve. Later, the company provided "bias training" to employees to ensure the fairness of the Facebook platform.

But it was clear from the arrangement that the group was operating like something of a news room rather an agnostic machine. In fact, the New York Times even reported that:
"Most major social media platforms have, in recent years, amassed editorial teams of their own, groups that select, tame and fill gaps in the material produced by users and media companies."

Well, all of that has come to an abrupt halt, when Facebook suddenly fired all of its human editors. The team of "news curators," as they were known internally, was contracted from a third party and was notified at 4 pm on Friday that Facebook no longer had use for their services. The new machine-based Trending section debuted at 5 pm.

Facebook says that is goal is "to enable Trending for as many people as possible," as they should discover interesting and relevant conversations happening on Facebook. The company says that it would be unable to do that at scale by employing people, and thus has resorted to its algorithm to identify and share the Trending topics.

This means that no two Trending sections will be alike. What you see, what your coworkers see, what your family sees, what I see — all will be personalized to our own interests and likings. While this might seem beneficial, moving toward an era of content personalization that many consumers have sought, it also seems unintuitive, so that stories that are truly trending may not make it to our list.

And if you look at the way the Trending section works with Facebook's artificial intelligence, you'll see that the topic is only identified by title; there is no news summary or context around why it is trending — only how many people are talking about it. One of the great values of curation is the ability to contextualize and analyze the news — something I do with The Full Monty newsletter every week. Without this, we're left with a Trending section that is about as useful as the one we see on Twitter. And we're already beginning to see the results as fake news creeps into the trends. Even Facebook's news curation team saw the writing on the wall.

The world is moving to a point where machines will replace a number of human functions in the workplace. Think of it as the automation of everything. While the eventuality is virtually inevitable, that doesn't mean that it's easy for humans to hear or even to experience firsthand.

We just heard that Uber's goal of automating its fleet will make human drivers superfluous, which surprised some of those drivers. One lamented to a reporter: “It feels like we’re just rentals. We’re kind of like placeholders until the technology comes out.”

Just because we're removing humans from business functions doesn't mean we have to be completely emotionless about it. Yes, the move may be necessary. But you could at least do it in a humane way.
Automation is predicted to stay,
The future's already here today,
  There's one missing part -
  Machines can't learn heart
So emotions won't get in the way

Bonus: if you'd like an eerie object lesson in how this was predicted 55 years ago, watch The Twilight Zone episode "The Brain Center at Whipple's," from Season 5, in which a heartless CEO completely automates his factory and lays off almost all of his workers over the objections of his employees.

What do you think? Was this a colossal mistake? Or will Facebook adapt to a human-lite approach?

If you liked this commentary, you can listen to it in my podcast The Full Monty. Subscribe via email or on iTunesGoogle PlayStitcherSpreaker or SoundCloud.


Post a Comment