This article comes from the 36 氪 compilation team “Translation Bureau”, translator boxi, Ai Faner authorized release.

Artificial intelligence has been around for more than 60 years, and some of the largest technology companies in the United States (Amazon, Microsoft, Google, Facebook, etc.) are just beginning to tap into the potential of AI and try to figure out how artificial intelligence will change. our future. This article is the third part of the series “The New Rule of AI” compiled by Fast Company. It introduces the flaws in the recommendation algorithm, and the manual editing is making a comeback. The original author is Jared Newman, titled: How human curation came back to clean up AI’s messes

Last month, HBO launched a handy website to help determine which original show to watch. However, the company did not use computer algorithms to classify its vast catalogue, but instead hired people to make recommendations videos to claim which series to look at.

This site called “Recommended by Humans” is more like a marketing gimmick than a product strategy – HBO does not plan to launch an app version for mobile or TV devices, and refuses to disclose the upcoming HBO Max service. What role to play. But the existence of this website is itself a statement of the state of our art today. We have now realized that the recommendation algorithm is not as reliable as the technology companies have claimed, and it is still valuable to hand over content management to people. A few weeks after the launch of HBO’s website, it was no coincidence that Netflix began testing content in its own applications for content planning.

HBO and Netflix are not the only companies that rediscover the humanity. Here are some other examples:

  • Apple hired the editor-in-chief for the Apple News app in 2017, and now employs more than a dozen journalists to decide which stories to highlight.
  • In 2017, Apple redesigned the iOS App Store around a selection of manual editors, with a close-up article and a list of recommendations on the home screen.
  • In 2018, in response to the pressure on YouTube Kids to be safer, YouTubeLaunched “Collections,” which included manually selected videos from trusted partners such as YouTube and PBS.
  • Roku recently added a children’s section to its Roku Channel app, which is recommended by an internal editorial team.
  • Last month, Facebook started hiring reporters again, this time to pick up hot news for the upcoming section of the app, the new section “News Tab.”

Similarly, Google recently rehired Krishna Bharat, the inventor of Google News, to say that the era of algorithmic content planning was launched when Google News was first introduced in 2002. During Bharat’s absence from Google, he criticized the company for not fulfilling its responsibilities to review the source of headlines.

Even with algorithmic recommendations, technology companies still need to rely on people to train their machine learning models to audit suspicious content (sometimes the moderator has to pay a huge personal price). What we are seeing now is to push people to content planning more forcefully. However, until the algorithm itself becomes a better trend leader, we don’t know whether the company sees this effort as a permanent investment or a stopgap measure.

Manual planning how to make a comeback

Technology companies have been passionate about manual planning for a long time, especially when Apple Music was introduced in 2015. Jimmy Iovine, who was then the head of Apple Music, said that the algorithm is not up to the “emotional task” of choosing the right song at the right time, so the company invited DJs and celebrities in the music industry to run Beats 1 radio station. Also hire someone to create a playlist. In the same period, Apple also began to hire editorial staff for the fast-growing news business. Twitter is also preparing a manual planning news function code-named Project Lightning, which later became Twitter Moments.

However, even though this trend has spread four years ago, this trend is becoming more urgent as technology companies face strong resistance due to the negative impact of their own products. Both Facebook and Google admit that their algorithmic recommendations play a disgraceful role in spreading misinformation and showing inappropriate content to young viewers. As for Neiflix, although the social catastrophe is not as bad as the former two, Netflix critics are beginning to wonder whether the service-like steel algorithm is no longer able to recommend good programs for new audiences.

Put peopleIt is obviously one of the ways to make up for the flaws in the algorithm. For example, according to the “Information” report, Facebook’s upcoming “News Tab” function, manual editing has received instructions to avoid publishing articles that will lead to readers’ polarization, giving priority to reports with public release sources. It may be difficult for the algorithm to make such inferences because the algorithm does not understand the meaning of the original data.

Jean-Louis Gassée said: “From machine learning (or whatever you call these algorithms) there is still a long way to go to understand the meaning of a sentence.” In the past, Apple executives are now a risk. The capitalist called for more artificial planning for technology products.

The recommendations on the Apple App Store are edited, which is another example of insufficient algorithmic capabilities. While the recommendation engine can recommend an app based on your past behavior, it doesn’t explain what it feels like to use a particular app, or it can’t give a better opinion than why the app is better than the other. With manual editing, Apple began to provide recommendations on the main screen, category page, and even search results pages, all of which wrote why the recommended app is worth your time.

Michael Bhaskar, author of Curration: The Power of Selection in a World of Excess, says that this storytelling ability algorithm has little chance to emulate.

Bhaskar said: “I think machine-driven things are there, because the amount of information and media is too big. But then, people prefer people.”

Reducing human factors

The main problem with most of the manual planning we see now is that they are often an orientation that is either one or the other. There may be a section in Facebook’s “News Tab” that is a hot news selected by journalists, but other sections are run by algorithms. The App Store has a lot of good editorial options, but if you want to check out the inaccessible places, such as the Markdown editor, you have to rely on yourself. The reason why technology companies use the recommendation algorithm is because the recommendation algorithm can be personalized indefinitely and can process large amounts of data at a small cost.

Everyone’s efforts can never reach this level: even Apple can’t hire enough experienced writers to handle millions or even more apps. Therefore, the revival of its people-oriented content planning may be just a matter of expediency, and the algorithm will improve as the algorithm improves. Earlier this year, YouTube once said that the collections of its YouTube Kids app exist, in part to helpThe company’s algorithm identifies high-quality programs from junk food. Facebook may have a similar plan for News Tab, which is the second time the company has tried to plan popular news. (A few years ago, because of accusations of liberal bias, the company fired a manual editor who had been selecting a story for its “Trend News” sidebar. The column was then run by the algorithm until it was completely closed by Facebook in 2018. .)

Nevertheless, a model in which people and algorithms coexist for content planning or mutual help is still imaginable. The news app Flipboard is a good example. It uses algorithms to personalize it, but it still requires human intervention throughout the process.

First, Flipboard’s own users act as planners, adding articles to their digital magazines for other users to read. These planners in turn can help Flipboard’s algorithm determine which stories and sources to recommend when searching for a given topic.

But the Flipboard didn’t stop there, but hired a bunch of manual editors to fine-tune the output of each theme. For example, if someone is looking for an article with the subject of rowing or a car, the algorithm may recommend a lot of articles related to the accident or crime, because such articles are sensational and often get the most clicks. But manual editing can then lower the priority of these articles and instead recommend more valuable stories.

Flipboard’s machine learning engineer Arnie Bhadury said: “We have some models that are trying to plan learning for our community. In fact, that is the output of our editors and thematic curation team, who have the final say on everything. Building similar tools can help us expand the scope of manual planning.”

Another approach is to increase investment in manual planning based on its own characteristics, regardless of the scalability of this approach. For example, Apple’s Beats 1 radio station is not just responsible for qualified people. They invited famous DJs or musicians and let these planners play live and then talk about the songs. Author Bhaskar said that the elements of artificial planning are irreplaceable.

He said: “There is one thing you can never get from the algorithm, that is the story behind it.”

All of this makes me wonder: If Apple News or Facebook’s News Tab not only recommend articles, but also their headers and allow editors to publicly explain why they chose that article? If Netflix’s Collections can have input from a director or critic? YouTube Kids through prelude or popular role public service adsWhat is the “best list” for children’s programs? Why can’t I have Beats 2, 3, 4, 5?

All of these are not a substitute for the granularity recommendations provided by the algorithm, and their production costs are not as cheap as algorithms. However, if they are attractive enough, they can provide a degree of trust, responsibility, and people-to-people connections that are not available from the cold algorithm. These may be valuable resources under our current resistance to technology.